diff --git a/docs/source/guide/project_settings_lse.md b/docs/source/guide/project_settings_lse.md
index 8334a1b75664..d5308f9e0393 100644
--- a/docs/source/guide/project_settings_lse.md
+++ b/docs/source/guide/project_settings_lse.md
@@ -80,7 +80,7 @@ Select how you want to distribute tasks to annotators for labeling.
-*Only available when using Automatic task distribution.*
+*Only available when using Automatic task assignment.*
Select the order in which tasks are presented to annotators.
@@ -97,7 +97,7 @@ Select the order in which tasks are presented to annotators.
-*Only available when using Automatic task distribution.*
+*Only available when using Automatic task assignment.*
Control how long tasks are reserved for annotators.
@@ -105,9 +105,9 @@ Task reservation ensures that the tasks an annotator starts working on are tempo
!!! note
- Task reservation takes the [**Annotations per task minimum**](#overlap) into consideration. For example, if your overlap is `2`, then two annotators can reserve a task simultaneously.
+ Task reservation takes overlap ([**Quality > Annotations per task**](#overlap)) into consideration. For example, if your overlap is `2`, then two annotators can reserve a task simultaneously.
- When [**Distribute Labeling Tasks**](#distribute-tasks) is set to **Manual**, the Task Reservation setting is hidden because it does not apply.
+ When [**Task Assignment**](#distribute-tasks) is set to **Manual**, the Task Reservation setting is hidden because it does not apply.
A task reservation is put in place as soon as an annotator opens the task. The reservation remains until one of the following happens (whichever happens first):
@@ -148,7 +148,7 @@ Configure additional settings for annotators.
| Field | Description |
| ------------- | ------------ |
| **Allow empty annotations** | This determines whether annotators can submit a task without making any annotations on it. If enabled, annotators can submit a task even if they haven't added any labels or regions, resulting in an empty annotation. |
-| **Show the Data Manager to annotators** | When disabled, annotators can only enter the label stream. When enabled, annotators can access the Data Manager. The tasks that are available to them depend on the labeling distribution mode: - Automatic distribution: Annotators can only see tasks that they have already completed or have created a draft.
- Manual distribution: Annotators can only see the tasks that they have been assigned.
Note that some information is still hidden from annotators and they can only view a subset of the Data Manager columns. For example, they cannot see columns such as Annotators, Agreement, Reviewers, and more. |
+| **Show Data Manager to annotators** | When disabled, annotators can only enter the label stream. When enabled, annotators can access the Data Manager. The tasks that are available to them depend on the how task are assigned: - Automatic task assignment: Annotators can only see tasks that they have already completed or have created a draft for.
- Manual task assignment: Annotators can only see the tasks that they have been assigned.
Note that some information is still hidden from annotators and they can only view a subset of the Data Manager columns. For example, they cannot see columns such as Annotators, Agreement, Reviewers, and more. |
| **Show only columns used in labeling configuration to Annotators** | (Only available if the previous setting is enabled)
Hide unused Data Manager columns from Annotators.
Unused Data Manager columns are columns that contain data that is not being used in the labeling configuration.
For example, you may include meta or system data that you want to view as part of a project, but you don't necessarily want to expose that data to Annotators. |
@@ -190,10 +190,10 @@ Select how you want to handle skipped tasks. To disallow skipping tasks, you can
If an annotator skips a task, the task is moved to the bottom of their queue. They see the task again as they reach the end of their queue.
-If the annotator exits the label stream without labeling the skipped task, and then later re-enters the label stream, whether they see the task again depends on how task distribution is set up.
+If the annotator exits the label stream without labeling the skipped task, and then later re-enters the label stream, whether they see the task again depends on how task assignments are set up.
-* Automatic distribution: Whether they see the task again depends on if other annotators have since completed the task. If the task is still incomplete when the annotator re-enters the labeling stream, they can update label and re-submit the task.
-* Manual distribution: The annotator will continue to see the skipped task until it is completed.
+* Automatic task assignment: Whether they see the task again depends on if other annotators have since completed the task. If the task is still incomplete when the annotator re-enters the labeling stream, they can update label and re-submit the task.
+* Manual task assignment: The annotator will continue to see the skipped task until it is completed.
Skipped tasks are not marked as completed, and affect the Overall Project Progress calculation visible from the project Dashboard. (Meaning that the progress for a project that has skipped tasks will be less than 100%.)
@@ -206,7 +206,7 @@ Skipped tasks are not marked as completed, and affect the Overall Project Progre
-*Only applies when using Automatic task distribution.*
+*Only applies when using Automatic task assignment.*
If an annotator skips a task, the task is removed from their queue and automatically assigned to a different annotator.
@@ -227,17 +227,17 @@ Note that if you select this option before switching to Manual mode, this option
|
-How this setting works depends on your labeling distribution method.
+How this setting works depends on your task assignment method.
-* Automatic distribution: If an annotator skips a task, the task is removed from the annotator's queue.
+* Automatic task assignment: If an annotator skips a task, the task is removed from the annotator's queue.
- If task overlap (as defined in [**Annotations per task minimum**](#overlap)) is set to 1, then the skipped task is marked as Completed and is not seen again by an annotator. However, if the overlap is greater than 1, then the task is shown to other annotators until the minimum annotations are reached.
+ If task overlap (as defined in [**Quality > Annotations per task**](#overlap)) is set to 1, then the skipped task is marked as Completed and is not seen again by an annotator. However, if the overlap is greater than 1, then the task is shown to other annotators until the minimum annotations are reached.
-* Manual distribution: If the annotator skips a task, it is removed from their queue. But other annotators assigned to the task will still see it in their queue.
+* Manual task assignment: If the annotator skips a task, it is removed from their queue. But other annotators assigned to the task will still see it in their queue.
-For both distribution methods, **Ignore skipped** treats skipped tasks differently when it comes to calculating progress.
+For both assignment methods, **Ignore skipped** treats skipped tasks differently when it comes to calculating progress.
-Unlike the other skip queue options, in this case skipped tasks do not adversely affect the Overall Project Progress calculation visible from the project Dashboard. (Meaning that the progress for a project that has skipped tasks can still be 100%, assuming all tasks are otherwise completed.)
+Unlike the other skip queue options, in this case skipped tasks do not adversely affect the Overall Project Progress calculation visible from the project dashboard. (Meaning that the progress for a project that has skipped tasks can still be 100%, assuming all tasks are otherwise completed.)
|
@@ -338,7 +338,7 @@ If enabled, a reviewer can only see tasks to which they've been assigned. Otherw
When enabled, a reviewer only sees tasks that have been completed by all required annotators.
-If your project is using auto distribution, then this means a reviewer only sees tasks that have met the **Annotations per task minimum** threshold.
+If your project is using auto distribution, then this means a reviewer only sees tasks that have met the **Annotations per task** threshold.
If your project is using manual distribution, then this means a reviewer only sees tasks in which all assigned annotators have submitted an annotation.
@@ -535,28 +535,69 @@ Use these settings to determine task completeness and agreement metrics.
!!! note
Overlap settings only apply when the project is using Automatic distribution mode. If you are using Manual distribution mode, all tasks must be manually assigned - meaning that you are also manually determining overlap.
-By default, each task only needs to be annotated by one annotator. If you want multiple annotators to be able to annotate tasks, increase the **Annotations per task minimum**.
+By default, each task only needs to be annotated by one annotator. If you want multiple annotators to annotate the same tasks, increase the **Annotations per task**.
-Us the slider below this field to indicate how the overlap should be enforced. For example, if you want all tasks to be annotated by at least 2 annotators:
+
+
+
+ | Field |
+ Description |
+
+
+
+|
+
+**Annotations per task**
+ |
+
+
+The number of annotations you want to allow per task.
+
+Note that in certain situations, this may be exceeded. For example, if there are long-standing drafts within a project or you have a very low [task reservation](#task-lock) time.
+
+ |
+
+
+|
+
+**Annotations per task coverage**
+ |
+
+
+Only available if **Annotations per task** is ≥ 2.
+
+This setting controls the percentage of the project tasks for which the overlap is enforced.
-- Set the minimum number of annotations to **2**
+For example, if you want all tasks to be annotated by at least 2 annotators:
+
+- Set the number of annotations to **2**
- Enforce the overlap for 100% of tasks.
-If you want at least half of the tasks to be annotated by at least 3 people:
+If you want half of the tasks to be annotated by at least 3 people:
-- Set the minimum number of annotations to **3**
+- Set the number of annotations to **3**
- Enforce the overlap for 50% of tasks.
-The following options supersede what you specified under [**Annotations > Task Sampling**](#task-sampling).
+ |
+
+
+
-| Field | Description |
-| ------------- | ------------ |
-| **Show tasks with overlap first** | If your overlap enforcement is less than 100% (meaning that only some tasks require multiple annotators), then the tasks that *do* require multiple annotations are shown first.
If your overlap is 100%, then this setting has no effect. |
+**Show tasks with overlap first**
+ |
+
+If your overlap enforcement is less than 100% (meaning that only some tasks require multiple annotators), then the tasks that *do* require multiple annotations are shown first.
If your overlap is 100%, then this setting has no effect.
+
+Note that if enabled, this setting supersedes what you specified under [**Annotations > Task Ordering Method**](#task-ordering)
+
+ |
+
+
-Annotation Limit
+Tasks Per Annotator Limit
@@ -564,6 +605,8 @@ Set limits on how many tasks each individual user can annotate. This can be usef
When an annotator reaches their limit, they will see a notification telling them that they have been paused. When paused, an annotator can no longer access the project.
+When **Limit tasks per annotator** is enabled, you will see the following options:
+
| Field | Description |
| ------------- | ------------ |
| **Limit by Number of Tasks** | Set a specific number of tasks each annotator is able to complete before their progress is paused. |
@@ -579,9 +622,11 @@ To unpause annotators:
For more information about pausing annotators, including how to manually pause specific annotators, see [Pause an annotator](quality#Pause-an-annotator).
!!! note
- Pauses affect users in Annotator and Reviewer roles. So, for example, if a Reviewer is also annotating tasks and they hit the annotation limit, they will be unable to regain access to the project to review annotations unless they are unpaused.
+ Pauses are enforced for users in Annotator and Reviewer roles.
+
+ So, for example, if a Reviewer is also annotating tasks and they hit the annotation limit, they will be unable to regain access to the project to review annotations unless they are unpaused.
- Users in the Manager, Administrator, or Owner role are unaffected by pause settings.
+ Users in the Manager, Administrator, or Owner role are unaffected by the task limit.
@@ -610,49 +655,55 @@ When configured, this setting looks at the agreement score for the annotator whe
Use this option to determine what types of tasks annotators will see first.
-* **Ongoing** - Annotators are presented with tasks in the order that is configured under [**Task Sampling**](#task-sampling).
+* **Ongoing** - Annotators are presented with tasks in the order that is configured under [**Task Ordering Method**](#task-ordering ).
Keep in mind that ongoing evaluation respects the [annotator overlap](#overlap) you set above. For example, if you set overlap to `2`, then only 2 annotators will be able to complete annotations on ground truth tasks before the task is considered complete and removed from the labeling stream for other users.
* **Onboarding** - Annotators are first presented with tasks that have a ground truth annotation. This ensures that all annotators are evaluated and that they meet your evaluation standards before progressing through the remaining project tasks.
Onboarding evaluation disregards the [annotator overlap](#overlap) for ground truth tasks. For example, if you set overlap to `2`, but you have 10 annotators, all 10 will still be able to add annotations to ground truth tasks.
-**Note:** This setting only appears when the project is configured to [automatically distribute tasks](#distribute-tasks). If you are using Manual distribution, annotators will see tasks ordered by ID number. If you would like them to see ground truth tasks first, you should add ground truth annotations in the same order.
+**Note:** This setting only appears when the project is configured to [automatically assign tasks](#distribute-tasks). If you are using Manual distribution, annotators will see tasks ordered by ID number. If you would like them to see ground truth tasks first, you should add ground truth annotations in the same order.
-**Minimum number of tasks for evaluation**
+**Pause annotator on failed evaluation**
|
-The desired ground truth score threshold will not be assessed until the annotator has completed at least the specified number of ground truth tasks.
+Determines whether annotators should be paused if they do not meet the required score set below. If they fail to meet the score, they are immediately paused and unable to access the project.
+
+If you do not enable pausing, the other **Annotator Evaluation** options are simply calculated in the background and can be reviewed in the [Members Dashboard](dashboard_members).
|
-**Desired ground truth score threshold**
+**Score required to pass evaluation**
|
-The agreement threshold the annotator must meet when evaluated against ground truth annotations.
+This is the agreement score threshold that an annotator must meet when evaluated against ground truth annotations. How agreement is calculate depends on what you select in the [**Agreement** section](#task-agreement).
+
+If they do not meet this score, they are paused.
|
-**Pause annotator on failed evaluation**
+**Number of tasks for evaluation**
|
-If, after completing the minimum number of tasks, the annotator does not meet the ground truth agreement threshold, they will be immediately paused and unable to access the project.
+This is the number of tasks a user has to complete before they can potentially be paused.
+
+For example, if you set this to `10`, even if the annotator gets every single task wrong, they will not be paused until after they have completed 10 ground truth tasks.
-If you do not enable pausing, the other **Annotator Evaluation** options are calculated in the background and can be seen in the Members Dashboard, but annotators are not paused.
+If they reach 10 tasks and meet the required score, they will continue progressing through the remaining ground truth tasks until they either fall below the score (in which case they are paused), or they finish their evaluation and continue on to the rest of the project queue.
|
@@ -663,13 +714,15 @@ You can see which users are paused from the **Members** page. To unpause a user,
For more information about pausing annotators, including how to manually pause specific annotators, see [Pause an annotator](quality#Pause-an-annotator).
!!! note
- Pauses affect users in Annotator and Reviewer roles. So, for example, if a Reviewer is also annotating tasks and they hit the annotation limit, they will be unable to regain access to the project to review annotations unless they are unpaused.
+ Pauses are enforced for users in Annotator and Reviewer roles.
+
+ So, for example, if a Reviewer is also annotating tasks and they fail to meet the required ground truth agreement score, they will be unable to regain access to the project to review annotations unless they are unpaused.
- Users in the Manager, Administrator, or Owner role are unaffected by pause settings.
+ Users in the Manager, Administrator, or Owner role are unaffected by evaluation requirements.
-Task Agreement
+Agreement
@@ -700,27 +753,25 @@ Select the [metric](stats#Available-agreement-metrics) that should determine tas
|
-**Low agreement strategy**
+**Assign additional annotator**
|
+Enable this option to automatically assign an additional annotator to any tasks that have a low agreement score.
-Note that to see these options, the project must be set up to [automatically distribute tasks](#distribute-tasks).
-
-You can set a low agreement strategy to ensure that a task is not marked complete until it meets 1) the required [overlap](#overlap) and 2) a minimum agreement level.
+This will ensure that the task is not marked complete until 1) it meets the required [overlap](#overlap) and 2) a minimum agreement score is achieved (this is specified below).
-* **Do nothing** - Tasks with a low agreement can be marked complete; no additional actions are taken.
-* **Assign additional annotator** - Automatically assign an additional annotator to tasks with low agreement.
+Note that to see this setting, the project must be set up with [automatic task assignments](#distribute-tasks).
|
|
-**Desired agreement threshold**
+**Agreement threshold**
|
-Enter the agreement threshold as a percentage (1-100) that a task must have before it can be considered complete.
+Enter the agreement score that a task must meet before it can be considered complete.
|
@@ -731,7 +782,7 @@ Enter the agreement threshold as a percentage (1-100) that a task must have befo
-Enter a maximum number of annotators that can be automatically assigned to the task. If left blank, there is no limit to additional annotators.
+Enter a maximum number of annotators that can be automatically assigned to the task.
Annotators are assigned one at a time until the agreement threshold is achieved.
@@ -740,7 +791,7 @@ Annotators are assigned one at a time until the agreement threshold is achieved.
!!! note
- When configuring **Maximum additional annotators**, be mindful of the number of annotators available in your project. If you have fewer annotators available than the sum of [**Annotations per task minimum**](#overlap) + **Maximum additional annotators**, you might encounter a scenario in which a task with a low agreement score cannot be marked complete.
+ When configuring **Maximum additional annotators**, be mindful of the number of annotators available in your project. If you have fewer annotators available than the sum of [**Annotations per task**](#overlap) + **Maximum additional annotators**, you might encounter a scenario in which a task with a low agreement score cannot be marked complete.
@@ -748,7 +799,7 @@ Annotators are assigned one at a time until the agreement threshold is achieved.
-Set custom weights for labels to change the agreement calculation. The options you are given are automatically generated from your labeling interface setup.
+Set custom weights for tags and labels to change the agreement calculation. The options you are given are automatically generated from your labeling interface setup.
Weights set to zero are ignored from calculation.
|