Annotator performance dashboard
The annotator performance dashboard provides metrics about a user’s annotation activities over a period of time, including how many tasks an annotator has completed and how quickly.
This can be useful in a number of ways:
- Resource management optimization: If you have a project that is on a rigid timeline, you can determine which annotators have the quickest turnaround and assign them as necessary.
- Annotator payment management: If you are using outside contractors or otherwise paying annotators based on their output, you can use this dashboard to help calculate or verify contractor payments.
- Reduce costs associated with internal analytic reports: With this report, you no longer need to develop complex tools or workflows to track annotator performance.
Access the dashboard
The dashboard is available from the Organization page, meaning that your user role must be an Owner or Administrator to have the necessary permissions to view it.
From the organization members list, select the user you want to view. Annotator performance reports are available for users in all roles, not just the Annotator role.
With the user selected, click Annotator Performance Report on the right.
Export data
You can use the Export drop-down to to export the following:
Report - Download the information in the dashboard as CSV or JSON.
Timeline - Download a detailed timeline of all the user’s annotation actions within the time frame, including when the began and submitted each annotation.
Comments Received - Download a CSV file with all of the comments that other users have left on the user’s annotations.
Metrics
Data used
The metrics are calculated from the following data:
last_action
– The last action taken on an annotation. This can can be submitted, updated, fixed and accepted, accepted, rejected.lead_times
– The time spent with annotations that have a last action matching those listed above.submitted_or_reviewed
– Annotations that have a last action matching those listed above.updated
– Annotations filtered to only includelast_action = Updated
.skipped
– Annotations withwas_cancelled = true
.
Performance summaries
Metric | Calculation | Description |
---|---|---|
Total Time | Sum of lead_times |
The total time spent annotating during the selected time frame. This is calculated based on annotations that meet the criteria for Submitted Annotations (see below). All annotations have a lead_time . The lead time reflects how much time a user spent labeling from the moment the task was opened until they click Submit or Update. This includes idle time. The total time does not include time spent on annotations that have not been submitted and/or updated. For example, it does not include time spent on drafts or time spent on skipped annotations. However, if they return to an annotation draft or a previously skipped annotation, then their earlier time spent on the annotation is included when calculating their total annotation time. |
Submitted Annotations | Sum of submitted_or_reviewed |
The total number of annotations the user submitted during the selected time frame. This includes annotations that have been submitted and updated. It does not include annotations that have been skipped. It also does not include annotations that were submitted and have since been rejected by a reviewer. However, if the annotator updates a rejected annotation and that fix is then accepted by a reviewer, the corrected annotation is included within their Submitted Annotation count. Note that each annotation is only included in their submitted count once. Label Studio does not count the same annotation twice based if it is later updated. |
Total Time (Median) | Sum of submitted_or_reviewed * the median of lead_times |
The number of submitted annotations multiplied by their median annotation time. |
Time per Annotation (Median) | Median of lead_times |
The median time they spent on each submitted annotation. |
Time per Annotation (Average) | Average of lead_times |
The average time they spent on each submitted annotation. |
Performance Score | Calculated from reviewer actions | The Performance Score reflects the overall performance of annotators in terms of review actions (Accept, Reject, Fix+Accept). The calculation is as follows:
|
Graphs
Graph | Description |
---|---|
Annotation Summary | A summary of all annotations done by the user over the selected time period, broken down by submitted, skipped, and updated. |
Annotations | The same information as in the Annotation Summary, but segmented by date. |
Total Time Annotating | The total time spent annotating each day, calculated as either the median time spent or average time spent. |
Time per Annotation | The median and average time per submitted annotation segmented by date. Note that the date and time are calculated based on when they completed the annotation (see last_action above), and not when they began their annotation. |