NEW Build Powerful Custom Labeling Interfaces with Plugins

Members dashboard

Get insight into the productivity, agreement, and performance of members annotating in a project.

While the Project Dashboard provides insight into project task progress and throughput, the Members Dashboard provides more information on annotation agreement and outcome.

This can be useful in a number of ways:

  • Track annotator performance and agreement: View key metrics like agreement rate, review score, and performance score to identify which annotators are aligned or need support.
  • Monitor annotation progress: See how many annotations are finished, skipped, or reviewed, along with annotation time and progress percentages to manage annotator backlogs.
  • Identify quality and consistency issues: Use the agreement matrix and distribution chart to spot disagreements or edge cases that may require further review or clearer guidelines.

For annotator metrics across projects and over time, see the Annotator Performance Dashboard.

Access the dashboard

The dashboard is available from the Members tab inside a project.

Screenshot of Members Dashboard

Annotation Summary

The Annotation Summary shows annotation progress and quality metrics for each member in the project. The values reflect current project data, i.e. any changes to annotations or reviews will be updated in the metrics.

Column Description
Paused Toggle indicating whether the member is currently paused in the project. Learn more about pausing an annotator.
Agreement Score Average agreement score against other members. Calculated as the average of the member’s pairwise agreement scores with other members in the project.
Pairwise agreement scores are found in the Annotator Agreement Matrix.
Finished Total number of submitted annotations. Excludes any skipped annotations. Does not consider annotation updates or review outcomes.
Skipped Total number of tasks currently skipped.
Accepted Total number of annotations accepted by reviewers. Only counts annotations with the current review state of ‘Accepted’.
Rejected Total number of annotations rejected by reviewers. Only counts annotations with the current review state of ‘Rejected’.
Review Score Percentage of reviewed annotations that are currently accepted. Offers two options:
  • Overall: (total number of accepted annotations) / (total number of reviewed annotations)
  • Per-label: (total number of accepted annotations where label is present) / (total number of reviewed annotations where label is present)
Performance Score Percentage calculation considering overall performance of annotations in terms of review outcome (Accept, Reject, Fix+Accept). A higher score indicates better overall annotation quality. See Annotator Performance summaries for more detail on the calculation method. Offers two modes:
  • Overall: considers all reviewed annotations
  • Per-label: considers only reviewed annotations where label is present
Annotation Progress Member’s annotation progress, calculated based on the project’s label distribution setting:
  • Auto: (total submitted annotations) / (total submitted annotations by member + total tasks where annotation is not complete - total tasks where member has a draft annotation)
  • Manual: (total assigned annotations submitted) / (total assigned tasks)
Time Time spent annotating based on lead_time, which includes time to submit and time spent updating. Offers three modes:
  • Mean Time: Average time spent per annotation
  • Median Time: Median time spent on an annotation
  • Total Time: Total time spent across all annotations
Ground Truth Average agreement score against ground truth annotations. Ground truth acts as a way to assess the accuracy of other annotations.
Predictions Average agreement score against model predictions. The model used for comparison is selected in the Live Predictions setting.

Export Annotation Summary table

You can use the Export CSV button to to export the Annotation Summary table.

Annotator Agreement Matrix

The Annotator Agreement Matrix helps you see how consistently different members annotate the same tasks.

  • Agreement scores are shown as percentages between members who have both annotated the same task. Higher percentages reflect stronger alignment in their annotations. See more on how agreement score is calculated.
  • Hover over any cell to view more information including the number of tasks where both members made an annotation. If a member made more than one annotation in a task, the additional annotation(s) are also considered.
  • Use the label dropdown to filter and explore agreement when at least one annotation contains the specified label.

Agreement Distribution

The Agreement Distribution visualizes how agreement scores vary across tasks in your project. The bar chart displays the number of tasks at each agreement score range.

  • Taller bars toward the right indicate stronger consensus and likely higher data quality.
  • Clusters in the lower agreement ranges may signal ambiguous or difficult tasks, or annotation guideline gaps