Label Studio is available to everyone as open source software (Label Studio Community Edition). There are also two paid editions: Starter Cloud and Enterprise.
Functionality |
Community |
Starter Cloud |
Enterprise |
User Management |
Role-based workflows Role-based automated workflows for annotators and reviewers. |
β |
β
|
β
|
Role-based access control Role-based access control into workspaces and projects: Admin, Manager, Reviewer, and Annotator. |
β |
β
|
β
|
Data Management |
Data management view View and manage datasets and tasks in a project through the Data Manager view. |
β
|
β
|
β
|
Multiple data formats Label any data type from text, images, audio, video, time series data to multimodality. |
β
|
β
|
β
|
Import data Reference data stored in your database, cloud storage buckets, or local storage and label it in the browser. |
β
|
β
|
β
|
Import pre-annotated data Import pre-annotated data (predictions) into Label Studio for further refinement and assessment. |
β
|
β
|
β
|
Export data Export annotations as common formats like JSON, COCO, Pascal VOC and others. |
β
|
β
|
β
|
Sync data Synchronize new and labeled data between projects and your external data storage. |
β
|
β
|
β
|
Project Management |
Organize data in projects Projects to manage data labeling activities. |
β
|
β
|
β
|
Organize projects in workspaces Organizes related projects by team, department, or product. Users can only access workspaces they associated with. |
β |
β |
β
|
Personal sandbox workspace Personal sandbox workspace for project testing and experimentation. |
β |
β
|
β
|
Templates Templates to set up data labeling projects faster. |
β
|
β
|
β
|
Project membership Only users who are added as members to a project can view it. |
β |
β
|
β
|
Project-level roles Annotator and Reviewer can be assigned to Annotator/Reviewer roles at a per-project level. |
β |
β
|
β
|
Project-level user settings Multiple configuration options for how Annotators and Reviewers interact with tasks and what information they can see. |
β |
β
|
β
|
Data Labeling Workflows |
Assign tasks Assign tasks to certain annotators or reviewers. |
β |
β
|
β
|
Automatically assign tasks Set rules and automate how tasks are distributed to annotators. |
β |
β
|
β
|
Simplified interface for Annotators Annotator-specific labeling view that only shows assigned tasks. |
β |
β
|
β
|
Customization & Development |
Tag library Use our tag library to customize the labeling interface by modifying pre-built templates or by building your own templates. |
β
|
β
|
β
|
White labeling Use your company colors and logo to give your team a consistent experience. (Additional cost) |
β |
β |
β
|
Custom scripts Use JavaScript to further enhance and customize your labeling interface. |
β |
β |
β
|
API/SDK & webhooks APIs, SDK, and webhooks for programmatically accessing and managing Label Studio. |
β
|
β
|
β
|
Prompts |
Automated labeling Fully automated data labeling using GenAI. |
β |
β
|
β
|
LLM fine-tuning and evaluation Evaluate and fine-tune LLM prompts against a ground truth dataset. |
β |
β
|
β
|
Bootstrap projects Bootstrap your labeling project using auto-generated predictions. |
β |
β
|
β
|
Machine Learning |
Custom ML backends Connect a machine learning model to the backend of a project. |
β
|
β
|
β
|
Active learning loops Accelerate labeling with active learning loops. |
β |
β |
β
|
Predictions from connected models Automatically label and sort tasks by prediction score with the ML model backend. |
β
|
β
|
β
|
Analytics and Reporting |
Project dashboards Dashboards for monitoring project progress. |
β |
β |
β
|
Annotator performance dashboards Dashboards to review and monitor individual annotator performance. |
β |
β |
β
|
Activity logs Activity logs for auditing annotation activity by project. |
β |
β |
β
|
Quality Workflows |
Assign reviewers Assign reviewers to review, fix and update annotations. |
β |
β
|
β
|
Automatic task reassignment Reassign tasks with low agreement scores to new annotators. |
β |
β
|
β
|
Agreement metrics Define how annotator consensus is calculated. You can choose from pre-defined metrics or customize your own. |
β |
β
|
β
|
Comments and notifications Team collaboration features like comments and notifications on annotation tasks. |
β |
β
|
β
|
Identify ground truths Mark which annotations should be included in a Ground Truth dataset. |
β |
β
|
β
|
Overlap configuration Set how many annotators must label each sample. |
β |
β
|
β
|
Annotator consensus matrices Matrices used to compare labeling results by different annotators. |
β |
β |
β
|
Label distribution charts Identify possible problems with your dataset distribution, such as an unbalanced dataset. |
β |
β |
β
|
Security and Support |
SSO Secure access and authentication of users via SAML SSO or LDAP. |
β |
β |
β
|
SOC2 SOC2-compliant hosted cloud service or on-premise availability |
β |
β |
β
|
Support portal Access to a dedicated support portal. |
β |
β
|
β
|
Uptime SLA 99.9% uptime SLA |
β |
β |
β
|
Customer Success Manager Dedicated customer success manager to support onboarding, education, and escalations. |
β |
β |
β
|