Is your feature request related to a problem? Please describe.
No response
Describe the solution you'd like
We utilize a fairly large annotation team for the needs of our projects, and the question of each annotator's performance is somewhat subjective. We would like to have specific statistics on each annotator's work, which could be available as a CSV file for download and analysis. In this regard, there is a proposal to develop something like an annotator rating system on the platform, which would be based on the following parameters:
Quality of completed tasks
Number of completed tasks (difference between assigned and completed tasks)
Average time to complete a task
Annotation speed
Number of annotated objects
Number of tasks accepted on the first (second and subsequent) attempts
It is implied that this rating would be formed at the level of a task or a project
Actions before raising this issue
Is your feature request related to a problem? Please describe.
No response
Describe the solution you'd like
We utilize a fairly large annotation team for the needs of our projects, and the question of each annotator's performance is somewhat subjective. We would like to have specific statistics on each annotator's work, which could be available as a CSV file for download and analysis. In this regard, there is a proposal to develop something like an annotator rating system on the platform, which would be based on the following parameters:
Quality of completed tasks
Number of completed tasks (difference between assigned and completed tasks)
Average time to complete a task
Annotation speed
Number of annotated objects
Number of tasks accepted on the first (second and subsequent) attempts
It is implied that this rating would be formed at the level of a task or a project
Describe alternatives you've considered
No response
Additional context
No response