yuvalkirstain / PickScore

MIT License
446 stars 26 forks source link

How to get the Annotation Methodology? #20

Closed njuzrs closed 8 months ago

njuzrs commented 8 months ago

Hi, thanks for the very nice work. I read the paper and have a question about the Annotation Methodology. In the paper, you mentioned that: ''' Annotation Methodology: While piloting the web app, we experimented with different annotation strategies to optimize for data quality, efficiency of collection, and user experience. Specifically, we tested the following annotation options: (1) 4 images, no ties; (2) 2 images, no ties; (3) 2 images, with ties. We found that the latter option (2 images, with ties) exceeds the other two in terms of user engagement and inter-rater agreement. ''' My question is “What is your specific evaluation approach of user engagement and inter-rater agreement for different settings ( 4 images, no ties; 2 images, no ties; 2 images with ties. )? Did you give the same prompts to different annotators, but the paper mentions that the prompt is entered by the users themselves. "

Looking forward to your reply, Thanks!

yuvalkirstain commented 8 months ago

That's a great question - we used expert annotators to provide feedback based on real user prompts, to assess aspects that we were not able to assess with real users.