huggingface / competitions

https://huggingface.co/docs/competitions
Apache License 2.0
115 stars 12 forks source link

Feature Request : Additional solutions submissions popup submissions instructions #14

Closed Josephrp closed 9 months ago

Josephrp commented 9 months ago

the bigger problem is that the solution.csv is not commensurate with the user inputs.

so for example if my solution.csv is:

id,modelcard,github,benchmark,score
1,tonic/stablemed,josephrp/stablemed/,accuracy,0.8

the user input in the submissions popup remains two descriptive fields.

to get around that painlessly the competition host might want to simply descrive the inputs required, which he can already do btw.

however experience with public facing applications that require user inputs tells us that every screen should have a reminder especially the last screen if multiple pieces of information a required.

hope this helps + good first issue !

Some thoughts how to resolve it in the context of the other and related open issue :

some kinda auto-gen-multi-parser that creates the gradio code for the compentition host inside competitions/create .

hey I'm a big fan + already made my first two competitions , keeping them long running , but not yet resolved on the competitions page ;-)

abhishekkrthakur commented 9 months ago

could you please share a bit more. what kind of competition it is? what do you want the user to submit and how you evaluate? or share the competition link? :)

hey I'm a big fan + already made my first two competitions , keeping them long running , but not yet resolved on the competitions page ;-)

thank you! ill try my best to help you fix the issue you are facing!

Josephrp commented 9 months ago

yes, sorry with all my appologies :

and the screenshot of the incongruence between the config and user input : specifically, the SUBMISSION_COLUMNS dont match the user inputs.

image image

here the model card is not getting refreshed (yet):

image

abhishekkrthakur commented 9 months ago

how are you evaluating the submissions? i dont see anything in the public leaderboard :) can you make a sample submission? or are you not able to make a submission at all?

here the model card is not getting refreshed (yet):

That part is still manual. We can add the competitions there once it starts working :)

Josephrp commented 9 months ago

so submissions actually fine at least they show up in the dataset

for the script competition we'll evaluate it ... not sure yet, but there should be a time requirement and problem set. with an public benchmark and analogous private benchmark. basically. the topic is math olympics, so probably all past math olympics are the public dataset (i have to make it still, i suppose) , i also want to share some elements lifted from GAIR/MathPile . For the Monarch32K competitition , it's a bit more straightforward, i would like to folks to submit their models and datasets they used, we'll simply use those datasets to test them for the precision and analogous but private datasets for the competition leaderboard.

how does the leaderboard actually work, there's maybe a set solutions csv , and all submissions should be evaluated against that?

image image image

github-actions[bot] commented 9 months ago

This issue is stale because it has been open for 15 days with no activity.

github-actions[bot] commented 9 months ago

This issue was closed because it has been inactive for 2 days since being marked as stale.