Project Title: Easy challenge management on EvalAI
Description:
This project will focus on streamlining the newly adopted GitHub challenge creation pipeline, building API’s for fully automating challenge creation on EvalAI, adding new capabilities in EvalAI’s latest frontend for a seamless user experience, and making our backend robust and less error-prone by adding test cases for different frontend and backend components. As of now, EvalAI admin has to be in the loop for the challenge creation process with respect to scaling worker resources for prediction-based AI challenges, setting up remote evaluation for AI challenges, and most importantly setting up code-upload AI challenges on EvalAI, the goal of this project is to remove EvalAI admin out of the loop by fully automating the process.
Deliverables:
[ ] Add feature to approve participants for a challenge by challenge hosts for unauthorized signups in the challenge.
Give challenge hosts/participants the control to remove participants from a challenge/team.
[ ] Minor: Fix the issue of entering email ID in caps while signing up on EvalAI.Add git bi-directional sync support on EvalAI. See this PR for reference.
[ ] Add support to create new challenge phases/dataset splits, etc. from the Github-based challenge creation pipeline after the challenge has been created.
[ ] Add feature in GitHub-based challenge to manage multiple challenge configs over the years for the same challenge.
[ ] Add feature to create GitHub repositories for existing challenges on EvalAI to allow users to migrate from config challenge creation to github based challenge creation.
[ ] Add APIs and celery tasks support to re-run bulk submissions. Also, add complete UI changes to allow hosts to re-run all existing submissions to the challenge.
[ ] Increase submission message time limit in the SQS queue to prevent the submission messages from expiring when there are large number of pending submissions.
[ ] Add APIs and UI changes to allow challenge hosts to rename the metrics on the leaderboard without needing to re-run submissions.
[ ] Add challenge configuration examples with documentation for code-upload challenges.
[ ] Add examples and documentation for remote challenge setup on EvalAI.
Mentors: - Ram Ramrakhya, Rishabh Jain
Skills Required: - Python, Django, AngularJS, AWS
Project size - 175 hours
Difficulty - Medium
Get started: Try to fix some issues in EvalAI (note that there are some issues labeled with GSOC-2022)
Project Title: Easy challenge management on EvalAI
Description:
This project will focus on streamlining the newly adopted GitHub challenge creation pipeline, building API’s for fully automating challenge creation on EvalAI, adding new capabilities in EvalAI’s latest frontend for a seamless user experience, and making our backend robust and less error-prone by adding test cases for different frontend and backend components. As of now, EvalAI admin has to be in the loop for the challenge creation process with respect to scaling worker resources for prediction-based AI challenges, setting up remote evaluation for AI challenges, and most importantly setting up code-upload AI challenges on EvalAI, the goal of this project is to remove EvalAI admin out of the loop by fully automating the process.
Deliverables:
[ ] Add feature to approve participants for a challenge by challenge hosts for unauthorized signups in the challenge. Give challenge hosts/participants the control to remove participants from a challenge/team.
[ ] Minor: Fix the issue of entering email ID in caps while signing up on EvalAI.Add git bi-directional sync support on EvalAI. See this PR for reference.
[ ] Add support to create new challenge phases/dataset splits, etc. from the Github-based challenge creation pipeline after the challenge has been created.
[ ] Add feature in GitHub-based challenge to manage multiple challenge configs over the years for the same challenge.
[ ] Add feature to create GitHub repositories for existing challenges on EvalAI to allow users to migrate from config challenge creation to github based challenge creation.
[ ] Add APIs and celery tasks support to re-run bulk submissions. Also, add complete UI changes to allow hosts to re-run all existing submissions to the challenge.
[ ] Increase submission message time limit in the SQS queue to prevent the submission messages from expiring when there are large number of pending submissions.
[ ] Add APIs and UI changes to allow challenge hosts to rename the metrics on the leaderboard without needing to re-run submissions.
[ ] Add challenge configuration examples with documentation for code-upload challenges.
[ ] Add examples and documentation for remote challenge setup on EvalAI.
Mentors: - Ram Ramrakhya, Rishabh Jain
Skills Required: - Python, Django, AngularJS, AWS
Project size - 175 hours
Difficulty - Medium
Get started: Try to fix some issues in EvalAI (note that there are some issues labeled with GSOC-2022)
Important Links: