Closed YanLiang1102 closed 3 years ago
We do not have the plan to release the test labels to avoid overfitting on test set. You can tune hyperparameters on the released dev set and submit the predictions to the CodaLab leaderboard.
Hi, I am now trying to implement an incremental learning algorithm on this dataset. However, it is impossible to evaluate the performance on a certain part of the test set without labels. It would be very helpful if you can release the label of the test set.
Or the plan is to always submit to the leaderboard?