microsoft / CodeXGLUE

CodeXGLUE
MIT License
1.51k stars 363 forks source link

Ground truth for evaluation #130

Closed tongjiaming closed 1 year ago

tongjiaming commented 2 years ago

Hi, I am trying to evaluate the accuracy of the model, but I found that the evaluation script is only comparing the given prediction and the answer, can you provide the origin data for evaluation?

celbree commented 2 years ago

As a benchmark dataset, we don't release the ground truth in some tasks. You can participant our benchmark leaderboard by sending submissions to codexglue@microsoft.com.