Closed smr97 closed 4 years ago
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
I want the numbers too.
You can check the model card for the evaluation results: https://huggingface.co/distilbert-base-uncased-finetuned-sst-2-english
❓ Questions & Help
I need to compare my research against distilBERT as a baseline for a paper in progress. I went through your publication and found that you don't report accuracies on the glue test set and instead on the dev set. TINY BERT publication by huawei tries to reproduce your work, but the numbers are lower.
I would really appreciate some help regarding this. As far as i understand, i need to distill the student with the entire Wikipedia + bookcorpus? Is there any way to skip this step (load a model that you might have)?
Alternatively, if you have the latest submission on glue, it would really help to know the numbers.
Thanks!