IndoNLP / indonlu

The first-ever vast natural language processing benchmark for Indonesian Language. We provide multiple downstream tasks, pre-trained IndoBERT models, and a starter code! (AACL-IJCNLP 2020)
https://indobenchmark.com
Apache License 2.0
554 stars 193 forks source link

Pre-training proplexity #39

Closed tryanbot closed 1 year ago

tryanbot commented 2 years ago

hi, I wanna ask about the training dairy/behaviour. do you have notes on it? or at least the number of the final proplexity for each of the bert types. It will be helpful for the research community to reproduce your research. thanks

gentaiscool commented 1 year ago

Hi @tryanbot, sorry for the late reply. We didn't keep the final perplexity for the pretrained models, thus we could not release the numbers.

gentaiscool commented 1 year ago

Feel free to re-open the issue if you have any follow-up questions.