Thank you for the great work. I was wondering if the model you have for download (model-base-uncased) has already been pre-trained or do we need to do pre-training ourselves?
@hivestrung It has been pre-trained on Wikipedia + BookCorpus for 40 epochs. You can also do pre-training by your data if you have data in other domains.
Hi,
Thank you for the great work. I was wondering if the model you have for download (model-base-uncased) has already been pre-trained or do we need to do pre-training ourselves?
Thanks!