Open addiu opened 3 years ago
thank you for your issue
we have shown some hyperparameters settings in our paper (see section 5.2)
for bert checkpoints after further-pretraining, we share a link in our README (see section Further Pre-Trained Checkpoints
)
after clicking the link, you can get the imdb-based checkpoint in the file pytorch_model_len128_imdb.bin
Dear Yige, thanks a lot for sharing the code! I was wondering if you could provide some more detail on "further pre-training" on the IMDB dataset, e.g. the hyperparameter settings for it. Or, is it possible to share the BERT model which did the LM pre-training on the IMDB dataset?