Closed xuewyang closed 3 years ago
Thanks for your interest.
If you want to replace BERT with RoBERTa, you need to change the tokenization and parameter names and further pre-train the model. Considering the pre-training data of RoBERTa is much larger than that of BERT, the further pre-training may be more careful to avoid catastrophic forgetting.
Hi, I would like to start by congratulating you for your impressive work and for your promising results !! I am willing to fine-tune the model on other datasets, but before doing that, I am trying to reproduce your results. I would like to know if it would be possible to provide me with the evaluation code (that produces precision, recall, F1-score), I would extremely appreciate to have the code to reproduce results on QQP as well. Thank you in davance.
@GhaithDek Sorry for the delayed response, we uploaded the prediction files to the GLUE website for evaluation.
Hi,
Thank you for your wonderful work. I am wondering if it is possible to replace BERT with Roberta? And how to do it?