Closed SaiKeshav closed 2 years ago
The code works perfectly well for the previous checkpoint bleurt-base-128 but seems to fail for the latest checkpoint of BLEURT-20. So the error may be a result of some compatibility issues between the code and the newly trained model?
Hi, thanks a lot for your feedback! Unfortunately this is correct - the fine-tuning library does not work with the latest checkpoints, as it relies on BERT-specific code (BLEURT-20 uses RemBERT, not BERT). We will update the doc to reflect this.
Thanks for the great work! I'm trying to finetune BLEURT-20 and encountering the same problem, wondering if this issue has been solved? Or where can I modify to fix it? Any help would be appreciated~
Thank you for the great work and for open-sourcing it!
I am trying to follow the instructions in https://github.com/google-research/bleurt/blob/master/checkpoints.md#from-an-existing-bleurt-checkpoint to fine-tune the BLEURT-20 model on a customized set of ratings.
However, when I run the suggested command,
I get the following issue:
I have checked this with both tensorflow 2.7 and 1.15
Any help related to this would be appreciated!