Closed 18445864529 closed 2 years ago
Made it work by editing the source code of bert-score as tokenizer.encode(max_length=min(512, tokenizer.model_max_length))
as it does not support length larger than 512 but tokenizer.model_max_length
may somehow exceed it (didn't explore furthur but at least worked).
Hi. When I follow the readme and do the evaluation, I encountered the following error. Could you give me some help? Thank you in advance.
Here is the shell script I used.