Closed telxt closed 1 year ago
Thank you for raising this issue. I am looking into this problem!
This is probably due to the transformers library upgrade. Looks like new input variables are defined in the new version.
Solution 1: Use the transformer version stated in the README file. (But it's a pretty outdated version 😢)
Solution 2: Replace the bart.py file with this new one (https://github.com/INK-USC/CrossFit/blob/676f801d7cc2c431ddd0e21b9593183d8e95f580/bart.py). **model_kwargs
will automatically handle new input variables due to transformer versioning. I have tested this with Python 3.6.9, transformers 4.10.0, torch 1.7.1. I hope it solves your issue.
describe a bug
Whe I try to run example_scripts/finetune_boolq.sh , an error is raised
System information