Closed jiminbot20 closed 1 year ago
Thanks for letting me know. It looks like the transformer lib is upgraded recently, and the "use_amp“ attribute is now replaced with "use_cuda_amp" and "use_cpu_amp". See https://github.com/huggingface/transformers/blob/main/src/transformers/trainer.py, line 510-513.
In fact, I didn't use mixed precision training in my experiment. So you can just set a use_amp=False in line 313. If you want to enable it, you might need to test if it works with the latest version of transformer.
https://github.com/TideDancer/interspeech21_emotion/blob/6f5851604d5d2367016a020a11949e53f14e0129/run_emotion.py#L355
while runing 'bash run.sh', AttributeError happens like capture below
so I added
use_amp = True
at _runemotion.py line 313 temporarily. Is it right way to train your work?