Open jcuenod opened 5 months ago
I was wondering why both optim and adafactor were options in the config file.
optim
adafactor
After some digging, turns out that the adafactor option is deprecated in favor of optim:
https://github.com/huggingface/transformers/blob/main/src/transformers/training_args.py#L1645
Not much work to do in this repo, but maybe you can drop this line (and remove it from any default configs that you pass around):
https://github.com/sillsdev/silnlp/blob/deaf7695e48bc4819cc342a089d0cb3eff71d17e/silnlp/nmt/hugging_face_config.py#L134
Thanks for investigating this. We will remove it.
Just a note that the wiki should also be updated: https://github.com/sillsdev/silnlp/wiki/Configure-a-model#params
I was wondering why both
optim
andadafactor
were options in the config file.After some digging, turns out that the
adafactor
option is deprecated in favor ofoptim
:https://github.com/huggingface/transformers/blob/main/src/transformers/training_args.py#L1645
Not much work to do in this repo, but maybe you can drop this line (and remove it from any default configs that you pass around):
https://github.com/sillsdev/silnlp/blob/deaf7695e48bc4819cc342a089d0cb3eff71d17e/silnlp/nmt/hugging_face_config.py#L134