Closed plandes closed 1 year ago
You need to upgrade your transformers version. This param is needed for later versions of the transformers lib. I'll have to go back through the transformers history and see when the code was added but the parse_xfm code was added in in February and transformers 4.11.3 was from October 2021 so it must have been added in a release between those time-frames.
I'll update amrlib's requirements to reflect the need for a later version of the transformers lib.
The _internal_call
parameter showed up in transformers v4.16.0. The call save_model
must be called with this equal to True
to prevent the code from attempting to push the model to the HF hub each save during training.
@bjascob Thank you for such a quick response and looking in to this issue. This makes my work much easier.
Regarding your requirements.txt
commit: may I recommend you change the requirements to:
transformers~=4.16.0
To nail it down to a 4.16 release? Then at least pip will whine if the dependencies are not copasetic when installing new packages and protects this code from later releases since it does appear to be a specific implementation detail rather than a class contract issue.
My goal is to keep it up-to-date with the latest releases, not a specific one. Most people use this for inference which works fine over a wide range of sub-library versions. I'll consider adding some notes to the readme or install instructions on the "recommended" versions so avoid potential issues.
Sure, but if you at least change the requirements to 4.16 or above you might have others in my situation with an older version already installed.
Thanks again.
I'm training a new corpus from the checkpoint of the pretrained
xfm_base
usingamrlib.models.parse_xfm.Trainer
programmatically. However, there appears to be a kwarg added that isn't provided in the HuggingFacetransformers
library (see the stack trace below).I am using
transformers
version4.11.3
.Proposed change
I imagine this kwarg was needed for a previous or future version of
transformers
. If that's the case, branching on thetransformers
version would probably be the right choice. For clarity, I am giving the diff that fixed this issue for me:I'm happy to create a pull request for this if you tell me if you want the version check and what that should be.
PS--There also might be a way to use Python introspection to get the allowed kwargs for the method.