Closed yangheng95 closed 1 year ago
I think the old checkpoints are supposed to work for latest transformers versions.
Only if use the from_pretrained
method. You cannot use torch.load_state_dict
without using strict=False
since they contain key that we do not use. In general, from_pretrained
is the fully supported way to load models for Transformers.
Thanks for your clear explanation! @sgugger
System Info
transformers
version: 4.31.0Who can help?
@sgugger
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
I am sorry I encounter this issue in a this repo: https://github.com/yangheng95/PyABSA/blob/v2/examples-v2/aspect_polarity_classification/inference.py
Maybe you can install this package: pip install pyabsa
Or I guess it is possible to understand this issue without an installation
Expected behavior
I think the old checkpoints are supposed to work for latest transformers versions. But recent update make the loading fail. The update happens to many models. e.g., https://github.com/huggingface/transformers/blob/8e5d1619b3e57367701d74647e87b95f8dba5409/src/transformers/models/albert/modeling_albert.py#L211
I have no idea about the context of the purpose of this modification. As this is a minor issue, it will be ok to ignore and close this issue.
Thank you for your great work!