huggingface / transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://huggingface.co/transformers
Apache License 2.0
132.87k stars 26.5k forks source link

Issue from recent versions: Unexpected in state_dict: embeddings.position_ids #25330

Closed yangheng95 closed 1 year ago

yangheng95 commented 1 year ago

System Info

Who can help?

@sgugger

Information

Tasks

Reproduction

I am sorry I encounter this issue in a this repo: https://github.com/yangheng95/PyABSA/blob/v2/examples-v2/aspect_polarity_classification/inference.py

Maybe you can install this package: pip install pyabsa

Or I guess it is possible to understand this issue without an installation

Expected behavior

I think the old checkpoints are supposed to work for latest transformers versions. But recent update make the loading fail. The update happens to many models. e.g., https://github.com/huggingface/transformers/blob/8e5d1619b3e57367701d74647e87b95f8dba5409/src/transformers/models/albert/modeling_albert.py#L211

I have no idea about the context of the purpose of this modification. As this is a minor issue, it will be ok to ignore and close this issue.

Thank you for your great work!

sgugger commented 1 year ago

I think the old checkpoints are supposed to work for latest transformers versions.

Only if use the from_pretrained method. You cannot use torch.load_state_dict without using strict=False since they contain key that we do not use. In general, from_pretrained is the fully supported way to load models for Transformers.

yangheng95 commented 1 year ago

Thanks for your clear explanation! @sgugger