huggingface / transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://huggingface.co/transformers
Apache License 2.0
135.32k stars 27.08k forks source link

Skip eetq test if it attempts to import shard_checkpoint #34868

Closed MekkCyber closed 6 hours ago

MekkCyber commented 19 hours ago

What does this PR do?

PR inspired from peft by @BenjaminBossan https://github.com/huggingface/peft/pull/2226.

EETQ attempts to import the shard_checkpoint function from the transformers library, but this function has been removed in the latest version. As a result, using EETQ currently causes an import error, leading to all tests failing. This fix ensures that EETQ tests are skipped if an import error occurs.

The issue is reported to EETQ: NetEase-FuXi/EETQ#34.

Who can review ?

@SunMarc

HuggingFaceDocBuilderDev commented 18 hours ago

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

BenjaminBossan commented 9 hours ago

I have already started a PR: #34854. Sorry for the missing communication.

MekkCyber commented 6 hours ago

No problem will close this one then