huggingface / transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://huggingface.co/transformers
Apache License 2.0
135.41k stars 27.09k forks source link

Better error message when loading adapter models with peft dependency missing #34733

Open maxjeblick opened 1 week ago

maxjeblick commented 1 week ago

Feature request

Loading adapter models (such as https://huggingface.co/lightonai/MonoQwen2-VL-v0.1/tree/main) fails with an error message when peft isn't installed. The error message OSError: lightonai/MonoQwen2-VL-v0.1 does not appear to have a file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt or flax_model.msgpack. is a bit cryptic and requires the user to understand that

To improve UX, it would be useful to show a different error message such as "The model lightonai/MonoQwen2-VL-v0.1 is an adapter model. To load it, you need to install peft (hint: runpip install peft)".

Motivation

Improve UX. The user may get the impression that the model repository is corrupted.

Your contribution

This feature should probably be implemented by core maintainers that are familiar with the internals of the model loading code.

LysandreJik commented 1 week ago

That seems fair indeed! Would you like to open a PR with your proposed change? cc @SunMarc @BenjaminBossan