Loading adapter models (such as https://huggingface.co/lightonai/MonoQwen2-VL-v0.1/tree/main) fails with an error message when peft isn't installed. The error message
OSError: lightonai/MonoQwen2-VL-v0.1 does not appear to have a file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt or flax_model.msgpack.
is a bit cryptic and requires the user to understand that
the model that will be loaded is a peft adapter
peft isn't installed in the current env
To improve UX, it would be useful to show a different error message such as "The model lightonai/MonoQwen2-VL-v0.1 is an adapter model. To load it, you need to install peft (hint: runpip install peft)".
Motivation
Improve UX. The user may get the impression that the model repository is corrupted.
Your contribution
This feature should probably be implemented by core maintainers that are familiar with the internals of the model loading code.
Feature request
Loading adapter models (such as https://huggingface.co/lightonai/MonoQwen2-VL-v0.1/tree/main) fails with an error message when peft isn't installed. The error message
OSError: lightonai/MonoQwen2-VL-v0.1 does not appear to have a file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt or flax_model.msgpack.
is a bit cryptic and requires the user to understand thatTo improve UX, it would be useful to show a different error message such as
"The model lightonai/MonoQwen2-VL-v0.1 is an adapter model. To load it, you need to install peft (hint: run
pip install peft)".
Motivation
Improve UX. The user may get the impression that the model repository is corrupted.
Your contribution
This feature should probably be implemented by core maintainers that are familiar with the internals of the model loading code.