huggingface / peft

🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
https://huggingface.co/docs/peft
Apache License 2.0
16.46k stars 1.62k forks source link

loftq_utils.py depdends on huggingface_hub.errors, which doesn't appear in some versions of huggingface_hub #2097

Closed mashoutsider closed 2 weeks ago

mashoutsider commented 1 month ago

System Info

loftq_utils.py refers to huggingface_hub.errors

Should the requirements.txt include huggingface_hub?

I have huggingface_hub version 0.19.4 and it does not have huggingface_hub.errors.

Is there a workaround or another version of huggingface_hub?

Who can help?

No response

Information

Tasks

Reproduction

Should the requirements.txt include huggingface_hub?

I have huggingface_hub version 0.19.4 and it does not have huggingface_hub.errors.

Is there a workaround or another version of huggingface_hub?

Expected behavior

Should the requirements.txt include huggingface_hub?

I have huggingface_hub version 0.19.4 and it does not have huggingface_hub.errors.

Is there a workaround or another version of huggingface_hub?

BenjaminBossan commented 1 month ago

It's true that we don't explicitly list huggingface_hub as a requirement, but it's an indirect requirement (e.g. it's a requirement of accelerate which is a requirement of PEFT). For your specific problem, I think it can easily be solved if the error is imported from huggingface_hub.utils instead. I tested this locally with v0.19.4 and it worked. Would you like to create a PR for this?

saeedesmaili commented 3 weeks ago

I'm also getting ModuleNotFoundError: No module named 'huggingface_hub.errors' when importing from peft import AutoPeftModelForCausalLM.

Tried on a new env with python 3.10 and pip install transformers peft. Not sure how to fix this.

BenjaminBossan commented 3 weeks ago

Yeah, it's the same issue most likely, with the fix I mentioned above. What version of huggingface_hub are you using? Could you try downgrading it?

saeedesmaili commented 3 weeks ago

I upgraded transformers and xformers and one of them helped to fix the issue.