Closed EnricoBeltramo closed 1 year ago
I fixed with a little workaround. In my env I use dreambooth too, and to let it work I need to specify the cuda version i.e.: pip install bitsandbytes-cuda110 this wasn't an issue for diffuser 0.12 but raise an error on >=0.13
At moment I fixed installing both version of bitsandbytes: pip install bitsandbytes-cuda110 bitsandbytes
There is a better solution?
Hey @EnricoBeltramo,
I cannot reproduce the issue sadly
I know this is not diffusers related, but I get the same issue with transformers With these two installed
bitsandbytes-cuda117==0.26.0.post2
transformers[audio,deepspeed,ftfy,onnx,sentencepiece,timm,tokenizers,video,vision]==4.28.1
and without bitsandbytes
I run into the same issue because bitsandbytes
and bitsandbytes-cuda117
behave as two different packages
File "/opt/conda/lib/python3.8/site-packages/transformers/models/auto/auto_factory.py", line 471, in from_pretrained
return model_class.from_pretrained(
File "/opt/conda/lib/python3.8/site-packages/transformers/modeling_utils.py", line 2177, in from_pretrained
is_8bit_serializable = version.parse(importlib_metadata.version("bitsandbytes")) > version.parse("0.37.2")
File "/opt/conda/lib/python3.8/importlib/metadata.py", line 530, in version
return distribution(distribution_name).version
File "/opt/conda/lib/python3.8/importlib/metadata.py", line 503, in distribution
return Distribution.from_name(distribution_name)
File "/opt/conda/lib/python3.8/importlib/metadata.py", line 177, in from_name
raise PackageNotFoundError(name)
importlib.metadata.PackageNotFoundError: bitsandbytes
installing both fixes the issue for me
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
also have such problem
Can someone add a link to a google colab that reproduces this issue? I'm using PyPI (and not conda
) and have Pytorch 2.x installed. I cannot reproduce the issue
I think it does not matter now because bitsandbytes
ships with libs for all cuda versions and one is picked according to whatever torch is using or installed on the system. So practically bitsandbytes-cuxxx
packages are no longer needed
Experiencing it while trying to run this finetuning example: https://huggingface.co/blog/g-ronimo/phinetuning
I have created a small sample colab for folks to repro: https://colab.research.google.com/drive/1e3vfyQx8HCnAiYpmzaca7xH545uvtCC9?usp=sharing
@VikramTiwari You can easily solve the problem by installing.
pip install bitsandbytes
I think the unfamiliar error message confused people.
Try using
pip install bitsandbytes
I ran through the same error and this command fixed the errors for me
Hello All,
I have done a simple project which is Text Summarization and I used T5 Model and I successfully fine tuned my model and also inference it on notebook.i have installed bitsandbytes, accelerate,trl and all.
But when push it on hugginface and inference it on server less API. I get error "No package metadata was found for bitsandbytes".
Please suggest me how I can resolve this issue.
Try using
pip install bitsandbytes
I ran through the same error and this command fixed the errors for me
I don't know why but this really works!!!
Yes because we need to install bitsandbytes then only we can use Quantization in LLM.
On Tue, Jun 18, 2024 at 2:37 PM Lychee @.***> wrote:
Try using
pip install bitsandbytes
I ran through the same error and this command fixed the errors for me
I don't know why but this really works!!!
— Reply to this email directly, view it on GitHub https://github.com/huggingface/diffusers/issues/3194#issuecomment-2175592117, or unsubscribe https://github.com/notifications/unsubscribe-auth/A3L2YV5ZYKO7AKLDVFUJSDDZH72F3AVCNFSM6AAAAAAXHV4DUWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCNZVGU4TEMJRG4 . You are receiving this because you commented.Message ID: @.***>
For me, it says
Traceback (most recent call last):
File "/home/heinzketchup/Documents/manim_gen/test.py", line 4, in <module>
model = AutoModelForCausalLM.from_pretrained("thanhkt/codegemma-7B-ManimGen")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/heinzketchup/Documents/manim_gen/.env/lib64/python3.12/site-packages/transformers/models/auto/auto_factory.py", line 564, in from_pretrained
return model_class.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/heinzketchup/Documents/manim_gen/.env/lib64/python3.12/site-packages/transformers/modeling_utils.py", line 3657, in from_pretrained
hf_quantizer.validate_environment(
File "/home/heinzketchup/Documents/manim_gen/.env/lib64/python3.12/site-packages/transformers/quantizers/quantizer_bnb_4bit.py", line 74, in validate_environment
raise ImportError(
ImportError: Using `bitsandbytes` 4-bit quantization requires the latest version of bitsandbytes: `pip install -U bitsandbytes`
And when I go to update, it can't find updates. It tells me to update pip, and after doing so, it still can't find the update.
Yes install bitsandbytes using pip
On Mon, 11 Nov, 2024, 04:47 Heinrich-XIAO, @.***> wrote:
For me, it says
Traceback (most recent call last): File "/home/heinzketchup/Documents/manim_gen/test.py", line 4, in
model = AutoModelForCausalLM.from_pretrained("thanhkt/codegemma-7B-ManimGen") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/heinzketchup/Documents/manim_gen/.env/lib64/python3.12/site-packages/transformers/models/auto/auto_factory.py", line 564, in from_pretrained return model_class.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/heinzketchup/Documents/manim_gen/.env/lib64/python3.12/site-packages/transformers/modeling_utils.py", line 3657, in from_pretrained hf_quantizer.validate_environment( File "/home/heinzketchup/Documents/manim_gen/.env/lib64/python3.12/site-packages/transformers/quantizers/quantizer_bnb_4bit.py", line 74, in validate_environment raise ImportError( ImportError: Using bitsandbytes
4-bit quantization requires the latest version of bitsandbytes:pip install -U bitsandbytes
And when I go to update, it can't find updates. It tells me to update pip, and after doing so, it still can't find the update.
— Reply to this email directly, view it on GitHub https://github.com/huggingface/diffusers/issues/3194#issuecomment-2466981998, or unsubscribe https://github.com/notifications/unsubscribe-auth/A3L2YVYU3RMF34PDWVJJ4LLZ77SRRAVCNFSM6AAAAAAXHV4DUWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINRWHE4DCOJZHA . You are receiving this because you commented.Message ID: @.***>
Describe the bug
I have a working configuration to load a text2img diffuser model with diffuser 0.12.1. When I switch of diffuser version >= 0.13.0, I have an error: PackageNotFoundError: No package metadata was found for bitsandbytes
Some dependencies are changed?
Reproduction
Logs
System Info
diffusers
version: 0.13.0