Closed rmac85 closed 1 year ago
I can confirm the same problem occurs with my attempt at training
I think i built a wheel but I don't know how to save it so I can keep using it until the link is updated,. The directory it says it is saved to doesn't seem to exist.
same problem here!
Same (on Google Colab):
/usr/local/lib/python3.8/dist-packages/diffusers/models/attention.py:435: UserWarning: Could not enable memory efficient attention. Make sure xformers is installed correctly and a GPU is available: No such operator xformers::efficient_attention_forward_cutlass - did you forget to build xformers with python setup.py develop
?
warnings.warn(
It works when you install the wheel but nobody likes waiting 42 minutes or so, and then as I said, once it is finally compiled, there is no way to save it. Would be cool if someone were able to find a way to compile the wheel and then download it o save to drive. It claims to be saved to some elusive tmp location but I can't find it. I don't really know how any of it works, maybe that's out of the question.
i have same problem. is there anyway to fix locally?
It works when you install the wheel but nobody likes waiting 42 minutes or so, and then as I said, once it is finally compiled, there is no way to save it. Would be cool if someone were able to find a way to compile the wheel and then download it o save to drive. It claims to be saved to some elusive tmp location but I can't find it. I don't really know how any of it works, maybe that's out of the question.
Cool, now trying the compile on Colab with:
%pip install git+https://github.com/facebookresearch/xformers@4c06c79#egg=xformers
Problem on Google Colab:
The following values were not passed to accelerate launch
and had defaults used instead:
--num_processes
was set to a value of 1
--num_machines
was set to a value of 1
--mixed_precision
was set to a value of 'no'
--num_cpu_threads_per_process
was set to 1
to improve out-of-box performance
To avoid this warning pass in values for each of the problematic parameters or run accelerate config
.
/usr/local/lib/python3.8/dist-packages/xformers/_C.so: undefined symbol: _ZNK3c104impl13OperatorEntry20reportSignatureErrorENS0_12CppSignatureE
WARNING: /usr/local/lib/python3.8/dist-packages/xformers/_C.so: undefined symbol: _ZNK3c104impl13OperatorEntry20reportSignatureErrorENS0_12CppSignatureE
Need to compile C++ extensions to get sparse attention support. Please run python setup.py build develop
Fetching 15 files: 100% 15/15 [00:50<00:00, 3.38s/it]
/usr/local/lib/python3.8/dist-packages/diffusers/models/attention.py:435: UserWarning: Could not enable memory efficient attention. Make sure xformers is installed correctly and a GPU is available: No such operator xformers::efficient_attention_forward_cutlass - did you forget to build xformers with python setup.py develop
?
warnings.warn(
Generating class images: 0% 0/13 [00:06<?, ?it/s]
Traceback (most recent call last):
File "train_dreambooth.py", line 822, in python setup.py develop
?
Traceback (most recent call last):
File "/usr/local/bin/accelerate", line 8, in
You have to # comment line 1 and then uncomment line 5 and then wait about 42 mins for a wheel to compile. I did that and then did a quick test and that worked. I'm sure the link will be updated soon.
thank you so much. i hope it will be updated quickly
You have to # comment line 1 and then uncomment line 5 and then wait about 42 mins for a wheel to compile. I did that and then did a quick test and that worked. I'm sure the link will be updated soon.
Thank you for this tip. It seems to work now - it got past the training step. The compiling of the wheel did take way longer then 40 minutes.. 🫠
I've done it a few times before, the wheel gets outdated a lot. It's like 40 minutes to an hour or more. Shouldn't go much more than 60.
is there any update?
I'm still getting the same error with the "metrolobo/xformers_wheels/releases/download/4c06c79_various6/xformers-0.0.15.dev0_4c06c79.d20221201-cp38-cp38-linux_x86_64.whl" update. I'm using the lastest notebook with the xformer from this link.
@ShivamShrirao is there any update? still google collab is not working fine
Same problem here. What versions should we fix in the notebook to prevent these kinds of breaks in the future?
Official xformers binary is available in conda.
https://github.com/facebookresearch/xformers#installing-xformers
any attempt to create a conda environment in notebook?
I think this is caused by an update to colab's python version. (not 100% sure) Another way is to install another version of python.
I've done it a few times before, the wheel gets outdated a lot. It's like 40 minutes to an hour or more. Shouldn't go much more than 60.
I've just done a compile that took 1,5 hours 😬 Hope there'll be a fix soon 🙏🏽 Colab: A100-SXM4-40GB, 40536 MiB, 40536 MiB
is there any other way to use another link for xformers
I've put the xformers wheels compiled by facebookresearch here (at least the one that works on Google Colab for Tesla T4 and A100, the only ones I've tested):
thanks for helping, solved all problems <3
I've put the xformers wheels compiled by facebookresearch here (at least the one that works on Google Colab for Tesla T4 and A100, the only ones I've tested):
wow thank you so much it solved!!
Thanks @brian6091 , I have updated 58a03e9aafaeeacadb4392f21f5571cc8da8fd6d
I've put the xformers wheels compiled by facebookresearch here (at least the one that works on Google Colab for Tesla T4 and A100, the only ones I've tested):
It worked! I've changed the line of code for the xformer block to:
%pip install -q https://github.com/brian6091/xformers-wheels/releases/download/0.0.15.dev0%2B4c06c79/xformers-0.0.15.dev0+4c06c79.d20221205-cp38-cp38-linux_x86_64.whl
Thanks for steering the wheel for us Brian.
Describe the bug
WARNING:xformers:WARNING: /usr/local/lib/python3.8/dist-packages/xformers/_C.so: undefined symbol: _ZNK3c104impl13OperatorEntry20reportSignatureErrorENS0_12CppSignatureE Need to compile C++ extensions to get sparse attention support. Please run python setup.py build develop /usr/local/lib/python3.8/dist-packages/xformers/_C.so: undefined symbol: _ZNK3c104impl13OperatorEntry20reportSignatureErrorENS0_12CppSignatureE /usr/local/lib/python3.8/dist-packages/diffusers/models/attention.py:435: UserWarning: Could not enable memory efficient attention. Make sure xformers is installed correctly and a GPU is available: No such operator xformers::efficient_attention_forward_cutlass - did you forget to build xformers with
python setup.py develop
? warnings.warn(Reproduction
No response
Logs
System Info
Colab notebook