sangminkim-99 / Sketch-Guided-Text-To-Image

Unofficial implementation of Sketch-Guided Text-to-Image Diffusion Models
8 stars 0 forks source link

Issue with xformers #2

Closed rs2125 closed 5 months ago

rs2125 commented 5 months ago

There is some version incompatability issue with xformers when I try to train the LEP. Which version of xformers should I install?

NotImplementedError: No operator found for memory_efficient_attention_forward with inputs: query : shape=(1, 2, 1, 40) (torch.float32) key : shape=(1, 2, 1, 40) (torch.float32) value : shape=(1, 2, 1, 40) (torch.float32) attn_bias : <class 'NoneType'> p : 0.0 decoderF is not supported because: xFormers wasn't build with CUDA support attn_bias type is <class 'NoneType'> operator wasn't built - see python -m xformers.info for more info flshattF@0.0.0 is not supported because: xFormers wasn't build with CUDA support requires device with capability > (8, 0) but your GPU has capability (7, 5) (too old) dtype=torch.float32 (supported: {torch.float16, torch.bfloat16}) operator wasn't built - see python -m xformers.info for more info cutlassF is not supported because: xFormers wasn't build with CUDA support operator wasn't built - see python -m xformers.info for more info smallkF is not supported because: max(query.shape[-1] != value.shape[-1]) > 32 xFormers wasn't build with CUDA support operator wasn't built - see python -m xformers.info for more info unsupported embed per head: 40

sangminkim-99 commented 5 months ago

I recently upgraded my desktop setup, prompting me to reinstall this repository from scratch.

My current working setup is outlined below:

conda install -c "nvidia/label/cuda-11.6.2" cuda-toolkit
conda install pytorch==1.12.1 torchvision==0.13.1 torchaudio==0.12.1 cudatoolkit=11.6 -c pytorch -c conda-forge
conda install xformers -c xformers
pip install -r requirements.txt
sangminkim-99 commented 5 months ago

Indeed, for gradio demo, I should upgrade the graido from 3.35.2 to 3.44.4

pip install gradio==3.44.4
rs2125 commented 5 months ago

Thanks a lot. Really helpful!