Closed cian0 closed 1 year ago
This is xformer error. I'm not really familiar with xformer here, maybe @hafriedlander might be able to help? BTW lora_pti is not tested with xformer right now.
Hmm. So: 1) Xformers doesn't work backwards on 3090 with certain unet sizes. This is a known issue, and xformers team aren't likely to fix. 2) When you enable xformers in https://github.com/cloneofsimo/lora/blob/master/training_scripts/train_lora_dreambooth.py I fix this by only enabling it on unets with sizes that do work. The specific code is at https://github.com/cloneofsimo/lora/blob/master/lora_diffusion/xformers_utils.py#L42 3) But that code isn't ever called with cli_lora_pti - so how is xformers getting enabled?
@cian0 What is the exact version of the Diffusers library you are using?
Hmm. So:
- Xformers doesn't work backwards on 3090 with certain unet sizes. This is a known issue, and xformers team aren't likely to fix.
- When you enable xformers in https://github.com/cloneofsimo/lora/blob/master/training_scripts/train_lora_dreambooth.py I fix this by only enabling it on unets with sizes that do work. The specific code is at https://github.com/cloneofsimo/lora/blob/master/lora_diffusion/xformers_utils.py#L42
- But that code isn't ever called with cli_lora_pti - so how is xformers getting enabled?
@cian0 What is the exact version of the Diffusers library you are using?
Indeed, lora_pti doesn't support xformers yet...
Hmm. So:
- Xformers doesn't work backwards on 3090 with certain unet sizes. This is a known issue, and xformers team aren't likely to fix.
- When you enable xformers in https://github.com/cloneofsimo/lora/blob/master/training_scripts/train_lora_dreambooth.py I fix this by only enabling it on unets with sizes that do work. The specific code is at https://github.com/cloneofsimo/lora/blob/master/lora_diffusion/xformers_utils.py#L42
- But that code isn't ever called with cli_lora_pti - so how is xformers getting enabled?
@cian0 What is the exact version of the Diffusers library you are using?
weird, when I run pip list it is 0.9.0 but when I run conda list it is 0.7.0.dev0 for my diffusers lib, I'll try to update both and see if it gets resolved as well
Cool, that's what I wondered - some versions of Diffusers (0.9.0 and 0.10.0) tried to automatically enable xformers if it was installed,. 0.11 doesn't for sure, so try updating to latest & see how it goes.
Fixed now with updating diffusers thanks!
My env:
Ubuntu 22 xformers 0.0.14.dev0 torch 1.12.1 diffusers 0.9.0 Python 3.9.12 Cuda 11.7 RTX 3090
Note that this doesn't occur when I uninstall/reinstall 0.0.7 lora_diffusion dev branch