google / prompt-to-prompt

Apache License 2.0
3.07k stars 285 forks source link

Installation issues: ModuleNotFound Error when trying to run on Colab #28

Closed m-mandel closed 1 year ago

m-mandel commented 1 year ago

I'm trying to run the Null-text inversion code on colab, but can't seem to install it due to xformers issues.

I think I succeeded installing xformers package using !pip install -U --pre xformers The versions of the packages are:

Torch version: 1.13.0+cu116 xformers version: 0.0.16rc396 diffusers version: 0.10.0

But I get the following error:


---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
[<ipython-input-15-ab2e3648a6a0>](https://localhost:8080/#) in <module>
      8 ldm_stable = StableDiffusionPipeline.from_pretrained("CompVis/stable-diffusion-v1-4", use_auth_token=MY_TOKEN, scheduler=scheduler).to(device)
      9 try:
---> 10     ldm_stable.disable_xformers_memory_efficient_attention()
     11 except AttributeError:
     12     print("Attribute disable_xformers_memory_efficient_attention() is missing")

7 frames
[/usr/local/lib/python3.8/dist-packages/diffusers/pipeline_utils.py](https://localhost:8080/#) in disable_xformers_memory_efficient_attention(self)
    829         Disable memory efficient attention as implemented in xformers.
    830         """
--> 831         self.set_use_memory_efficient_attention_xformers(False)
    832 
    833     def set_use_memory_efficient_attention_xformers(self, valid: bool) -> None:

[/usr/local/lib/python3.8/dist-packages/diffusers/pipeline_utils.py](https://localhost:8080/#) in set_use_memory_efficient_attention_xformers(self, valid)
    846         Xformers    module = getattr(self, module_name)
    847             if isinstance(module, torch.nn.Module):
--> 848                 fn_recursive_set_mem_eff(module)
    849 
    850     def enable_attention_slicing(self, slice_size: Optional[Union[str, int]] = "auto"):

[/usr/local/lib/python3.8/dist-packages/diffusers/pipeline_utils.py](https://localhost:8080/#) in fn_recursive_set_mem_eff(module)
    840 
    841             for child in module.children():
--> 842                 fn_recursive_set_mem_eff(child)
    843 
    844         module_names, _, _ = self.extract_init_dict(dict(self.config))

[/usr/local/lib/python3.8/dist-packages/diffusers/pipeline_utils.py](https://localhost:8080/#) in fn_recursive_set_mem_eff(module)
    840 
    841             for child in module.children():
--> 842                 fn_recursive_set_mem_eff(child)
    843 
    844         module_names, _, _ = self.extract_init_dict(dict(self.config))

[/usr/local/lib/python3.8/dist-packages/diffusers/pipeline_utils.py](https://localhost:8080/#) in fn_recursive_set_mem_eff(module)
    840 
    841             for child in module.children():
--> 842                 fn_recursive_set_mem_eff(child)
    843 
    844         module_names, _, _ = self.extract_init_dict(dict(self.config))

[/usr/local/lib/python3.8/dist-packages/diffusers/pipeline_utils.py](https://localhost:8080/#) in fn_recursive_set_mem_eff(module)
    840 
    841             for child in module.children():
--> 842                 fn_recursive_set_mem_eff(child)
    843 
    844         module_names, _, _ = self.extract_init_dict(dict(self.config))

[/usr/local/lib/python3.8/dist-packages/diffusers/pipeline_utils.py](https://localhost:8080/#) in fn_recursive_set_mem_eff(module)
    837         def fn_recursive_set_mem_eff(module: torch.nn.Module):
    838             if hasattr(module, "set_use_memory_efficient_attention_xformers"):
--> 839                 module.set_use_memory_efficient_attention_xformers(valid)
    840 
    841             for child in module.children():

[/usr/local/lib/python3.8/dist-packages/diffusers/models/attention.py](https://localhost:8080/#) in set_use_memory_efficient_attention_xformers(self, use_memory_efficient_attention_xformers)
    289     def set_use_memory_efficient_attention_xformers(self, use_memory_efficient_attention_xformers: bool):
    290         if not is_xformers_available():
--> 291             raise ModuleNotFoundError(
    292                 "Refer to https://github.com/facebookresearch/xformers for more information on how to install"
    293                 " xformers",

ModuleNotFoundError: Refer to https://github.com/facebookresearch/xformers for more information on how to install xformers

What am I doing wrong? Any help would be much appreciated.

m-mandel commented 1 year ago

Solved! Downgrading diffusers to 0.8.0 gets it to work.