miccunifi / ladi-vton

[ACM MM 2023] - LaDI-VTON: Latent Diffusion Textual-Inversion Enhanced Virtual Try-On
Other
412 stars 56 forks source link

How can i run this project with m1 mac #42

Open abdurrahmanekr opened 11 months ago

abdurrahmanekr commented 11 months ago

I followed all instructions which this project provide, and I get this error:

ValueError: torch.cuda.is_available() should be True but is False. xformers' memory efficient attention is only available for GPU

After that I removed --enable_xformers_memory_efficient_attention argument from my command, the error changed like this:

...src/inference.py", line 226, in main
    generator = torch.Generator("cuda").manual_seed(args.seed)
                ^^^^^^^^^^^^^^^^^^^^^^^
RuntimeError: Device type CUDA is not supported for torch.Generator() api.

I've searched that error and then I've found pytorch mps document. I changed the src/inference.py:226 code to:

generator = torch.Generator("mps").manual_seed(args.seed)

So maybe I've been trying something wrong, because it didn't work too. I was going to try without GPU but i didn't do it. Is there a way to close cuda/GPU?

Thank you

My environment: Macbook Pro 14-inch, 2021 chip: Apple M1 Pro os: 13.6 (22G120) python: 3.11