ironjr / StreamMultiDiffusion

Official code for the paper "StreamMultiDiffusion: Real-Time Interactive Generation with Region-Based Semantic Control."
https://jaerinlee.com/research/streammultidiffusion
MIT License
518 stars 43 forks source link

Does he have a way to run under the CPU? Can I run the Dreamshaper series or other lightweight models #6

Closed mingooglegit closed 4 months ago

mingooglegit commented 5 months ago

Does he have a way to run under the CPU? Can I run the Dreamshaper series or other lightweight models

ironjr commented 5 months ago
  1. I believe CPU takes much more time than GPU, but if you need this feature, I'll try and add options for that.
  2. Does Dreamshaper mean SD1.5-based Dreamshaper? The code supports SD1.5 version so you may just provide .safetensors to the argument as described in README.
ironjr commented 4 months ago

I just added option to enforce CPU run. Just add option in the demo apps by python app.py --device=-1 <other options>. But I don't want to recommend using it. Running Stable Diffusion in CPU is super slow.

For example on RTX 2080 Ti, demo/semantic_palette runs in 10 sec, but in Xeon 96-core CPU, it runs in 8 mins. So... I would not dare trying it.