Closed mingooglegit closed 4 months ago
.safetensors
to the argument as described in README.I just added option to enforce CPU run. Just add option in the demo apps by python app.py --device=-1 <other options>
. But I don't want to recommend using it. Running Stable Diffusion in CPU is super slow.
For example on RTX 2080 Ti, demo/semantic_palette
runs in 10 sec, but in Xeon 96-core CPU, it runs in 8 mins. So... I would not dare trying it.
Does he have a way to run under the CPU? Can I run the Dreamshaper series or other lightweight models