Open johnwick123f opened 5 months ago
Inference on SDXL takes around 6 minutes and requires roughly 37GB of VRAM, using the default hyper-parameters. Decreasing num_guidance_steps and/or num_gd_iterations will decrease the runtime, compromising performance.
Same as the title, how much vram does it take to run sdxl. How fast is it as well? Great project btw!