0xbitches / sd-webui-lcm

Latent Consistency Model for AUTOMATIC1111 Stable Diffusion WebUI
MIT License
614 stars 43 forks source link

sd-webui-lcm does not take the GPU device setting according to the platform #22

Closed edwios closed 9 months ago

edwios commented 9 months ago

Instead of hardcoded to use "cuda", it should either take it from A1111 or use torch.*.is_available() to determine the GPU to be used on the platform it is executing.

ljleb commented 9 months ago

modules.devices exposes the inference device, we should use that IMO.

0xbitches commented 9 months ago

Should be fixed in https://github.com/0xbitches/sd-webui-lcm/commit/bddc54285be81b0c45320d6ba9edc8d93fe39806.