0xbitches / sd-webui-lcm

Latent Consistency Model for AUTOMATIC1111 Stable Diffusion WebUI
MIT License
613 stars 43 forks source link

Error in img2img #34

Open ato-zen opened 8 months ago

ato-zen commented 8 months ago

/home/user/stable-diffusion-webui/venv/lib/python3.10/site-packages/diffusers/pipelines/pipeline_utils.py:749: FutureWarning: torch_dtype is deprecated and will be removed in version 0.25.0. deprecate("torch_dtype", "0.25.0", "") /home/user/stable-diffusion-webui/venv/lib/python3.10/site-packages/diffusers/pipelines/pipeline_utils.py:752: FutureWarning: torch_device is deprecated and will be removed in version 0.25.0. deprecate("torch_device", "0.25.0", "") Traceback (most recent call last): File "/home/user/stable-diffusion-webui/venv/lib/python3.10/site-packages/gradio/routes.py", line 488, in run_predict output = await app.get_blocks().process_api( File "/home/user/stable-diffusion-webui/venv/lib/python3.10/site-packages/gradio/blocks.py", line 1431, in process_api result = await self.call_function( File "/home/user/stable-diffusion-webui/venv/lib/python3.10/site-packages/gradio/blocks.py", line 1103, in call_function prediction = await anyio.to_thread.run_sync( File "/home/user/stable-diffusion-webui/venv/lib/python3.10/site-packages/anyio/to_thread.py", line 33, in run_sync return await get_asynclib().run_sync_in_worker_thread( File "/home/user/stable-diffusion-webui/venv/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 877, in run_sync_in_worker_thread return await future File "/home/user/stable-diffusion-webui/venv/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 807, in run result = context.run(func, args) File "/home/user/stable-diffusion-webui/venv/lib/python3.10/site-packages/gradio/utils.py", line 707, in wrapper response = f(args, kwargs) File "/home/user/stable-diffusion-webui/venv/lib/python3.10/site-packages/gradio/utils.py", line 707, in wrapper response = f(*args, *kwargs) File "/home/user/stable-diffusion-webui/extensions/sd-webui-lcm/scripts/main.py", line 172, in generate_i2i result = pipe( File "/home/user/stable-diffusion-webui/venv/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context return func(args, kwargs) File "/home/user/stable-diffusion-webui/extensions/sd-webui-lcm/lcm/lcm_i2i_pipeline.py", line 305, in call self.scheduler.set_timesteps(strength, num_inference_steps, original_inference_steps) File "/home/user/stable-diffusion-webui/venv/lib/python3.10/site-packages/diffusers/schedulers/scheduling_lcm.py", line 382, in set_timesteps timesteps = lcm_origin_timesteps[::-skipping_step][:num_inference_steps] TypeError: slice indices must be integers or None or have an index method

navarisun1982 commented 8 months ago

same error here and in LCM vid2vid

zenphyl commented 7 months ago

same here

Ehplodor commented 7 months ago

there is a problem of function overlap with https://github.com/huggingface/diffusers/blame/main/src/diffusers/schedulers/scheduling_lcm.py

ato-zen commented 7 months ago

@Ehplodor I don't think so. If you set denoise to 1, there is a problem with another part of the code etc.... this project has been abandoned, just like ComfyUI

Ehplodor commented 7 months ago

@ato-zen yes, the parameters of set_timesteps aren't aligned between each version of the function (the one in diffusers here has evolved in contrasts to the one implemented into lcm i2i pipeline here)

Ehplodor commented 7 months ago

set_timestep is called here on line 305 of lcm i2i pipeline but it calls diffusers version where params are different

Ehplodor commented 7 months ago

So lcm's denoise strength is actually sent into num_inference_steps from diffuser's version of set_timestep().

Ehplodor commented 7 months ago

hence the problem with non integer indices if denoise strength is set to anything different than 1.0

Ehplodor commented 7 months ago

@ato-zen ok i understand why you say it's been abandoned. Code is obsolete and would need multiple evolutions. TY for pointing that out. Do you know of an alternative for A1111 webui ?

ato-zen commented 7 months ago

LCM checkpoints XL and LCM 1.5 SD is blazing fast for upscaling with tile https://civitai.com/images/3772590