0xbitches / sd-webui-lcm

Latent Consistency Model for AUTOMATIC1111 Stable Diffusion WebUI
MIT License
614 stars 43 forks source link

TypeError: slice indices must be integers or None or have an __index__ method #37

Open lilyzlt opened 10 months ago

lilyzlt commented 10 months ago

iffuser:0.23.0 or 0.24.0.dev0 , all has errors below: File "/data/ComfyUI/custom_nodes/ComfyUI-LCM/lcm/lcm_i2i_pipeline.py", line 304, in call self.scheduler.set_timesteps(strength, num_inference_steps, lcm_origin_steps) File "/data/miniconda3/envs/env-novelai/lib/python3.10/site-packages/diffusers/schedulers/scheduling_lcm.py", line 388, in set_timesteps timesteps = lcm_origin_timesteps[::-skipping_step][:num_inference_steps] image

matichek commented 10 months ago

yes, I have the same but on vid2vid

fanweiya commented 10 months ago

I have the same issue

PavelMSW commented 10 months ago

same for me

LCM inference time: 54.97430491447449 seconds Loading pipeline components...: 100%|█████████████████████████████████████████████████| 6/6 [00:01<00:00, 4.63steps/s] Traceback (most recent call last): File "D:\sd.webui\system\python\lib\site-packages\gradio\routes.py", line 488, in run_predict output = await app.get_blocks().process_api( File "D:\sd.webui\system\python\lib\site-packages\gradio\blocks.py", line 1431, in process_api result = await self.call_function( File "D:\sd.webui\system\python\lib\site-packages\gradio\blocks.py", line 1103, in call_function prediction = await anyio.to_thread.run_sync( File "D:\sd.webui\system\python\lib\site-packages\anyio\to_thread.py", line 33, in run_sync return await get_asynclib().run_sync_in_worker_thread( File "D:\sd.webui\system\python\lib\site-packages\anyio_backends_asyncio.py", line 877, in run_sync_in_worker_thread return await future File "D:\sd.webui\system\python\lib\site-packages\anyio_backends_asyncio.py", line 807, in run result = context.run(func, args) File "D:\sd.webui\system\python\lib\site-packages\gradio\utils.py", line 707, in wrapper response = f(args, *kwargs) File "D:\sd.webui\webui\extensions\sd-webui-lcm\scripts\main.py", line 291, in generate_v2v result = pipe( File "D:\sd.webui\system\python\lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context return func(args, **kwargs) File "D:\sd.webui\webui\extensions\sd-webui-lcm\lcm\lcm_i2i_pipeline.py", line 305, in call self.scheduler.set_timesteps(strength, num_inference_steps, original_inference_steps) File "D:\sd.webui\system\python\lib\site-packages\diffusers\schedulers\scheduling_lcm.py", line 382, in set_timesteps timesteps = lcm_origin_timesteps[::-skipping_step][:num_inference_steps] TypeError: slice indices must be integers or None or have an index method