Closed coronet1127 closed 1 year ago
it looks like, this is caused by your controlnet.py Dose your controlnet.py work well without video loopback ? i haven't test video loopback with controlnet. it's a good idea, i'll try it later.
Ok, i made some change on the sd-webui-controlnet by Mikubill, now ,try install this version https://github.com/fishslot/sd-webui-controlnet. After you start WebUI, you have to run controlnet once without video loopback, and then, if nothing get wrong, they can work together.
thanks! But I got another error.
Loop:1/10,Image:1/99
seed:816578386, subseed:1380438853
100%|██████████████████████████████████████████████████████████████████████████████████| 23/23 [00:02<00:00, 8.84it/s]
Error completing request | 207/22770 [25:38<42:27, 8.86it/s]
Arguments: ('task(y30wa4udmbuedoh)', 0, 'masterpiece, best quality, high quality, absurdres', 'worst quality, low quality, medium quality, deleted, lowres, comic, bad anatomy, bad hands, text, error, missing fingers, extra digit, fewer digits, cropped, jpeg artifacts, signature, watermark, username, blurry', [], <PIL.Image.Image image mode=RGBA size=406x720 at 0x1A0A3BA8F10>, None, None, None, None, None, None, 30, 0, 4, 0, 1, False, False, 1, 1, 7, 0.75, -1.0, -1.0, 0, 0, 0, False, 512, 512, 0, 0, 32, 0, '', '', '', 11, True, 'openpose', 'control_sd15_openpose(fef5e48e)', 1, {'image': array([[[237, 252, 255],
[237, 252, 255],
[232, 252, 255],
...,
[245, 254, 255],
[251, 255, 255],
[251, 255, 255]],
[[237, 252, 255],
[237, 252, 255],
[232, 252, 255],
...,
[245, 254, 255],
[251, 255, 255],
[251, 255, 255]],
[[235, 253, 255],
[235, 253, 255],
[232, 252, 255],
...,
[245, 254, 255],
[251, 255, 255],
[251, 255, 255]],
...,
[[255, 254, 255],
[255, 254, 255],
[255, 254, 255],
...,
[255, 254, 255],
[255, 254, 255],
[255, 254, 255]],
[[255, 254, 255],
[255, 254, 255],
[255, 254, 255],
...,
[255, 254, 255],
[255, 254, 255],
[255, 254, 255]],
[[255, 254, 255],
[255, 254, 255],
[255, 254, 255],
...,
[255, 254, 255],
[255, 254, 255],
[255, 254, 255]]], dtype=uint8), 'mask': array([[[ 0, 0, 0, 255],
[ 0, 0, 0, 255],
[ 0, 0, 0, 255],
...,
[ 0, 0, 0, 255],
[ 0, 0, 0, 255],
[ 0, 0, 0, 255]],
[[ 0, 0, 0, 255],
[ 0, 0, 0, 255],
[ 0, 0, 0, 255],
...,
[ 0, 0, 0, 255],
[ 0, 0, 0, 255],
[ 0, 0, 0, 255]],
[[ 0, 0, 0, 255],
[ 0, 0, 0, 255],
[ 0, 0, 0, 255],
...,
[ 0, 0, 0, 255],
[ 0, 0, 0, 255],
[ 0, 0, 0, 255]],
...,
[[ 0, 0, 0, 255],
[ 0, 0, 0, 255],
[ 0, 0, 0, 255],
...,
[ 0, 0, 0, 255],
[ 0, 0, 0, 255],
[ 0, 0, 0, 255]],
[[ 0, 0, 0, 255],
[ 0, 0, 0, 255],
[ 0, 0, 0, 255],
...,
[ 0, 0, 0, 255],
[ 0, 0, 0, 255],
[ 0, 0, 0, 255]],
[[ 0, 0, 0, 255],
[ 0, 0, 0, 255],
[ 0, 0, 0, 255],
...,
[ 0, 0, 0, 255],
[ 0, 0, 0, 255],
[ 0, 0, 0, 255]]], dtype=uint8)}, False, 'Just Resize', False, '<ul>\n<li><code>CFG Scale</code> should be 2 or lower.</li>\n</ul>\n', True, True, '', '', True, 50, True, 1, 0, False, 4, 1, '<p style="margin-bottom:0.75em">Recommended settings: Sampling Steps: 80-100, Sampler: Euler a, Denoising strength: 0.8</p>', 128, 8, ['left', 'right', 'up', 'down'], 1, 0.05, 128, 4, 0, ['left', 'right', 'up', 'down'], '', 1, True, 100, False, False, False, False, '', '<p style="margin-bottom:0.75em">Will upscale the image by the selected scale factor; use width and height sliders to set tile size</p>', 64, 0, 2, 1, '', 0, '', 0, '', True, False, False, False, 'C:\\Users\\coron\\Desktop\\StableDiffusion\\stable-diffusion-webui\\training-picker\\extracted-frames\\mmd,dance', 'C:\\Users\\coron\\Desktop\\new', False, '', 127, False, 30, 99, 1, 10, 0.25, True, True, '1', False, '', '', '', '', '', '', '', '', '', '', '', 'None', 0.3, 60) {}
Traceback (most recent call last):
File "C:\Users\coron\desktop\StableDiffusion\stable-diffusion-webui\modules\call_queue.py", line 56, in f
res = list(func(*args, **kwargs))
File "C:\Users\coron\desktop\StableDiffusion\stable-diffusion-webui\modules\call_queue.py", line 37, in f
res = func(*args, **kwargs)
File "C:\Users\coron\desktop\StableDiffusion\stable-diffusion-webui\modules\img2img.py", line 160, in img2img
processed = modules.scripts.scripts_img2img.run(p, *args)
File "C:\Users\coron\desktop\StableDiffusion\stable-diffusion-webui\modules\scripts.py", line 362, in run
processed = script.run(p, *script_args)
File "C:\Users\coron\desktop\StableDiffusion\stable-diffusion-webui\extensions\video_loopback_for_webui\scripts\video_loopback.py", line 554, in run
output_img = img_que.blend_batch(
File "C:\Users\coron\desktop\StableDiffusion\stable-diffusion-webui\extensions\video_loopback_for_webui\scripts\video_loopback.py", line 108, in blend_batch
new_img: Image.Image = blend_average(new_imgs)
File "C:\Users\coron\desktop\StableDiffusion\stable-diffusion-webui\extensions\video_loopback_for_webui\scripts\video_loopback_utils\utils.py", line 57, in blend_average
new_img = Image.blend(new_img, img, 1.0 / (i + 1))
File "C:\Users\coron\desktop\StableDiffusion\stable-diffusion-webui\venv\lib\site-packages\PIL\Image.py", line 3324, in blend
im1.load()
AttributeError: 'numpy.ndarray' object has no attribute 'load'
i think cause this program form \stable-diffusion-webui\venv\Lib\site-packages\PIL\image.py
def blend(im1, im2, alpha):
"""
Creates a new image by interpolating between two input images, using
a constant alpha::
out = image1 * (1.0 - alpha) + image2 * alpha
:param im1: The first image.
:param im2: The second image. Must have the same mode and size as
the first image.
:param alpha: The interpolation alpha factor. If alpha is 0.0, a
copy of the first image is returned. If alpha is 1.0, a copy of
the second image is returned. There are no restrictions on the
alpha value. If necessary, the result is clipped to fit into
the allowed output range.
:returns: An :py:class:`~PIL.Image.Image` object.
"""
im1.load()
im2.load()
return im1._new(core.blend(im1.im, im2.im, alpha))
I just fixed the problem, update it and try again, is there still this problem?
thanks! this one run fine.
I do run,but I'm concerned about errors.
Error running process: C:\Users\coron\desktop\StableDiffusion\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\controlnet.py Traceback (most recent call last): File "C:\Users\coron\desktop\StableDiffusion\stable-diffusion-webui\modules\scripts.py", line 372, in process script.process(p, *script_args) File "C:\Users\coron\desktop\StableDiffusion\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\controlnet.py", line 231, in process raise RuntimeError(f"model not found: {model}") RuntimeError: model not found: True
100%|██████████████████████████████████████████████████████████████████████████████████| 23/23 [00:03<00:00, 6.03it/s] Loop:1/2,Image:5/20%|██████▌ | 92/920 [00:16<02:16, 6.07it/s] seed:60970597, subseed:3379688672 Error running process: C:\Users\coron\desktop\StableDiffusion\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\controlnet.py Traceback (most recent call last): File "C:\Users\coron\desktop\StableDiffusion\stable-diffusion-webui\modules\scripts.py", line 372, in process script.process(p, *script_args) File "C:\Users\coron\desktop\StableDiffusion\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\controlnet.py", line 231, in process raise RuntimeError(f"model not found: {model}") RuntimeError: model not found: True