AUTOMATIC1111 / stable-diffusion-webui

Stable Diffusion web UI
GNU Affero General Public License v3.0
141.8k stars 26.79k forks source link

[Bug]: Prompt editing appears to be broken. #10966

Closed amalmin closed 1 year ago

amalmin commented 1 year ago

Is there an existing issue for this?

What happened?

The prompt "a cat, [black:white:0.25]" results in this error:

VisitError: Error trying to process rule "start": not enough values to unpack (expected 4, got 3)

Steps to reproduce the problem

  1. Start webui.
  2. Add prompt "a cat, [black:white:0.25]"
  3. hit Generate

What should have happened?

I think the syntax is correct. It should have worked.

Additionally, other variants don't work:

[:white:0.25] [white:0.25] [black::0.25]

Commit where the problem happens

1.3.1 - b6af0a3809ea869fb180633f9affcae4b199ffcf

What Python version are you running on ?

Python 3.9.x (below, no recommended)

What platforms do you use to access the UI ?

Linux

What device are you running WebUI on?

Nvidia GPUs (RTX 20 above)

What browsers do you use to access the UI ?

Google Chrome

Command Line Arguments

./webui.sh --xformers --opt-channelslast

List of extensions

a1111-sd-webui-lycoris https://github.com/KohakuBlueleaf/a1111-sd-webui-lycoris.git main b0d24ca6 Wed May 17 00:19:32 2023 unknown clip-interrogator-ext https://github.com/pharmapsychotic/clip-interrogator-ext.git main 9e6bbd9b Tue Mar 28 03:20:44 2023 unknown sd-webui-controlnet https://github.com/Mikubill/sd-webui-controlnet.git main 7b707dc1 Tue May 30 09:50:33 2023 unknown stable-diffusion-webui-promptgen https://github.com/AUTOMATIC1111/stable-diffusion-webui-promptgen.git master 84e58b5d Fri Jan 20 11:15:12 2023 unknown ultimate-upscale-for-automatic1111 https://github.com/Coyote-A/ultimate-upscale-for-automatic1111.git master 756bb505 Fri May 5 00:22:21 2023 unknown LDSR built-in None Fri Jun 2 17:43:33 2023 Lora built-in None Fri Jun 2 17:43:33 2023 ScuNET built-in None Fri Jun 2 17:43:33 2023 SwinIR built-in None Fri Jun 2 17:43:33 2023 prompt-bracket-checker built-in None Fri Jun 2 17:43:33 2023

Console logs

Error completing request
Arguments: ('task(mzsbxux3624a09z)', 'a cat, [black::0.25]', 'dog', [], 33, 0, False, False, 1, 1, 7, 2992334299.0, -1.0, 0, 0, 0, False, 512, 512, False, 0.43, 2, '4x_UniversalUpscalerV2-Neutral_115000_swaG', 0, 0, 0, 0, '', '', [], 0, <controlnet.py.UiControlNetUnit object at 0x7f1cb1fea4c0>, <controlnet.py.UiControlNetUnit object at 0x7f1d73eb1190>, <controlnet.py.UiControlNetUnit object at 0x7f1d73ead160>, False, False, 'positive', 'comma', 0, False, False, '', 1, '', [], 0, '', [], 0, '', [], True, False, False, False, 0, None, None, False, None, None, False, None, None, False, 50) {}
Traceback (most recent call last):
  File "/home/amalmin/.local/lib/python3.9/site-packages/lark/visitors.py", line 93, in _call_userfunc
    return f(children)
  File "/home/amalmin/src/stable-diffusion-webui/modules/prompt_parser.py", line 87, in start
    return ''.join(flatten(args))
  File "/home/amalmin/src/stable-diffusion-webui/modules/prompt_parser.py", line 86, in flatten
    yield from flatten(gen)
  File "/home/amalmin/src/stable-diffusion-webui/modules/prompt_parser.py", line 86, in flatten
    yield from flatten(gen)
  File "/home/amalmin/src/stable-diffusion-webui/modules/prompt_parser.py", line 85, in flatten
    for gen in x:
  File "/home/amalmin/src/stable-diffusion-webui/modules/prompt_parser.py", line 76, in scheduled
    before, after, _, when = args
ValueError: not enough values to unpack (expected 4, got 3)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/amalmin/src/stable-diffusion-webui/modules/call_queue.py", line 57, in f
    res = list(func(*args, **kwargs))
  File "/home/amalmin/src/stable-diffusion-webui/modules/call_queue.py", line 37, in f
    res = func(*args, **kwargs)
  File "/home/amalmin/src/stable-diffusion-webui/modules/txt2img.py", line 57, in txt2img
    processed = processing.process_images(p)
  File "/home/amalmin/src/stable-diffusion-webui/modules/processing.py", line 610, in process_images
    res = process_images_inner(p)
  File "/home/amalmin/src/stable-diffusion-webui/extensions/sd-webui-controlnet/scripts/batch_hijack.py", line 42, in processing_process_images_hijack
    return getattr(processing, '__controlnet_original_process_images_inner')(p, *args, **kwargs)
  File "/home/amalmin/src/stable-diffusion-webui/modules/processing.py", line 718, in process_images_inner
    p.setup_conds()
  File "/home/amalmin/src/stable-diffusion-webui/modules/processing.py", line 1096, in setup_conds
    super().setup_conds()
  File "/home/amalmin/src/stable-diffusion-webui/modules/processing.py", line 338, in setup_conds
    self.c = self.get_conds_with_caching(prompt_parser.get_multicond_learned_conditioning, self.prompts, self.steps * self.step_multiplier, self.cached_c)
  File "/home/amalmin/src/stable-diffusion-webui/modules/processing.py", line 328, in get_conds_with_caching
    cache[1] = function(shared.sd_model, required_prompts, steps)
  File "/home/amalmin/src/stable-diffusion-webui/modules/prompt_parser.py", line 208, in get_multicond_learned_conditioning
    learned_conditioning = get_learned_conditioning(model, prompt_flat_list, steps)
  File "/home/amalmin/src/stable-diffusion-webui/modules/prompt_parser.py", line 132, in get_learned_conditioning
    prompt_schedules = get_learned_conditioning_prompt_schedules(prompts, steps)
  File "/home/amalmin/src/stable-diffusion-webui/modules/prompt_parser.py", line 105, in get_learned_conditioning_prompt_schedules
    promptdict = {prompt: get_schedule(prompt) for prompt in set(prompts)}
  File "/home/amalmin/src/stable-diffusion-webui/modules/prompt_parser.py", line 105, in <dictcomp>
    promptdict = {prompt: get_schedule(prompt) for prompt in set(prompts)}
  File "/home/amalmin/src/stable-diffusion-webui/modules/prompt_parser.py", line 103, in get_schedule
    return [[t, at_step(t, tree)] for t in collect_steps(steps, tree)]
  File "/home/amalmin/src/stable-diffusion-webui/modules/prompt_parser.py", line 103, in <listcomp>
    return [[t, at_step(t, tree)] for t in collect_steps(steps, tree)]
  File "/home/amalmin/src/stable-diffusion-webui/modules/prompt_parser.py", line 93, in at_step
    return AtStep().transform(tree)
  File "/home/amalmin/.local/lib/python3.9/site-packages/lark/visitors.py", line 130, in transform
    return self._transform_tree(tree)
  File "/home/amalmin/.local/lib/python3.9/site-packages/lark/visitors.py", line 126, in _transform_tree
    return self._call_userfunc(tree, children)
  File "/home/amalmin/.local/lib/python3.9/site-packages/lark/visitors.py", line 97, in _call_userfunc
    raise VisitError(tree.data, tree, e)
lark.exceptions.VisitError: Error trying to process rule "start":

not enough values to unpack (expected 4, got 3)

Additional information

No response

amalmin commented 1 year ago

This is now working again. I am not sure what changed. It also works with 1.3.2.

Presumably something within my environment changed, but I can't see what that is.

mp3pintyo commented 1 year ago

It gives me the same error. Any idea what you set up differently?

amalmin commented 1 year ago

It gives me the same error. Any idea what you set up differently?

Unfortunately no. I had already updated everything when I posted the issue. Then I went away from it for a day or so, and when coming back it...just worked...infuriatingly. It then continued to work after I upgraded again to 1.3.2.

saintty9190 commented 1 year ago

https://gitcode.net/ranting8323/prompt-fusion-extension the same bug happens on me , tried to install the extension above, so that the bug fixed . suggest to install the extension

statopre commented 1 year ago

Had the same problem. Deleting venv fixed it.

Saskalex commented 1 year ago

Deleting venv/lib/site-packages/lark* instead of whole venv should suffice because the stack trace above seems to hint at a problem with the lark parser toolkit. I had the same problem and this fixed it for me.