ljleb / prompt-fusion-extension

auto1111 webui extension for all sorts of prompt interpolations!
MIT License
268 stars 16 forks source link

Txt2img fails with ":o" #40

Closed R-N closed 1 year ago

R-N commented 1 year ago

Yes it's an actual prompt of mouth making "oo" sound.

Traceback (most recent call last):
  File "/content/nai/stable-diffusion-webui/modules/call_queue.py", line 56, in f
    res = list(func(*args, **kwargs))
  File "/content/nai/stable-diffusion-webui/modules/call_queue.py", line 37, in f
    res = func(*args, **kwargs)
  File "/content/nai/stable-diffusion-webui/modules/txt2img.py", line 56, in txt2img
    processed = process_images(p)
  File "/content/nai/stable-diffusion-webui/modules/processing.py", line 484, in process_images
    res = process_images_inner(p)
  File "/content/nai/stable-diffusion-webui/modules/processing.py", line 616, in process_images_inner
    c = get_conds_with_caching(prompt_parser.get_multicond_learned_conditioning, prompts, p.steps, cached_c)
  File "/content/nai/stable-diffusion-webui/modules/processing.py", line 570, in get_conds_with_caching
    cache[1] = function(shared.sd_model, required_prompts, steps)
  File "/content/nai/stable-diffusion-webui/modules/prompt_parser.py", line 205, in get_multicond_learned_conditioning
    learned_conditioning = get_learned_conditioning(model, prompt_flat_list, steps)
  File "/content/nai/stable-diffusion-webui/extensions/prompt-fusion-extension/lib_prompt_fusion/hijacker.py", line 15, in wrapper
    return function(*args, **kwargs, original_function=self.__original_functions[attribute])
  File "/content/nai/stable-diffusion-webui/extensions/prompt-fusion-extension/scripts/promptlang.py", line 36, in _hijacked_get_learned_conditioning
    tensor_builders = _parse_tensor_builders(prompts, total_steps)
  File "/content/nai/stable-diffusion-webui/extensions/prompt-fusion-extension/scripts/promptlang.py", line 52, in _parse_tensor_builders
    expr = parse_prompt(prompt)
  File "/content/nai/stable-diffusion-webui/extensions/prompt-fusion-extension/lib_prompt_fusion/prompt_parser.py", line 124, in parse_prompt
    return parse_expression(prompt.lstrip())
  File "/usr/local/lib/python3.8/dist-packages/lark/lark.py", line 645, in parse
    return self.parser.parse(text, start=start, on_error=on_error)
  File "/usr/local/lib/python3.8/dist-packages/lark/parser_frontends.py", line 96, in parse
    return self.parser.parse(stream, chosen_start, **kw)
  File "/usr/local/lib/python3.8/dist-packages/lark/parsers/lalr_parser.py", line 41, in parse
    return self.parser.parse(lexer, start)
  File "/usr/local/lib/python3.8/dist-packages/lark/parsers/lalr_parser.py", line 171, in parse
    return self.parse_from_state(parser_state)
  File "/usr/local/lib/python3.8/dist-packages/lark/parsers/lalr_parser.py", line 193, in parse_from_state
    raise e
  File "/usr/local/lib/python3.8/dist-packages/lark/parsers/lalr_parser.py", line 183, in parse_from_state
    for token in state.lexer.lex(state):
  File "/usr/local/lib/python3.8/dist-packages/lark/lexer.py", line 599, in lex
    raise UnexpectedToken(token, e.allowed, state=parser_state, token_history=[last_token], terminals_by_name=self.root_lexer.terminals_by_name)
lark.exceptions.UnexpectedToken: Unexpected token Token('SYMBOL', 'o') at line 1, column 141.
Expected one of: 
    * FREE_FLOAT
Previous tokens: [Token('COLON', ':')]
ljleb commented 1 year ago

Ah right. Goes in hand with #16.

A quick workaround for this is to write \:o. You can also disable the extension in the settings tab under "Prompt Fusion" or extensions tab after clicking on "reload ui" to use the original webui parser.

PladsElsker commented 1 year ago

It should be fixed now!