adieyal / sd-dynamic-prompts

A custom script for AUTOMATIC1111/stable-diffusion-webui to implement a tiny template language for random prompt generation
MIT License
2.02k stars 261 forks source link

Magic Prompt adding unknown prompts and symbols when using Loras #707

Closed wiseoldowl-66 closed 8 months ago

wiseoldowl-66 commented 8 months ago

Hello,

Magic Prompt appears to be having some issues when enabled if Loras are present in the main prompt.

With some images, not all, it adds random characters (e.g. < and [ ] ) to the prompt. Additionally, the word 'unknown' frequently appears in the final prompt. Example below. The problem seems to disappear when the Loras are removed from the main prompt.

I'm using SDXL and have tried two different base models and a number of different Loras.

The problem seems to be exacerbated by very short prompts. Providing a longer original prompt was used, increasing the 'Max magic prompt length' did not seem to worsen it. Additionally, using the same short prompts without any Loras did not replicate the issue.

Running latest versions of web-ui and the dynamic-prompts. I'm using the Gustavosta/MagicPrompt-Stable-Diffusion prompt model.

Examples (main prompt is up to and including the lora):

image

image

akx commented 8 months ago

Thanks for the report!

I suppose the Magic Prompt models (which aren't part of the extension, to be clear) haven't been trained on prompts that contain LoRAs, so they get confused and output nonsense.

We could strip out LoRA syntax before feeding the prompt to the Magic Prompt and then put it back in afterwards, but at present, that isn't happening.

wiseoldowl-66 commented 8 months ago

Thank you for the reply. It hadn't occurred to me that the prompt model relies on the existing prompt but of course this makes sense.

Your proposed solution sounds like a good idea. I would offer to help but wouldn't know how!

wiseoldowl-66 commented 8 months ago

Thank you for the quick solution!

akx commented 8 months ago

@perspeculum No problem. I merged #708 now, so if you update the extension, this should work better :)

wiseoldowl-66 commented 8 months ago

I'm getting Python errors on generation with the new merge, and no magic prompts being added. This is the traceback:

Traceback (most recent call last):
      File "E:\SD\webui\webui\modules\scripts.py", line 718, in process
        script.process(p, *script_args)
      File "E:\SD\webui\webui\extensions\sd-dynamic-prompts\sd_dynamic_prompts\dynamic_prompting.py", line 481, in process
        all_prompts, all_negative_prompts = generate_prompts(
      File "E:\SD\webui\webui\extensions\sd-dynamic-prompts\sd_dynamic_prompts\helpers.py", line 93, in generate_prompts
        all_prompts = prompt_generator.generate(prompt, num_prompts, seeds=seeds) or [""]
      File "E:\SD\webui\system\python\lib\site-packages\dynamicprompts\generators\magicprompt.py", line 164, in generate
        magic_prompts = self._generate_magic_prompts(prompts)
      File "E:\SD\webui\webui\extensions\sd-dynamic-prompts\sd_dynamic_prompts\magic_prompt.py", line 20, in _generate_magic_prompts
        magic_prompts = super()._generate_magic_prompts(orig_prompts)
      File "E:\SD\webui\system\python\lib\site-packages\dynamicprompts\generators\magicprompt.py", line 210, in _generate_magic_prompts
        prompts = self._generator(
      File "E:\SD\webui\system\python\lib\site-packages\transformers\pipelines\text_generation.py", line 201, in __call__
        return super().__call__(text_inputs, **kwargs)
      File "E:\SD\webui\system\python\lib\site-packages\transformers\pipelines\base.py", line 1120, in __call__
        return self.run_single(inputs, preprocess_params, forward_params, postprocess_params)
      File "E:\SD\webui\system\python\lib\site-packages\transformers\pipelines\base.py", line 1126, in run_single
        model_inputs = self.preprocess(inputs, **preprocess_params)
      File "E:\SD\webui\system\python\lib\site-packages\transformers\pipelines\text_generation.py", line 205, in preprocess
        prefix + prompt_text, padding=False, add_special_tokens=False, return_tensors=self.framework
    TypeError: can only concatenate str (not "tuple") to str
akx commented 8 months ago

Oops... I'll check that out.

akx commented 8 months ago

@perspeculum Okay, fixed. Silly transformers... Can you try again?

wiseoldowl-66 commented 8 months ago

@akx That appears to be working well now, thank you!

There's a minor formatting issue in terms of the double ,, at the beginning and lack of , before the tag, but I don't suppose this is affecting the functionality.

image

akx commented 8 months ago

The double commas and lacks of spaces shouldn't matter :) (We actually already do try to clean up various cruft from the machine-generated prompts, but evidently not double commas!)

I'll go ahead and close this as fixed.