Closed wiseoldowl-66 closed 8 months ago
Thanks for the report!
I suppose the Magic Prompt models (which aren't part of the extension, to be clear) haven't been trained on prompts that contain LoRAs, so they get confused and output nonsense.
We could strip out LoRA syntax before feeding the prompt to the Magic Prompt and then put it back in afterwards, but at present, that isn't happening.
Thank you for the reply. It hadn't occurred to me that the prompt model relies on the existing prompt but of course this makes sense.
Your proposed solution sounds like a good idea. I would offer to help but wouldn't know how!
Thank you for the quick solution!
@perspeculum No problem. I merged #708 now, so if you update the extension, this should work better :)
I'm getting Python errors on generation with the new merge, and no magic prompts being added. This is the traceback:
Traceback (most recent call last):
File "E:\SD\webui\webui\modules\scripts.py", line 718, in process
script.process(p, *script_args)
File "E:\SD\webui\webui\extensions\sd-dynamic-prompts\sd_dynamic_prompts\dynamic_prompting.py", line 481, in process
all_prompts, all_negative_prompts = generate_prompts(
File "E:\SD\webui\webui\extensions\sd-dynamic-prompts\sd_dynamic_prompts\helpers.py", line 93, in generate_prompts
all_prompts = prompt_generator.generate(prompt, num_prompts, seeds=seeds) or [""]
File "E:\SD\webui\system\python\lib\site-packages\dynamicprompts\generators\magicprompt.py", line 164, in generate
magic_prompts = self._generate_magic_prompts(prompts)
File "E:\SD\webui\webui\extensions\sd-dynamic-prompts\sd_dynamic_prompts\magic_prompt.py", line 20, in _generate_magic_prompts
magic_prompts = super()._generate_magic_prompts(orig_prompts)
File "E:\SD\webui\system\python\lib\site-packages\dynamicprompts\generators\magicprompt.py", line 210, in _generate_magic_prompts
prompts = self._generator(
File "E:\SD\webui\system\python\lib\site-packages\transformers\pipelines\text_generation.py", line 201, in __call__
return super().__call__(text_inputs, **kwargs)
File "E:\SD\webui\system\python\lib\site-packages\transformers\pipelines\base.py", line 1120, in __call__
return self.run_single(inputs, preprocess_params, forward_params, postprocess_params)
File "E:\SD\webui\system\python\lib\site-packages\transformers\pipelines\base.py", line 1126, in run_single
model_inputs = self.preprocess(inputs, **preprocess_params)
File "E:\SD\webui\system\python\lib\site-packages\transformers\pipelines\text_generation.py", line 205, in preprocess
prefix + prompt_text, padding=False, add_special_tokens=False, return_tensors=self.framework
TypeError: can only concatenate str (not "tuple") to str
Oops... I'll check that out.
@perspeculum Okay, fixed. Silly transformers
... Can you try again?
@akx That appears to be working well now, thank you!
There's a minor formatting issue in terms of the double ,, at the beginning and lack of , before the
Hello,
Magic Prompt appears to be having some issues when enabled if Loras are present in the main prompt.
With some images, not all, it adds random characters (e.g. < and [ ] ) to the prompt. Additionally, the word 'unknown' frequently appears in the final prompt. Example below. The problem seems to disappear when the Loras are removed from the main prompt.
I'm using SDXL and have tried two different base models and a number of different Loras.
The problem seems to be exacerbated by very short prompts. Providing a longer original prompt was used, increasing the 'Max magic prompt length' did not seem to worsen it. Additionally, using the same short prompts without any Loras did not replicate the issue.
Running latest versions of web-ui and the dynamic-prompts. I'm using the Gustavosta/MagicPrompt-Stable-Diffusion prompt model.
Examples (main prompt is up to and including the lora):