Open lbeltrame opened 1 year ago
Setting a LoRA instead causes a RecursionError:
12:52:59-997639 ERROR gradio call: RecursionError
╭───────────────────── Traceback (most recent call last) ──────────────────────╮
│ /notebooks/automatic/modules/call_queue.py:34 in f │
│ │
│ 33 │ │ │ try: │
│ ❱ 34 │ │ │ │ res = func(*args, **kwargs) │
│ 35 │ │ │ │ progress.record_results(id_task, res) │
│ │
│ /notebooks/automatic/modules/txt2img.py:66 in txt2img │
│ │
│ 65 │ if processed is None: │
│ ❱ 66 │ │ processed = processing.process_images(p) │
│ 67 │ p.close() │
│ │
│ /notebooks/automatic/modules/processing.py:683 in process_images │
│ │
│ 682 │ │ │ with context_hypertile_vae(p), context_hypertile_unet(p): │
│ ❱ 683 │ │ │ │ res = process_images_inner(p) │
│ 684 │ finally: │
│ │
│ /notebooks/automatic/extensions-builtin/sd-webui-controlnet/scripts/batch_hi │
│ jack.py:42 in processing_process_images_hijack │
│ │
│ 41 │ │ │ # we are not in batch mode, fallback to original function │
│ ❱ 42 │ │ │ return getattr(processing, '__controlnet_original_process_ │
│ 43 │
│ │
│ /notebooks/automatic/modules/processing.py:812 in process_images_inner │
│ │
│ 811 │ │ │ if shared.backend == shared.Backend.ORIGINAL: │
│ ❱ 812 │ │ │ │ uc = get_conds_with_caching(modules.prompt_parser.get │
│ 813 │ │ │ │ c = get_conds_with_caching(modules.prompt_parser.get_ │
│ │
│ ... 2973 frames hidden ... │
│ │
│ /notebooks/automatic/extensions/sd-webui-regional-prompter/scripts/latent.py │
│ :488 in h_Linear_forward │
│ │
│ 487 │ │ if shared.opts.lora_functional: │
│ ❱ 488 │ │ │ return networks.network_forward(self, input, networks.orig │
│ 489 │ │ networks.network_apply_weights(self) │
│ │
│ /notebooks/automatic/extensions-builtin/Lora/networks.py:296 in │
│ network_forward │
│ │
│ 295 │ network_restore_weights_from_backup(module) │
│ ❱ 296 │ network_reset_cached_weight(module) │
│ 297 │ y = original_forward(module, input) │
│ │
│ /notebooks/automatic/extensions-builtin/Lora/networks.py:308 in │
│ network_reset_cached_weight │
│ │
│ 307 def network_reset_cached_weight(self: Union[torch.nn.Conv2d, torch.nn. │
│ ❱ 308 │ self.network_current_names = () │
│ 309 │ self.network_weights_backup = None │
│ │
│ /usr/local/lib/python3.9/dist-packages/torch/nn/modules/module.py:1707 in │
│ __setattr__ │
│ │
│ 1706 │ │ params = self.__dict__.get('_parameters') │
│ ❱ 1707 │ │ if isinstance(value, Parameter): │
│ 1708 │ │ │ if params is None: │
│ │
│ /usr/local/lib/python3.9/dist-packages/torch/nn/parameter.py:10 in │
│ __instancecheck__ │
│ │
│ 9 │ │ return super().__instancecheck__(instance) or ( │
│ ❱ 10 │ │ │ isinstance(instance, torch.Tensor) and getattr(instance, ' │
│ 11 │
╰──────────────────────────────────────────────────────────────────────────────╯
RecursionError: maximum recursion depth exceeded while calling a Python object
i cannot reproduce, it seems to work just fine with and without lora:
Ok, I see why you cannot reproduce. Can you switch from Attention mode to Latent mode? I forgot to mention this in the original report, but Attention works, while Latent does not.
yes, i just added note to upstream issue. i'll add support for latent mode soon, right now its not supported. attention mode works fine.
Thanks, I'll make sure to use attention mode for now.
Issue Description
When setting up sd-webui-regional-prompter without specifying any LoRA in the prompt, the following exception is raised:
I have no information to determine whether the fault lies in regional-prompter or SD-next.
Version Platform Description
Ubuntu 20.04 NVIDIA A4000 Latest SD.Next master as of 2023-10-20
URL link of the extension
https://github.com/hako-mikan/sd-webui-regional-prompter
URL link of the issue reported in the extension repository
https://github.com/hako-mikan/sd-webui-regional-prompter/issues/255
Acknowledgements