AUTOMATIC1111 / stable-diffusion-webui

Stable Diffusion web UI
GNU Affero General Public License v3.0
139.66k stars 26.47k forks source link

[Bug]: Failed to match keys when loading network /home/moon/develop/stable-diffusion-webui/models/Lora/testcar.safetensors: #12448

Closed p-moon closed 1 year ago

p-moon commented 1 year ago

Is there an existing issue for this?

What happened?

I trained a testcar. safetensors file using the script provided by the diffusers library. When I use the webui to load the lora model and generate images, an error is reported saying:

Failed to match keys when loading network /home/moon/develop/stable-diffusion-webui/models/Lora/testcar.safetensors: {'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.processor.to_k_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.processor.to_k_lora.up.weight': 'unet', 'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.processor.to_out_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.processor.to_out_lora.up.weight': 'unet', 'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.processor.to_q_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.processor.to_q_lora.up.weight': 'unet', 'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.processor.to_v_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.processor.to_v_lora.up.weight': 'unet', 'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.processor.to_k_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.processor.to_k_lora.up.weight': 'unet', 'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.processor.to_out_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.processor.to_out_lora.up.weight': 'unet', 'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.processor.to_q_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.processor.to_q_lora.up.weight': 'unet', 'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.processor.to_v_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.processor.to_v_lora.up.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.processor.to_k_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.processor.to_k_lora.up.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.processor.to_out_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.processor.to_out_lora.up.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.processor.to_q_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.processor.to_q_lora.up.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.processor.to_v_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.processor.to_v_lora.up.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.processor.to_k_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.processor.to_k_lora.up.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.processor.to_out_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.processor.to_out_lora.up.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.processor.to_q_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.processor.to_q_lora.up.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.processor.to_v_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.processor.to_v_lora.up.weight': 'unet', 'unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_k_lora.down.weight': 'unet', 'unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_k_lora.up.weight': 'unet', 'unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_out_lora.down.weight': 'unet', 'unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_out_lora.up.weight': 'unet', 'unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_q_lora.down.weight': 'unet', 'unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_q_lora.up.weight': 'unet', 'unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_v_lora.down.weight': 'unet', 'unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_v_lora.up.weight': 'unet', 'unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.processor.to_k_lora.down.weight': 'unet', 'unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.processor.to_k_lora.up.weight': 'unet', 'unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.processor.to_out_lora.down.weight': 'unet', 'unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.processor.to_out_lora.up.weight': 'unet'

Steps to reproduce the problem

my train script:

#!/bin/bash

export MODEL_NAME="../../models/stable-diffusion/stable-diffusion-2-1"
export INSTANCE_DIR="/home/moon/develop/stable-diffusion/stable-diffusion-train/images"
export CLASS_DIR=""
export OUTPUT_DIR="./new_models"

accelerate launch train_dreambooth_lora.py \
  --pretrained_model_name_or_path=$MODEL_NAME  \
  --instance_data_dir=$INSTANCE_DIR \
  --output_dir=$OUTPUT_DIR \
  --instance_prompt="a photo of aodicar" \
  --resolution=256 \
  --train_batch_size=4 \
  --gradient_accumulation_steps=1 \
  --checkpointing_steps=5 \
  --learning_rate=1e-4 \
  --report_to="wandb" \
  --lr_scheduler="constant" \
  --lr_warmup_steps=0 \
  --max_train_steps=10000 \
  --num_train_epochs=20 \
  --num_train_epochs=100 \
  --validation_prompt="A photo of blue aodicar in the exhibition hall" \
  --validation_epochs=5 \
  --seed="$RANDOM"
    # Save the lora layers
    accelerator.wait_for_everyone()
    if accelerator.is_main_process:
        unet = accelerator.unwrap_model(unet)
        unet = unet.to(torch.float32)
        unet_lora_layers = unet_attn_processors_state_dict(unet)

        if text_encoder is not None and args.train_text_encoder:
            text_encoder = accelerator.unwrap_model(text_encoder)
            text_encoder = text_encoder.to(torch.float32)
            text_encoder_lora_layers = text_encoder_lora_state_dict(text_encoder)
        else:
            text_encoder_lora_layers = None

        LoraLoaderMixin.save_lora_weights(
            save_directory=args.output_dir,
            unet_lora_layers=unet_lora_layers,
            text_encoder_lora_layers=text_encoder_lora_layers,
            safe_serialization=True,
        )
        LoraLoaderMixin.save_lora_weights(
            save_directory=args.output_dir,
            unet_lora_layers=unet_lora_layers,
            text_encoder_lora_layers=text_encoder_lora_layers,
        )

What should have happened?

The lora model should be used to generate images

Version or Commit where the problem happens

68f336bd994bed5442ad95bad6b6ad5564a5409a

What Python version are you running on ?

None

What platforms do you use to access the UI ?

Linux

What device are you running WebUI on?

Other GPUs

Cross attention optimization

Automatic

What browsers do you use to access the UI ?

No response

Command Line Arguments

python launch.py

List of extensions

DreamArtist-sd-webui-extension | git@github.com:7eu7d7/DreamArtist-sd-webui-extension.git | master | 12f80775 | Mon Apr 24 05:53:26 2023 | 未知/unknown Stable-Diffusion-Webui-Civitai-Helper | git@github.com:butaixianran/Stable-Diffusion-Webui-Civitai-Helper.git | main | 920ca326 | Tue May 23 11:53:22 2023 | 未知/unknown a1111-sd-webui-locon | git@github.com:KohakuBlueleaf/a1111-sd-webui-locon.git | main | b6911354 | Wed Jun 14 13:56:44 2023 | 未知/unknown a1111-sd-webui-lycoris | git@github.com:KohakuBlueleaf/a1111-sd-webui-lycoris.git | main | 8e97bf54 | Sun Jul 9 07:44:58 2023 | 未知/unknown a1111-sd-webui-tagcomplete | git@github.com:DominikDoom/a1111-sd-webui-tagcomplete.git | main | 7f188563 | Sun Jun 25 10:34:57 2023 | 未知/unknown lora-prompt-tool | git@github.com:a2569875/lora-prompt-tool.git | main | f45a77e7 | Fri Jul 7 09:27:16 2023 | 未知/unknown sd-webui-bilingual-localization | git@github.com:journey-ad/sd-webui-bilingual-localization.git | main | 89a02280 | Thu Apr 27 03:54:25 2023 | 未知/unknown sd-webui-controlnet | git@github.com:Mikubill/sd-webui-controlnet.git | main | b9e09db3 | Mon Jul 3 04:30:20 2023 | 未知/unknown sd-webui-deforum | git@github.com:deforum-art/sd-webui-deforum.git | automatic1111-webui | b58056f9 | Tue Jun 6 21:47:35 2023 | 未知/unknown sd-webui-model-converter | git@github.com:Akegarasu/sd-webui-model-converter.git | main | 2a3834d7 | Wed Jun 28 13:41:44 2023 | 未知/unknown sd-webui-prompt-all-in-one | git@github.com:Physton/sd-webui-prompt-all-in-one.git | main | bfd1929b | Sun Jul 2 01:53:24 2023 | 未知/unknown sd-webui-prompt-history | https://github.com/namkazt/sd-webui-prompt-history | main | bde3e95e | Sat Jul 15 06:23:45 2023 | 未知/unknown sd-webui-segment-anything | https://github.com/continue-revolution/sd-webui-segment-anything.git | master | ffe26315 | Wed Jun 28 18:49:49 2023 | 未知/unknown sd-webui-train-tools | git@github.com:liasece/sd-webui-train-tools.git | main | a5a2b889 | Tue Jun 6 05:47:23 2023 | 未知/unknown sd-webui-xldemo-txt2img | git@github.com:lifeisboringsoprogramming/sd-webui-xldemo-txt2img.git | main | e193c1c1 | Mon Jul 17 00:58:40 2023 | 未知/unknown sdweb-easy-prompt-selector | git@github.com:lijialong1313/sdweb-easy-prompt-selector.git | main | 288e796a | Fri Jun 16 03:04:33 2023 | 未知/unknown sdweb-easy-prompt-selector-zhcn | git@github.com:fiua/sdweb-easy-prompt-selector-zhcn.git | main | 2a5b2594 | Sun Jul 9 23:08:04 2023 | 未知/unknown stable-diffusion-webui-localization-zh_CN | git@github.com:dtlnor/stable-diffusion-webui-localization-zh_CN.git | main | 582ca24d | Thu Mar 30 07:06:14 2023 | 未知/unknown LDSR | built-in | 无/None |   | Thu Aug 10 09:00:56 2023 |   低秩微调模型(LoRA)/Lora | built-in | 无/None |   | Thu Aug 10 09:00:56 2023 |   ScuNET | built-in | 无/None |   | Thu Aug 10 09:00:56 2023 |   SwinIR | built-in | 无/None |   | Thu Aug 10 09:00:56 2023 |   canvas-zoom-and-pan | built-in | 无/None |   | Thu Aug 10 09:00:56 2023 |   extra-options-section | built-in | 无/None |   | Thu Aug 10 09:00:56 2023 |   mobile | built-in | 无/None |   | Thu Aug 10 09:00:56 2023 |   prompt-bracket-checker


DreamArtist-sd-webui-extension git@github.com:7eu7d7/DreamArtist-sd-webui-extension.git master 12f80775 Mon Apr 24 05:53:26 2023 未知/unknown Stable-Diffusion-Webui-Civitai-Helper git@github.com:butaixianran/Stable-Diffusion-Webui-Civitai-Helper.git main 920ca326 Tue May 23 11:53:22 2023 未知/unknown a1111-sd-webui-locon git@github.com:KohakuBlueleaf/a1111-sd-webui-locon.git main b6911354 Wed Jun 14 13:56:44 2023 未知/unknown a1111-sd-webui-lycoris git@github.com:KohakuBlueleaf/a1111-sd-webui-lycoris.git main 8e97bf54 Sun Jul 9 07:44:58 2023 未知/unknown a1111-sd-webui-tagcomplete git@github.com:DominikDoom/a1111-sd-webui-tagcomplete.git main 7f188563 Sun Jun 25 10:34:57 2023 未知/unknown lora-prompt-tool git@github.com:a2569875/lora-prompt-tool.git main f45a77e7 Fri Jul 7 09:27:16 2023 未知/unknown sd-webui-bilingual-localization git@github.com:journey-ad/sd-webui-bilingual-localization.git main 89a02280 Thu Apr 27 03:54:25 2023 未知/unknown sd-webui-controlnet git@github.com:Mikubill/sd-webui-controlnet.git main b9e09db3 Mon Jul 3 04:30:20 2023 未知/unknown sd-webui-deforum git@github.com:deforum-art/sd-webui-deforum.git automatic1111-webui b58056f9 Tue Jun 6 21:47:35 2023 未知/unknown sd-webui-model-converter git@github.com:Akegarasu/sd-webui-model-converter.git main 2a3834d7 Wed Jun 28 13:41:44 2023 未知/unknown sd-webui-prompt-all-in-one git@github.com:Physton/sd-webui-prompt-all-in-one.git main bfd1929b Sun Jul 2 01:53:24 2023 未知/unknown sd-webui-prompt-history https://github.com/namkazt/sd-webui-prompt-history main bde3e95e Sat Jul 15 06:23:45 2023 未知/unknown sd-webui-segment-anything https://github.com/continue-revolution/sd-webui-segment-anything.git master ffe26315 Wed Jun 28 18:49:49 2023 未知/unknown sd-webui-train-tools git@github.com:liasece/sd-webui-train-tools.git main a5a2b889 Tue Jun 6 05:47:23 2023 未知/unknown sd-webui-xldemo-txt2img git@github.com:lifeisboringsoprogramming/sd-webui-xldemo-txt2img.git main e193c1c1 Mon Jul 17 00:58:40 2023 未知/unknown sdweb-easy-prompt-selector git@github.com:lijialong1313/sdweb-easy-prompt-selector.git main 288e796a Fri Jun 16 03:04:33 2023 未知/unknown sdweb-easy-prompt-selector-zhcn git@github.com:fiua/sdweb-easy-prompt-selector-zhcn.git main 2a5b2594 Sun Jul 9 23:08:04 2023 未知/unknown stable-diffusion-webui-localization-zh_CN git@github.com:dtlnor/stable-diffusion-webui-localization-zh_CN.git main 582ca24d Thu Mar 30 07:06:14 2023 未知/unknown LDSR built-in 无/None Thu Aug 10 09:00:56 2023
低秩微调模型(LoRA)/Lora built-in 无/None Thu Aug 10 09:00:56 2023
ScuNET built-in 无/None Thu Aug 10 09:00:56 2023
SwinIR built-in 无/None Thu Aug 10 09:00:56 2023
canvas-zoom-and-pan built-in 无/None Thu Aug 10 09:00:56 2023
extra-options-section built-in 无/None Thu Aug 10 09:00:56 2023
mobile built-in 无/None Thu Aug 10 09:00:56 2023
prompt-bracket-checker

Console logs

(stable-diffusion-webui) ☁  stable-diffusion-webui [master] ⚡  python launch.py
Python 3.10.6 (main, Oct 24 2022, 16:07:47) [GCC 11.2.0]
Version: v1.5.1
Commit hash: 68f336bd994bed5442ad95bad6b6ad5564a5409a
Installing requirements

Installing requirements for scikit_learn

Installing sd-webui-xl-demo requirements_webui.txt

Launching Web UI with arguments:
Setting ds_accelerator to cuda (auto detect)
2023-08-10 16:59:47.940675: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
no module 'xformers'. Processing without...
no module 'xformers'. Processing without...
No module 'xformers'. Proceeding without it.
Skipping broken symlink: /home/moon/develop/stable-diffusion-webui/models/Stable-diffusion/sd_xl_base_0.9.safetensors
Skipping broken symlink: /home/moon/develop/stable-diffusion-webui/models/Stable-diffusion/sd_xl_refiner_0.9.safetensors
Civitai Helper: Get Custom Model Folder
Civitai Helper: Load setting from: /home/moon/develop/stable-diffusion-webui/extensions/Stable-Diffusion-Webui-Civitai-Helper/setting.json
Additional Network extension not installed, Only hijack built-in lora
LoCon Extension hijack built-in lora successfully
[lora-prompt-tool] Get Custom Model Folder
dirname:  /home/moon/develop/stable-diffusion-webui/localizations
localizations:  {'chinese-all-0512': '/home/moon/develop/stable-diffusion-webui/localizations/chinese-all-0512.json', 'chinese-english-0512': '/home/moon/develop/stable-diffusion-webui/localizations/chinese-english-0512.json', 'zh_CN': '/home/moon/develop/stable-diffusion-webui/extensions/stable-diffusion-webui-localization-zh_CN/localizations/zh_CN.json'}
2023-08-10 16:59:52,649 - ControlNet - INFO - ControlNet v1.1.228
ControlNet preprocessor location: /home/moon/develop/stable-diffusion-webui/extensions/sd-webui-controlnet/annotator/downloads
2023-08-10 16:59:52,713 - ControlNet - INFO - ControlNet v1.1.228
sd-webui-prompt-all-in-one background API service started successfully.
Loading model /home/moon/develop/models/stable-diffusion/stable-diffusion-xl-base-1.0
The config attributes {'add_watermarker': None} were passed to StableDiffusionXLPipeline, but are not expected and will be ignored. Please verify your model_index.json configuration file.
Keyword arguments {'add_watermarker': None} are not expected by StableDiffusionXLPipeline and will be ignored.
The config attributes {'force_upcast': True} were passed to AutoencoderKL, but are not expected and will be ignored. Please verify your config.json configuration file.
Loading model /home/moon/develop/models/stable-diffusion/stable-diffusion-xl-refiner-1.0
The config attributes {'force_upcast': True} were passed to AutoencoderKL, but are not expected and will be ignored. Please verify your config.json configuration file.
Loading weights [dcd690123c] from /home/moon/develop/stable-diffusion-webui/models/Stable-diffusion/v2-1_768-ema-pruned.safetensors
Running on local URL:  http://127.0.0.1:7860

To create a public link, set `share=True` in `launch()`.
Startup time: 24.1s (launcher: 12.3s, import torch: 2.7s, import gradio: 0.7s, setup paths: 1.0s, other imports: 0.8s, opts onchange: 0.8s, load scripts: 3.9s, create ui: 0.7s, gradio launch: 0.8s, app_started_callback: 0.3s).
Creating model from config: /home/moon/develop/stable-diffusion-webui/repositories/stable-diffusion-stability-ai/configs/stable-diffusion/v2-inference-v.yaml
LatentDiffusion: Running in v-prediction mode
DiffusionWrapper has 865.91 M params.
Applying attention optimization: Doggettx... done.
Model loaded in 4.4s (load weights from disk: 0.9s, find config: 1.4s, create model: 0.2s, apply weights to model: 0.8s, apply half(): 0.5s, move model to device: 0.4s, calculate empty prompt: 0.1s).
Failed to match keys when loading network /home/moon/develop/stable-diffusion-webui/models/Lora/testcar.safetensors: {'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.processor.to_k_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.processor.to_k_lora.up.weight': 'unet', 'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.processor.to_out_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.processor.to_out_lora.up.weight': 'unet', 'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.processor.to_q_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.processor.to_q_lora.up.weight': 'unet', 'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.processor.to_v_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.processor.to_v_lora.up.weight': 'unet', 'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.processor.to_k_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.processor.to_k_lora.up.weight': 'unet', 'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.processor.to_out_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.processor.to_out_lora.up.weight': 'unet', 'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.processor.to_q_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.processor.to_q_lora.up.weight': 'unet', 'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.processor.to_v_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.processor.to_v_lora.up.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.processor.to_k_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.processor.to_k_lora.up.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.processor.to_out_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.processor.to_out_lora.up.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.processor.to_q_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.processor.to_q_lora.up.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.processor.to_v_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.processor.to_v_lora.up.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.processor.to_k_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.processor.to_k_lora.up.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.processor.to_out_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.processor.to_out_lora.up.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.processor.to_q_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.processor.to_q_lora.up.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.processor.to_v_lora.down.weight': 'unet', 'unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.processor.to_v_lora.up.weight': 'unet', 'unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_k_lora.down.weight': 'unet', 'unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_k_lora.up.weight': 'unet', 'unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_out_lora.down.weight': 'unet', 'unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_out_lora.up.weight': 'unet', 'unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_q_lora.down.weight': 'unet', 'unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_q_lora.up.weight': 'unet', 'unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_v_lora.down.weight': 'unet', 'unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_v_lora.up.weight': 'unet', 'unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.processor.to_k_lora.down.weight': 'unet', 'unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.processor.to_k_lora.up.weight': 'unet', 'unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.processor.to_out_lora.down.weight': 'unet', 'unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.processor.to_out_lora.up.weight': 'unet', 'unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.processor.to_q_lora.down.weight': 'unet', 'unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.processor.to_q_lora.up.weight': 'unet', 'unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.processor.to_v_lora.down.weight': 'unet', 'unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.processor.to_v_lora.up.weight': 'unet', 'unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.processor.to_k_lora.down.weight': 'unet', 'unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.processor.to_k_lora.up.weight': 'unet', 'unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.processor.to_out_lora.down.weight': 'unet', 'unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.processor.to_out_lora.up.weight': 'unet', 'unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.processor.to_q_lora.down.weight': 'unet', 'unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.processor.to_q_lora.up.weight': 'unet', 'unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.processor.to_v_lora.down.weight': 'unet', 'unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.processor.to_v_lora.up.weight': 'unet', 'unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.processor.to_k_lora.down.weight': 'unet', 'unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.processor.to_k_lora.up.weight': 'unet', 'unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.processor.to_out_lora.down.weight': 'unet', 'unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.processor.to_out_lora.up.weight': 'unet', 'unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.processor.to_q_lora.down.weight': 'unet', 'unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.processor.to_q_lora.up.weight': 'unet', 'unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.processor.to_v_lora.down.weight': 'unet', 'unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.processor.to_v_lora.up.weight': 'unet', 'unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.processor.to_k_lora.down.weight': 'unet', 'unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.processor.to_k_lora.up.weight': 'unet', 'unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.processor.to_out_lora.down.weight': 'unet', 'unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.processor.to_out_lora.up.weight': 'unet', 'unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.processor.to_q_lora.down.weight': 'unet', 'unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.processor.to_q_lora.up.weight': 'unet', 'unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.processor.to_v_lora.down.weight': 'unet', 'unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.processor.to_v_lora.up.weight': 'unet', 'unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.processor.to_k_lora.down.weight': 'unet', 'unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.processor.to_k_lora.up.weight': 'unet', 'unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.processor.to_out_lora.down.weight': 'unet', 'unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.processor.to_out_lora.up.weight': 'unet', 'unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.processor.to_q_lora.down.weight': 'unet', 'unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.processor.to_q_lora.up.weight': 'unet', 'unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.processor.to_v_lora.down.weight': 'unet', 'unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.processor.to_v_lora.up.weight': 'unet', 'unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.processor.to_k_lora.down.weight': 'unet', 'unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.processor.to_k_lora.up.weight': 'unet', 'unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.processor.to_out_lora.down.weight': 'unet', 'unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.processor.to_out_lora.up.weight': 'unet', 'unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.processor.to_q_lora.down.weight': 'unet', 'unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.processor.to_q_lora.up.weight': 'unet', 'unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.processor.to_v_lora.down.weight': 'unet', 'unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.processor.to_v_lora.up.weight': 'unet', 'unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.processor.to_k_lora.down.weight': 'unet', 'unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.processor.to_k_lora.up.weight': 'unet', 'unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.processor.to_out_lora.down.weight': 'unet', 'unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.processor.to_out_lora.up.weight': 'unet', 'unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.processor.to_q_lora.down.weight': 'unet', 'unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.processor.to_q_lora.up.weight': 'unet', 'unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.processor.to_v_lora.down.weight': 'unet', 'unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.processor.to_v_lora.up.weight': 'unet', 'unet.mid_block.attentions.0.transformer_blocks.0.attn1.processor.to_k_lora.down.weight': 'unet', 'unet.mid_block.attentions.0.transformer_blocks.0.attn1.processor.to_k_lora.up.weight': 'unet', 'unet.mid_block.attentions.0.transformer_blocks.0.attn1.processor.to_out_lora.down.weight': 'unet', 'unet.mid_block.attentions.0.transformer_blocks.0.attn1.processor.to_out_lora.up.weight': 'unet', 'unet.mid_block.attentions.0.transformer_blocks.0.attn1.processor.to_q_lora.down.weight': 'unet', 'unet.mid_block.attentions.0.transformer_blocks.0.attn1.processor.to_q_lora.up.weight': 'unet', 'unet.mid_block.attentions.0.transformer_blocks.0.attn1.processor.to_v_lora.down.weight': 'unet', 'unet.mid_block.attentions.0.transformer_blocks.0.attn1.processor.to_v_lora.up.weight': 'unet', 'unet.mid_block.attentions.0.transformer_blocks.0.attn2.processor.to_k_lora.down.weight': 'unet', 'unet.mid_block.attentions.0.transformer_blocks.0.attn2.processor.to_k_lora.up.weight': 'unet', 'unet.mid_block.attentions.0.transformer_blocks.0.attn2.processor.to_out_lora.down.weight': 'unet', 'unet.mid_block.attentions.0.transformer_blocks.0.attn2.processor.to_out_lora.up.weight': 'unet', 'unet.mid_block.attentions.0.transformer_blocks.0.attn2.processor.to_q_lora.down.weight': 'unet', 'unet.mid_block.attentions.0.transformer_blocks.0.attn2.processor.to_q_lora.up.weight': 'unet', 'unet.mid_block.attentions.0.transformer_blocks.0.attn2.processor.to_v_lora.down.weight': 'unet', 'unet.mid_block.attentions.0.transformer_blocks.0.attn2.processor.to_v_lora.up.weight': 'unet', 'unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_k_lora.down.weight': 'unet', 'unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_k_lora.up.weight': 'unet', 'unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_out_lora.down.weight': 'unet', 'unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_out_lora.up.weight': 'unet', 'unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_q_lora.down.weight': 'unet', 'unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_q_lora.up.weight': 'unet', 'unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_v_lora.down.weight': 'unet', 'unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_v_lora.up.weight': 'unet', 'unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.processor.to_k_lora.down.weight': 'unet', 'unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.processor.to_k_lora.up.weight': 'unet', 'unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.processor.to_out_lora.down.weight': 'unet', 'unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.processor.to_out_lora.up.weight': 'unet', 'unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.processor.to_q_lora.down.weight': 'unet', 'unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.processor.to_q_lora.up.weight': 'unet', 'unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.processor.to_v_lora.down.weight': 'unet', 'unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.processor.to_v_lora.up.weight': 'unet', 'unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.processor.to_k_lora.down.weight': 'unet', 'unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.processor.to_k_lora.up.weight': 'unet', 'unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.processor.to_out_lora.down.weight': 'unet', 'unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.processor.to_out_lora.up.weight': 'unet', 'unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.processor.to_q_lora.down.weight': 'unet', 'unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.processor.to_q_lora.up.weight': 'unet', 'unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.processor.to_v_lora.down.weight': 'unet', 'unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.processor.to_v_lora.up.weight': 'unet', 'unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.processor.to_k_lora.down.weight': 'unet', 'unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.processor.to_k_lora.up.weight': 'unet', 'unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.processor.to_out_lora.down.weight': 'unet', 'unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.processor.to_out_lora.up.weight': 'unet', 'unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.processor.to_q_lora.down.weight': 'unet', 'unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.processor.to_q_lora.up.weight': 'unet', 'unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.processor.to_v_lora.down.weight': 'unet', 'unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.processor.to_v_lora.up.weight': 'unet', 'unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.processor.to_k_lora.down.weight': 'unet', 'unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.processor.to_k_lora.up.weight': 'unet', 'unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.processor.to_out_lora.down.weight': 'unet', 'unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.processor.to_out_lora.up.weight': 'unet', 'unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.processor.to_q_lora.down.weight': 'unet', 'unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.processor.to_q_lora.up.weight': 'unet', 'unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.processor.to_v_lora.down.weight': 'unet', 'unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.processor.to_v_lora.up.weight': 'unet', 'unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.processor.to_k_lora.down.weight': 'unet', 'unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.processor.to_k_lora.up.weight': 'unet', 'unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.processor.to_out_lora.down.weight': 'unet', 'unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.processor.to_out_lora.up.weight': 'unet', 'unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.processor.to_q_lora.down.weight': 'unet', 'unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.processor.to_q_lora.up.weight': 'unet', 'unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.processor.to_v_lora.down.weight': 'unet', 'unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.processor.to_v_lora.up.weight': 'unet', 'unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.processor.to_k_lora.down.weight': 'unet', 'unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.processor.to_k_lora.up.weight': 'unet', 'unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.processor.to_out_lora.down.weight': 'unet', 'unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.processor.to_out_lora.up.weight': 'unet', 'unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.processor.to_q_lora.down.weight': 'unet', 'unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.processor.to_q_lora.up.weight': 'unet', 'unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.processor.to_v_lora.down.weight': 'unet', 'unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.processor.to_v_lora.up.weight': 'unet', 'unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.processor.to_k_lora.down.weight': 'unet', 'unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.processor.to_k_lora.up.weight': 'unet', 'unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.processor.to_out_lora.down.weight': 'unet', 'unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.processor.to_out_lora.up.weight': 'unet', 'unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.processor.to_q_lora.down.weight': 'unet', 'unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.processor.to_q_lora.up.weight': 'unet', 'unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.processor.to_v_lora.down.weight': 'unet', 'unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.processor.to_v_lora.up.weight': 'unet', 'unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.processor.to_k_lora.down.weight': 'unet', 'unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.processor.to_k_lora.up.weight': 'unet', 'unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.processor.to_out_lora.down.weight': 'unet', 'unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.processor.to_out_lora.up.weight': 'unet', 'unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.processor.to_q_lora.down.weight': 'unet', 'unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.processor.to_q_lora.up.weight': 'unet', 'unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.processor.to_v_lora.down.weight': 'unet', 'unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.processor.to_v_lora.up.weight': 'unet', 'unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.processor.to_k_lora.down.weight': 'unet', 'unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.processor.to_k_lora.up.weight': 'unet', 'unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.processor.to_out_lora.down.weight': 'unet', 'unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.processor.to_out_lora.up.weight': 'unet', 'unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.processor.to_q_lora.down.weight': 'unet', 'unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.processor.to_q_lora.up.weight': 'unet', 'unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.processor.to_v_lora.down.weight': 'unet', 'unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.processor.to_v_lora.up.weight': 'unet', 'unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.processor.to_k_lora.down.weight': 'unet', 'unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.processor.to_k_lora.up.weight': 'unet', 'unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.processor.to_out_lora.down.weight': 'unet', 'unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.processor.to_out_lora.up.weight': 'unet', 'unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.processor.to_q_lora.down.weight': 'unet', 'unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.processor.to_q_lora.up.weight': 'unet', 'unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.processor.to_v_lora.down.weight': 'unet', 'unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.processor.to_v_lora.up.weight': 'unet', 'unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.processor.to_k_lora.down.weight': 'unet', 'unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.processor.to_k_lora.up.weight': 'unet', 'unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.processor.to_out_lora.down.weight': 'unet', 'unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.processor.to_out_lora.up.weight': 'unet', 'unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.processor.to_q_lora.down.weight': 'unet', 'unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.processor.to_q_lora.up.weight': 'unet', 'unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.processor.to_v_lora.down.weight': 'unet', 'unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.processor.to_v_lora.up.weight': 'unet', 'unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.processor.to_k_lora.down.weight': 'unet', 'unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.processor.to_k_lora.up.weight': 'unet', 'unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.processor.to_out_lora.down.weight': 'unet', 'unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.processor.to_out_lora.up.weight': 'unet', 'unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.processor.to_q_lora.down.weight': 'unet', 'unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.processor.to_q_lora.up.weight': 'unet', 'unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.processor.to_v_lora.down.weight': 'unet', 'unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.processor.to_v_lora.up.weight': 'unet', 'unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.processor.to_k_lora.down.weight': 'unet', 'unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.processor.to_k_lora.up.weight': 'unet', 'unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.processor.to_out_lora.down.weight': 'unet', 'unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.processor.to_out_lora.up.weight': 'unet', 'unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.processor.to_q_lora.down.weight': 'unet', 'unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.processor.to_q_lora.up.weight': 'unet', 'unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.processor.to_v_lora.down.weight': 'unet', 'unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.processor.to_v_lora.up.weight': 'unet', 'unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.processor.to_k_lora.down.weight': 'unet', 'unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.processor.to_k_lora.up.weight': 'unet', 'unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.processor.to_out_lora.down.weight': 'unet', 'unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.processor.to_out_lora.up.weight': 'unet', 'unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.processor.to_q_lora.down.weight': 'unet', 'unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.processor.to_q_lora.up.weight': 'unet', 'unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.processor.to_v_lora.down.weight': 'unet', 'unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.processor.to_v_lora.up.weight': 'unet', 'unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.processor.to_k_lora.down.weight': 'unet', 'unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.processor.to_k_lora.up.weight': 'unet', 'unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.processor.to_out_lora.down.weight': 'unet', 'unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.processor.to_out_lora.up.weight': 'unet', 'unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.processor.to_q_lora.down.weight': 'unet', 'unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.processor.to_q_lora.up.weight': 'unet', 'unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.processor.to_v_lora.down.weight': 'unet', 'unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.processor.to_v_lora.up.weight': 'unet', 'unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.processor.to_k_lora.down.weight': 'unet', 'unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.processor.to_k_lora.up.weight': 'unet', 'unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.processor.to_out_lora.down.weight': 'unet', 'unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.processor.to_out_lora.up.weight': 'unet', 'unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.processor.to_q_lora.down.weight': 'unet', 'unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.processor.to_q_lora.up.weight': 'unet', 'unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.processor.to_v_lora.down.weight': 'unet', 'unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.processor.to_v_lora.up.weight': 'unet', 'unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.processor.to_k_lora.down.weight': 'unet', 'unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.processor.to_k_lora.up.weight': 'unet', 'unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.processor.to_out_lora.down.weight': 'unet', 'unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.processor.to_out_lora.up.weight': 'unet', 'unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.processor.to_q_lora.down.weight': 'unet', 'unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.processor.to_q_lora.up.weight': 'unet', 'unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.processor.to_v_lora.down.weight': 'unet', 'unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.processor.to_v_lora.up.weight': 'unet'}

Additional information

No response

catboxanon commented 1 year ago

diffusers trained LoRAs aren't compatibile out-of-the-box. The webui expects them in format created by https://github.com/kohya-ss/sd-scripts. Reformatting the JSON returned in the error also makes it pretty obvious all the keys in your LoRA are unet for some reason.

{
  "unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.processor.to_k_lora.down.weight": "unet",
  "unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.processor.to_k_lora.up.weight": "unet",
  "unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.processor.to_out_lora.down.weight": "unet",
  "unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.processor.to_out_lora.up.weight": "unet",
  "unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.processor.to_q_lora.down.weight": "unet",
  "unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.processor.to_q_lora.up.weight": "unet",
  "unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.processor.to_v_lora.down.weight": "unet",
  "unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.processor.to_v_lora.up.weight": "unet",
  "unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.processor.to_k_lora.down.weight": "unet",
  "unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.processor.to_k_lora.up.weight": "unet",
  "unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.processor.to_out_lora.down.weight": "unet",
  "unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.processor.to_out_lora.up.weight": "unet",
  "unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.processor.to_q_lora.down.weight": "unet",
  "unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.processor.to_q_lora.up.weight": "unet",
  "unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.processor.to_v_lora.down.weight": "unet",
  "unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.processor.to_v_lora.up.weight": "unet",
  "unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.processor.to_k_lora.down.weight": "unet",
  "unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.processor.to_k_lora.up.weight": "unet",
  "unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.processor.to_out_lora.down.weight": "unet",
  "unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.processor.to_out_lora.up.weight": "unet",
  "unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.processor.to_q_lora.down.weight": "unet",
  "unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.processor.to_q_lora.up.weight": "unet",
  "unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.processor.to_v_lora.down.weight": "unet",
  "unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.processor.to_v_lora.up.weight": "unet",
  "unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.processor.to_k_lora.down.weight": "unet",
  "unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.processor.to_k_lora.up.weight": "unet",
  "unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.processor.to_out_lora.down.weight": "unet",
  "unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.processor.to_out_lora.up.weight": "unet",
  "unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.processor.to_q_lora.down.weight": "unet",
  "unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.processor.to_q_lora.up.weight": "unet",
  "unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.processor.to_v_lora.down.weight": "unet",
  "unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.processor.to_v_lora.up.weight": "unet",
  "unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_k_lora.down.weight": "unet",
  "unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_k_lora.up.weight": "unet",
  "unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_out_lora.down.weight": "unet",
  "unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_out_lora.up.weight": "unet",
  "unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_q_lora.down.weight": "unet",
  "unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_q_lora.up.weight": "unet",
  "unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_v_lora.down.weight": "unet",
  "unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.processor.to_v_lora.up.weight": "unet",
  "unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.processor.to_k_lora.down.weight": "unet",
  "unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.processor.to_k_lora.up.weight": "unet",
  "unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.processor.to_out_lora.down.weight": "unet",
  "unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.processor.to_out_lora.up.weight": "unet"
}

My guess is your LoRA has keys that are nested and they need to be flattened for the webui. I don't have links onhand but I believe there's a few scripts available that do this. They might be in the diffusers repo.

p-moon commented 1 year ago

Thank you for your help. I did find a solution yesterday, which can be solved using the following script https://github.com/harrywang/finetune-sd/blob/main/convert-to-safetensors.py

Thank you for your help. I did find a solution yesterday, which can be solved using the following script

import re
import os
import argparse
import torch;
from safetensors.torch import save_file

def main(args):

    ## use GPU or CPU
    if torch.cuda.is_available():
        device = 'cuda'
        checkpoint = torch.load(args.file, map_location=torch.device('cuda'))
    else:
        device = 'cpu'
        # if on CPU or want to have maximum precision on GPU, use default full-precision setting
        checkpoint = torch.load(args.file, map_location=torch.device('cpu'))

    print(f'device is {device}')

    new_dict = dict()
    for idx, key in enumerate(checkpoint):
        new_key = re.sub('\.processor\.', '_', key)
        new_key = re.sub('mid_block\.', 'mid_block_', new_key)
        new_key = re.sub('_lora.up.', '.lora_up.', new_key)
        new_key = re.sub('_lora.down.', '.lora_down.', new_key)
        new_key = re.sub('\.(\d+)\.', '_\\1_', new_key)
        new_key = re.sub('to_out', 'to_out_0', new_key)
        new_key = 'lora_unet_' + new_key

        new_dict[new_key] = checkpoint[key]

    file_name = os.path.splitext(args.file)[0]  # get the file name without the extension
    new_lora_name = file_name + '_converted.safetensors'
    print("Saving " + new_lora_name)
    save_file(new_dict, new_lora_name)

def parse_args():
    parser = argparse.ArgumentParser()
    parser.add_argument(
        "--file",
        type=str,
        default=None,
        required=True,
        help="path to the full file name",
    )

    args = parser.parse_args()
    return args

if __name__ == "__main__":
    args = parse_args()
    main(args)
StanislawKarnacky commented 1 year ago

The solution above is great for LoRA trained on 1.5, 2.0 base models. But there's still no working script for SDXL model.