kohya-ss / sd-scripts

Apache License 2.0
4.84k stars 807 forks source link

Missing key(s) in state_dict for LoRANetwork #1080

Open sxy771 opened 7 months ago

sxy771 commented 7 months ago
RuntimeError: Error(s) in loading state_dict for LoRANetwork:
    Missing key(s) in state_dict: "lora_te_text_model_encoder_layers_0_self_attn_k_proj.alpha", 
"lora_te_text_model_encoder_layers_0_self_attn_v_proj.alpha", 
"lora_te_text_model_encoder_layers_0_self_attn_q_proj.alpha", 
"lora_te_text_model_encoder_layers_0_self_attn_out_proj.alpha", "lora_te_text_model_encoder_layers_0_mlp_fc1.alpha", 
"lora_te_text_model_encoder_layers_0_mlp_fc2.alpha", "lora_te_text_model_encoder_layers_1_self_attn_k_proj.alpha", 
"lora_te_text_model_encoder_layers_1_self_attn_v_proj.alpha", 
"lora_te_text_model_encoder_layers_1_self_attn_q_proj.alpha", 
"lora_te_text_model_encoder_layers_1_self_attn_out_proj.alpha", "lora_te_text_model_encoder_layers_1_mlp_fc1.alpha", 
"lora_te_text_model_encoder_layers_1_mlp_fc2.alpha", "lora_te_text_model_encoder_layers_2_self_attn_k_proj.alpha", 
"lora_te_text_model_encoder_layers_2_self_attn_v_proj.alpha", 
"lora_te_text_model_encoder_layers_2_self_attn_q_proj.alpha", 
"lora_te_text_model_encoder_layers_2_self_attn_out_proj.alpha", "lora_te_text_model_encoder_layers_2_mlp_fc1.alpha", 
"lora_te_text_model_encoder_layers_2_mlp_fc2.alpha", "lora_te_text_model_encoder_layers_3_self_attn_k_proj.alpha", 
"lora_te_text_model_encoder_layers_3_self_attn_v_proj.alpha", 
"lora_te_text_model_encoder_layers_3_self_attn_q_proj.alpha", 
"lora_te_text_model_encoder_layers_3_self_attn_out_proj.alpha", "lora_te_text_model_encoder_layers_3_mlp_fc1.alpha", 
"lora_te_text_model_encoder_layers_3_mlp_fc2.alpha", "lora_te_text_model_encoder_layers_4_self_attn_k_proj.alpha", 
"lora_te_text_model_encoder_layers_4_self_attn_v_proj.alpha", 
"lora_te_text_model_encoder_layers_4_self_attn_q_proj.alpha", 
"lora_te_text_model_encoder_layers_4_self_attn_out_proj.alpha", "lora_te_text_model_encoder_layers_4_mlp_fc1.alpha", "lora_te_text_model_encoder_layers_4_mlp_fc2.alpha", "lora_te_text_model_encoder_layers_5_self_attn_k_proj.alpha", "lora_te_text_model_encoder_layers_5_self_attn_v_proj.alpha", "lora_te_text_model_encoder_layers_5_self_attn_q_proj.alpha", "lora_te_text_model_encoder_layers_5_self_attn_out_proj.alpha", "lora_te_text_model_encoder_layers_5_mlp_fc1.alpha", 
"lora_te_text_model_encoder_layers_5_mlp_fc2.alpha", "lora_te_text_model_encoder_layers_6_self_attn_k_proj.alpha", 
"lora_te_text_model_encoder_layers_6_self_attn_v_proj.alpha", 
"lora_te_text_model_encoder_layers_6_self_attn_q_proj.alpha", 
"lora_te_text_model_encoder_layers_6_self_attn_out_proj.alpha", "lora_te_text_model_encoder_layers_6_mlp_fc1.alpha", 
"lora_te_text_model_encoder_layers_6_mlp_fc2.alpha", "lora_te_text_model_encoder_layers_7_self_attn_k_proj.alpha", 
"lora_te_text_model_encoder_layers_7_self_attn_v_proj.alpha", 
"lora_te_text_model_encoder_layers_7_self_attn_q_proj.alpha", 
"lora_te_text_model_encoder_layers_7_self_attn_out_proj.alpha", "lora_te_text_model_encoder_layers_7_mlp_fc1.alpha", 
"lora_te_text_model_encoder_layers_7_mlp_fc2.alpha", "lora_te_text_model_encoder_layers_8_self_attn_k_proj.alpha", 
"lora_te_text_model_encoder_layers_8_self_attn_v_proj.alpha", 
"lora_te_text_model_encoder_layers_8_self_attn_q_proj.alpha", 
"lora_te_text_model_encoder_layers_8_self_attn_out_proj.alpha",
 "lora_te_text_model_encoder_layers_8_mlp_fc1.alpha", "lora_te_text_model_encoder_layers_8_mlp_fc2.alpha", 
"lora_te_text_model_encoder_layers_9_self_attn_k_proj.alpha", "lora_te_text_model_encoder_layers_9_self_attn_v_proj.alpha", "lora_te_text_model_encoder_layers_9_self_attn_q_proj.alpha", "lora_te_text_model_encoder_layers_9_self_attn_out_proj.alpha", "lora_te_text_model_encoder_layers_9_mlp_fc1.alpha", "lora_te_text_model_encoder_layers_9_mlp_fc2.alpha", "lora_te_text_model_encoder_layers_10_self_attn_k_proj.alpha", "lora_te_text_model_encoder_layers_10_self_attn_v_proj.alpha", "lora_te_text_model_encoder_layers_10_self_attn_q_proj.alpha", "lora_te_text_model_encoder_layers_10_self_attn_out_proj.alpha", "lora_te_text_model_encoder_layers_10_mlp_fc1.alpha", "lora_te_text_model_encoder_layers_10_mlp_fc2.alpha", "lora_te_text_model_encoder_layers_11_self_attn_k_proj.alpha", 
"lora_te_text_model_encoder_layers_11_self_attn_v_proj.alpha", 
"lora_te_text_model_encoder_layers_11_self_attn_q_proj.alpha", 
"lora_te_text_model_encoder_layers_11_self_attn_out_proj.alpha", 
"lora_te_text_model_encoder_layers_11_mlp_fc1.alpha", "lora_te_text_model_encoder_layers_11_mlp_fc2.alpha", 
"lora_unet_down_blocks_0_attentions_0_proj_in.alpha",
 "lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_q.alpha", 
"lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_k.alpha", 
"lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_v.alpha", 
"lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_out_0.alpha", 
"lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_q.alpha", 
"lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_k.alpha", 
"lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_v.alpha", 
"lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_out_0.alpha", 
"lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_ff_net_0_proj.alpha", 
"lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_ff_net_2.alpha", 
"lora_unet_down_blocks_0_attentions_0_proj_out.alpha", "lora_unet_down_blocks_0_attentions_1_proj_in.alpha", 
"lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_q.alpha", 
"lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_k.alpha", 
"lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_v.alpha", 
"lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_out_0.alpha", 
"lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_q.alpha", 
"lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_k.alpha", 
"lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_v.alpha", 
"lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_out_0.alpha", 
"lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_ff_net_0_proj.alpha", 
"lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_ff_net_2.alpha", 
"lora_unet_down_blocks_0_attentions_1_proj_out.alpha", "lora_unet_down_blocks_1_attentions_0_proj_in.alpha", 
"lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_q.alpha", 
"lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_k.alpha", 
"lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_v.alpha", 
"lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_out_0.alpha", 
"lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_q.alpha", 
"lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_k.alpha", 
"lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_v.alpha", 
"lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_out_0.alpha", 
"lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_ff_net_0_proj.alpha", 
"lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_ff_net_2.alpha", 
"lora_unet_down_blocks_1_attentions_0_proj_out.alpha", "lora_unet_down_blocks_1_attentions_1_proj_in.alpha", 
"lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_q.alpha", 
"lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_k.alpha", 
"lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_v.alpha", 
"lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_out_0.alpha", 
"lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_q.alpha", 
"lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_k.alpha", 
"lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_v.alpha", 
"lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_out_0.alpha", 
"lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_ff_net_0_proj.alpha", 
"lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_ff_net_2.alpha", 
"lora_unet_down_blocks_1_attentions_1_proj_out.alpha", "lora_unet_down_blocks_2_attentions_0_proj_in.alpha", 
"lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_q.alpha", 
"lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_k.alpha", 
"lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_v.alpha", 
"lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_out_0.alpha", 
"lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_q.alpha", 
"lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_k.alpha", 
"lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_v.alpha", 
"lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_out_0.alpha", 
"lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_ff_net_0_proj.alpha", 
"lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_ff_net_2.alpha", 
"lora_unet_down_blocks_2_attentions_0_proj_out.alpha", "lora_unet_down_blocks_2_attentions_1_proj_in.alpha", 
"lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_q.alpha", 
"lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_k.alpha", 
"lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_v.alpha", 
"lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_out_0.alpha", 
"lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_q.alpha", 
"lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_k.alpha", 
"lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_v.alpha", 
"lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_out_0.alpha", 
"lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_ff_net_0_proj.alpha", 
"lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_ff_net_2.alpha", 
"lora_unet_down_blocks_2_attentions_1_proj_out.alpha", "lora_unet_up_blocks_1_attentions_0_proj_in.alpha", 
"lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_q.alpha", 
"lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_k.alpha", "lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_v.alpha", "lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_out_0.alpha", "lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_q.alpha", "lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_k.alpha", "lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_v.alpha", "lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_out_0.alpha", "lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_ff_net_0_proj.alpha", "lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_ff_net_2.alpha", "lora_unet_up_blocks_1_attentions_0_proj_out.alpha", "lora_unet_up_blocks_1_attentions_1_proj_in.alpha", "lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_q.alpha", "lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_k.alpha", "lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_v.alpha", "lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_out_0.alpha", "lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_q.alpha", "lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_k.alpha", "lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_v.alpha", "lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_out_0.alpha", "lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_ff_net_0_proj.alpha", "lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_ff_net_2.alpha", "lora_unet_up_blocks_1_attentions_1_proj_out.alpha", "lora_unet_up_blocks_1_attentions_2_proj_in.alpha", "lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_q.alpha", "lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_k.alpha", "lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_v.alpha", "lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_out_0.alpha", "lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_q.alpha", "lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_k.alpha", "lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_v.alpha", "lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_out_0.alpha", "lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_ff_net_0_proj.alpha", "lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_ff_net_2.alpha", "lora_unet_up_blocks_1_attentions_2_proj_out.alpha", "lora_unet_up_blocks_2_attentions_0_proj_in.alpha", "lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_q.alpha", "lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_k.alpha", "lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_v.alpha", "lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_out_0.alpha", "lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_q.alpha", "lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_k.alpha", "lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_v.alpha", "lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_out_0.alpha", "lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_ff_net_0_proj.alpha", "lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_ff_net_2.alpha", "lora_unet_up_blocks_2_attentions_0_proj_out.alpha", "lora_unet_up_blocks_2_attentions_1_proj_in.alpha", "lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_q.alpha", "lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_k.alpha", "lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_v.alpha", "lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_out_0.alpha", "lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_q.alpha", "lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_k.alpha", "lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_v.alpha", "lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_out_0.alpha", "lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_ff_net_0_proj.alpha", "lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_ff_net_2.alpha", "lora_unet_up_blocks_2_attentions_1_proj_out.alpha", "lora_unet_up_blocks_2_attentions_2_proj_in.alpha", "lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_q.alpha", "lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_k.alpha", "lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_v.alpha", "lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_out_0.alpha", "lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_q.alpha", "lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_k.alpha", "lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_v.alpha", "lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_out_0.alpha", "lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_ff_net_0_proj.alpha", "lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_ff_net_2.alpha", "lora_unet_up_blocks_2_attentions_2_proj_out.alpha", "lora_unet_up_blocks_3_attentions_0_proj_in.alpha", "lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_q.alpha", "lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_k.alpha", "lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_v.alpha", "lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_out_0.alpha", "lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_q.alpha", "lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_k.alpha", "lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_v.alpha", "lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_out_0.alpha", "lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_ff_net_0_proj.alpha", "lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_ff_net_2.alpha", "lora_unet_up_blocks_3_attentions_0_proj_out.alpha", "lora_unet_up_blocks_3_attentions_1_proj_in.alpha", "lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_q.alpha", "lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_k.alpha", "lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_v.alpha", "lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_out_0.alpha", "lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_q.alpha", "lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_k.alpha", "lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_v.alpha", "lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_out_0.alpha", "lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_ff_net_0_proj.alpha", "lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_ff_net_2.alpha", "lora_unet_up_blocks_3_attentions_1_proj_out.alpha", "lora_unet_up_blocks_3_attentions_2_proj_in.alpha", "lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_q.alpha", "lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_k.alpha", "lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_v.alpha", "lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_out_0.alpha", "lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_q.alpha", "lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_k.alpha", "lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_v.alpha", "lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_out_0.alpha", "lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_0_proj.alpha", "lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_2.alpha", "lora_unet_up_blocks_3_attentions_2_proj_out.alpha", "lora_unet_mid_block_attentions_0_proj_in.alpha", "lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_q.alpha", "lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_k.alpha", "lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_v.alpha", "lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_out_0.alpha", "lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_q.alpha", "lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_k.alpha", "lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_v.alpha", "lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_out_0.alpha", "lora_unet_mid_block_attentions_0_transformer_blocks_0_ff_net_0_proj.alpha", "lora_unet_mid_block_attentions_0_transformer_blocks_0_ff_net_2.alpha", "lora_unet_mid_block_attentions_0_proj_out.alpha". 

Anyone know why it happens and how to solve????

kohya-ss commented 7 months ago

Hi, what command causes this error?