comfyanonymous / ComfyUI

The most powerful and modular diffusion model GUI, api and backend with a graph/nodes interface.
https://www.comfy.org/
GNU General Public License v3.0
49.78k stars 5.24k forks source link

Comfyui Error Loop : New [ unknown ] #2364

Closed ameen-roayan closed 8 months ago

ameen-roayan commented 8 months ago

Not sure what is going on. quite a big error, gets stuck on a loop

comfyui.log

ameen-roayan commented 8 months ago

F:\AI\ComfyUI>.\python_embeded\python.exe -s ComfyUI\main.py --windows-standalone-build --enable-cors-header ComfyUI startup time: 2023-12-24 12:42:41.590986 Platform: Windows Python version: 3.11.6 (tags/v3.11.6:8b6ee5b, Oct 2 2023, 14:57:12) [MSC v.1935 64 bit (AMD64)] Python executable: F:\AI\ComfyUI\python_embeded\python.exe ** Log path: F:\AI\ComfyUI\comfyui.log

Prestartup times for custom nodes: 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\rgthree-comfy 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Manager 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui-deploy

Total VRAM 24564 MB, total RAM 65303 MB Set vram state to: NORMAL_VRAM Device: cuda:0 NVIDIA GeForce RTX 4090 : cudaMallocAsync VAE dtype: torch.bfloat16 Using pytorch cross attention Adding extra search path checkpoints F:\AI\Automatic1111\stable-diffusion-webui\models/Stable-diffusion Adding extra search path configs F:\AI\Automatic1111\stable-diffusion-webui\models/Stable-diffusion Adding extra search path vae F:\AI\Automatic1111\stable-diffusion-webui\models/VAE Adding extra search path loras F:\AI\Automatic1111\stable-diffusion-webui\models/Lora Adding extra search path loras F:\AI\Automatic1111\stable-diffusion-webui\models/LyCORIS Adding extra search path upscale_models F:\AI\Automatic1111\stable-diffusion-webui\models/ESRGAN Adding extra search path upscale_models F:\AI\Automatic1111\stable-diffusion-webui\models/RealESRGAN Adding extra search path upscale_models F:\AI\Automatic1111\stable-diffusion-webui\models/SwinIR Adding extra search path embeddings F:\AI\Automatic1111\stable-diffusion-webui\embeddings Adding extra search path hypernetworks F:\AI\Automatic1111\stable-diffusion-webui\models/hypernetworks Adding extra search path controlnet F:\AI\Automatic1111\stable-diffusion-webui\extensions\sd-webui-controlnet\models Adding F:\AI\ComfyUI\ComfyUI\custom_nodes to sys.path Efficiency Nodes: Attempting to add Control Net options to the 'HiRes-Fix Script' Node (comfyui_controlnet_aux add-on)...Success! Efficiency Nodes: Attempting to add 'AnimatedDiff Script' Node (ComfyUI-AnimateDiff-Evolved add-on)...Success! Loaded efficiency nodes from F:\AI\ComfyUI\ComfyUI\custom_nodes\efficiency-nodes-comfyui Loaded ControlNetPreprocessors nodes from F:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui_controlnet_aux No module named 'control' Loaded AnimateDiff from F:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui-animatediff\animatediff/sliding_schedule.py Loaded IPAdapter nodes from F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_IPAdapter_plus Loaded VideoHelperSuite from F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-VideoHelperSuite

Loading: ComfyUI-Impact-Pack (V4.51.1)

Loading: ComfyUI-Impact-Pack (Subpack: V0.3.2)

Loading: ComfyUI-Inspire-Pack (V0.53)

[Impact Pack] Wildcards loading done. Package diffusers is already installed.

Loading: ComfyUI-Manager (V1.16)

ComfyUI Revision: 1851 [a252963f] | Released on '2023-12-23'

FETCH DATA from: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/custom-node-list.json FETCH DATA from: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/extension-node-map.json FETCH DATA from: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/model-list.json FETCH DATA from: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/alter-list.json [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/model-list.json [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/alter-list.json [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/custom-node-list.json [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/extension-node-map.json

Mixlab Nodes: Loaded

Total VRAM 24564 MB, total RAM 65303 MB Set vram state to: NORMAL_VRAM Device: cuda:0 NVIDIA GeForce RTX 4090 : cudaMallocAsync VAE dtype: torch.bfloat16 Torch version: 2.1.1+cu121

Loading: Save img prompt


Comfyroll Custom Nodes v1.47 : 148 Nodes Loaded

FizzleDorf Custom Nodes: Loaded Using pytorch cross attention [tinyterraNodes] Loaded

[rgthree] Loaded 20 exciting nodes. [rgthree] Optimizing ComfyUI recursive execution. If queueing and/or re-queueing seems broken, change "patch_recursive_execution" to false in rgthree_config.json

Searge-SDXL v4.3.1 in F:\AI\ComfyUI\ComfyUI\custom_nodes\SeargeSDXL

Import times for custom nodes: 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_IPAdapter_plus 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\gcLatentTunnel.py 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-VideoHelperSuite 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\efficiency-nodes-comfyui 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Image-Selector 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\alkemann.py 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\hakkun_nodes.py 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_Fill-Nodes-main 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\sdxl-recommended-res-calc 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUi-NoodleWebcam 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_GradientDeepShrink 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\sdxl_prompt_styler 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_SimpleTiles 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_TiledIPAdapter 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_Noise 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Coziness 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Logic 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUi_NNLatentUpscale 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\cg-use-everywhere-main 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\save-image-extended-comfyui 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui_controlnet_aux 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\sd-dynamic-thresholding 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui-tooling-nodes 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_TiledKSampler 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_JPS-Nodes 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\cg_custom_core 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-SaveImgPrompt 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_toyxyz_test_nodes 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\stability-ComfyUI-nodes 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-WD14-Tagger 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui-yanc 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_ADV_CLIP_emb 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_experiments 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_ColorMod 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\IPAdapter-ComfyUI 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\mikey_nodes 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Custom-Scripts 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyMath 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\cg-noise 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\images-grid-comfy-plugin 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui-animatediff 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Prompt-Expansion 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui-various 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui-deploy 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Advanced-ControlNet 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-AnimateDiff-Evolved 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_UltimateSDUpscale 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\comfy-image-saver 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Frame-Interpolation 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\Derfuu_ComfyUI_ModdedNodes 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\rgthree-comfy 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_essentials 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_Comfyroll_CustomNodes 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui-dream-project 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Inspire-Pack 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_smZNodes 0.1 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\NodeGPT-main 0.1 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-LCM 0.1 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_FizzNodes 0.1 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Impact-Pack 0.2 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui-reactor-node 0.2 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui_segment_anything 0.2 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Manager 0.2 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\SeargeSDXL 0.4 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_tinyterraNodes 0.4 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui-mixlab-nodes 0.8 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui-art-venture

https_key OK: F:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui-mixlab-nodes\https\certificate.crt F:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui-mixlab-nodes\https\private.key Starting server

To see the GUI go to: http://127.0.0.1:8188 To see the GUI go to: https://127.0.0.1:8189 FETCH DATA from: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Manager\extension-node-map.json print INPUT_TYPES <class 'comfyui-mixlab-nodes.nodes.Utils.DynamicDelayProcessor'> got prompt model_type EPS adm 2816 Using pytorch attention in VAE Working with z of shape (1, 4, 32, 32) = 4096 dimensions. Using pytorch attention in VAE missing {'cond_stage_model.clip_l.logit_scale', 'cond_stage_model.clip_l.text_projection'} left over keys: dict_keys(['conditioner.embedders.0.logit_scale', 'conditioner.embedders.0.text_projection']) lora key not loaded lora_unet_down_blocks_0_attentions_0_proj_in.alpha lora key not loaded lora_unet_down_blocks_0_attentions_0_proj_in.lora_down.weight lora key not loaded lora_unet_down_blocks_0_attentions_0_proj_in.lora_up.weight lora key not loaded lora_unet_down_blocks_0_attentions_0_proj_out.alpha lora key not loaded lora_unet_down_blocks_0_attentions_0_proj_out.lora_down.weight lora key not loaded lora_unet_down_blocks_0_attentions_0_proj_out.lora_up.weight lora key not loaded lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_k.alpha lora key not loaded lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_k.lora_down.weight lora key not loaded lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_k.lora_up.weight lora key not loaded lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_out_0.alpha lora key not loaded lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_down.weight lora key not loaded lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_up.weight lora key not loaded lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_q.alpha lora key not loaded lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_q.lora_down.weight lora key not loaded lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_q.lora_up.weight lora key not loaded lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_v.alpha lora key not loaded lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_v.lora_down.weight lora key not loaded lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_v.lora_up.weight lora key not loaded lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_k.alpha lora key not loaded lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_k.lora_down.weight lora key not loaded lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_k.lora_up.weight lora key not loaded lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_out_0.alpha lora key not loaded lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_down.weight lora key not loaded lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_up.weight lora key not loaded lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_q.alpha lora key not loaded lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_q.lora_down.weight lora key not loaded lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_q.lora_up.weight lora key not loaded lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_v.alpha lora key not loaded lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_v.lora_down.weight lora key not loaded lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_v.lora_up.weight lora key not loaded lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_ff_net_0_proj.alpha lora key not loaded lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_down.weight lora key not loaded lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_up.weight lora key not loaded lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_ff_net_2.alpha lora key not loaded lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_ff_net_2.lora_down.weight lora key not loaded lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_ff_net_2.lora_up.weight lora key not loaded lora_unet_down_blocks_0_attentions_1_proj_in.alpha lora key not loaded lora_unet_down_blocks_0_attentions_1_proj_in.lora_down.weight lora key not loaded lora_unet_down_blocks_0_attentions_1_proj_in.lora_up.weight lora key not loaded lora_unet_down_blocks_0_attentions_1_proj_out.alpha lora key not loaded lora_unet_down_blocks_0_attentions_1_proj_out.lora_down.weight lora key not loaded lora_unet_down_blocks_0_attentions_1_proj_out.lora_up.weight lora key not loaded lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_k.alpha lora key not loaded lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_k.lora_down.weight lora key not loaded lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_k.lora_up.weight lora key not loaded lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_out_0.alpha lora key not loaded lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_down.weight lora key not loaded lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_up.weight lora key not loaded lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_q.alpha lora key not loaded lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_q.lora_down.weight lora key not loaded lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_q.lora_up.weight lora key not loaded lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_v.alpha lora key not loaded lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_v.lora_down.weight lora key not loaded lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_v.lora_up.weight lora key not loaded lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_k.alpha lora key not loaded lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_k.lora_down.weight lora key not loaded lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_k.lora_up.weight lora key not loaded lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_out_0.alpha lora key not loaded lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_down.weight lora key not loaded lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_up.weight lora key not loaded lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_q.alpha lora key not loaded lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_q.lora_down.weight lora key not loaded lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_q.lora_up.weight lora key not loaded lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_v.alpha lora key not loaded lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_v.lora_down.weight lora key not loaded lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_v.lora_up.weight lora key not loaded lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_ff_net_0_proj.alpha lora key not loaded lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_down.weight lora key not loaded lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_up.weight lora key not loaded lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_ff_net_2.alpha lora key not loaded lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_ff_net_2.lora_down.weight lora key not loaded lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_ff_net_2.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_0_proj_in.alpha lora key not loaded lora_unet_up_blocks_2_attentions_0_proj_in.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_0_proj_in.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_0_proj_out.alpha lora key not loaded lora_unet_up_blocks_2_attentions_0_proj_out.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_0_proj_out.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_k.alpha lora key not loaded lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_k.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_k.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_out_0.alpha lora key not loaded lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_q.alpha lora key not loaded lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_q.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_q.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_v.alpha lora key not loaded lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_v.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_v.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_k.alpha lora key not loaded lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_k.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_k.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_out_0.alpha lora key not loaded lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_q.alpha lora key not loaded lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_q.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_q.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_v.alpha lora key not loaded lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_v.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_v.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_ff_net_0_proj.alpha lora key not loaded lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_ff_net_2.alpha lora key not loaded lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_ff_net_2.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_ff_net_2.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_1_proj_in.alpha lora key not loaded lora_unet_up_blocks_2_attentions_1_proj_in.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_1_proj_in.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_1_proj_out.alpha lora key not loaded lora_unet_up_blocks_2_attentions_1_proj_out.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_1_proj_out.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_k.alpha lora key not loaded lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_k.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_k.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_out_0.alpha lora key not loaded lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_q.alpha lora key not loaded lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_q.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_q.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_v.alpha lora key not loaded lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_v.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_v.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_k.alpha lora key not loaded lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_k.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_k.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_out_0.alpha lora key not loaded lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_q.alpha lora key not loaded lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_q.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_q.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_v.alpha lora key not loaded lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_v.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_v.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_ff_net_0_proj.alpha lora key not loaded lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_ff_net_2.alpha lora key not loaded lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_ff_net_2.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_ff_net_2.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_2_proj_in.alpha lora key not loaded lora_unet_up_blocks_2_attentions_2_proj_in.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_2_proj_in.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_2_proj_out.alpha lora key not loaded lora_unet_up_blocks_2_attentions_2_proj_out.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_2_proj_out.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_k.alpha lora key not loaded lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_k.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_k.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_out_0.alpha lora key not loaded lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_out_0.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_out_0.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_q.alpha lora key not loaded lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_q.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_q.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_v.alpha lora key not loaded lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_v.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_v.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_k.alpha lora key not loaded lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_k.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_k.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_out_0.alpha lora key not loaded lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_out_0.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_out_0.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_q.alpha lora key not loaded lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_q.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_q.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_v.alpha lora key not loaded lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_v.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_v.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_ff_net_0_proj.alpha lora key not loaded lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_ff_net_0_proj.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_ff_net_0_proj.lora_up.weight lora key not loaded lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_ff_net_2.alpha lora key not loaded lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_ff_net_2.lora_down.weight lora key not loaded lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_ff_net_2.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_0_proj_in.alpha lora key not loaded lora_unet_up_blocks_3_attentions_0_proj_in.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_0_proj_in.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_0_proj_out.alpha lora key not loaded lora_unet_up_blocks_3_attentions_0_proj_out.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_0_proj_out.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_k.alpha lora key not loaded lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_k.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_k.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_out_0.alpha lora key not loaded lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_q.alpha lora key not loaded lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_q.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_q.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_v.alpha lora key not loaded lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_v.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_v.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_k.alpha lora key not loaded lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_k.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_k.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_out_0.alpha lora key not loaded lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_q.alpha lora key not loaded lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_q.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_q.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_v.alpha lora key not loaded lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_v.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_v.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_ff_net_0_proj.alpha lora key not loaded lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_ff_net_2.alpha lora key not loaded lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_ff_net_2.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_ff_net_2.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_1_proj_in.alpha lora key not loaded lora_unet_up_blocks_3_attentions_1_proj_in.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_1_proj_in.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_1_proj_out.alpha lora key not loaded lora_unet_up_blocks_3_attentions_1_proj_out.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_1_proj_out.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_k.alpha lora key not loaded lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_k.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_k.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_out_0.alpha lora key not loaded lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_q.alpha lora key not loaded lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_q.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_q.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_v.alpha lora key not loaded lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_v.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_v.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_k.alpha lora key not loaded lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_k.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_k.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_out_0.alpha lora key not loaded lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_q.alpha lora key not loaded lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_q.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_q.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_v.alpha lora key not loaded lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_v.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_v.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_ff_net_0_proj.alpha lora key not loaded lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_ff_net_2.alpha lora key not loaded lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_ff_net_2.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_ff_net_2.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_2_proj_in.alpha lora key not loaded lora_unet_up_blocks_3_attentions_2_proj_in.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_2_proj_in.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_2_proj_out.alpha lora key not loaded lora_unet_up_blocks_3_attentions_2_proj_out.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_2_proj_out.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_k.alpha lora key not loaded lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_k.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_k.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_out_0.alpha lora key not loaded lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_out_0.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_out_0.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_q.alpha lora key not loaded lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_q.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_q.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_v.alpha lora key not loaded lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_v.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_v.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_k.alpha lora key not loaded lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_k.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_k.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_out_0.alpha lora key not loaded lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_out_0.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_out_0.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_q.alpha lora key not loaded lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_q.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_q.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_v.alpha lora key not loaded lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_v.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_v.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_0_proj.alpha lora key not loaded lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_0_proj.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_0_proj.lora_up.weight lora key not loaded lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_2.alpha lora key not loaded lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_2.lora_down.weight lora key not loaded lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_2.lora_up.weight Requested to load SDXLClipModel Loading 1 new model WARNING: shape mismatch when trying to apply embedding, embedding will be ignored 768 1280 WARNING: shape mismatch when trying to apply embedding, embedding will be ignored 768 1280 WARNING: shape mismatch when trying to apply embedding, embedding will be ignored 768 1280 WARNING: shape mismatch when trying to apply embedding, embedding will be ignored 768 1280 WARNING: shape mismatch when trying to apply embedding, embedding will be ignored 768 1280 WARNING: shape mismatch when trying to apply embedding, embedding will be ignored 768 1280 WARNING: shape mismatch when trying to apply embedding, embedding will be ignored 768 1280 WARNING: shape mismatch when trying to apply embedding, embedding will be ignored 768 1280 WARNING: shape mismatch when trying to apply embedding, embedding will be ignored 768 1280 WARNING: shape mismatch when trying to apply embedding, embedding will be ignored 768 1280 Using pytorch attention in VAE Working with z of shape (1, 4, 32, 32) = 4096 dimensions. Using pytorch attention in VAE Leftover VAE keys ['model_ema.decay', 'model_ema.num_updates'] Requested to load AutoencoderKL Loading 1 new model Requested to load SDXL Loading 1 new model ERROR diffusion_model.input_blocks.4.1.transformer_blocks.0.attn2.to_k.weight shape '[640, 2048]' is invalid for input of size 491520 ERROR diffusion_model.input_blocks.4.1.transformer_blocks.0.attn2.to_v.weight shape '[640, 2048]' is invalid for input of size 491520 ERROR diffusion_model.input_blocks.5.1.transformer_blocks.0.attn2.to_k.weight shape '[640, 2048]' is invalid for input of size 491520 ERROR diffusion_model.input_blocks.5.1.transformer_blocks.0.attn2.to_v.weight shape '[640, 2048]' is invalid for input of size 491520 ERROR diffusion_model.input_blocks.7.1.transformer_blocks.0.attn2.to_k.weight shape '[1280, 2048]' is invalid for input of size 983040 ERROR diffusion_model.input_blocks.7.1.transformer_blocks.0.attn2.to_v.weight shape '[1280, 2048]' is invalid for input of size 983040 ERROR diffusion_model.input_blocks.8.1.transformer_blocks.0.attn2.to_k.weight shape '[1280, 2048]' is invalid for input of size 983040 ERROR diffusion_model.input_blocks.8.1.transformer_blocks.0.attn2.to_v.weight shape '[1280, 2048]' is invalid for input of size 983040 ERROR diffusion_model.middle_block.1.transformer_blocks.0.attn2.to_k.weight shape '[1280, 2048]' is invalid for input of size 983040 ERROR diffusion_model.middle_block.1.transformer_blocks.0.attn2.to_v.weight shape '[1280, 2048]' is invalid for input of size 983040 ERROR diffusion_model.output_blocks.3.1.proj_in.weight shape '[640, 640]' is invalid for input of size 1638400 ERROR diffusion_model.output_blocks.3.1.transformer_blocks.0.attn1.to_q.weight shape '[640, 640]' is invalid for input of size 1638400 ERROR diffusion_model.output_blocks.3.1.transformer_blocks.0.attn1.to_k.weight shape '[640, 640]' is invalid for input of size 1638400 ERROR diffusion_model.output_blocks.3.1.transformer_blocks.0.attn1.to_v.weight shape '[640, 640]' is invalid for input of size 1638400 ERROR diffusion_model.output_blocks.3.1.transformer_blocks.0.attn1.to_out.0.weight shape '[640, 640]' is invalid for input of size 1638400 ERROR diffusion_model.output_blocks.3.1.transformer_blocks.0.ff.net.0.proj.weight shape '[5120, 640]' is invalid for input of size 13107200 ERROR diffusion_model.output_blocks.3.1.transformer_blocks.0.ff.net.2.weight shape '[640, 2560]' is invalid for input of size 6553600 ERROR diffusion_model.output_blocks.3.1.transformer_blocks.0.attn2.to_q.weight shape '[640, 640]' is invalid for input of size 1638400 ERROR diffusion_model.output_blocks.3.1.transformer_blocks.0.attn2.to_k.weight shape '[640, 2048]' is invalid for input of size 983040 ERROR diffusion_model.output_blocks.3.1.transformer_blocks.0.attn2.to_v.weight shape '[640, 2048]' is invalid for input of size 983040 ERROR diffusion_model.output_blocks.3.1.transformer_blocks.0.attn2.to_out.0.weight shape '[640, 640]' is invalid for input of size 1638400 ERROR diffusion_model.output_blocks.3.1.proj_out.weight shape '[640, 640]' is invalid for input of size 1638400 ERROR diffusion_model.output_blocks.4.1.proj_in.weight shape '[640, 640]' is invalid for input of size 1638400 ERROR diffusion_model.output_blocks.4.1.transformer_blocks.0.attn1.to_q.weight shape '[640, 640]' is invalid for input of size 1638400 ERROR diffusion_model.output_blocks.4.1.transformer_blocks.0.attn1.to_k.weight shape '[640, 640]' is invalid for input of size 1638400 ERROR diffusion_model.output_blocks.4.1.transformer_blocks.0.attn1.to_v.weight shape '[640, 640]' is invalid for input of size 1638400 ERROR diffusion_model.output_blocks.4.1.transformer_blocks.0.attn1.to_out.0.weight shape '[640, 640]' is invalid for input of size 1638400 ERROR diffusion_model.output_blocks.4.1.transformer_blocks.0.ff.net.0.proj.weight shape '[5120, 640]' is invalid for input of size 13107200 ERROR diffusion_model.output_blocks.4.1.transformer_blocks.0.ff.net.2.weight shape '[640, 2560]' is invalid for input of size 6553600 ERROR diffusion_model.output_blocks.4.1.transformer_blocks.0.attn2.to_q.weight shape '[640, 640]' is invalid for input of size 1638400 ERROR diffusion_model.output_blocks.4.1.transformer_blocks.0.attn2.to_k.weight shape '[640, 2048]' is invalid for input of size 983040 ERROR diffusion_model.output_blocks.4.1.transformer_blocks.0.attn2.to_v.weight shape '[640, 2048]' is invalid for input of size 983040 ERROR diffusion_model.output_blocks.4.1.transformer_blocks.0.attn2.to_out.0.weight shape '[640, 640]' is invalid for input of size 1638400 ERROR diffusion_model.output_blocks.4.1.proj_out.weight shape '[640, 640]' is invalid for input of size 1638400 ERROR diffusion_model.output_blocks.5.1.proj_in.weight shape '[640, 640]' is invalid for input of size 1638400 ERROR diffusion_model.output_blocks.5.1.transformer_blocks.0.attn1.to_q.weight shape '[640, 640]' is invalid for input of size 1638400 ERROR diffusion_model.output_blocks.5.1.transformer_blocks.0.attn1.to_k.weight shape '[640, 640]' is invalid for input of size 1638400 ERROR diffusion_model.output_blocks.5.1.transformer_blocks.0.attn1.to_v.weight shape '[640, 640]' is invalid for input of size 1638400 ERROR diffusion_model.output_blocks.5.1.transformer_blocks.0.attn1.to_out.0.weight shape '[640, 640]' is invalid for input of size 1638400 ERROR diffusion_model.output_blocks.5.1.transformer_blocks.0.ff.net.0.proj.weight shape '[5120, 640]' is invalid for input of size 13107200 ERROR diffusion_model.output_blocks.5.1.transformer_blocks.0.ff.net.2.weight shape '[640, 2560]' is invalid for input of size 6553600 ERROR diffusion_model.output_blocks.5.1.transformer_blocks.0.attn2.to_q.weight shape '[640, 640]' is invalid for input of size 1638400 ERROR diffusion_model.output_blocks.5.1.transformer_blocks.0.attn2.to_k.weight shape '[640, 2048]' is invalid for input of size 983040 ERROR diffusion_model.output_blocks.5.1.transformer_blocks.0.attn2.to_v.weight shape '[640, 2048]' is invalid for input of size 983040 ERROR diffusion_model.output_blocks.5.1.transformer_blocks.0.attn2.to_out.0.weight shape '[640, 640]' is invalid for input of size 1638400 ERROR diffusion_model.output_blocks.5.1.proj_out.weight shape '[640, 640]' is invalid for input of size 1638400 0%| | 0/4 [00:00<?, ?it/s] Stopped server 25%|█████████████████████ | 1/4 [00:00<00:00, 3.25it/s]--- Logging error --- Traceback (most recent call last):

Had to stop it, else it goes on forever writing infinite //////////////////////////////////

ameen-roayan commented 8 months ago

for some reason comfyu-manager was broken, removed it and it worked. recloned through git and it....same issue, something broke it.

ameen-roayan commented 8 months ago

Ok after further testing, removing the manager folder from custom nodes is making it work, something is in conflict with it for sure, and i am not sure what honestly, will keep this open.

here is a list of the nodes, hopefully some one else has the issue and we can find the culprit.

0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_IPAdapter_plus 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Image-Selector 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\alkemann.py 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-VideoHelperSuite 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\gcLatentTunnel.py 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\efficiency-nodes-comfyui 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\sdxl-recommended-res-calc 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUi-NoodleWebcam 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\hakkun_nodes.py 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_Fill-Nodes-main 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_GradientDeepShrink 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_TiledIPAdapter 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Logic 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\sdxl_prompt_styler 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_SimpleTiles 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_toyxyz_test_nodes 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_Noise 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Coziness 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\sd-dynamic-thresholding 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_ADV_CLIP_emb 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\cg-use-everywhere-main 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\save-image-extended-comfyui 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_TiledKSampler 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUi_NNLatentUpscale 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\stability-ComfyUI-nodes 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\cg_custom_core 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-SaveImgPrompt 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui-tooling-nodes 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_JPS-Nodes 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui_controlnet_aux 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-WD14-Tagger 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\IPAdapter-ComfyUI 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui-yanc 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_experiments 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_ColorMod 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\mikey_nodes 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Custom-Scripts 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\images-grid-comfy-plugin 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui-animatediff 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\cg-noise 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui-deploy 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyMath 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui-various 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Prompt-Expansion 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Advanced-ControlNet 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_UltimateSDUpscale 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-AnimateDiff-Evolved 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\comfy-image-saver 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Frame-Interpolation 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\rgthree-comfy 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\Derfuu_ComfyUI_ModdedNodes 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_essentials 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_Comfyroll_CustomNodes 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui-dream-project 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Inspire-Pack 0.0 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_smZNodes 0.1 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\NodeGPT-main 0.1 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-LCM 0.1 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_FizzNodes 0.1 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Impact-Pack 0.2 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui_segment_anything 0.2 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui-reactor-node 0.2 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\SeargeSDXL 0.4 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui-mixlab-nodes 0.5 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_tinyterraNodes 0.7 seconds: F:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui-art-venture

beyastard commented 3 months ago

In your list above, "F:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui-art-venture" is broken. In the file, ./module/animatediff/__init.py, you find the following: line 9: animatediff_dir_names = ["AnimateDiff", "comfyui-animatediff"] line 27: module_path = os.path.join(module_path, "animatediff/sliding_schedule.py") ComfyUI-AnimateDiff-Evolved is the latest incarnation and cannot be used if using comfyui-art-venture.