StartHua / Comfyui_CXH_joy_caption

Recommended based on comfyui node pictures:Joy_caption + MiniCPMv2_6-prompt-generator + florence2
Apache License 2.0
287 stars 17 forks source link

错误:在目录 C:/comfyui/models/clip\siglip-so400m-patch14-384 中找不到名为 pytorch_model.bin、model.safetensors、tf_model.h5、model.ckpt.index 或 flax_model.msgpack 的文件。 #60

Open jiej32228 opened 6 days ago

jiej32228 commented 6 days ago

求助大佬们,这个报错要怎么解决呢,该下载的模型已经全部下载并放到文件夹:siglip-so400m-patch14-384

JoyCaption Error no file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory C:/comfyui/models/clip\siglip-so400m-patch14-384.

错误:在目录 C:/comfyui/models/clip\siglip-so400m-patch14-384 中找不到名为 pytorch_model.bin、model.safetensors、tf_model.h5、model.ckpt.index 或 flax_model.msgpack 的文件。

ComfyUI Error Report

Error Details

## System Information
- **ComfyUI Version:** v0.2.2
- **Arguments:** C:\AI\ComfyUI_windows_portable\ComfyUI\main.py --auto-launch --preview-method auto --disable-cuda-malloc --fast
- **OS:** nt
- **Python Version:** 3.11.9 (tags/v3.11.9:de54cf5, Apr  2 2024, 10:12:12) [MSC v.1938 64 bit (AMD64)]
- **Embedded Python:** true
- **PyTorch Version:** 2.4.1+cu121
## Devices

- **Name:** cuda:0 NVIDIA GeForce RTX 4070 Ti SUPER : cudaMallocAsync
  - **Type:** cuda
  - **VRAM Total:** 17170825216
  - **VRAM Free:** 15723397120
  - **Torch VRAM Total:** 67108864
  - **Torch VRAM Free:** 33554432

## Logs

2024-09-15 20:02:01,715 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_8_self_attn_out_proj.alpha 2024-09-15 20:02:01,715 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_8_self_attn_out_proj.lora_down.weight 2024-09-15 20:02:01,715 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_8_self_attn_out_proj.lora_up.weight 2024-09-15 20:02:01,715 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_8_self_attn_q_proj.alpha 2024-09-15 20:02:01,715 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_8_self_attn_q_proj.lora_down.weight 2024-09-15 20:02:01,715 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_8_self_attn_q_proj.lora_up.weight 2024-09-15 20:02:01,716 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_8_self_attn_v_proj.alpha 2024-09-15 20:02:01,716 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_8_self_attn_v_proj.lora_down.weight 2024-09-15 20:02:01,716 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_8_self_attn_v_proj.lora_up.weight 2024-09-15 20:02:01,716 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_mlp_fc1.alpha 2024-09-15 20:02:01,716 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_mlp_fc1.lora_down.weight 2024-09-15 20:02:01,716 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_mlp_fc1.lora_up.weight 2024-09-15 20:02:01,716 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_mlp_fc2.alpha 2024-09-15 20:02:01,716 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_mlp_fc2.lora_down.weight 2024-09-15 20:02:01,716 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_mlp_fc2.lora_up.weight 2024-09-15 20:02:01,716 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_self_attn_k_proj.alpha 2024-09-15 20:02:01,716 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_self_attn_k_proj.lora_down.weight 2024-09-15 20:02:01,716 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_self_attn_k_proj.lora_up.weight 2024-09-15 20:02:01,716 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_self_attn_out_proj.alpha 2024-09-15 20:02:01,716 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_self_attn_out_proj.lora_down.weight 2024-09-15 20:02:01,717 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_self_attn_out_proj.lora_up.weight 2024-09-15 20:02:01,717 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_self_attn_q_proj.alpha 2024-09-15 20:02:01,717 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_self_attn_q_proj.lora_down.weight 2024-09-15 20:02:01,717 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_self_attn_q_proj.lora_up.weight 2024-09-15 20:02:01,717 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_self_attn_v_proj.alpha 2024-09-15 20:02:01,717 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_self_attn_v_proj.lora_down.weight 2024-09-15 20:02:01,717 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_self_attn_v_proj.lora_up.weight 2024-09-15 20:02:02,221 - root - WARNING - clip missing: ['textprojection.weight'] 2024-09-15 20:02:10,079 - root - INFO - Requested to load FluxClipModel 2024-09-15 20:02:10,079 - root - INFO - Loading 1 new model 2024-09-15 20:02:12,649 - root - INFO - loaded completely 0.0 9319.23095703125 True 2024-09-15 20:02:13,371 - root - INFO - Requested to load Flux 2024-09-15 20:02:13,371 - root - INFO - Loading 1 new model 2024-09-15 20:02:38,237 - root - INFO - loaded completely 0.0 11350.048889160156 True 2024-09-15 20:03:08,844 - root - INFO - Requested to load AutoencodingEngine 2024-09-15 20:03:08,844 - root - INFO - Loading 1 new model 2024-09-15 20:03:09,398 - root - INFO - loaded completely 0.0 159.87335777282715 True 2024-09-15 20:03:10,047 - root - INFO - Prompt executed in 106.76 seconds 2024-09-15 20:03:32,229 - root - INFO - got prompt 2024-09-15 20:03:33,046 - root - ERROR - !!! Exception during processing !!! Error no file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory C:/comfyui/models/clip\siglip-so400m-patch14-384. 2024-09-15 20:03:33,046 - root - ERROR - Traceback (most recent call last): File "C:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 317, in execute output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 192, in get_output_data return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 169, in _map_node_over_list process_inputs(input_dict, i) File "C:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 158, in process_inputs results.append(getattr(obj, func)(**inputs)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_HF_Servelress_Inference\nodes\Joy_Caption.py", line 125, in gen self.loadCheckPoint() File "C:\AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_HF_Servelress_Inference\nodes\Joy_Caption.py", line 84, in loadCheckPoint clip_model = AutoModel.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\AI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\models\auto\auto_factory.py", line 564, in from_pretrained return model_class.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\AI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\modeling_utils.py", line 3460, in from_pretrained raise EnvironmentError( OSError: Error no file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory C:/comfyui/models/clip\siglip-so400m-patch14-384.

2024-09-15 20:03:33,048 - root - INFO - Prompt executed in 0.80 seconds 2024-09-15 20:03:39,933 - root - INFO - got prompt 2024-09-15 20:03:40,455 - root - ERROR - !!! Exception during processing !!! Error no file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory C:/comfyui/models/clip\siglip-so400m-patch14-384. 2024-09-15 20:03:40,457 - root - ERROR - Traceback (most recent call last): File "C:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 317, in execute output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 192, in get_output_data return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 169, in _map_node_over_list process_inputs(input_dict, i) File "C:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 158, in process_inputs results.append(getattr(obj, func)(**inputs)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_HF_Servelress_Inference\nodes\Joy_Caption.py", line 125, in gen self.loadCheckPoint() File "C:\AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_HF_Servelress_Inference\nodes\Joy_Caption.py", line 84, in loadCheckPoint clip_model = AutoModel.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\AI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\models\auto\auto_factory.py", line 564, in from_pretrained return model_class.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\AI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\modeling_utils.py", line 3460, in from_pretrained raise EnvironmentError( OSError: Error no file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory C:/comfyui/models/clip\siglip-so400m-patch14-384.

2024-09-15 20:03:40,459 - root - INFO - Prompt executed in 0.51 seconds 2024-09-15 20:03:43,669 - root - INFO - got prompt 2024-09-15 20:03:44,222 - root - ERROR - !!! Exception during processing !!! Error no file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory C:/comfyui/models/clip\siglip-so400m-patch14-384. 2024-09-15 20:03:44,225 - root - ERROR - Traceback (most recent call last): File "C:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 317, in execute output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 192, in get_output_data return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 169, in _map_node_over_list process_inputs(input_dict, i) File "C:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 158, in process_inputs results.append(getattr(obj, func)(**inputs)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_HF_Servelress_Inference\nodes\Joy_Caption.py", line 125, in gen self.loadCheckPoint() File "C:\AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_HF_Servelress_Inference\nodes\Joy_Caption.py", line 84, in loadCheckPoint clip_model = AutoModel.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\AI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\models\auto\auto_factory.py", line 564, in from_pretrained return model_class.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\AI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\modeling_utils.py", line 3460, in from_pretrained raise EnvironmentError( OSError: Error no file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory C:/comfyui/models/clip\siglip-so400m-patch14-384.

2024-09-15 20:03:44,226 - root - INFO - Prompt executed in 0.54 seconds 2024-09-15 20:04:44,797 - root - INFO - got prompt 2024-09-15 20:04:47,599 - root - INFO - Using pytorch attention in VAE 2024-09-15 20:04:47,602 - root - INFO - Using pytorch attention in VAE 2024-09-15 20:04:48,102 - root - INFO - model weight dtype torch.float8_e5m2, manual cast: torch.bfloat16 2024-09-15 20:04:48,102 - root - INFO - model_type FLUX 2024-09-15 20:05:17,583 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_0_mlp_fc1.alpha 2024-09-15 20:05:17,583 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_0_mlp_fc1.lora_down.weight 2024-09-15 20:05:17,583 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_0_mlp_fc1.lora_up.weight 2024-09-15 20:05:17,584 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_0_mlp_fc2.alpha 2024-09-15 20:05:17,584 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_0_mlp_fc2.lora_down.weight 2024-09-15 20:05:17,584 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_0_mlp_fc2.lora_up.weight 2024-09-15 20:05:17,584 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_0_self_attn_k_proj.alpha 2024-09-15 20:05:17,584 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_0_self_attn_k_proj.lora_down.weight 2024-09-15 20:05:17,584 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_0_self_attn_k_proj.lora_up.weight 2024-09-15 20:05:17,584 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_0_self_attn_out_proj.alpha 2024-09-15 20:05:17,584 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_0_self_attn_out_proj.lora_down.weight 2024-09-15 20:05:17,584 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_0_self_attn_out_proj.lora_up.weight 2024-09-15 20:05:17,585 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_0_self_attn_q_proj.alpha 2024-09-15 20:05:17,585 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_0_self_attn_q_proj.lora_down.weight 2024-09-15 20:05:17,585 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_0_self_attn_q_proj.lora_up.weight 2024-09-15 20:05:17,585 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_0_self_attn_v_proj.alpha 2024-09-15 20:05:17,585 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_0_self_attn_v_proj.lora_down.weight 2024-09-15 20:05:17,585 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_0_self_attn_v_proj.lora_up.weight 2024-09-15 20:05:17,585 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_10_mlp_fc1.alpha 2024-09-15 20:05:17,585 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_10_mlp_fc1.lora_down.weight 2024-09-15 20:05:17,585 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_10_mlp_fc1.lora_up.weight 2024-09-15 20:05:17,585 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_10_mlp_fc2.alpha 2024-09-15 20:05:17,585 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_10_mlp_fc2.lora_down.weight 2024-09-15 20:05:17,585 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_10_mlp_fc2.lora_up.weight 2024-09-15 20:05:17,585 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_10_self_attn_k_proj.alpha 2024-09-15 20:05:17,585 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_10_self_attn_k_proj.lora_down.weight 2024-09-15 20:05:17,585 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_10_self_attn_k_proj.lora_up.weight 2024-09-15 20:05:17,585 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_10_self_attn_out_proj.alpha 2024-09-15 20:05:17,585 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_10_self_attn_out_proj.lora_down.weight 2024-09-15 20:05:17,585 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_10_self_attn_out_proj.lora_up.weight 2024-09-15 20:05:17,586 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_10_self_attn_q_proj.alpha 2024-09-15 20:05:17,586 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_10_self_attn_q_proj.lora_down.weight 2024-09-15 20:05:17,586 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_10_self_attn_q_proj.lora_up.weight 2024-09-15 20:05:17,586 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_10_self_attn_v_proj.alpha 2024-09-15 20:05:17,586 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_10_self_attn_v_proj.lora_down.weight 2024-09-15 20:05:17,586 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_10_self_attn_v_proj.lora_up.weight 2024-09-15 20:05:17,586 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_11_mlp_fc1.alpha 2024-09-15 20:05:17,586 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_11_mlp_fc1.lora_down.weight 2024-09-15 20:05:17,586 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_11_mlp_fc1.lora_up.weight 2024-09-15 20:05:17,586 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_11_mlp_fc2.alpha 2024-09-15 20:05:17,586 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_11_mlp_fc2.lora_down.weight 2024-09-15 20:05:17,586 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_11_mlp_fc2.lora_up.weight 2024-09-15 20:05:17,586 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_11_self_attn_k_proj.alpha 2024-09-15 20:05:17,586 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_11_self_attn_k_proj.lora_down.weight 2024-09-15 20:05:17,586 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_11_self_attn_k_proj.lora_up.weight 2024-09-15 20:05:17,586 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_11_self_attn_out_proj.alpha 2024-09-15 20:05:17,586 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_11_self_attn_out_proj.lora_down.weight 2024-09-15 20:05:17,586 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_11_self_attn_out_proj.lora_up.weight 2024-09-15 20:05:17,587 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_11_self_attn_q_proj.alpha 2024-09-15 20:05:17,587 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_11_self_attn_q_proj.lora_down.weight 2024-09-15 20:05:17,587 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_11_self_attn_q_proj.lora_up.weight 2024-09-15 20:05:17,587 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_11_self_attn_v_proj.alpha 2024-09-15 20:05:17,587 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_11_self_attn_v_proj.lora_down.weight 2024-09-15 20:05:17,587 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_11_self_attn_v_proj.lora_up.weight 2024-09-15 20:05:17,587 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_1_mlp_fc1.alpha 2024-09-15 20:05:17,587 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_1_mlp_fc1.lora_down.weight 2024-09-15 20:05:17,587 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_1_mlp_fc1.lora_up.weight 2024-09-15 20:05:17,587 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_1_mlp_fc2.alpha 2024-09-15 20:05:17,587 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_1_mlp_fc2.lora_down.weight 2024-09-15 20:05:17,587 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_1_mlp_fc2.lora_up.weight 2024-09-15 20:05:17,587 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_1_self_attn_k_proj.alpha 2024-09-15 20:05:17,587 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_1_self_attn_k_proj.lora_down.weight 2024-09-15 20:05:17,587 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_1_self_attn_k_proj.lora_up.weight 2024-09-15 20:05:17,587 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_1_self_attn_out_proj.alpha 2024-09-15 20:05:17,587 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_1_self_attn_out_proj.lora_down.weight 2024-09-15 20:05:17,587 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_1_self_attn_out_proj.lora_up.weight 2024-09-15 20:05:17,587 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_1_self_attn_q_proj.alpha 2024-09-15 20:05:17,588 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_1_self_attn_q_proj.lora_down.weight 2024-09-15 20:05:17,588 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_1_self_attn_q_proj.lora_up.weight 2024-09-15 20:05:17,588 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_1_self_attn_v_proj.alpha 2024-09-15 20:05:17,588 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_1_self_attn_v_proj.lora_down.weight 2024-09-15 20:05:17,588 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_1_self_attn_v_proj.lora_up.weight 2024-09-15 20:05:17,588 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_2_mlp_fc1.alpha 2024-09-15 20:05:17,588 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_2_mlp_fc1.lora_down.weight 2024-09-15 20:05:17,588 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_2_mlp_fc1.lora_up.weight 2024-09-15 20:05:17,588 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_2_mlp_fc2.alpha 2024-09-15 20:05:17,588 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_2_mlp_fc2.lora_down.weight 2024-09-15 20:05:17,588 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_2_mlp_fc2.lora_up.weight 2024-09-15 20:05:17,588 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_2_self_attn_k_proj.alpha 2024-09-15 20:05:17,588 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_2_self_attn_k_proj.lora_down.weight 2024-09-15 20:05:17,588 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_2_self_attn_k_proj.lora_up.weight 2024-09-15 20:05:17,588 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_2_self_attn_out_proj.alpha 2024-09-15 20:05:17,588 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_2_self_attn_out_proj.lora_down.weight 2024-09-15 20:05:17,588 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_2_self_attn_out_proj.lora_up.weight 2024-09-15 20:05:17,588 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_2_self_attn_q_proj.alpha 2024-09-15 20:05:17,588 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_2_self_attn_q_proj.lora_down.weight 2024-09-15 20:05:17,589 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_2_self_attn_q_proj.lora_up.weight 2024-09-15 20:05:17,589 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_2_self_attn_v_proj.alpha 2024-09-15 20:05:17,589 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_2_self_attn_v_proj.lora_down.weight 2024-09-15 20:05:17,589 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_2_self_attn_v_proj.lora_up.weight 2024-09-15 20:05:17,589 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_3_mlp_fc1.alpha 2024-09-15 20:05:17,589 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_3_mlp_fc1.lora_down.weight 2024-09-15 20:05:17,589 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_3_mlp_fc1.lora_up.weight 2024-09-15 20:05:17,589 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_3_mlp_fc2.alpha 2024-09-15 20:05:17,589 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_3_mlp_fc2.lora_down.weight 2024-09-15 20:05:17,589 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_3_mlp_fc2.lora_up.weight 2024-09-15 20:05:17,589 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_3_self_attn_k_proj.alpha 2024-09-15 20:05:17,589 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_3_self_attn_k_proj.lora_down.weight 2024-09-15 20:05:17,589 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_3_self_attn_k_proj.lora_up.weight 2024-09-15 20:05:17,589 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_3_self_attn_out_proj.alpha 2024-09-15 20:05:17,589 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_3_self_attn_out_proj.lora_down.weight 2024-09-15 20:05:17,589 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_3_self_attn_out_proj.lora_up.weight 2024-09-15 20:05:17,589 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_3_self_attn_q_proj.alpha 2024-09-15 20:05:17,589 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_3_self_attn_q_proj.lora_down.weight 2024-09-15 20:05:17,589 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_3_self_attn_q_proj.lora_up.weight 2024-09-15 20:05:17,589 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_3_self_attn_v_proj.alpha 2024-09-15 20:05:17,589 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_3_self_attn_v_proj.lora_down.weight 2024-09-15 20:05:17,589 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_3_self_attn_v_proj.lora_up.weight 2024-09-15 20:05:17,589 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_4_mlp_fc1.alpha 2024-09-15 20:05:17,589 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_4_mlp_fc1.lora_down.weight 2024-09-15 20:05:17,589 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_4_mlp_fc1.lora_up.weight 2024-09-15 20:05:17,589 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_4_mlp_fc2.alpha 2024-09-15 20:05:17,589 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_4_mlp_fc2.lora_down.weight 2024-09-15 20:05:17,589 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_4_mlp_fc2.lora_up.weight 2024-09-15 20:05:17,589 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_4_self_attn_k_proj.alpha 2024-09-15 20:05:17,589 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_4_self_attn_k_proj.lora_down.weight 2024-09-15 20:05:17,591 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_4_self_attn_k_proj.lora_up.weight 2024-09-15 20:05:17,591 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_4_self_attn_out_proj.alpha 2024-09-15 20:05:17,591 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_4_self_attn_out_proj.lora_down.weight 2024-09-15 20:05:17,591 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_4_self_attn_out_proj.lora_up.weight 2024-09-15 20:05:17,591 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_4_self_attn_q_proj.alpha 2024-09-15 20:05:17,591 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_4_self_attn_q_proj.lora_down.weight 2024-09-15 20:05:17,591 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_4_self_attn_q_proj.lora_up.weight 2024-09-15 20:05:17,591 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_4_self_attn_v_proj.alpha 2024-09-15 20:05:17,591 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_4_self_attn_v_proj.lora_down.weight 2024-09-15 20:05:17,591 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_4_self_attn_v_proj.lora_up.weight 2024-09-15 20:05:17,591 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_5_mlp_fc1.alpha 2024-09-15 20:05:17,591 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_5_mlp_fc1.lora_down.weight 2024-09-15 20:05:17,591 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_5_mlp_fc1.lora_up.weight 2024-09-15 20:05:17,591 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_5_mlp_fc2.alpha 2024-09-15 20:05:17,591 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_5_mlp_fc2.lora_down.weight 2024-09-15 20:05:17,591 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_5_mlp_fc2.lora_up.weight 2024-09-15 20:05:17,591 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_5_self_attn_k_proj.alpha 2024-09-15 20:05:17,591 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_5_self_attn_k_proj.lora_down.weight 2024-09-15 20:05:17,591 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_5_self_attn_k_proj.lora_up.weight 2024-09-15 20:05:17,591 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_5_self_attn_out_proj.alpha 2024-09-15 20:05:17,592 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_5_self_attn_out_proj.lora_down.weight 2024-09-15 20:05:17,592 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_5_self_attn_out_proj.lora_up.weight 2024-09-15 20:05:17,592 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_5_self_attn_q_proj.alpha 2024-09-15 20:05:17,592 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_5_self_attn_q_proj.lora_down.weight 2024-09-15 20:05:17,592 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_5_self_attn_q_proj.lora_up.weight 2024-09-15 20:05:17,592 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_5_self_attn_v_proj.alpha 2024-09-15 20:05:17,592 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_5_self_attn_v_proj.lora_down.weight 2024-09-15 20:05:17,592 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_5_self_attn_v_proj.lora_up.weight 2024-09-15 20:05:17,592 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_6_mlp_fc1.alpha 2024-09-15 20:05:17,592 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_6_mlp_fc1.lora_down.weight 2024-09-15 20:05:17,592 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_6_mlp_fc1.lora_up.weight 2024-09-15 20:05:17,592 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_6_mlp_fc2.alpha 2024-09-15 20:05:17,592 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_6_mlp_fc2.lora_down.weight 2024-09-15 20:05:17,592 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_6_mlp_fc2.lora_up.weight 2024-09-15 20:05:17,592 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_6_self_attn_k_proj.alpha 2024-09-15 20:05:17,592 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_6_self_attn_k_proj.lora_down.weight 2024-09-15 20:05:17,592 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_6_self_attn_k_proj.lora_up.weight 2024-09-15 20:05:17,592 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_6_self_attn_out_proj.alpha 2024-09-15 20:05:17,592 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_6_self_attn_out_proj.lora_down.weight 2024-09-15 20:05:17,592 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_6_self_attn_out_proj.lora_up.weight 2024-09-15 20:05:17,592 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_6_self_attn_q_proj.alpha 2024-09-15 20:05:17,592 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_6_self_attn_q_proj.lora_down.weight 2024-09-15 20:05:17,592 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_6_self_attn_q_proj.lora_up.weight 2024-09-15 20:05:17,592 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_6_self_attn_v_proj.alpha 2024-09-15 20:05:17,592 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_6_self_attn_v_proj.lora_down.weight 2024-09-15 20:05:17,592 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_6_self_attn_v_proj.lora_up.weight 2024-09-15 20:05:17,592 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_7_mlp_fc1.alpha 2024-09-15 20:05:17,592 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_7_mlp_fc1.lora_down.weight 2024-09-15 20:05:17,592 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_7_mlp_fc1.lora_up.weight 2024-09-15 20:05:17,592 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_7_mlp_fc2.alpha 2024-09-15 20:05:17,593 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_7_mlp_fc2.lora_down.weight 2024-09-15 20:05:17,593 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_7_mlp_fc2.lora_up.weight 2024-09-15 20:05:17,593 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_7_self_attn_k_proj.alpha 2024-09-15 20:05:17,593 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_7_self_attn_k_proj.lora_down.weight 2024-09-15 20:05:17,593 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_7_self_attn_k_proj.lora_up.weight 2024-09-15 20:05:17,593 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_7_self_attn_out_proj.alpha 2024-09-15 20:05:17,593 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_7_self_attn_out_proj.lora_down.weight 2024-09-15 20:05:17,593 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_7_self_attn_out_proj.lora_up.weight 2024-09-15 20:05:17,593 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_7_self_attn_q_proj.alpha 2024-09-15 20:05:17,593 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_7_self_attn_q_proj.lora_down.weight 2024-09-15 20:05:17,593 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_7_self_attn_q_proj.lora_up.weight 2024-09-15 20:05:17,593 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_7_self_attn_v_proj.alpha 2024-09-15 20:05:17,593 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_7_self_attn_v_proj.lora_down.weight 2024-09-15 20:05:17,593 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_7_self_attn_v_proj.lora_up.weight 2024-09-15 20:05:17,593 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_8_mlp_fc1.alpha 2024-09-15 20:05:17,593 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_8_mlp_fc1.lora_down.weight 2024-09-15 20:05:17,593 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_8_mlp_fc1.lora_up.weight 2024-09-15 20:05:17,593 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_8_mlp_fc2.alpha 2024-09-15 20:05:17,593 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_8_mlp_fc2.lora_down.weight 2024-09-15 20:05:17,593 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_8_mlp_fc2.lora_up.weight 2024-09-15 20:05:17,594 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_8_self_attn_k_proj.alpha 2024-09-15 20:05:17,594 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_8_self_attn_k_proj.lora_down.weight 2024-09-15 20:05:17,594 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_8_self_attn_k_proj.lora_up.weight 2024-09-15 20:05:17,594 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_8_self_attn_out_proj.alpha 2024-09-15 20:05:17,594 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_8_self_attn_out_proj.lora_down.weight 2024-09-15 20:05:17,594 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_8_self_attn_out_proj.lora_up.weight 2024-09-15 20:05:17,594 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_8_self_attn_q_proj.alpha 2024-09-15 20:05:17,594 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_8_self_attn_q_proj.lora_down.weight 2024-09-15 20:05:17,594 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_8_self_attn_q_proj.lora_up.weight 2024-09-15 20:05:17,594 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_8_self_attn_v_proj.alpha 2024-09-15 20:05:17,594 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_8_self_attn_v_proj.lora_down.weight 2024-09-15 20:05:17,594 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_8_self_attn_v_proj.lora_up.weight 2024-09-15 20:05:17,594 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_mlp_fc1.alpha 2024-09-15 20:05:17,594 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_mlp_fc1.lora_down.weight 2024-09-15 20:05:17,594 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_mlp_fc1.lora_up.weight 2024-09-15 20:05:17,594 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_mlp_fc2.alpha 2024-09-15 20:05:17,594 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_mlp_fc2.lora_down.weight 2024-09-15 20:05:17,594 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_mlp_fc2.lora_up.weight 2024-09-15 20:05:17,594 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_self_attn_k_proj.alpha 2024-09-15 20:05:17,595 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_self_attn_k_proj.lora_down.weight 2024-09-15 20:05:17,595 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_self_attn_k_proj.lora_up.weight 2024-09-15 20:05:17,595 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_self_attn_out_proj.alpha 2024-09-15 20:05:17,595 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_self_attn_out_proj.lora_down.weight 2024-09-15 20:05:17,595 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_self_attn_out_proj.lora_up.weight 2024-09-15 20:05:17,595 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_self_attn_q_proj.alpha 2024-09-15 20:05:17,595 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_self_attn_q_proj.lora_down.weight 2024-09-15 20:05:17,595 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_self_attn_q_proj.lora_up.weight 2024-09-15 20:05:17,595 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_self_attn_v_proj.alpha 2024-09-15 20:05:17,595 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_self_attn_v_proj.lora_down.weight 2024-09-15 20:05:17,595 - root - WARNING - lora key not loaded: lora_te1_text_model_encoder_layers_9_self_attn_v_proj.lora_up.weight 2024-09-15 20:05:17,868 - root - WARNING - clip missing: ['textprojection.weight'] 2024-09-15 20:05:21,755 - root - INFO - Requested to load FluxClipModel 2024-09-15 20:05:21,755 - root - INFO - Loading 1 new model 2024-09-15 20:05:24,481 - root - INFO - loaded completely 0.0 9319.23095703125 True 2024-09-15 20:05:24,675 - root - INFO - Requested to load Flux 2024-09-15 20:05:24,675 - root - INFO - Loading 1 new model 2024-09-15 20:05:43,413 - root - INFO - loaded completely 0.0 11350.048889160156 True 2024-09-15 20:06:13,990 - root - INFO - Requested to load AutoencodingEngine 2024-09-15 20:06:13,990 - root - INFO - Loading 1 new model 2024-09-15 20:06:14,106 - root - INFO - loaded completely 0.0 159.87335777282715 True 2024-09-15 20:06:14,439 - root - INFO - Prompt executed in 89.63 seconds 2024-09-15 20:06:20,388 - root - INFO - got prompt 2024-09-15 20:06:27,368 - root - INFO - loaded completely 13317.622642227172 11350.048889160156 True 2024-09-15 20:06:58,566 - root - INFO - Prompt executed in 38.16 seconds 2024-09-15 20:07:06,601 - root - INFO - got prompt 2024-09-15 20:07:10,363 - root - INFO - loaded completely 13317.622642227172 11350.048889160156 True 2024-09-15 20:07:48,106 - root - INFO - Prompt executed in 41.50 seconds 2024-09-15 20:10:45,058 - root - INFO - got prompt 2024-09-15 20:10:45,551 - root - ERROR - !!! Exception during processing !!! Error no file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory C:/comfyui/models/clip\siglip-so400m-patch14-384. 2024-09-15 20:10:45,553 - root - ERROR - Traceback (most recent call last): File "C:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 317, in execute output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 192, in get_output_data return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 169, in _map_node_over_list process_inputs(input_dict, i) File "C:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 158, in process_inputs results.append(getattr(obj, func)(**inputs)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_HF_Servelress_Inference\nodes\Joy_Caption.py", line 125, in gen self.loadCheckPoint() File "C:\AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_HF_Servelress_Inference\nodes\Joy_Caption.py", line 84, in loadCheckPoint clip_model = AutoModel.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\AI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\models\auto\auto_factory.py", line 564, in from_pretrained return model_class.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\AI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\modeling_utils.py", line 3460, in from_pretrained raise EnvironmentError( OSError: Error no file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory C:/comfyui/models/clip\siglip-so400m-patch14-384.

2024-09-15 20:10:45,553 - root - INFO - Prompt executed in 0.49 seconds 2024-09-15 20:13:52,938 - root - INFO - got prompt 2024-09-15 20:13:53,387 - root - ERROR - !!! Exception during processing !!! Error no file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory C:/comfyui/models/clip\siglip-so400m-patch14-384. 2024-09-15 20:13:53,388 - root - ERROR - Traceback (most recent call last): File "C:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 317, in execute output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 192, in get_output_data return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 169, in _map_node_over_list process_inputs(input_dict, i) File "C:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 158, in process_inputs results.append(getattr(obj, func)(**inputs)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_HF_Servelress_Inference\nodes\Joy_Caption.py", line 152, in gen joy_pipeline.parent.loadCheckPoint() File "C:\AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_HF_Servelress_Inference\nodes\Joy_Caption.py", line 84, in loadCheckPoint clip_model = AutoModel.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\AI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\models\auto\auto_factory.py", line 564, in from_pretrained return model_class.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\AI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\modeling_utils.py", line 3460, in from_pretrained raise EnvironmentError( OSError: Error no file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory C:/comfyui/models/clip\siglip-so400m-patch14-384.

2024-09-15 20:13:53,388 - root - INFO - Prompt executed in 0.44 seconds 2024-09-15 20:14:00,968 - root - INFO - got prompt 2024-09-15 20:14:01,288 - root - ERROR - !!! Exception during processing !!! Error no file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory C:/comfyui/models/clip\siglip-so400m-patch14-384. 2024-09-15 20:14:01,290 - root - ERROR - Traceback (most recent call last): File "C:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 317, in execute output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 192, in get_output_data return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 169, in _map_node_over_list process_inputs(input_dict, i) File "C:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 158, in process_inputs results.append(getattr(obj, func)(**inputs)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_HF_Servelress_Inference\nodes\Joy_Caption.py", line 152, in gen joy_pipeline.parent.loadCheckPoint() File "C:\AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_HF_Servelress_Inference\nodes\Joy_Caption.py", line 84, in loadCheckPoint clip_model = AutoModel.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\AI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\models\auto\auto_factory.py", line 564, in from_pretrained return model_class.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\AI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\modeling_utils.py", line 3460, in from_pretrained raise EnvironmentError( OSError: Error no file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory C:/comfyui/models/clip\siglip-so400m-patch14-384.

2024-09-15 20:14:01,291 - root - INFO - Prompt executed in 0.31 seconds

## Attached Workflow
Please make sure that workflow does not contain any sensitive information such as API keys or passwords.

{"last_node_id":8,"last_link_id":7,"nodes":[{"id":4,"type":"GoogleTranslateTextNode","pos":{"0":1048.4412841796875,"1":733.2234497070312},"size":{"0":218.39999389648438,"1":171.99998474121094},"flags":{},"order":3,"mode":0,"inputs":[{"name":"text","type":"STRING","link":3,"widget":{"name":"text"},"label":"text"}],"outputs":[{"name":"text","type":"STRING","links":[6],"slot_index":0,"shape":3,"label":"text"}],"properties":{"Node name for S&R":"GoogleTranslateTextNode"},"widgets_values":["auto","zh-cn",false,"Manual Trasnlate","",true]},{"id":6,"type":"LoadImage","pos":{"0":67,"1":335},"size":{"0":315,"1":314},"flags":{},"order":0,"mode":0,"inputs":[],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[5],"shape":3,"label":"图像"},{"name":"MASK","type":"MASK","links":null,"shape":3,"label":"遮罩"}],"properties":{"Node name for S&R":"LoadImage"},"widgets_values":["01.png","image"]},{"id":3,"type":"Joy_caption_load","pos":{"0":544.4412841796875,"1":320.2234191894531},"size":{"0":315,"1":58},"flags":{},"order":1,"mode":0,"inputs":[],"outputs":[{"name":"JoyPipeline","type":"JoyPipeline","links":[1],"slot_index":0,"shape":3,"label":"JoyCaption"}],"properties":{"Node name for S&R":"Joy_caption_load"},"widgets_values":["unsloth/Meta-Llama-3.1-8B-bnb-4bit"]},{"id":7,"type":"easy showAnything","pos":{"0":614.4412841796875,"1":715.2234497070312},"size":[356.4357604980469,250.48460388183594],"flags":{},"order":5,"mode":0,"inputs":[{"name":"anything","type":"","link":6,"label":"输入任何"}],"outputs":[],"properties":{"Node name for S&R":"easy showAnything"}},{"id":8,"type":"easy showAnything","pos":{"0":981,"1":420},"size":[356.4357604980469,250.48460388183594],"flags":{},"order":4,"mode":0,"inputs":[{"name":"anything","type":"","link":7,"label":"输入任何"}],"outputs":[],"properties":{"Node name for S&R":"easy showAnything"}},{"id":1,"type":"Joy_caption","pos":{"0":501.4412841796875,"1":439.2234191894531},"size":{"0":400,"1":200},"flags":{},"order":2,"mode":0,"inputs":[{"name":"joy_pipeline","type":"JoyPipeline","link":1,"label":"JoyCaption"},{"name":"image","type":"IMAGE","link":5,"slot_index":1,"label":"图像"}],"outputs":[{"name":"STRING","type":"STRING","links":[3,7],"slot_index":0,"shape":3,"label":"字符串"}],"properties":{"Node name for S&R":"Joy_caption"},"widgets_values":["A descriptive caption for this image",300,0.5,false,true]}],"links":[[1,3,0,1,0,"JoyPipeline"],[3,1,0,4,0,"STRING"],[5,6,0,1,1,"IMAGE"],[6,4,0,7,0,""],[7,1,0,8,0,""]],"groups":[],"config":{},"extra":{"ds":{"scale":0.876922695000001,"offset":[235.12303623020046,-14.780887099634423]},"workspace_info":{"id":"i0k5Joz_uTKx4VlPLlM3h"}},"version":0.4}



## Additional Context
(Please add any additional context or steps to reproduce the error here)
zrk1117 commented 1 day ago

同样的问题,楼主解决了吗?