Recommended based on comfyui node pictures:Joy_caption + MiniCPMv2_6-prompt-generator + florence2
Apache License 2.0
444
stars
26
forks
source link
使用Meta-Llama-3.1-8B-bnb-4bit加载Joycaption的时候`rope_scaling` must be a dictionary with two fields, `type` and `factor`, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}这个报错 #67
Exception Message:rope_scaling must be a dictionary with two fields, type and factor, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}
Stack Trace
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 323, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 198, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 169, in _map_node_over_list
process_inputs(input_dict, i)
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 158, in process_inputs
results.append(getattr(obj, func)(**inputs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 135, in gen
self.loadCheckPoint()
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 110, in loadCheckPoint
text_model = AutoModelForCausalLM.from_pretrained(MODEL_PATH, device_map="auto",trust_remote_code=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\auto_factory.py", line 523, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 952, in from_pretrained
return config_class.from_dict(config_dict, **unused_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\configuration_utils.py", line 761, in from_dict
config = cls(**config_dict)
^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 161, in __init__
self._rope_scaling_validation()
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 181, in _rope_scaling_validation
raise ValueError(
2024-09-19 20:08:32,802 - root - INFO - Total VRAM 16380 MB, total RAM 65372 MB
2024-09-19 20:08:32,802 - root - INFO - pytorch version: 2.4.1+cu124
2024-09-19 20:08:32,803 - root - INFO - Set vram state to: NORMAL_VRAM
2024-09-19 20:08:32,803 - root - INFO - Device: cuda:0 NVIDIA GeForce RTX 4060 Ti : cudaMallocAsync
2024-09-19 20:08:33,291 - root - INFO - Using pytorch cross attention
2024-09-19 20:08:34,021 - root - INFO - [Prompt Server] web root: D:\SD\ComfyUI\ComfyUI\web
2024-09-19 20:08:34,023 - root - INFO - Adding extra search path checkpoints D:\SD\sd-webui-aki-v4.8\models/Stable-diffusion
2024-09-19 20:08:34,023 - root - INFO - Adding extra search path configs D:\SD\sd-webui-aki-v4.8\models/Stable-diffusion
2024-09-19 20:08:34,023 - root - INFO - Adding extra search path vae D:\SD\sd-webui-aki-v4.8\models/VAE
2024-09-19 20:08:34,023 - root - INFO - Adding extra search path loras D:\SD\sd-webui-aki-v4.8\models/Lora
2024-09-19 20:08:34,023 - root - INFO - Adding extra search path loras D:\SD\sd-webui-aki-v4.8\models/LyCORIS
2024-09-19 20:08:34,023 - root - INFO - Adding extra search path upscale_models D:\SD\sd-webui-aki-v4.8\models/ESRGAN
2024-09-19 20:08:34,023 - root - INFO - Adding extra search path upscale_models D:\SD\sd-webui-aki-v4.8\models/RealESRGAN
2024-09-19 20:08:34,023 - root - INFO - Adding extra search path upscale_models D:\SD\sd-webui-aki-v4.8\models/SwinIR
2024-09-19 20:08:34,023 - root - INFO - Adding extra search path embeddings D:\SD\sd-webui-aki-v4.8\embeddings
2024-09-19 20:08:34,023 - root - INFO - Adding extra search path hypernetworks D:\SD\sd-webui-aki-v4.8\models/hypernetworks
2024-09-19 20:08:34,023 - root - INFO - Adding extra search path controlnet D:\SD\sd-webui-aki-v4.8\models/ControlNet
2024-09-19 20:08:44,799 - root - WARNING - Traceback (most recent call last):
File "D:\SD\ComfyUI\ComfyUI\nodes.py", line 1994, in load_custom_node
module_spec.loader.exec_module(module)
File "", line 940, in exec_module
File "", line 241, in _call_with_frames_removed
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-BiRefNet-ZHO__init__.py", line 1, in
from . import birefnet
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-BiRefNet-ZHO\birefnet.py", line 7, in
from models.baseline import BiRefNet
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-BiRefNet-ZHO\models\baseline.py", line 16, in
from models.refinement.refiner import Refiner, RefinerPVTInChannels4, RefUNet
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-BiRefNet-ZHO\models\refinement\refiner.py", line 11, in
from dataset import class_labels_TR_sorted
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-BiRefNet-ZHO\dataset.py", line 10, in
from utils import path_to_image
ImportError: cannot import name 'path_to_image' from 'utils' (D:\SD\ComfyUI\ComfyUI\utils__init__.py)
2024-09-19 20:08:44,799 - root - WARNING - Cannot import D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-BiRefNet-ZHO module for custom nodes: cannot import name 'path_to_image' from 'utils' (D:\SD\ComfyUI\ComfyUI\utils__init.py)
2024-09-19 20:08:44,879 - dinov2 - WARNING - xFormers not available
2024-09-19 20:08:44,881 - dinov2 - WARNING - xFormers not available
2024-09-19 20:08:45,374 - root - INFO - Total VRAM 16380 MB, total RAM 65372 MB
2024-09-19 20:08:45,375 - root - INFO - pytorch version: 2.4.1+cu124
2024-09-19 20:08:45,375 - root - INFO - Set vram state to: NORMAL_VRAM
2024-09-19 20:08:45,375 - root - INFO - Device: cuda:0 NVIDIA GeForce RTX 4060 Ti : cudaMallocAsync
2024-09-19 20:08:48,722 - root - INFO - --------------
2024-09-19 20:08:48,723 - root - INFO - [91m ### Mixlab Nodes: [93mLoaded
2024-09-19 20:08:48,723 - root - INFO - ChatGPT.available True
2024-09-19 20:08:48,723 - root - INFO - editmask.available True
2024-09-19 20:08:48,818 - root - INFO - ClipInterrogator.available True
2024-09-19 20:08:48,850 - root - INFO - PromptGenerate.available True
2024-09-19 20:08:48,850 - root - INFO - ChinesePrompt.available True
2024-09-19 20:08:48,850 - root - INFO - RembgNode.available True
2024-09-19 20:08:49,287 - root - INFO - TripoSR.available
2024-09-19 20:08:49,288 - root - INFO - MiniCPMNode.available
2024-09-19 20:08:49,326 - root - INFO - Scenedetect.available
2024-09-19 20:08:49,381 - root - INFO - FishSpeech.available
2024-09-19 20:08:49,381 - root - INFO - [93m -------------- [0m
2024-09-19 20:08:50,952 - root - WARNING - Traceback (most recent call last):
File "D:\SD\ComfyUI\ComfyUI\nodes.py", line 1994, in load_custom_node
module_spec.loader.exec_module(module)
File "", line 940, in exec_module
File "", line 241, in _call_with_frames_removed
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_bitsandbytes_NF4__init__.py", line 6, in
import bitsandbytes
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\bitsandbytes__init.py", line 6, in
from . import cuda_setup, utils, research
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\bitsandbytes\research\init__.py", line 1, in
from . import nn
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\bitsandbytes\research\nn\init.py", line 1, in
from .modules import LinearFP8Mixed, LinearFP8Global
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\bitsandbytes\research\nn\modules.py", line 8, in
from bitsandbytes.optim import GlobalOptimManager
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\bitsandbytes\optim\init__.py", line 6, in
from bitsandbytes.cextension import COMPILED_WITH_CUDA
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\bitsandbytes\cextension.py", line 20, in
raise RuntimeError('''
RuntimeError:
CUDA Setup failed despite GPU being available. Please run the following command to get more information:
python -m bitsandbytes
Inspect the output of the command and see if you can locate CUDA libraries. You might need to add them
to your LD_LIBRARY_PATH. If you suspect a bug, please take the information from python -m bitsandbytes
and open an issue at: https://github.com/TimDettmers/bitsandbytes/issues
2024-09-19 20:08:50,952 - root - WARNING - Cannot import D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_bitsandbytes_NF4 module for custom nodes:
CUDA Setup failed despite GPU being available. Please run the following command to get more information:
python -m bitsandbytes
Inspect the output of the command and see if you can locate CUDA libraries. You might need to add them
to your LD_LIBRARY_PATH. If you suspect a bug, please take the information from python -m bitsandbytes
and open an issue at: https://github.com/TimDettmers/bitsandbytes/issues
2024-09-19 20:08:52,533 - root - WARNING - Traceback (most recent call last):
File "D:\SD\ComfyUI\ComfyUI\nodes.py", line 1994, in load_custom_node
module_spec.loader.exec_module(module)
File "", line 940, in exec_module
File "", line 241, in _call_with_frames_removed
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\eden_comfy_pipelines__init__.py", line 6, in
from img_utils.img_nodes import *
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\eden_comfy_pipelines\img_utils\img_nodes.py", line 359, in
from sklearn.cluster import KMeans
ModuleNotFoundError: No module named 'sklearn'
2024-09-19 20:08:52,533 - root - WARNING - Cannot import D:\SD\ComfyUI\ComfyUI\custom_nodes\eden_comfy_pipelines module for custom nodes: No module named 'sklearn'
2024-09-19 20:08:53,486 - root - INFO -
Import times for custom nodes:
2024-09-19 20:08:53,486 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\Mask Blackener.py
2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\Image List Converter.py
2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\get_size_resize.py
2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\Mask Applier and Combiner.py
2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\example_node.py
2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\comfyui-portrait-master-zh-cn
2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\sdxl_utility.py
2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\websocket_image_save.py
2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\AIGODLIKE-ComfyUI-Translation
2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Image-Selector
2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\comfyUI_TJ_NormalLighting
2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-CascadeResolutions
2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-ZeroShot-MTrans
2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\hakkun_nodes.py
2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\SD-Latent-Interposer
2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Inpaint-CropAndStitch
2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_TTP_CN_Preprocessor
2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_IPAdapter_plus
2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Ollama-Describer
2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ControlNet-LLLite-ComfyUI
2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_ADV_CLIP_emb
2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_Noise
2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUi_NNLatentUpscale
2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\efficiency-nodes-comfyui
2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_TTP_Toolset
2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_PRNodes
2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\cg-use-everywhere
2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\FreeU_Advanced
2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\klinter_nodes
2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_TiledKSampler
2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Chibi-Nodes
2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Eagle-PNGInfo
2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\stability-ComfyUI-nodes
2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_joytag
2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-BRIA_AI-RMBG
2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\sd-perturbed-attention
2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-post-processing-nodes
2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\comfyui_controlnet_aux
2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfy-Topaz
2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\wlsh_nodes
2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_Dave_CustomNode
2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-WD14-Tagger
2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-AutomaticCFG
2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyLiterals
2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_experiments
2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-ELLA
2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\masquerade-nodes-comfyui
2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\MistoControlNet-Flux-dev
2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Dickson-Nodes
2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\comfyui-inpaint-nodes
2024-09-19 20:08:53,489 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-CUP
2024-09-19 20:08:53,489 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\AuraSR-ComfyUI
2024-09-19 20:08:53,489 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Text_Image-Composite
2024-09-19 20:08:53,489 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-OllamaGemini
2024-09-19 20:08:53,489 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-TiledDiffusion
2024-09-19 20:08:53,489 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\PowerNoiseSuite
2024-09-19 20:08:53,489 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui-StableSR
2024-09-19 20:08:53,489 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\comfyui-nodes-docs
2024-09-19 20:08:53,489 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-DDColor
2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Kolors-MZ
2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Impact-Pack
2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption
2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\images-grid-comfy-plugin
2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\comfyui-various
2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\comfy-image-saver
2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Custom-Scripts
2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_UltimateSDUpscale
2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\mikey_nodes
2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_essentials
2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\OneButtonPrompt
2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds (IMPORT FAILED): D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_bitsandbytes_NF4
2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds (IMPORT FAILED): D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-BiRefNet-ZHO
2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-segment-anything-2
2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-KJNodes
2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-DepthAnythingV2
2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Allor
2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\comfyui-sixgod_prompt
2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-GGUF
2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Advanced-ControlNet
2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Florence2
2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\Derfuu_ComfyUI_ModdedNodes
2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\comfyui_bmad_nodes
2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-KwaiKolorsWrapper
2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\rgthree-comfy
2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Marigold
2024-09-19 20:08:53,491 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-IC-Light
2024-09-19 20:08:53,491 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\All-IN-ONE-style
2024-09-19 20:08:53,491 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_ExtraModels
2024-09-19 20:08:53,491 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\comfyui-workspace-manager
2024-09-19 20:08:53,491 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_Comfyroll_CustomNodes
2024-09-19 20:08:53,491 - root - INFO - 0.0 seconds (IMPORT FAILED): D:\SD\ComfyUI\ComfyUI\custom_nodes\eden_comfy_pipelines
2024-09-19 20:08:53,491 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Inspire-Pack
2024-09-19 20:08:53,491 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\comfy_mtb
2024-09-19 20:08:53,491 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-IDM-VTON
2024-09-19 20:08:53,503 - root - INFO - 0.1 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\comfyui_segment_anything
2024-09-19 20:08:53,503 - root - INFO - 0.1 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-BrushNet
2024-09-19 20:08:53,503 - root - INFO - 0.1 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Long-CLIP
2024-09-19 20:08:53,503 - root - INFO - 0.1 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_LayerStyle
2024-09-19 20:08:53,503 - root - INFO - 0.1 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_NYJY
2024-09-19 20:08:53,503 - root - INFO - 0.1 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Gemini
2024-09-19 20:08:53,503 - root - INFO - 0.2 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-kimFilter
2024-09-19 20:08:53,503 - root - INFO - 0.2 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_smZNodes
2024-09-19 20:08:53,503 - root - INFO - 0.2 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Anyline
2024-09-19 20:08:53,503 - root - INFO - 0.2 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-SUPIR
2024-09-19 20:08:53,504 - root - INFO - 0.3 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\comfyui-ollama
2024-09-19 20:08:53,504 - root - INFO - 0.3 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Easy-Use
2024-09-19 20:08:53,504 - root - INFO - 0.4 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Manager
2024-09-19 20:08:53,504 - root - INFO - 0.5 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\batchImg-rembg-ComfyUI-nodes
2024-09-19 20:08:53,504 - root - INFO - 0.9 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\comfyui-mixlab-nodes
2024-09-19 20:08:53,504 - root - INFO - 0.9 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\was-node-suite-comfyui
2024-09-19 20:08:53,504 - root - INFO - 1.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUi-Ollama-YN
2024-09-19 20:08:53,504 - root - INFO - 1.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_Custom_Nodes_AlekPet
2024-09-19 20:08:53,504 - root - INFO - 2.3 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\comfyui-art-venture
2024-09-19 20:08:53,504 - root - INFO - 2.4 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Inspyrenet-Rembg
2024-09-19 20:08:53,504 - root - INFO - 7.2 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-BiRefNet-Hugo
2024-09-19 20:08:53,504 - root - INFO -
2024-09-19 20:08:53,518 - root - INFO -
2024-09-19 20:08:53,518 - root - INFO -
Starting server
2024-09-19 20:08:53,518 - root - INFO - [93mTo see the GUI go to: http://192.168.21.5:8188 or http://127.0.0.1:8188
2024-09-19 20:08:53,518 - root - INFO - [93mTo see the GUI go to: https://192.168.21.5:8189 or https://127.0.0.1:8189[0m
2024-09-19 20:09:52,250 - root - INFO - got prompt
2024-09-19 20:09:56,702 - root - ERROR - !!! Exception during processing !!! rope_scaling must be a dictionary with two fields, type and factor, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}
2024-09-19 20:09:56,703 - root - ERROR - Traceback (most recent call last):
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 323, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 198, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 169, in _map_node_over_list
process_inputs(input_dict, i)
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 158, in process_inputs
results.append(getattr(obj, func)(inputs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 135, in gen
self.loadCheckPoint()
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 110, in loadCheckPoint
text_model = AutoModelForCausalLM.from_pretrained(MODEL_PATH, device_map="auto",trust_remote_code=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\auto_factory.py", line 523, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 952, in from_pretrained
return config_class.from_dict(config_dict, unused_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\configuration_utils.py", line 761, in from_dict
config = cls(**config_dict)
^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 161, in init
self._rope_scaling_validation()
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 181, in _rope_scaling_validation
raise ValueError(
ValueError: rope_scaling must be a dictionary with two fields, type and factor, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}
2024-09-19 20:09:56,704 - root - INFO - Prompt executed in 4.42 seconds
2024-09-19 20:10:01,209 - root - INFO - got prompt
2024-09-19 20:10:03,297 - root - ERROR - !!! Exception during processing !!! D:\SD\ComfyUI\ComfyUI\models\LLM\Meta-Llama-3.1-8B does not appear to have a file named config.json. Checkout 'https://huggingface.co/D:\SD\ComfyUI\ComfyUI\models\LLM\Meta-Llama-3.1-8B/tree/None' for available files.
2024-09-19 20:10:03,402 - root - ERROR - Traceback (most recent call last):
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 323, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 198, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 169, in _map_node_over_list
process_inputs(input_dict, i)
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 158, in process_inputs
results.append(getattr(obj, func)(inputs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 135, in gen
self.loadCheckPoint()
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 107, in loadCheckPoint
tokenizer = AutoTokenizer.from_pretrained(MODEL_PATH,use_fast=False)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\tokenization_auto.py", line 819, in from_pretrained
config = AutoConfig.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 928, in from_pretrained
config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\configuration_utils.py", line 631, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\configuration_utils.py", line 686, in _get_config_dict
resolved_config_file = cached_file(
^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\utils\hub.py", line 369, in cached_file
raise EnvironmentError(
OSError: D:\SD\ComfyUI\ComfyUI\models\LLM\Meta-Llama-3.1-8B does not appear to have a file named config.json. Checkout 'https://huggingface.co/D:\SD\ComfyUI\ComfyUI\models\LLM\Meta-Llama-3.1-8B/tree/None' for available files.
2024-09-19 20:10:03,411 - root - INFO - Prompt executed in 2.16 seconds
2024-09-19 20:11:23,322 - root - INFO - got prompt
2024-09-19 20:11:25,820 - root - ERROR - !!! Exception during processing !!! rope_scaling must be a dictionary with two fields, type and factor, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}
2024-09-19 20:11:25,820 - root - ERROR - Traceback (most recent call last):
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 323, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 198, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 169, in _map_node_over_list
process_inputs(input_dict, i)
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 158, in process_inputs
results.append(getattr(obj, func)(inputs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 135, in gen
self.loadCheckPoint()
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 110, in loadCheckPoint
text_model = AutoModelForCausalLM.from_pretrained(MODEL_PATH, device_map="auto",trust_remote_code=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\auto_factory.py", line 523, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 952, in from_pretrained
return config_class.from_dict(config_dict, unused_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\configuration_utils.py", line 761, in from_dict
config = cls(**config_dict)
^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 161, in init
self._rope_scaling_validation()
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 181, in _rope_scaling_validation
raise ValueError(
ValueError: rope_scaling must be a dictionary with two fields, type and factor, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}
2024-09-19 20:11:25,822 - root - INFO - Prompt executed in 2.46 seconds
2024-09-19 20:11:42,913 - root - INFO - got prompt
2024-09-19 20:11:45,240 - root - ERROR - !!! Exception during processing !!! rope_scaling must be a dictionary with two fields, type and factor, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}
2024-09-19 20:11:45,240 - root - ERROR - Traceback (most recent call last):
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 323, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 198, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 169, in _map_node_over_list
process_inputs(input_dict, i)
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 158, in process_inputs
results.append(getattr(obj, func)(inputs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 162, in gen
joy_pipeline.parent.loadCheckPoint()
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 110, in loadCheckPoint
text_model = AutoModelForCausalLM.from_pretrained(MODEL_PATH, device_map="auto",trust_remote_code=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\auto_factory.py", line 523, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 952, in from_pretrained
return config_class.from_dict(config_dict, unused_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\configuration_utils.py", line 761, in from_dict
config = cls(**config_dict)
^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 161, in init
self._rope_scaling_validation()
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 181, in _rope_scaling_validation
raise ValueError(
ValueError: rope_scaling must be a dictionary with two fields, type and factor, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}
2024-09-19 20:11:45,241 - root - INFO - Prompt executed in 2.30 seconds
2024-09-19 20:12:56,006 - root - INFO - got prompt
2024-09-19 20:12:58,164 - root - ERROR - !!! Exception during processing !!! D:\SD\ComfyUI\ComfyUI\models\LLM\Meta-Llama-3.1-8B does not appear to have a file named config.json. Checkout 'https://huggingface.co/D:\SD\ComfyUI\ComfyUI\models\LLM\Meta-Llama-3.1-8B/tree/None' for available files.
2024-09-19 20:12:58,164 - root - ERROR - Traceback (most recent call last):
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 323, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 198, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 169, in _map_node_over_list
process_inputs(input_dict, i)
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 158, in process_inputs
results.append(getattr(obj, func)(inputs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 135, in gen
self.loadCheckPoint()
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 107, in loadCheckPoint
tokenizer = AutoTokenizer.from_pretrained(MODEL_PATH,use_fast=False)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\tokenization_auto.py", line 819, in from_pretrained
config = AutoConfig.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 928, in from_pretrained
config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\configuration_utils.py", line 631, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\configuration_utils.py", line 686, in _get_config_dict
resolved_config_file = cached_file(
^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\utils\hub.py", line 369, in cached_file
raise EnvironmentError(
OSError: D:\SD\ComfyUI\ComfyUI\models\LLM\Meta-Llama-3.1-8B does not appear to have a file named config.json. Checkout 'https://huggingface.co/D:\SD\ComfyUI\ComfyUI\models\LLM\Meta-Llama-3.1-8B/tree/None' for available files.
2024-09-19 20:12:58,165 - root - INFO - Prompt executed in 2.13 seconds
2024-09-19 20:13:07,841 - root - INFO - got prompt
2024-09-19 20:13:10,698 - root - ERROR - !!! Exception during processing !!! rope_scaling must be a dictionary with two fields, type and factor, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}
2024-09-19 20:13:10,699 - root - ERROR - Traceback (most recent call last):
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 323, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 198, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 169, in _map_node_over_list
process_inputs(input_dict, i)
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 158, in process_inputs
results.append(getattr(obj, func)(inputs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 135, in gen
self.loadCheckPoint()
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 110, in loadCheckPoint
text_model = AutoModelForCausalLM.from_pretrained(MODEL_PATH, device_map="auto",trust_remote_code=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\auto_factory.py", line 523, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 952, in from_pretrained
return config_class.from_dict(config_dict, unused_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\configuration_utils.py", line 761, in from_dict
config = cls(**config_dict)
^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 161, in init
self._rope_scaling_validation()
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 181, in _rope_scaling_validation
raise ValueError(
ValueError: rope_scaling must be a dictionary with two fields, type and factor, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}
2024-09-19 20:13:10,700 - root - INFO - Prompt executed in 2.83 seconds
2024-09-19 20:15:26,185 - root - INFO - got prompt
2024-09-19 20:15:28,313 - root - ERROR - !!! Exception during processing !!! D:\SD\ComfyUI\ComfyUI\models\LLM\Meta-Llama-3.1-8B does not appear to have a file named config.json. Checkout 'https://huggingface.co/D:\SD\ComfyUI\ComfyUI\models\LLM\Meta-Llama-3.1-8B/tree/None' for available files.
2024-09-19 20:15:28,314 - root - ERROR - Traceback (most recent call last):
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 323, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 198, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 169, in _map_node_over_list
process_inputs(input_dict, i)
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 158, in process_inputs
results.append(getattr(obj, func)(inputs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 135, in gen
self.loadCheckPoint()
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 107, in loadCheckPoint
tokenizer = AutoTokenizer.from_pretrained(MODEL_PATH,use_fast=False)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\tokenization_auto.py", line 819, in from_pretrained
config = AutoConfig.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 928, in from_pretrained
config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\configuration_utils.py", line 631, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\configuration_utils.py", line 686, in _get_config_dict
resolved_config_file = cached_file(
^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\utils\hub.py", line 369, in cached_file
raise EnvironmentError(
OSError: D:\SD\ComfyUI\ComfyUI\models\LLM\Meta-Llama-3.1-8B does not appear to have a file named config.json. Checkout 'https://huggingface.co/D:\SD\ComfyUI\ComfyUI\models\LLM\Meta-Llama-3.1-8B/tree/None' for available files.
2024-09-19 20:15:28,315 - root - INFO - Prompt executed in 2.10 seconds
2024-09-19 20:15:31,171 - root - INFO - got prompt
2024-09-19 20:15:34,019 - root - ERROR - !!! Exception during processing !!! rope_scaling must be a dictionary with two fields, type and factor, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}
2024-09-19 20:15:34,019 - root - ERROR - Traceback (most recent call last):
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 323, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 198, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 169, in _map_node_over_list
process_inputs(input_dict, i)
File "D:\SD\ComfyUI\ComfyUI\execution.py", line 158, in process_inputs
results.append(getattr(obj, func)(inputs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 135, in gen
self.loadCheckPoint()
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 110, in loadCheckPoint
text_model = AutoModelForCausalLM.from_pretrained(MODEL_PATH, device_map="auto",trust_remote_code=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\auto_factory.py", line 523, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 952, in from_pretrained
return config_class.from_dict(config_dict, unused_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\configuration_utils.py", line 761, in from_dict
config = cls(**config_dict)
^^^^^^^^^^^^^^^^^^
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 161, in init
self._rope_scaling_validation()
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 181, in _rope_scaling_validation
raise ValueError(
ValueError: rope_scaling must be a dictionary with two fields, type and factor, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}
2024-09-19 20:15:34,021 - root - INFO - Prompt executed in 2.81 seconds
## Attached Workflow
Please make sure that workflow does not contain any sensitive information such as API keys or passwords.
{"last_node_id":5,"last_link_id":4,"nodes":[{"id":2,"type":"Joy_caption","pos":{"0":828,"1":498},"size":{"0":400,"1":200},"flags":{},"order":2,"mode":0,"inputs":[{"name":"joy_pipeline","type":"JoyPipeline","link":4,"label":"JoyCaption"},{"name":"image","type":"IMAGE","link":2,"slot_index":1,"label":"图像"}],"outputs":[{"name":"STRING","type":"STRING","links":[3],"slot_index":0,"shape":3,"label":"字符串"}],"properties":{"Node name for S&R":"Joy_caption"},"widgets_values":["A descriptive caption for this image",300,0.5,false,true]},{"id":4,"type":"easy showAnything","pos":{"0":1255,"1":502},"size":{"0":356.4357604980469,"1":250.48460388183594},"flags":{},"order":3,"mode":0,"inputs":[{"name":"anything","type":"","link":3,"label":"输入任何"}],"outputs":[],"properties":{"Node name for S&R":"easy showAnything"},"widgets_values":["of a young girl standing on a lush green lawn, surrounded by tall trees with budding leaves, under a cloudy sky. The girl, approximately 3-5 years old, has light blonde hair and a cheerful expression, smiling with her teeth showing. She wears a white, short-sleeved dress adorned with colorful floral appliqués in shades of pink, yellow, and orange, and a matching white hat with a large pink flower on the side. Her dress has a full skirt and is knee-length, with delicate lace trim along the hem. She also wears white tights and white shoes, enhancing the purity of her attire. In her hands, she carries a bouquet of fresh flowers, including yellow, pink, and white varieties, held close to her chest. The background is softly blurred, emphasizing the girl as the focal point, with the trees and sky providing a serene, natural setting. The overall mood is joyful and whimsical, capturing the innocence and beauty of childhood."]},{"id":3,"type":"LoadImage","pos":{"0":198,"1":577},"size":{"0":570.2863159179688,"1":474.0776062011719},"flags":{},"order":0,"mode":0,"inputs":[],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[2],"shape":3,"label":"图像"},{"name":"MASK","type":"MASK","links":null,"shape":3,"label":"遮罩"}],"properties":{"Node name for S&R":"LoadImage"},"widgets_values":["02368DB0BFA6AA367DC096A4738D5198.png","image"]},{"id":5,"type":"Joy_caption_load","pos":{"0":454,"1":446},"size":{"0":315,"1":58},"flags":{},"order":1,"mode":0,"inputs":[],"outputs":[{"name":"JoyPipeline","type":"JoyPipeline","links":[4],"slot_index":0,"shape":3,"label":"JoyCaption"}],"properties":{"Node name for S&R":"Joy_caption_load"},"widgets_values":["unsloth/Meta-Llama-3.1-8B-bnb-4bit"]}],"links":[[2,3,0,2,1,"IMAGE"],[3,2,0,4,0,""],[4,5,0,2,0,"JoyPipeline"]],"groups":[],"config":{},"extra":{"ds":{"scale":1.1671841070450009,"offset":[-90.3300588520564,-146.23972434239784]}},"version":0.4}
## Additional Context
(Please add any additional context or steps to reproduce the error here)
ComfyUI Error Report
Error Details
Exception Message:
rope_scaling
must be a dictionary with two fields,type
andfactor
, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}Stack Trace
2024-09-19 20:08:32,802 - root - INFO - Total VRAM 16380 MB, total RAM 65372 MB 2024-09-19 20:08:32,802 - root - INFO - pytorch version: 2.4.1+cu124 2024-09-19 20:08:32,803 - root - INFO - Set vram state to: NORMAL_VRAM 2024-09-19 20:08:32,803 - root - INFO - Device: cuda:0 NVIDIA GeForce RTX 4060 Ti : cudaMallocAsync 2024-09-19 20:08:33,291 - root - INFO - Using pytorch cross attention 2024-09-19 20:08:34,021 - root - INFO - [Prompt Server] web root: D:\SD\ComfyUI\ComfyUI\web 2024-09-19 20:08:34,023 - root - INFO - Adding extra search path checkpoints D:\SD\sd-webui-aki-v4.8\models/Stable-diffusion 2024-09-19 20:08:34,023 - root - INFO - Adding extra search path configs D:\SD\sd-webui-aki-v4.8\models/Stable-diffusion 2024-09-19 20:08:34,023 - root - INFO - Adding extra search path vae D:\SD\sd-webui-aki-v4.8\models/VAE 2024-09-19 20:08:34,023 - root - INFO - Adding extra search path loras D:\SD\sd-webui-aki-v4.8\models/Lora 2024-09-19 20:08:34,023 - root - INFO - Adding extra search path loras D:\SD\sd-webui-aki-v4.8\models/LyCORIS 2024-09-19 20:08:34,023 - root - INFO - Adding extra search path upscale_models D:\SD\sd-webui-aki-v4.8\models/ESRGAN 2024-09-19 20:08:34,023 - root - INFO - Adding extra search path upscale_models D:\SD\sd-webui-aki-v4.8\models/RealESRGAN 2024-09-19 20:08:34,023 - root - INFO - Adding extra search path upscale_models D:\SD\sd-webui-aki-v4.8\models/SwinIR 2024-09-19 20:08:34,023 - root - INFO - Adding extra search path embeddings D:\SD\sd-webui-aki-v4.8\embeddings 2024-09-19 20:08:34,023 - root - INFO - Adding extra search path hypernetworks D:\SD\sd-webui-aki-v4.8\models/hypernetworks 2024-09-19 20:08:34,023 - root - INFO - Adding extra search path controlnet D:\SD\sd-webui-aki-v4.8\models/ControlNet 2024-09-19 20:08:44,799 - root - WARNING - Traceback (most recent call last): File "D:\SD\ComfyUI\ComfyUI\nodes.py", line 1994, in load_custom_node module_spec.loader.exec_module(module) File "", line 940, in exec_module
File "", line 241, in _call_with_frames_removed
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-BiRefNet-ZHO__init__.py", line 1, in
from . import birefnet
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-BiRefNet-ZHO\birefnet.py", line 7, in
from models.baseline import BiRefNet
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-BiRefNet-ZHO\models\baseline.py", line 16, in
from models.refinement.refiner import Refiner, RefinerPVTInChannels4, RefUNet
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-BiRefNet-ZHO\models\refinement\refiner.py", line 11, in
from dataset import class_labels_TR_sorted
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-BiRefNet-ZHO\dataset.py", line 10, in
from utils import path_to_image
ImportError: cannot import name 'path_to_image' from 'utils' (D:\SD\ComfyUI\ComfyUI\utils__init__.py)
2024-09-19 20:08:44,799 - root - WARNING - Cannot import D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-BiRefNet-ZHO module for custom nodes: cannot import name 'path_to_image' from 'utils' (D:\SD\ComfyUI\ComfyUI\utils__init.py) 2024-09-19 20:08:44,879 - dinov2 - WARNING - xFormers not available 2024-09-19 20:08:44,881 - dinov2 - WARNING - xFormers not available 2024-09-19 20:08:45,374 - root - INFO - Total VRAM 16380 MB, total RAM 65372 MB 2024-09-19 20:08:45,375 - root - INFO - pytorch version: 2.4.1+cu124 2024-09-19 20:08:45,375 - root - INFO - Set vram state to: NORMAL_VRAM 2024-09-19 20:08:45,375 - root - INFO - Device: cuda:0 NVIDIA GeForce RTX 4060 Ti : cudaMallocAsync 2024-09-19 20:08:48,722 - root - INFO - -------------- 2024-09-19 20:08:48,723 - root - INFO - [91m ### Mixlab Nodes: [93mLoaded 2024-09-19 20:08:48,723 - root - INFO - ChatGPT.available True 2024-09-19 20:08:48,723 - root - INFO - editmask.available True 2024-09-19 20:08:48,818 - root - INFO - ClipInterrogator.available True 2024-09-19 20:08:48,850 - root - INFO - PromptGenerate.available True 2024-09-19 20:08:48,850 - root - INFO - ChinesePrompt.available True 2024-09-19 20:08:48,850 - root - INFO - RembgNode.available True 2024-09-19 20:08:49,287 - root - INFO - TripoSR.available 2024-09-19 20:08:49,288 - root - INFO - MiniCPMNode.available 2024-09-19 20:08:49,326 - root - INFO - Scenedetect.available 2024-09-19 20:08:49,381 - root - INFO - FishSpeech.available 2024-09-19 20:08:49,381 - root - INFO - [93m -------------- [0m 2024-09-19 20:08:50,952 - root - WARNING - Traceback (most recent call last): File "D:\SD\ComfyUI\ComfyUI\nodes.py", line 1994, in load_custom_node module_spec.loader.exec_module(module) File "", line 940, in exec_module
File "", line 241, in _call_with_frames_removed
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_bitsandbytes_NF4__init__.py", line 6, in
import bitsandbytes
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\bitsandbytes__init.py", line 6, in
from . import cuda_setup, utils, research
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\bitsandbytes\research\ init__.py", line 1, in
from . import nn
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\bitsandbytes\research\nn\ init.py", line 1, in
from .modules import LinearFP8Mixed, LinearFP8Global
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\bitsandbytes\research\nn\modules.py", line 8, in
from bitsandbytes.optim import GlobalOptimManager
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\bitsandbytes\optim\ init__.py", line 6, in
from bitsandbytes.cextension import COMPILED_WITH_CUDA
File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\bitsandbytes\cextension.py", line 20, in
raise RuntimeError('''
RuntimeError:
CUDA Setup failed despite GPU being available. Please run the following command to get more information:
2024-09-19 20:08:50,952 - root - WARNING - Cannot import D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_bitsandbytes_NF4 module for custom nodes: CUDA Setup failed despite GPU being available. Please run the following command to get more information:
2024-09-19 20:08:52,533 - root - WARNING - Traceback (most recent call last): File "D:\SD\ComfyUI\ComfyUI\nodes.py", line 1994, in load_custom_node module_spec.loader.exec_module(module) File "", line 940, in exec_module
File "", line 241, in _call_with_frames_removed
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\eden_comfy_pipelines__init__.py", line 6, in
from img_utils.img_nodes import *
File "D:\SD\ComfyUI\ComfyUI\custom_nodes\eden_comfy_pipelines\img_utils\img_nodes.py", line 359, in
from sklearn.cluster import KMeans
ModuleNotFoundError: No module named 'sklearn'
2024-09-19 20:08:52,533 - root - WARNING - Cannot import D:\SD\ComfyUI\ComfyUI\custom_nodes\eden_comfy_pipelines module for custom nodes: No module named 'sklearn' 2024-09-19 20:08:53,486 - root - INFO - Import times for custom nodes: 2024-09-19 20:08:53,486 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\Mask Blackener.py 2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\Image List Converter.py 2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\get_size_resize.py 2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\Mask Applier and Combiner.py 2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\example_node.py 2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\comfyui-portrait-master-zh-cn 2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\sdxl_utility.py 2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\websocket_image_save.py 2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\AIGODLIKE-ComfyUI-Translation 2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Image-Selector 2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\comfyUI_TJ_NormalLighting 2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-CascadeResolutions 2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-ZeroShot-MTrans 2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\hakkun_nodes.py 2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\SD-Latent-Interposer 2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Inpaint-CropAndStitch 2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_TTP_CN_Preprocessor 2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_IPAdapter_plus 2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Ollama-Describer 2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ControlNet-LLLite-ComfyUI 2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_ADV_CLIP_emb 2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_Noise 2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUi_NNLatentUpscale 2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\efficiency-nodes-comfyui 2024-09-19 20:08:53,487 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_TTP_Toolset 2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_PRNodes 2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\cg-use-everywhere 2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\FreeU_Advanced 2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\klinter_nodes 2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_TiledKSampler 2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Chibi-Nodes 2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Eagle-PNGInfo 2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\stability-ComfyUI-nodes 2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_joytag 2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-BRIA_AI-RMBG 2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\sd-perturbed-attention 2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-post-processing-nodes 2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\comfyui_controlnet_aux 2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfy-Topaz 2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\wlsh_nodes 2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_Dave_CustomNode 2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-WD14-Tagger 2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-AutomaticCFG 2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyLiterals 2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_experiments 2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-ELLA 2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\masquerade-nodes-comfyui 2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\MistoControlNet-Flux-dev 2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Dickson-Nodes 2024-09-19 20:08:53,488 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\comfyui-inpaint-nodes 2024-09-19 20:08:53,489 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-CUP 2024-09-19 20:08:53,489 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\AuraSR-ComfyUI 2024-09-19 20:08:53,489 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Text_Image-Composite 2024-09-19 20:08:53,489 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-OllamaGemini 2024-09-19 20:08:53,489 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-TiledDiffusion 2024-09-19 20:08:53,489 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\PowerNoiseSuite 2024-09-19 20:08:53,489 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui-StableSR 2024-09-19 20:08:53,489 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\comfyui-nodes-docs 2024-09-19 20:08:53,489 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-DDColor 2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Kolors-MZ 2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Impact-Pack 2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption 2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\images-grid-comfy-plugin 2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\comfyui-various 2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\comfy-image-saver 2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Custom-Scripts 2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_UltimateSDUpscale 2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\mikey_nodes 2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_essentials 2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\OneButtonPrompt 2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds (IMPORT FAILED): D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_bitsandbytes_NF4 2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds (IMPORT FAILED): D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-BiRefNet-ZHO 2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-segment-anything-2 2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-KJNodes 2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-DepthAnythingV2 2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Allor 2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\comfyui-sixgod_prompt 2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-GGUF 2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Advanced-ControlNet 2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Florence2 2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\Derfuu_ComfyUI_ModdedNodes 2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\comfyui_bmad_nodes 2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-KwaiKolorsWrapper 2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\rgthree-comfy 2024-09-19 20:08:53,490 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Marigold 2024-09-19 20:08:53,491 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-IC-Light 2024-09-19 20:08:53,491 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\All-IN-ONE-style 2024-09-19 20:08:53,491 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_ExtraModels 2024-09-19 20:08:53,491 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\comfyui-workspace-manager 2024-09-19 20:08:53,491 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_Comfyroll_CustomNodes 2024-09-19 20:08:53,491 - root - INFO - 0.0 seconds (IMPORT FAILED): D:\SD\ComfyUI\ComfyUI\custom_nodes\eden_comfy_pipelines 2024-09-19 20:08:53,491 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Inspire-Pack 2024-09-19 20:08:53,491 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\comfy_mtb 2024-09-19 20:08:53,491 - root - INFO - 0.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-IDM-VTON 2024-09-19 20:08:53,503 - root - INFO - 0.1 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\comfyui_segment_anything 2024-09-19 20:08:53,503 - root - INFO - 0.1 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-BrushNet 2024-09-19 20:08:53,503 - root - INFO - 0.1 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Long-CLIP 2024-09-19 20:08:53,503 - root - INFO - 0.1 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_LayerStyle 2024-09-19 20:08:53,503 - root - INFO - 0.1 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_NYJY 2024-09-19 20:08:53,503 - root - INFO - 0.1 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Gemini 2024-09-19 20:08:53,503 - root - INFO - 0.2 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-kimFilter 2024-09-19 20:08:53,503 - root - INFO - 0.2 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_smZNodes 2024-09-19 20:08:53,503 - root - INFO - 0.2 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Anyline 2024-09-19 20:08:53,503 - root - INFO - 0.2 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-SUPIR 2024-09-19 20:08:53,504 - root - INFO - 0.3 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\comfyui-ollama 2024-09-19 20:08:53,504 - root - INFO - 0.3 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Easy-Use 2024-09-19 20:08:53,504 - root - INFO - 0.4 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Manager 2024-09-19 20:08:53,504 - root - INFO - 0.5 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\batchImg-rembg-ComfyUI-nodes 2024-09-19 20:08:53,504 - root - INFO - 0.9 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\comfyui-mixlab-nodes 2024-09-19 20:08:53,504 - root - INFO - 0.9 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\was-node-suite-comfyui 2024-09-19 20:08:53,504 - root - INFO - 1.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUi-Ollama-YN 2024-09-19 20:08:53,504 - root - INFO - 1.0 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI_Custom_Nodes_AlekPet 2024-09-19 20:08:53,504 - root - INFO - 2.3 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\comfyui-art-venture 2024-09-19 20:08:53,504 - root - INFO - 2.4 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Inspyrenet-Rembg 2024-09-19 20:08:53,504 - root - INFO - 7.2 seconds: D:\SD\ComfyUI\ComfyUI\custom_nodes\ComfyUI-BiRefNet-Hugo 2024-09-19 20:08:53,504 - root - INFO - 2024-09-19 20:08:53,518 - root - INFO -
2024-09-19 20:08:53,518 - root - INFO -
Starting server 2024-09-19 20:08:53,518 - root - INFO - [93mTo see the GUI go to: http://192.168.21.5:8188 or http://127.0.0.1:8188 2024-09-19 20:08:53,518 - root - INFO - [93mTo see the GUI go to: https://192.168.21.5:8189 or https://127.0.0.1:8189[0m 2024-09-19 20:09:52,250 - root - INFO - got prompt 2024-09-19 20:09:56,702 - root - ERROR - !!! Exception during processing !!!
rope_scaling
must be a dictionary with two fields,type
andfactor
, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'} 2024-09-19 20:09:56,703 - root - ERROR - Traceback (most recent call last): File "D:\SD\ComfyUI\ComfyUI\execution.py", line 323, in execute output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\ComfyUI\execution.py", line 198, in get_output_data return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\ComfyUI\execution.py", line 169, in _map_node_over_list process_inputs(input_dict, i) File "D:\SD\ComfyUI\ComfyUI\execution.py", line 158, in process_inputs results.append(getattr(obj, func)(inputs)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 135, in gen self.loadCheckPoint() File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 110, in loadCheckPoint text_model = AutoModelForCausalLM.from_pretrained(MODEL_PATH, device_map="auto",trust_remote_code=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\auto_factory.py", line 523, in from_pretrained config, kwargs = AutoConfig.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 952, in from_pretrained return config_class.from_dict(config_dict, unused_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\configuration_utils.py", line 761, in from_dict config = cls(**config_dict) ^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 161, in init self._rope_scaling_validation() File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 181, in _rope_scaling_validation raise ValueError( ValueError:rope_scaling
must be a dictionary with two fields,type
andfactor
, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}2024-09-19 20:09:56,704 - root - INFO - Prompt executed in 4.42 seconds 2024-09-19 20:10:01,209 - root - INFO - got prompt 2024-09-19 20:10:03,297 - root - ERROR - !!! Exception during processing !!! D:\SD\ComfyUI\ComfyUI\models\LLM\Meta-Llama-3.1-8B does not appear to have a file named config.json. Checkout 'https://huggingface.co/D:\SD\ComfyUI\ComfyUI\models\LLM\Meta-Llama-3.1-8B/tree/None' for available files. 2024-09-19 20:10:03,402 - root - ERROR - Traceback (most recent call last): File "D:\SD\ComfyUI\ComfyUI\execution.py", line 323, in execute output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\ComfyUI\execution.py", line 198, in get_output_data return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\ComfyUI\execution.py", line 169, in _map_node_over_list process_inputs(input_dict, i) File "D:\SD\ComfyUI\ComfyUI\execution.py", line 158, in process_inputs results.append(getattr(obj, func)(inputs)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 135, in gen self.loadCheckPoint() File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 107, in loadCheckPoint tokenizer = AutoTokenizer.from_pretrained(MODEL_PATH,use_fast=False) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\tokenization_auto.py", line 819, in from_pretrained config = AutoConfig.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 928, in from_pretrained config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\configuration_utils.py", line 631, in get_config_dict config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\configuration_utils.py", line 686, in _get_config_dict resolved_config_file = cached_file( ^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\utils\hub.py", line 369, in cached_file raise EnvironmentError( OSError: D:\SD\ComfyUI\ComfyUI\models\LLM\Meta-Llama-3.1-8B does not appear to have a file named config.json. Checkout 'https://huggingface.co/D:\SD\ComfyUI\ComfyUI\models\LLM\Meta-Llama-3.1-8B/tree/None' for available files.
2024-09-19 20:10:03,411 - root - INFO - Prompt executed in 2.16 seconds 2024-09-19 20:11:23,322 - root - INFO - got prompt 2024-09-19 20:11:25,820 - root - ERROR - !!! Exception during processing !!!
rope_scaling
must be a dictionary with two fields,type
andfactor
, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'} 2024-09-19 20:11:25,820 - root - ERROR - Traceback (most recent call last): File "D:\SD\ComfyUI\ComfyUI\execution.py", line 323, in execute output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\ComfyUI\execution.py", line 198, in get_output_data return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\ComfyUI\execution.py", line 169, in _map_node_over_list process_inputs(input_dict, i) File "D:\SD\ComfyUI\ComfyUI\execution.py", line 158, in process_inputs results.append(getattr(obj, func)(inputs)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 135, in gen self.loadCheckPoint() File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 110, in loadCheckPoint text_model = AutoModelForCausalLM.from_pretrained(MODEL_PATH, device_map="auto",trust_remote_code=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\auto_factory.py", line 523, in from_pretrained config, kwargs = AutoConfig.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 952, in from_pretrained return config_class.from_dict(config_dict, unused_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\configuration_utils.py", line 761, in from_dict config = cls(**config_dict) ^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 161, in init self._rope_scaling_validation() File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 181, in _rope_scaling_validation raise ValueError( ValueError:rope_scaling
must be a dictionary with two fields,type
andfactor
, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}2024-09-19 20:11:25,822 - root - INFO - Prompt executed in 2.46 seconds 2024-09-19 20:11:42,913 - root - INFO - got prompt 2024-09-19 20:11:45,240 - root - ERROR - !!! Exception during processing !!!
rope_scaling
must be a dictionary with two fields,type
andfactor
, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'} 2024-09-19 20:11:45,240 - root - ERROR - Traceback (most recent call last): File "D:\SD\ComfyUI\ComfyUI\execution.py", line 323, in execute output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\ComfyUI\execution.py", line 198, in get_output_data return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\ComfyUI\execution.py", line 169, in _map_node_over_list process_inputs(input_dict, i) File "D:\SD\ComfyUI\ComfyUI\execution.py", line 158, in process_inputs results.append(getattr(obj, func)(inputs)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 162, in gen joy_pipeline.parent.loadCheckPoint() File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 110, in loadCheckPoint text_model = AutoModelForCausalLM.from_pretrained(MODEL_PATH, device_map="auto",trust_remote_code=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\auto_factory.py", line 523, in from_pretrained config, kwargs = AutoConfig.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 952, in from_pretrained return config_class.from_dict(config_dict, unused_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\configuration_utils.py", line 761, in from_dict config = cls(**config_dict) ^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 161, in init self._rope_scaling_validation() File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 181, in _rope_scaling_validation raise ValueError( ValueError:rope_scaling
must be a dictionary with two fields,type
andfactor
, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}2024-09-19 20:11:45,241 - root - INFO - Prompt executed in 2.30 seconds 2024-09-19 20:12:56,006 - root - INFO - got prompt 2024-09-19 20:12:58,164 - root - ERROR - !!! Exception during processing !!! D:\SD\ComfyUI\ComfyUI\models\LLM\Meta-Llama-3.1-8B does not appear to have a file named config.json. Checkout 'https://huggingface.co/D:\SD\ComfyUI\ComfyUI\models\LLM\Meta-Llama-3.1-8B/tree/None' for available files. 2024-09-19 20:12:58,164 - root - ERROR - Traceback (most recent call last): File "D:\SD\ComfyUI\ComfyUI\execution.py", line 323, in execute output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\ComfyUI\execution.py", line 198, in get_output_data return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\ComfyUI\execution.py", line 169, in _map_node_over_list process_inputs(input_dict, i) File "D:\SD\ComfyUI\ComfyUI\execution.py", line 158, in process_inputs results.append(getattr(obj, func)(inputs)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 135, in gen self.loadCheckPoint() File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 107, in loadCheckPoint tokenizer = AutoTokenizer.from_pretrained(MODEL_PATH,use_fast=False) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\tokenization_auto.py", line 819, in from_pretrained config = AutoConfig.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 928, in from_pretrained config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\configuration_utils.py", line 631, in get_config_dict config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\configuration_utils.py", line 686, in _get_config_dict resolved_config_file = cached_file( ^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\utils\hub.py", line 369, in cached_file raise EnvironmentError( OSError: D:\SD\ComfyUI\ComfyUI\models\LLM\Meta-Llama-3.1-8B does not appear to have a file named config.json. Checkout 'https://huggingface.co/D:\SD\ComfyUI\ComfyUI\models\LLM\Meta-Llama-3.1-8B/tree/None' for available files.
2024-09-19 20:12:58,165 - root - INFO - Prompt executed in 2.13 seconds 2024-09-19 20:13:07,841 - root - INFO - got prompt 2024-09-19 20:13:10,698 - root - ERROR - !!! Exception during processing !!!
rope_scaling
must be a dictionary with two fields,type
andfactor
, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'} 2024-09-19 20:13:10,699 - root - ERROR - Traceback (most recent call last): File "D:\SD\ComfyUI\ComfyUI\execution.py", line 323, in execute output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\ComfyUI\execution.py", line 198, in get_output_data return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\ComfyUI\execution.py", line 169, in _map_node_over_list process_inputs(input_dict, i) File "D:\SD\ComfyUI\ComfyUI\execution.py", line 158, in process_inputs results.append(getattr(obj, func)(inputs)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 135, in gen self.loadCheckPoint() File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 110, in loadCheckPoint text_model = AutoModelForCausalLM.from_pretrained(MODEL_PATH, device_map="auto",trust_remote_code=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\auto_factory.py", line 523, in from_pretrained config, kwargs = AutoConfig.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 952, in from_pretrained return config_class.from_dict(config_dict, unused_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\configuration_utils.py", line 761, in from_dict config = cls(**config_dict) ^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 161, in init self._rope_scaling_validation() File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 181, in _rope_scaling_validation raise ValueError( ValueError:rope_scaling
must be a dictionary with two fields,type
andfactor
, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}2024-09-19 20:13:10,700 - root - INFO - Prompt executed in 2.83 seconds 2024-09-19 20:15:26,185 - root - INFO - got prompt 2024-09-19 20:15:28,313 - root - ERROR - !!! Exception during processing !!! D:\SD\ComfyUI\ComfyUI\models\LLM\Meta-Llama-3.1-8B does not appear to have a file named config.json. Checkout 'https://huggingface.co/D:\SD\ComfyUI\ComfyUI\models\LLM\Meta-Llama-3.1-8B/tree/None' for available files. 2024-09-19 20:15:28,314 - root - ERROR - Traceback (most recent call last): File "D:\SD\ComfyUI\ComfyUI\execution.py", line 323, in execute output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\ComfyUI\execution.py", line 198, in get_output_data return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\ComfyUI\execution.py", line 169, in _map_node_over_list process_inputs(input_dict, i) File "D:\SD\ComfyUI\ComfyUI\execution.py", line 158, in process_inputs results.append(getattr(obj, func)(inputs)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 135, in gen self.loadCheckPoint() File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 107, in loadCheckPoint tokenizer = AutoTokenizer.from_pretrained(MODEL_PATH,use_fast=False) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\tokenization_auto.py", line 819, in from_pretrained config = AutoConfig.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 928, in from_pretrained config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\configuration_utils.py", line 631, in get_config_dict config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\configuration_utils.py", line 686, in _get_config_dict resolved_config_file = cached_file( ^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\utils\hub.py", line 369, in cached_file raise EnvironmentError( OSError: D:\SD\ComfyUI\ComfyUI\models\LLM\Meta-Llama-3.1-8B does not appear to have a file named config.json. Checkout 'https://huggingface.co/D:\SD\ComfyUI\ComfyUI\models\LLM\Meta-Llama-3.1-8B/tree/None' for available files.
2024-09-19 20:15:28,315 - root - INFO - Prompt executed in 2.10 seconds 2024-09-19 20:15:31,171 - root - INFO - got prompt 2024-09-19 20:15:34,019 - root - ERROR - !!! Exception during processing !!!
rope_scaling
must be a dictionary with two fields,type
andfactor
, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'} 2024-09-19 20:15:34,019 - root - ERROR - Traceback (most recent call last): File "D:\SD\ComfyUI\ComfyUI\execution.py", line 323, in execute output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\ComfyUI\execution.py", line 198, in get_output_data return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\ComfyUI\execution.py", line 169, in _map_node_over_list process_inputs(input_dict, i) File "D:\SD\ComfyUI\ComfyUI\execution.py", line 158, in process_inputs results.append(getattr(obj, func)(inputs)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 135, in gen self.loadCheckPoint() File "D:\SD\ComfyUI\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 110, in loadCheckPoint text_model = AutoModelForCausalLM.from_pretrained(MODEL_PATH, device_map="auto",trust_remote_code=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\auto_factory.py", line 523, in from_pretrained config, kwargs = AutoConfig.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 952, in from_pretrained return config_class.from_dict(config_dict, unused_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\configuration_utils.py", line 761, in from_dict config = cls(**config_dict) ^^^^^^^^^^^^^^^^^^ File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 161, in init self._rope_scaling_validation() File "D:\SD\ComfyUI\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 181, in _rope_scaling_validation raise ValueError( ValueError:rope_scaling
must be a dictionary with two fields,type
andfactor
, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}2024-09-19 20:15:34,021 - root - INFO - Prompt executed in 2.81 seconds
{"last_node_id":5,"last_link_id":4,"nodes":[{"id":2,"type":"Joy_caption","pos":{"0":828,"1":498},"size":{"0":400,"1":200},"flags":{},"order":2,"mode":0,"inputs":[{"name":"joy_pipeline","type":"JoyPipeline","link":4,"label":"JoyCaption"},{"name":"image","type":"IMAGE","link":2,"slot_index":1,"label":"图像"}],"outputs":[{"name":"STRING","type":"STRING","links":[3],"slot_index":0,"shape":3,"label":"字符串"}],"properties":{"Node name for S&R":"Joy_caption"},"widgets_values":["A descriptive caption for this image",300,0.5,false,true]},{"id":4,"type":"easy showAnything","pos":{"0":1255,"1":502},"size":{"0":356.4357604980469,"1":250.48460388183594},"flags":{},"order":3,"mode":0,"inputs":[{"name":"anything","type":"","link":3,"label":"输入任何"}],"outputs":[],"properties":{"Node name for S&R":"easy showAnything"},"widgets_values":["of a young girl standing on a lush green lawn, surrounded by tall trees with budding leaves, under a cloudy sky. The girl, approximately 3-5 years old, has light blonde hair and a cheerful expression, smiling with her teeth showing. She wears a white, short-sleeved dress adorned with colorful floral appliqués in shades of pink, yellow, and orange, and a matching white hat with a large pink flower on the side. Her dress has a full skirt and is knee-length, with delicate lace trim along the hem. She also wears white tights and white shoes, enhancing the purity of her attire. In her hands, she carries a bouquet of fresh flowers, including yellow, pink, and white varieties, held close to her chest. The background is softly blurred, emphasizing the girl as the focal point, with the trees and sky providing a serene, natural setting. The overall mood is joyful and whimsical, capturing the innocence and beauty of childhood."]},{"id":3,"type":"LoadImage","pos":{"0":198,"1":577},"size":{"0":570.2863159179688,"1":474.0776062011719},"flags":{},"order":0,"mode":0,"inputs":[],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[2],"shape":3,"label":"图像"},{"name":"MASK","type":"MASK","links":null,"shape":3,"label":"遮罩"}],"properties":{"Node name for S&R":"LoadImage"},"widgets_values":["02368DB0BFA6AA367DC096A4738D5198.png","image"]},{"id":5,"type":"Joy_caption_load","pos":{"0":454,"1":446},"size":{"0":315,"1":58},"flags":{},"order":1,"mode":0,"inputs":[],"outputs":[{"name":"JoyPipeline","type":"JoyPipeline","links":[4],"slot_index":0,"shape":3,"label":"JoyCaption"}],"properties":{"Node name for S&R":"Joy_caption_load"},"widgets_values":["unsloth/Meta-Llama-3.1-8B-bnb-4bit"]}],"links":[[2,3,0,2,1,"IMAGE"],[3,2,0,4,0,""],[4,5,0,2,0,"JoyPipeline"]],"groups":[],"config":{},"extra":{"ds":{"scale":1.1671841070450009,"offset":[-90.3300588520564,-146.23972434239784]}},"version":0.4}