StartHua / Comfyui_CXH_joy_caption

Recommended based on comfyui node pictures:Joy_caption + MiniCPMv2_6-prompt-generator + florence2
Apache License 2.0
444 stars 26 forks source link

Joy caption load报错 #29

Open hexi2024hexi opened 2 months ago

hexi2024hexi commented 2 months ago

Error occurred when executing Joy_caption_load:

D:\ComfyUI\models\LLM\Meta-Llama-3.1-8B does not appear to have a file named config.json. Checkout 'https://huggingface.co/D:\ComfyUI\models\LLM\Meta-Llama-3.1-8B/tree/None' for available files.

File "D:\ComfyUI\execution.py", line 317, in execute output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) File "D:\ComfyUI\execution.py", line 192, in get_output_data return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) File "D:\ComfyUI\execution.py", line 169, in _map_node_over_list process_inputs(input_dict, i) File "D:\ComfyUI\execution.py", line 158, in process_inputs results.append(getattr(obj, func)(inputs)) File "D:\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 117, in gen self.loadCheckPoint() File "D:\ComfyUI\custom_nodes\Comfyui_CXH_joy_caption\Joy_caption_node.py", line 89, in loadCheckPoint tokenizer = AutoTokenizer.from_pretrained(MODEL_PATH,use_fast=False) File "D:\ComfyUI\python\lib\site-packages\transformers\models\auto\tokenization_auto.py", line 853, in from_pretrained config = AutoConfig.from_pretrained( File "D:\ComfyUI\python\lib\site-packages\transformers\models\auto\configuration_auto.py", line 972, in from_pretrained config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, kwargs) File "D:\ComfyUI\python\lib\site-packages\transformers\configuration_utils.py", line 632, in get_config_dict config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs) File "D:\ComfyUI\python\lib\site-packages\transformers\configuration_utils.py", line 689, in _get_config_dict resolved_config_file = cached_file( File "D:\ComfyUI\python\lib\site-packages\transformers\utils\hub.py", line 373, in cached_file raise EnvironmentError([](url) 222

Tanglinling commented 2 months ago

image 用这个,或者把https://huggingface.co/unsloth/Meta-Llama-3.1-8B/tree/main下载下来

markawonge commented 2 months ago

试过了,下载下来也老样子

xiaminiu commented 2 months ago

因为你调用的模型选错了,要用unsloth/Meta-Llama-3.1-8B-bnb-4bit

sxcbing commented 2 months ago

模型文件名是否正确,

markawonge commented 2 months ago

使用Meta-Llama-3.1-8B后VRAM持续占用88%,GPU100%

/home/kwan/ComfyUI/models/clip/siglip-so400m-patch14-384 /home/kwan/ComfyUI/models/LLM/Meta-Llama-3.1-8B We will use 90% of the memory on device 0 for storing the model, and 10% for the buffer to avoid OOM. You can set max_memory in to a higher value to use more memory (at your own risk). Loading checkpoint shards: 100%|█████████████████████████████████| 4/4 [00:09<00:00, 2.36s/it] Some parameters are on the meta device device because they were offloaded to the cpu.

然后它就一直吃满资源却不吐prompt