Open ohahaps2 opened 17 hours ago
pip install requirements.txt后新的错误又出来了.
rope_scaling
must be a dictionary with two fields, type
and factor
, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}
Exception Message: rope_scaling
must be a dictionary with two fields, type
and factor
, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}
File "H:\sd3\ComfyUI\execution.py", line 323, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\ComfyUI\execution.py", line 198, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\ComfyUI\execution.py", line 169, in _map_node_over_list
process_inputs(input_dict, i)
File "H:\sd3\ComfyUI\execution.py", line 158, in process_inputs
results.append(getattr(obj, func)(**inputs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\ComfyUI\custom_nodes\ComfyUI_SLK_joy_caption_two\joy_caption_two_node.py", line 374, in generate
text_model = joy_two_pipeline.llm.load_llm_model(joy_two_pipeline.model)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\ComfyUI\custom_nodes\ComfyUI_SLK_joy_caption_two\joy_caption_two_node.py", line 172, in load_llm_model
text_model = AutoModelForCausalLM.from_pretrained(text_model_path,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\python_embeded\Lib\site-packages\transformers\models\auto\auto_factory.py", line 523, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\python_embeded\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 958, in from_pretrained
return config_class.from_dict(config_dict, **unused_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\python_embeded\Lib\site-packages\transformers\configuration_utils.py", line 768, in from_dict
config = cls(**config_dict)
^^^^^^^^^^^^^^^^^^
File "H:\sd3\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 161, in __init__
self._rope_scaling_validation()
File "H:\sd3\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 182, in _rope_scaling_validation
raise ValueError(
2024-10-12 20:19:38,924 - root - INFO - got prompt
2024-10-12 20:19:40,746 - root - INFO - Requested to load SiglipVisionTransformer
2024-10-12 20:19:40,747 - root - INFO - Loading 1 new model
2024-10-12 20:19:41,034 - root - INFO - loaded completely 0.0 1618.345947265625 True
2024-10-12 20:19:41,569 - root - INFO - Requested to load ImageAdapter
2024-10-12 20:19:41,569 - root - INFO - Loading 1 new model
2024-10-12 20:19:41,583 - root - INFO - loaded completely 0.0 82.078125 True
2024-10-12 20:19:41,937 - root - ERROR - !!! Exception during processing !!! rope_scaling
must be a dictionary with two fields, type
and factor
, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}
2024-10-12 20:19:41,940 - root - ERROR - Traceback (most recent call last):
File "H:\sd3\ComfyUI\execution.py", line 323, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\ComfyUI\execution.py", line 198, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\ComfyUI\execution.py", line 169, in _map_node_over_list
process_inputs(input_dict, i)
File "H:\sd3\ComfyUI\execution.py", line 158, in process_inputs
results.append(getattr(obj, func)(inputs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\ComfyUI\custom_nodes\ComfyUI_SLK_joy_caption_two\joy_caption_two_node.py", line 374, in generate
text_model = joy_two_pipeline.llm.load_llm_model(joy_two_pipeline.model)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\ComfyUI\custom_nodes\ComfyUI_SLK_joy_caption_two\joy_caption_two_node.py", line 172, in load_llm_model
text_model = AutoModelForCausalLM.from_pretrained(text_model_path,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\python_embeded\Lib\site-packages\transformers\models\auto\auto_factory.py", line 523, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\python_embeded\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 958, in from_pretrained
return config_class.from_dict(config_dict, unused_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\python_embeded\Lib\site-packages\transformers\configuration_utils.py", line 768, in from_dict
config = cls(**config_dict)
^^^^^^^^^^^^^^^^^^
File "H:\sd3\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 161, in init
self._rope_scaling_validation()
File "H:\sd3\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 182, in _rope_scaling_validation
raise ValueError(
ValueError: rope_scaling
must be a dictionary with two fields, type
and factor
, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}
2024-10-12 20:19:41,943 - root - INFO - Prompt executed in 3.00 seconds
2024-10-12 20:20:47,288 - root - INFO - got prompt
2024-10-12 20:20:47,538 - root - ERROR - !!! Exception during processing !!! rope_scaling
must be a dictionary with two fields, type
and factor
, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}
2024-10-12 20:20:47,539 - root - ERROR - Traceback (most recent call last):
File "H:\sd3\ComfyUI\execution.py", line 323, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\ComfyUI\execution.py", line 198, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\ComfyUI\execution.py", line 169, in _map_node_over_list
process_inputs(input_dict, i)
File "H:\sd3\ComfyUI\execution.py", line 158, in process_inputs
results.append(getattr(obj, func)(inputs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\ComfyUI\custom_nodes\ComfyUI_SLK_joy_caption_two\joy_caption_two_node.py", line 374, in generate
text_model = joy_two_pipeline.llm.load_llm_model(joy_two_pipeline.model)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\ComfyUI\custom_nodes\ComfyUI_SLK_joy_caption_two\joy_caption_two_node.py", line 172, in load_llm_model
text_model = AutoModelForCausalLM.from_pretrained(text_model_path,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\python_embeded\Lib\site-packages\transformers\models\auto\auto_factory.py", line 523, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\python_embeded\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 958, in from_pretrained
return config_class.from_dict(config_dict, unused_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\python_embeded\Lib\site-packages\transformers\configuration_utils.py", line 768, in from_dict
config = cls(**config_dict)
^^^^^^^^^^^^^^^^^^
File "H:\sd3\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 161, in init
self._rope_scaling_validation()
File "H:\sd3\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 182, in _rope_scaling_validation
raise ValueError(
ValueError: rope_scaling
must be a dictionary with two fields, type
and factor
, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}
2024-10-12 20:20:47,540 - root - INFO - Prompt executed in 0.23 seconds
2024-10-12 20:21:45,770 - root - INFO - got prompt
2024-10-12 20:21:45,928 - root - ERROR - !!! Exception during processing !!! rope_scaling
must be a dictionary with two fields, type
and factor
, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}
2024-10-12 20:21:45,929 - root - ERROR - Traceback (most recent call last):
File "H:\sd3\ComfyUI\execution.py", line 323, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\ComfyUI\execution.py", line 198, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\ComfyUI\execution.py", line 169, in _map_node_over_list
process_inputs(input_dict, i)
File "H:\sd3\ComfyUI\execution.py", line 158, in process_inputs
results.append(getattr(obj, func)(inputs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\ComfyUI\custom_nodes\ComfyUI_SLK_joy_caption_two\joy_caption_two_node.py", line 374, in generate
text_model = joy_two_pipeline.llm.load_llm_model(joy_two_pipeline.model)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\ComfyUI\custom_nodes\ComfyUI_SLK_joy_caption_two\joy_caption_two_node.py", line 172, in load_llm_model
text_model = AutoModelForCausalLM.from_pretrained(text_model_path,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\python_embeded\Lib\site-packages\transformers\models\auto\auto_factory.py", line 523, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\python_embeded\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 958, in from_pretrained
return config_class.from_dict(config_dict, unused_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\python_embeded\Lib\site-packages\transformers\configuration_utils.py", line 768, in from_dict
config = cls(**config_dict)
^^^^^^^^^^^^^^^^^^
File "H:\sd3\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 161, in init
self._rope_scaling_validation()
File "H:\sd3\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 182, in _rope_scaling_validation
raise ValueError(
ValueError: rope_scaling
must be a dictionary with two fields, type
and factor
, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}
2024-10-12 20:21:45,931 - root - INFO - Prompt executed in 0.14 seconds
2024-10-12 20:23:21,945 - root - INFO - got prompt
2024-10-12 20:23:22,178 - root - ERROR - !!! Exception during processing !!! rope_scaling
must be a dictionary with two fields, type
and factor
, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}
2024-10-12 20:23:22,179 - root - ERROR - Traceback (most recent call last):
File "H:\sd3\ComfyUI\execution.py", line 323, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\ComfyUI\execution.py", line 198, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\ComfyUI\execution.py", line 169, in _map_node_over_list
process_inputs(input_dict, i)
File "H:\sd3\ComfyUI\execution.py", line 158, in process_inputs
results.append(getattr(obj, func)(inputs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\ComfyUI\custom_nodes\ComfyUI_SLK_joy_caption_two\joy_caption_two_node.py", line 374, in generate
text_model = joy_two_pipeline.llm.load_llm_model(joy_two_pipeline.model)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\ComfyUI\custom_nodes\ComfyUI_SLK_joy_caption_two\joy_caption_two_node.py", line 172, in load_llm_model
text_model = AutoModelForCausalLM.from_pretrained(text_model_path,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\python_embeded\Lib\site-packages\transformers\models\auto\auto_factory.py", line 523, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\python_embeded\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 958, in from_pretrained
return config_class.from_dict(config_dict, unused_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "H:\sd3\python_embeded\Lib\site-packages\transformers\configuration_utils.py", line 768, in from_dict
config = cls(**config_dict)
^^^^^^^^^^^^^^^^^^
File "H:\sd3\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 161, in init
self._rope_scaling_validation()
File "H:\sd3\python_embeded\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 182, in _rope_scaling_validation
raise ValueError(
ValueError: rope_scaling
must be a dictionary with two fields, type
and factor
, got {'factor': 8.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}
2024-10-12 20:23:22,181 - root - INFO - Prompt executed in 0.21 seconds
## Attached Workflow
Please make sure that workflow does not contain any sensitive information such as API keys or passwords.
你这里感觉应该是模型下载不完整之类的问题,建议从国内镜像下载整个目录下的文件,再复制到指定目录下,我截图里有确定的位置的,你再试试
这个不用下吧,看你说用那个bnb-4bit小显存,我就没下载那个大的 “”“把整个文件夹内的内容复制到 models\LLM\Meta-Llama-3.1-8B-Instruct-bnb-4bit 下
输入类型(torch.cuda.HalfTensor)和权重类型(torch.FloatTensor)应该相同,大佬,这是啥问题?
全部重新下载了 还是一样的报第二个错。
![Uploading 微信截图_20241013013201.png…]()
我照网上的办法 修改了models\clip\siglip-so400m-patch14-384\config.json "rms_norm_eps": 1e-05, "rope_scaling": { "factor": 8.0, "type": "dynamic" 就能正常工作了 ,说是transformers新版本 rope_scaling 一些参数变了。
ComfyUI Error Report ComfyUI 错误报告
Error Details 错误详情
- Node Type: Joy_caption_two节点类型: Joy_caption_two
- Exception Type: RuntimeError异常类型:运行时错误
- Exception Message: Input type (torch.cuda.HalfTensor) and weight type (torch.FloatTensor) should be the same异常消息:输入类型(torch.cuda.HalfTensor)和权重类型(torch.FloatTensor)应保持一致
Stack Trace 堆栈跟踪
![Uploading 微信截图_20241013013201.png…]()
你的截图看不到呀,这个应该是系统默认的创建张量类型不一致导致的,因为我的系统比较老旧,不能够充分测试,所以你再提供一下详细的截图给我看看?我也方便更新代码,我的系统是rtx2080 8G的显卡,默认是能正常使用的,但是我也遇到过类似的,结果现在还是在你这边的环境下遇到了
ComfyUI Error Report
Error Details
Exception Message:
Stack Trace
删除手动下载文件,让它自动下载models\clip\siglip-so400m-patch14-384 也是一样的错误,该怎么办 大佬?