zhongpei / Comfyui_image2prompt

image to prompt by vikhyatk/moondream1
GNU General Public License v3.0
240 stars 14 forks source link

help #61

Open jokero3answer opened 2 months ago

jokero3answer commented 2 months ago

Clicked execute and it never responded image low_memory And so it is. image

Diagnostics-1714320959.log

jokero3answer commented 2 months ago

image image 我是手动下载的还不全吗,但是自动好像也有问题 @zhongpei ,佛佬救命

jokero3answer commented 2 months ago

Error occurred when executing LoadImage2TextModel:

You can't pass load_in_4bitor load_in_8bit as a kwarg when passing quantization_config argument at the same time.

File "D:\ComfyUI\execution.py", line 151, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) File "D:\ComfyUI\execution.py", line 81, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) File "D:\ComfyUI\execution.py", line 74, in map_node_over_list results.append(getattr(obj, func)(**slice_dict(input_data_all, i))) File "D:\ComfyUI\custom_nodes\Comfyui_image2prompt\src\image2text.py", line 54, in get_model return (Llama3vModel(device=device,low_memory=low_memory),) File "D:\ComfyUI\custom_nodes\Comfyui_image2prompt\src\llama3_model.py", line 67, in init self.model = AutoModelForCausalLM.from_pretrained( File "D:\ComfyUI\venv\lib\site-packages\transformers\models\auto\auto_factory.py", line 556, in from_pretrained return model_class.from_pretrained( File "D:\ComfyUI\venv\lib\site-packages\transformers\modeling_utils.py", line 2952, in from_pretrained raise ValueError(

jokero3answer commented 2 months ago

模型ok了,现在报错这个

jokero3answer commented 2 months ago

image

zhenyuanzhou commented 1 month ago

Same, seems this error is from the wrong version of transformers