gokayfem / ComfyUI_VLM_nodes

Custom ComfyUI nodes for Vision Language Models, Large Language Models, Image to Music, Text to Music, Consistent and Random Creative Prompt Generation
Apache License 2.0
297 stars 23 forks source link

ValueError: could not broadcast input array from shape (32000,) into shape (0,) #95

Open ultimatech-cn opened 3 weeks ago

ultimatech-cn commented 3 weeks ago

When excution example workflow image

!!! Exception during processing!!! could not broadcast input array from shape (32000,) into shape (0,) Traceback (most recent call last): File "E:\training\ComfyUI\execution.py", line 151, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\training\ComfyUI\execution.py", line 81, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\training\ComfyUI\execution.py", line 74, in map_node_over_list results.append(getattr(obj, func)(**slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\training\ComfyUI\custom_nodes\ComfyUI_VLM_nodes\nodes\llavaloader.py", line 99, in generate_text response = llm.create_chat_completion( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\training\python_embeded\Lib\site-packages\llama_cpp\llama.py", line 1734, in create_chat_completion return handler( ^^^^^^^^ File "E:\training\python_embeded\Lib\site-packages\llama_cpp\llama_chat_format.py", line 2712, in call completion_or_chunks = llama.create_completion( ^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\training\python_embeded\Lib\site-packages\llama_cpp\llama.py", line 1570, in create_completion completion: Completion = next(completion_or_chunks) # type: ignore ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\training\python_embeded\Lib\site-packages\llama_cpp\llama.py", line 1095, in _create_completion for token in self.generate( File "E:\training\python_embeded\Lib\site-packages\llama_cpp\llama.py", line 723, in generate self.eval(tokens) File "E:\training\python_embeded\Lib\site-packages\llama_cpp\llama.py", line 569, in eval self.scores[n_past : n_past + n_tokens, :].reshape(-1)[: :] = logits


ValueError: could not broadcast input array from shape (32000,) into shape (0,)

Any workround for this?

gokayfem commented 3 weeks ago

first time seeing this error. are you sure you donwloaded the right mmproj for the model? you cannot use another models mmproj on different model.