Closed Michael-Dawkins closed 3 months ago
I am trying to use the LLava model loader but I always get this error:
[2024-04-02 22:42] got prompt [2024-04-02 22:42] !!! Exception during processing !!! [2024-04-02 22:42] Traceback (most recent call last): File "C:\Users\miked\Documents\projects\ComfyUI_windows_portable\ComfyUI\execution.py", line 151, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\miked\Documents\projects\ComfyUI_windows_portable\ComfyUI\execution.py", line 81, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\miked\Documents\projects\ComfyUI_windows_portable\ComfyUI\execution.py", line 74, in map_node_over_list results.append(getattr(obj, func)(**slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\miked\Documents\projects\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_VLM_nodes\nodes\llavaloader.py", line 57, in load_clip_checkpoint clip = Llava15ChatHandler(clip_model_path = clip_path, verbose=False) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\miked\Documents\projects\ComfyUI_windows_portable\python_embeded\Lib\site-packages\llama_cpp\llama_chat_format.py", line 1072, in __init__ self.clip_ctx = self._llava_cpp.clip_model_load( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\miked\Documents\projects\ComfyUI_windows_portable\python_embeded\Lib\site-packages\llama_cpp\llava_cpp.py", line 174, in clip_model_load return _libllava.clip_model_load(fname, verbosity) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ OSError: [WinError -529697949] Windows Error 0xe06d7363 [2024-04-02 22:42] Prompt executed in 0.11 seconds [2024-04-02 22:42] Exception ignored in: <function Llava15ChatHandler.__del__ at 0x000001D864616480> [2024-04-02 22:42] Traceback (most recent call last): [2024-04-02 22:42] File "C:\Users\miked\Documents\projects\ComfyUI_windows_portable\python_embeded\Lib\site-packages\llama_cpp\llama_chat_format.py", line 1078, in __del__ [2024-04-02 22:42] if self.clip_ctx is not None and self._clip_free is not None: [2024-04-02 22:42] ^^^^^^^^^^^^^ [2024-04-02 22:42] AttributeError: 'Llava15ChatHandler' object has no attribute 'clip_ctx'
I am running comfyui on windows 11, with the embedded python provided. I tried updating all dependencies. I have the same result with:
Did I miss an installation step or is something wrong about my python install ? Sorry I am new to Comfy / Python
I simply had to read the documentation more attentively, I was providing the model and not the mmproj CLIP projector. It works now :)
I am trying to use the LLava model loader but I always get this error:
I am running comfyui on windows 11, with the embedded python provided. I tried updating all dependencies. I have the same result with:
Did I miss an installation step or is something wrong about my python install ? Sorry I am new to Comfy / Python