Closed vivi-gomez closed 9 months ago
are you sure this node is from VLM nodes? can you send me the workfow picture?
https://github.com/ceruleandeep/ComfyUI-LLaVA-Captioner it can be from this nodes?
The llama_cpp_python package is broken. llama-cpp-python==0.2.44 works but 0.2.50 fails with the same error outside of ComfyUI. I'm not sure which version started the breakage. Latest is always picked up for install.
lcpp_version = latest_lamacpp()
But you can install it yourself and it will ignore installing latest it looks like:
imported = package_is_installed("llama-cpp-python") or package_is_installed("llama_cpp")
This is the solution for me. By now there is a llama-cpp-python==0.2.52 that works.
The llama_cpp_python package is broken. llama-cpp-python==0.2.44 works but 0.2.50 fails with the same error outside of ComfyUI. I'm not sure which version started the breakage. Latest is always picked up for install.
lcpp_version = latest_lamacpp()
But you can install it yourself and it will ignore installing latest it looks like:
imported = package_is_installed("llama-cpp-python") or package_is_installed("llama_cpp")
I didnt even know I got installed that addon. However the comfyui workflow was stopping by LLava Sampler Simple which is identified as one of VLM_nodes.
Thank you
are you sure this node is from VLM nodes? can you send me the workfow picture?
https://github.com/ceruleandeep/ComfyUI-LLaVA-Captioner it can be from this nodes?
This is the solution for me. By now there is a llama-cpp-python==0.2.52 that works.
Great! That's the first version that fixes it with this commit: https://github.com/abetlen/llama-cpp-python/commit/8383a9e5620f5df5a88f62da16813eac200dd706
Whatever I try, I always get this error, venv/lib/python3.10/site-packages/llama_cpp/llama_chat_format.py", line 1959, in call self._llava_cpp.llava_image_embed_make_with_bytes( TypeError: this function takes at least 4 arguments (0 given)
I can someone point me any light to solve it