Open RamonGuthrie opened 2 months ago
I have the same issue.
Install llama_cpp_python by using whl file from https://github.com/abetlen/llama-cpp-python/releases. Choose the right cuda version prebuillt whl for comfyui.
how to know cudas version
Look into the console log when you start comfyui.
it says this
my UI is diffrent for some reason
thats the error i keep getting
Check this folder.
My comfyui env:
thats the error i keep getting
thanks so much
I had this problem and had to make sure "Visual Studio C++" was installed in Windows, then it worked.
how to know cudas version
type this command in to your command prompt , in windows CLi.
nvcc --version
Seems like I have a simliar problem after updating to nightly comfy portable I can find the .whl file and download it but where do I put it ... or is it a different problem here is my output from terminal
Installing llama-cpp-python...
Looking in indexes: https://jllllll.github.io/llama-cpp-python-cuBLAS-wheels/AVX/cu124
ERROR: Could not find a version that satisfies the requirement llama-cpp-python (from versions: none)
ERROR: No matching distribution found for llama-cpp-python
Traceback (most recent call last):
File "C:\AI\01_Comfy_nightly\ComfyUI\nodes.py", line 1879, in load_custom_node
module_spec.loader.exec_module(module)
File "
Cannot import C:\AI\01_Comfy_nightly\ComfyUI\custom_nodes\ComfyUI_VLM_nodes module for custom nodes: Command '['C:\AI\01_Comfy_nightly\python_embeded\python.exe', '-m', 'pip', 'install', 'llama-cpp-python', '--no-cache-dir', '--force-reinstall', '--no-deps', '--index-url=https://jllllll.github.io/llama-cpp-python-cuBLAS-wheels/AVX/cu124']' returned non-zero exit status 1.
Seems like I have a simliar problem after updating to nightly comfy portable I can find the .whl file and download it but where do I put it ... or is it a different problem here is my output from terminal
Installing llama-cpp-python... Looking in indexes: https://jllllll.github.io/llama-cpp-python-cuBLAS-wheels/AVX/cu124 ERROR: Could not find a version that satisfies the requirement llama-cpp-python (from versions: none) ERROR: No matching distribution found for llama-cpp-python Traceback (most recent call last): File "C:\AI\01_Comfy_nightly\ComfyUI\nodes.py", line 1879, in load_custom_node module_spec.loader.exec_module(module) File "", line 995, in exec_module File "", line 488, in _call_with_frames_removed File "C:\AI\01_Comfy_nightly\ComfyUI\custom_nodes\ComfyUI_VLM_nodesinit.py", line 44, in install_llama(system_info) File "C:\AI\01_Comfy_nightly\ComfyUI\custom_nodes\ComfyUI_VLM_nodes\install_init.py", line 111, in install_llama install_package("llama-cpp-python", custom_command=custom_command) File "C:\AI\01_Comfy_nightly\ComfyUI\custom_nodes\ComfyUI_VLM_nodes\install_init.py", line 91, in install_package subprocess.check_call(command) File "subprocess.py", line 413, in check_call subprocess.CalledProcessError: Command '['C:\AI\01_Comfy_nightly\python_embeded\python.exe', '-m', 'pip', 'install', 'llama-cpp-python', '--no-cache-dir', '--force-reinstall', '--no-deps', '--index-url=https://jllllll.github.io/llama-cpp-python-cuBLAS-wheels/AVX/cu124']' returned non-zero exit status 1.
Cannot import C:\AI\01_Comfy_nightly\ComfyUI\custom_nodes\ComfyUI_VLM_nodes module for custom nodes: Command '['C:\AI\01_Comfy_nightly\python_embeded\python.exe', '-m', 'pip', 'install', 'llama-cpp-python', '--no-cache-dir', '--force-reinstall', '--no-deps', '--index-url=https://jllllll.github.io/llama-cpp-python-cuBLAS-wheels/AVX/cu124']' returned non-zero exit status 1.
Do not compile by yourself. Use the precompiled whl file. 2 way(example,you should modify it according to your pc):
This seems to be the error message I'm getting I hope it makes sense