ceruleandeep / ComfyUI-LLaVA-Captioner

A ComfyUI extension for chatting with your images with LLaVA. Runs locally, no external services, no filter.
GNU General Public License v3.0
98 stars 11 forks source link

No module named 'llama_cpp' #1

Open theonetwoone opened 8 months ago

theonetwoone commented 8 months ago

After installing and moving the models to the right folder I still get this when starting Comfyui:

Traceback (most recent call last):
  File "D:\AI-Programmer\ComfyUI\ComfyUI\nodes.py", line 1813, in load_custom_node
    module_spec.loader.exec_module(module)
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "D:\AI-Programmer\ComfyUI\ComfyUI\custom_nodes\ComfyUI-LLaVA-Captioner\__init__.py", line 1, in <module>
    from .llava import NODE_CLASS_MAPPINGS, NODE_DISPLAY_NAME_MAPPINGS
  File "D:\AI-Programmer\ComfyUI\ComfyUI\custom_nodes\ComfyUI-LLaVA-Captioner\llava.py", line 12, in <module>
    from llama_cpp import Llama
ModuleNotFoundError: No module named 'llama_cpp'

Cannot import D:\AI-Programmer\ComfyUI\ComfyUI\custom_nodes\ComfyUI-LLaVA-Captioner module for custom nodes: No module named 'llama_cpp'
slavakurilyak commented 8 months ago

I got the same ModuleNotFoundError which I resolved by installing llama-cpp-python inside ComfyUI folder.

Here's the fix:

cd ComfyUI
pip install llama-cpp-python

llama-cpp-python by @abetlen is a Python binding for llama.cpp by @ggerganov.

pixelass commented 8 months ago

@slavakurilyak thanks for the tipp, but it didn't work for me. llama-cpp-python is installed now but can't be found. This repo needs a decent installation guide or better yet handle installs on its own.

theonetwoone commented 8 months ago

I found that it's likely a problem only for pre compiled comfyui users. Installing llamacpp manually then copy the dependencies from %appdata% local/programs/python/python310/lib/site-packages into the comfyui/python_embeded/lib/site-packages, fixes the the node works fine after

pixelass commented 8 months ago

I mean, yyeah one way to get rid of open issues is just closing them. 🤷

@theonetwoone I appreciate your help but that does not work either.

manually then copy the dependencies from

You must be kidding, right?

this will end in an endless manual moving of dependencies

ModuleNotFoundError: No module named 'diskcache'
theonetwoone commented 8 months ago

Sorry didn't mean to close it my bad!

pixelass commented 8 months ago

I got it to work by manually copying all dependencies (for me only 'diskcache' was remaining). But something is still off, it takes 20s (so I gguess it's running on CPU).

This is a great tool if we put in some more work.

AlexYez commented 5 months ago

llama-cpp-python with GPU support: CUDA 11.8 .\python_embeded\python.exe -m pip install llama-cpp-python --prefer-binary --no-cache-dir --extra-index-url=https://jllllll.github.io/llama-cpp-python-cuBLAS-wheels/AVX2/cu118/

CUDA 12.2 .\python_embeded\python.exe -m pip install llama-cpp-python --prefer-binary --no-cache-dir --extra-index-url=https://jllllll.github.io/llama-cpp-python-cuBLAS-wheels/AVX2/cu122

andreszs commented 5 months ago

In my case the llama installation fails directly:

image

PS only now I see that to install that llama, VisualStudio is required.