Nuked88 / ComfyUI-N-Nodes

A suite of custom nodes for ConfyUI that includes GPT text-prompt generation, LoadVideo, SaveVideo, LoadFramesFromFolder and FrameInterpolator
MIT License
205 stars 22 forks source link

[BUG] #79

Open highfiiv opened 2 weeks ago

highfiiv commented 2 weeks ago

Was told to move this https://github.com/comfyanonymous/ComfyUI/issues/5510 story to ComfyUI-N-Nodes repo.

Expected Behavior

I'm not sure if Ollama models are required in anyway but I do see errors related to them and would like the most healthy setup possible.

Also the projects looks for them here https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.1/llama_cpp_python-0.3.1-cp311-cp311-macosx_15_0_arm64.whl

But the download only exists here https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.1-metal/llama_cpp_python-0.3.1-cp311-cp311-macosx_11_0_arm64.whl

Actual Behavior

When running the project I see Error fetching Ollama models

Steps to Reproduce

Install on Macos and run the project

Debug Logs

There's 18,000 lines but here is a key snippet

Total VRAM 32768 MB, total RAM 32768 MB
pytorch version: 2.5.0
Set vram state to: SHARED
Device: mps
ComfyUI_FluxPromptGen loaded successfully
------------------------------------------
### N-Suite Revision: ae7cc848 
Current version of packaging: 23.1
Version of cpuinfo: Not found
Current version of git: 3.1.43
Current version of moviepy: 1.0.3
Current version of cv2: 4.10.0
Current version of skbuild: 0.18.1
Version of typing: Not found
Current version of diskcache: 5.6.3
Installing llama_cpp...
Error while checking AVX2 support: 'flags'
Python version: 311
OS: Darwin
OS bit: 64
Platform tag: cp311-cp311-macosx_15_0_arm64
Collecting llama-cpp-python==0.3.1
  ERROR: HTTP error 404 while getting https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.1/llama_cpp_python-0.3.1-cp311-cp311-macosx_15_0_arm64.whl
ERROR: Could not install requirement llama-cpp-python==0.3.1 from https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.1/llama_cpp_python-0.3.1-cp311-cp311-macosx_15_0_arm64.whl because of HTTP error 404 Client Error: Not Found for url: https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.1/llama_cpp_python-0.3.1-cp311-cp311-macosx_15_0_arm64.whl for URL https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.1/llama_cpp_python-0.3.1-cp311-cp311-macosx_15_0_arm64.whl
Error while installing LLAMA: Command '['/usr/local/bin/python3.11', '-m', 'pip', 'install', 'https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.1/llama_cpp_python-0.3.1-cp311-cp311-macosx_15_0_arm64.whl']' returned non-zero exit status 1.
Current version of timm: 1.0.9
Traceback (most recent call last):
  File "/Users/home/dev/ComfyUI/custom_nodes/ComfyUI-N-Nodes/__init__.py", line 64, in <module>
    spec.loader.exec_module(module)
  File "<frozen importlib._bootstrap_external>", line 940, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/Users/home/dev/ComfyUI/custom_nodes/ComfyUI-N-Nodes/py/gptcpp_node.py", line 4, in <module>
    from llama_cpp import Llama
ModuleNotFoundError: No module named 'llama_cpp'
!!!VExpress path was added to /Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/VExpress.pth
 if meet `No module` error,try `python main.py` again, don't be foolish to pip install modules

Other

I'm wondering if this is causing other issues, such as conflicts and instability for various custom nodes.

highfiiv commented 2 weeks ago

Still working on this... Here's where I'm at

  1. We may want to improve handling of Python path usage on macOS
  2. The SSL certificate fix requirement for NLTK data downloads on macOS

Here's a breakdown of the issues and fixes:

  1. Primary Issue - Wrong llama-cpp-python URL

    • Wrong llama URL in ComfyUI
    • Fix: Installed correct version manually for macOS ARM from:
      https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.1-metal/llama_cpp_python-0.3.1-cp311-cp311-macosx_11_0_arm64.whl
  2. SSL Certificate Issues with NLTK

    • Original Error: SSL certificate verification failed when downloading NLTK data
    • Fix: Updated macOS certificates using the Python certificate installer:
      /Applications/Python\ 3.11/Install\ Certificates.command
  3. Python Environment Confusion

    • Issue: Conflicts between conda Python (/opt/miniconda3/bin/python3) and framework Python (/Library/Frameworks/Python.framework/Versions/3.11/bin/python3)
    • Fix: Explicitly used framework Python path for installations and commands
    • Additionally resolved a dependency conflict between:
    • cached-path 1.6.3 requiring huggingface-hub<0.24.0
    • System having huggingface-hub 0.26.2