Nuked88 / ComfyUI-N-Nodes

A suite of custom nodes for ConfyUI that includes GPT text-prompt generation, LoadVideo, SaveVideo, LoadFramesFromFolder and FrameInterpolator
MIT License
206 stars 22 forks source link

[BUG] inport failed #67

Open derpmagician opened 4 months ago

derpmagician commented 4 months ago

this happens after the instalation with the manager

File "/home/zeus/miniconda3/envs/cloudspace/lib/python3.10/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/home/zeus/miniconda3/envs/cloudspace/lib/python3.10/site-packages/llama_cpp/__init__.py", line 1, in <module>
    from .llama_cpp import *
  File "/home/zeus/miniconda3/envs/cloudspace/lib/python3.10/site-packages/llama_cpp/llama_cpp.py", line 88, in <module>
    _lib = _load_shared_library(_lib_base_name)
  File "/home/zeus/miniconda3/envs/cloudspace/lib/python3.10/site-packages/llama_cpp/llama_cpp.py", line 77, in _load_shared_library
    raise RuntimeError(f"Failed to load shared library '{_lib_path}': {e}")
RuntimeError: Failed to load shared library '/home/zeus/miniconda3/envs/cloudspace/lib/python3.10/site-packages/llama_cpp/lib/libllama.so': /usBCXX_3.4.29' not found (required by /home/zeus/miniconda3/envs/cloudspace/lib/python3.10/site-packages/llama_cpp/lib/libllama.so)

Cannot import /teamspace/studios/this_studio/ComfyUI/custom_nodes/ComfyUI-N-Nodes module for custom nodes: Failed to load shared library '/homee-packages/llama_cpp/lib/libllama.so': /usr/lib/x86_64-linux-gnu/libstdc++.so.6: version `GLIBCXX_3.4.29' not found (required by /home/zeus/mins/llama_cpp/lib/libllama.so)
secretivebanana commented 3 months ago

Had a similar or same error. I fixed it by:

  1. Close ComfyUI.
  2. Activate ComfyUI virtual environment (venv) from venv/scripts/activate.bat.
  3. In the command window: pip uninstall llama-cpp-python.
  4. Then again in command window: pip install llama-cpp-python.
  5. Re-open ComfyUI.