I'm running Text-Generation-WebUI with Pinokio. The installation goes fine, then I've added this model in the Model > Download section. The download works fine, but after that when I try to load it, I get this error:
OSError: CUDA_HOME environment variable is not set. Please set it to your CUDA install root.
Note: I got the same error with other models as well.
Look for text-generation-webui, download and install it
Once installed, try to install and load this model
Screenshot
No response
Logs
File "/Users/rob/pinokio/api/oobabooga.pinokio.git/text-generation-webui/modules/ui_model_menu.py", line 232, in load_model_wrapper
shared.model, shared.tokenizer = load_model(selected_model, loader)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/rob/pinokio/api/oobabooga.pinokio.git/text-generation-webui/modules/models.py", line 93, in load_model
output = load_func_map[loader](model_name)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/rob/pinokio/api/oobabooga.pinokio.git/text-generation-webui/modules/models.py", line 313, in ExLlamav2_HF_loader
from modules.exllamav2_hf import Exllamav2HF
File "/Users/rob/pinokio/api/oobabooga.pinokio.git/text-generation-webui/modules/exllamav2_hf.py", line 7, in <module>
from exllamav2 import (
File "/Users/rob/pinokio/api/oobabooga.pinokio.git/text-generation-webui/installer_files/env/lib/python3.11/site-packages/exllamav2/__init__.py", line 3, in <module>
from exllamav2.model import ExLlamaV2
File "/Users/rob/pinokio/api/oobabooga.pinokio.git/text-generation-webui/installer_files/env/lib/python3.11/site-packages/exllamav2/model.py", line 35, in <module>
from exllamav2.config import ExLlamaV2Config
File "/Users/rob/pinokio/api/oobabooga.pinokio.git/text-generation-webui/installer_files/env/lib/python3.11/site-packages/exllamav2/config.py", line 5, in <module>
from exllamav2.stloader import STFile, cleanup_stfiles
File "/Users/rob/pinokio/api/oobabooga.pinokio.git/text-generation-webui/installer_files/env/lib/python3.11/site-packages/exllamav2/stloader.py", line 5, in <module>
from exllamav2.ext import none_tensor, exllamav2_ext as ext_c
File "/Users/rob/pinokio/api/oobabooga.pinokio.git/text-generation-webui/installer_files/env/lib/python3.11/site-packages/exllamav2/ext.py", line 276, in <module>
exllamav2_ext = load \
^^^^^^
File "/Users/rob/pinokio/api/oobabooga.pinokio.git/text-generation-webui/installer_files/env/lib/python3.11/site-packages/torch/utils/cpp_extension.py", line 1312, in load
return _jit_compile(
^^^^^^^^^^^^^
File "/Users/rob/pinokio/api/oobabooga.pinokio.git/text-generation-webui/installer_files/env/lib/python3.11/site-packages/torch/utils/cpp_extension.py", line 1722, in _jit_compile
_write_ninja_file_and_build_library(
File "/Users/rob/pinokio/api/oobabooga.pinokio.git/text-generation-webui/installer_files/env/lib/python3.11/site-packages/torch/utils/cpp_extension.py", line 1811, in _write_ninja_file_and_build_library
extra_ldflags = _prepare_ldflags(
^^^^^^^^^^^^^^^^^
File "/Users/rob/pinokio/api/oobabooga.pinokio.git/text-generation-webui/installer_files/env/lib/python3.11/site-packages/torch/utils/cpp_extension.py", line 1900, in _prepare_ldflags
if (not os.path.exists(_join_cuda_home(extra_lib_dir)) and
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/rob/pinokio/api/oobabooga.pinokio.git/text-generation-webui/installer_files/env/lib/python3.11/site-packages/torch/utils/cpp_extension.py", line 2416, in _join_cuda_home
raise OSError('CUDA_HOME environment variable is not set. '
OSError: CUDA_HOME environment variable is not set. Please set it to your CUDA install root.
Describe the bug
Hello,
I'm running Text-Generation-WebUI with Pinokio. The installation goes fine, then I've added this model in the Model > Download section. The download works fine, but after that when I try to load it, I get this error:
Note: I got the same error with other models as well.
Is there an existing issue for this?
Reproduction
Screenshot
No response
Logs
System Info