oobabooga / text-generation-webui

A Gradio web UI for Large Language Models.
GNU Affero General Public License v3.0
39.56k stars 5.2k forks source link

Failing to launch using llama-cpp due to missing exllama dependency #3983

Closed CubeTheThird closed 1 year ago

CubeTheThird commented 1 year ago

Describe the bug

When attempting to launch using a setup with the requirements_nocuda.txt, an error indicates a missing exllama dependency, which was previously not required for llama-cpp. Upon cloning the exllama repo into the repositories directory, a similar message is displayed, but with a different stacktrace.

Is there an existing issue for this?

Reproduction

Screenshot

No response

Logs

##Log without exllama

2023-09-17 20:34:09 WARNING:exllama module failed to import. Will attempt to import from repositories/.
2023-09-17 20:34:09 ERROR:Could not find repositories/exllama. Please ensure that exllama (https://github.com/turboderp/exllama) is cloned inside repositories/ and is up to date.
Traceback (most recent call last):
  File "/path/text-generation-webui/modules/exllama.py", line 13, in <module>
    from exllama.generator import ExLlamaGenerator
ModuleNotFoundError: No module named 'exllama'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/path/text-generation-webui/server.py", line 29, in <module>
    from modules import (
  File "/path/text-generation-webui/modules/ui_default.py", line 3, in <module>
    from modules import logits, shared, ui, utils
  File "/path/text-generation-webui/modules/logits.py", line 4, in <module>
    from modules.exllama import ExllamaModel
  File "/path/text-generation-webui/modules/exllama.py", line 22, in <module>
    from generator import ExLlamaGenerator
ModuleNotFoundError: No module named 'generator'

##Log with exllama in repositories directory

2023-09-17 20:34:43 WARNING:exllama module failed to import. Will attempt to import from repositories/.
2023-09-17 20:34:43 ERROR:Could not find repositories/exllama. Please ensure that exllama (https://github.com/turboderp/exllama) is cloned inside repositories/ and is up to date.
Traceback (most recent call last):
  File "/path/text-generation-webui/modules/exllama.py", line 13, in <module>
    from exllama.generator import ExLlamaGenerator
ModuleNotFoundError: No module named 'exllama'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/path/text-generation-webui/server.py", line 29, in <module>
    from modules import (
  File "/path/text-generation-webui/modules/ui_default.py", line 3, in <module>
    from modules import logits, shared, ui, utils
  File "/path/text-generation-webui/modules/logits.py", line 4, in <module>
    from modules.exllama import ExllamaModel
  File "/path/text-generation-webui/modules/exllama.py", line 22, in <module>
    from generator import ExLlamaGenerator
  File "/path/text-generation-webui/repositories/exllama/generator.py", line 1, in <module>
    import cuda_ext
  File "/path/text-generation-webui/repositories/exllama/cuda_ext.py", line 43, in <module>
    exllama_ext = load(
  File "/home/user/.conda/envs/textgen2/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 1284, in load
    return _jit_compile(
  File "/home/user/.conda/envs/textgen2/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 1509, in _jit_compile
    _write_ninja_file_and_build_library(
  File "/home/user/.conda/envs/textgen2/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 1601, in _write_ninja_file_and_build_library
    extra_ldflags = _prepare_ldflags(
  File "/home/user/.conda/envs/textgen2/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 1699, in _prepare_ldflags
    extra_ldflags.append(f'-L{_join_cuda_home("lib64")}')
  File "/home/user/.conda/envs/textgen2/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 2223, in _join_cuda_home
    raise EnvironmentError('CUDA_HOME environment variable is not set. '
OSError: CUDA_HOME environment variable is not set. Please set it to your CUDA install root.

System Info

Arch Linux
AMD RX 6700XT
AMD Ryzen 5800X CPU
CubeTheThird commented 1 year ago

Fixed with https://github.com/oobabooga/text-generation-webui/commit/b062d50c451ced2fdf884e2e030954f98a419b55