turboderp / exui

Web UI for ExLlamaV2
MIT License
436 stars 41 forks source link

NameError: name 'exllamav2_ext' is not defined #61

Closed jwax33 closed 1 month ago

jwax33 commented 1 month ago

(venv) F:\AI-Language\exui>python server.py Traceback (most recent call last): File "F:\AI-Language\exui\server.py", line 11, in from backend.models import update_model, load_models, get_model_info, list_models, remove_model, load_model, unload_model, get_loaded_model File "F:\AI-Language\exui\backend\models.py", line 5, in from exllamav2 import( File "F:\AI-Language\exui\venv\lib\site-packages\exllamav2__init__.py", line 3, in from exllamav2.model import ExLlamaV2 File "F:\AI-Language\exui\venv\lib\site-packages\exllamav2\model.py", line 35, in from exllamav2.config import ExLlamaV2Config File "F:\AI-Language\exui\venv\lib\site-packages\exllamav2\config.py", line 5, in from exllamav2.fasttensors import STFile File "F:\AI-Language\exui\venv\lib\site-packages\exllamav2\fasttensors.py", line 6, in from exllamav2.ext import exllamav2_ext as ext_c File "F:\AI-Language\exui\venv\lib\site-packages\exllamav2\ext.py", line 291, in ext_c = exllamav2_ext NameError: name 'exllamav2_ext' is not defined

I get this when I run python server.py. I have exllamav2 installed. I've tried installing it from the precompiled wheels linked as well as from pip and I still get this error. Not sure what to do next.

turboderp commented 1 month ago

Which wheel version did you use and what's your version of PyTorch?

jwax33 commented 1 month ago

Hello, Wheel version: exllamav2-0.2.0+cu118.torch2.2.0-cp310-cp310-win_amd64.whl PyTorch: 2.2.0+cu121 Python 3.10.6

turboderp commented 1 month ago

That all looks reasonable.

You may have a broken extension cache, I've seen that a few times. Try deleting the C:\Users\<username>\AppData\Local\torch_extensions folder.

You may also want to update Python to 3.11 and Torch to 2.4.

What's your GPU, by the way?

jwax33 commented 1 month ago

I deleted the extension cache without success. I've upgraded to Python 3.11 and Torch 2.4. Now I'm getting this error:

(venv) F:\AI-Language\exui>python server.py Traceback (most recent call last): File "F:\AI-Language\exui\server.py", line 11, in <module> from backend.models import update_model, load_models, get_model_info, list_models, remove_model, load_model, unload_model, get_loaded_model File "F:\AI-Language\exui\backend\models.py", line 5, in <module> from exllamav2 import( File "F:\AI-Language\exui\venv\Lib\site-packages\exllamav2\__init__.py", line 3, in <module> from exllamav2.model import ExLlamaV2 File "F:\AI-Language\exui\venv\Lib\site-packages\exllamav2\model.py", line 35, in <module> from exllamav2.config import ExLlamaV2Config File "F:\AI-Language\exui\venv\Lib\site-packages\exllamav2\config.py", line 5, in <module> from exllamav2.fasttensors import STFile File "F:\AI-Language\exui\venv\Lib\site-packages\exllamav2\fasttensors.py", line 3, in <module> from safetensors import safe_open File "F:\AI-Language\exui\venv\Lib\site-packages\safetensors\__init__.py", line 2, in <module> from ._safetensors_rust import ( # noqa: F401 ModuleNotFoundError: No module named 'safetensors._safetensors_rust'

turboderp commented 1 month ago

You may have to reinstall libraries like safetensors after updating Torch.