Closed jwax33 closed 1 month ago
Which wheel version did you use and what's your version of PyTorch?
Hello, Wheel version: exllamav2-0.2.0+cu118.torch2.2.0-cp310-cp310-win_amd64.whl PyTorch: 2.2.0+cu121 Python 3.10.6
That all looks reasonable.
You may have a broken extension cache, I've seen that a few times. Try deleting the C:\Users\<username>\AppData\Local\torch_extensions
folder.
You may also want to update Python to 3.11 and Torch to 2.4.
What's your GPU, by the way?
I deleted the extension cache without success. I've upgraded to Python 3.11 and Torch 2.4. Now I'm getting this error:
(venv) F:\AI-Language\exui>python server.py Traceback (most recent call last): File "F:\AI-Language\exui\server.py", line 11, in <module> from backend.models import update_model, load_models, get_model_info, list_models, remove_model, load_model, unload_model, get_loaded_model File "F:\AI-Language\exui\backend\models.py", line 5, in <module> from exllamav2 import( File "F:\AI-Language\exui\venv\Lib\site-packages\exllamav2\__init__.py", line 3, in <module> from exllamav2.model import ExLlamaV2 File "F:\AI-Language\exui\venv\Lib\site-packages\exllamav2\model.py", line 35, in <module> from exllamav2.config import ExLlamaV2Config File "F:\AI-Language\exui\venv\Lib\site-packages\exllamav2\config.py", line 5, in <module> from exllamav2.fasttensors import STFile File "F:\AI-Language\exui\venv\Lib\site-packages\exllamav2\fasttensors.py", line 3, in <module> from safetensors import safe_open File "F:\AI-Language\exui\venv\Lib\site-packages\safetensors\__init__.py", line 2, in <module> from ._safetensors_rust import ( # noqa: F401 ModuleNotFoundError: No module named 'safetensors._safetensors_rust'
You may have to reinstall libraries like safetensors after updating Torch.
I get this when I run python server.py. I have exllamav2 installed. I've tried installing it from the precompiled wheels linked as well as from pip and I still get this error. Not sure what to do next.