turboderp / exui

Web UI for ExLlamaV2
MIT License
449 stars 43 forks source link

Launching throws error with Exllamav2 stating "Tokenizers" are not found #47

Open Adzeiros opened 7 months ago

Adzeiros commented 7 months ago

Followed the install instructions, launching the server.py in exui throws the below error:


  File "G:\_AI\exui\server.py", line 11, in <module>
    from backend.models import update_model, load_models, get_model_info, list_models, remove_model, load_model, unload_model, get_loaded_model
  File "G:\_AI\exui\backend\models.py", line 5, in <module>
    from exllamav2 import(
  File "C:\Users\Adzei\AppData\Local\Programs\Python\Python310\lib\site-packages\exllamav2\__init__.py", line 9, in <module>
    from exllamav2.tokenizer.tokenizer import ExLlamaV2Tokenizer
  File "C:\Users\Adzei\AppData\Local\Programs\Python\Python310\lib\site-packages\exllamav2\tokenizer\__init__.py", line 5, in <module>
    from exllamav2.tokenizer.hf import ExLlamaV2TokenizerHF
  File "C:\Users\Adzei\AppData\Local\Programs\Python\Python310\lib\site-packages\exllamav2\tokenizer\hf.py", line 4, in <module>
    from tokenizers import Tokenizer
ModuleNotFoundError: No module named 'tokenizers'```
SirTurlock commented 6 months ago

Workaround is to install the "tokenizers" package which does not seem to be included in the provided requirements.txt. So just do a pip install tokenizers in your venv.

turboderp commented 6 months ago

tokenizers isn't included as a requirement because it isn't needed for all models, only the ones that aren't made with SentencePiece or don't include the tokenizer.model SPM file. I guess I could add a check and a more helpful error message.