ParisNeo / lollms-webui

Lord of Large Language Models Web User Interface
https://lollms.com
Apache License 2.0
4.33k stars 544 forks source link

Unable to use exllama binding #403

Open athos54 opened 1 year ago

athos54 commented 1 year ago

Expected Behavior

Be able to use exllama binding

Current Behavior

Imposible use exllama binding

I get the error:

Traceback (most recent call last): File "E:\data_test\bindings_zoo\exllama2__init__.py", line 114, in build_model from exllamav2 import ( ModuleNotFoundError: No module named 'exllamav2'

Steps to Reproduce

clone repo run scripts\windows\win_install.bat once app run, select exllama2 binding

Context

windows 10 rtx 3090

Screenshots

(video to the entire process https://youtu.be/UpXPpFnTC8A)

tye-singwa commented 1 year ago

FYI seems that i've found workaround for now, you can add to bindings_zoo/exllama2/requirements.txt dependency exllamav2 manually, something like:

...
websockets
regex
exllamav2 # add here

And reload exllama2 binding from UI

athos54 commented 1 year ago

thanks very much for your reply, I think we are close, but now I have this other error:

Traceback (most recent call last):
  File "E:\finalLollms\lollms-webui\env\lib\site-packages\exllamav2\ext.py", line 14, in <module>
    import exllamav2_ext
ModuleNotFoundError: No module named 'exllamav2_ext'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "E:\finalLollms\data\bindings_zoo\exllama2\__init__.py", line 114, in build_model
    from exllamav2 import (
  File "E:\finalLollms\lollms-webui\env\lib\site-packages\exllamav2\__init__.py", line 3, in <module>
    from exllamav2.model import ExLlamaV2
  File "E:\finalLollms\lollms-webui\env\lib\site-packages\exllamav2\model.py", line 11, in <module>
    from exllamav2.cache import ExLlamaV2Cache
  File "E:\finalLollms\lollms-webui\env\lib\site-packages\exllamav2\cache.py", line 2, in <module>
    from exllamav2.ext import exllamav2_ext as ext_c
  File "E:\finalLollms\lollms-webui\env\lib\site-packages\exllamav2\ext.py", line 124, in <module>
    exllamav2_ext = load \
  File "E:\finalLollms\lollms-webui\env\lib\site-packages\torch\utils\cpp_extension.py", line 1308, in load
    return _jit_compile(
  File "E:\finalLollms\lollms-webui\env\lib\site-packages\torch\utils\cpp_extension.py", line 1736, in _jit_compile
    return _import_module_from_library(name, build_directory, is_python_module)
  File "E:\finalLollms\lollms-webui\env\lib\site-packages\torch\utils\cpp_extension.py", line 2136, in _import_module_from_library
    module = importlib.util.module_from_spec(spec)
ImportError: DLL load failed while importing exllamav2_ext: No se puede encontrar el módulo especificado.
ParisNeo commented 1 year ago

Hi and sorry for the late reply. Last week I worked on major upgrades to the ui and now it has a new more stable structure. Please consider reintalling using the last stable release installer. This should be the final structure.

athos54 commented 1 year ago

Hi @ParisNeo , thanks for your reply.

Now, i have installed exllama, but I have detected other error when you try to download a model. To be able to download a model I have done two things,

first, I have comment some code on get_file_size function because it explode

image

second, image

I have had to parse repo var because on repo was the url complete, so the if of line 402 (on my code because the prints), on repo, instead TheBloke/airoboros-33B-GPT4-2.0-GPTQ was https://huggingface.co/TheBloke/airoboros-33B-GPT4-2.0-GPTQ/resolve/main/model.safetensors

Im not python programmer, so I dont understand very well the structure of the code, so I think I cant to make a pr, but I think this information is good for you

Here is the function to parse the url, sorry about the language, chatgpt want to write code on spanish today :)

def obtener_nombre_modelo_desde_url(url):
    from urllib.parse import urlparse
    parsed_url = urlparse(url)
    path_segments = parsed_url.path.split('/')
    path_segments = [segment for segment in path_segments if segment != '']
    nombre_modelo = '/'.join(path_segments)
    return nombre_modelo