ParisNeo / lollms-webui

Lord of Large Language Models Web User Interface
https://lollms.com
Apache License 2.0
4.27k stars 537 forks source link

Can't install binding VLLm - Key error 'enable_gpu' #471

Closed ba2512005 closed 7 months ago

ba2512005 commented 8 months ago

Expected Behavior

Error when installing Vllm binding

Current Behavior

Installing the binding is failing on this error message Couldn't build binding: ['enable_gpu']

Steps to Reproduce

Please provide detailed steps to reproduce the issue.

  1. Install Lollms through linux_install.sh on Ubuntu 22
  2. Go to settings and try to install binding for Vllm to run AWQ model
  3. Broken

Possible Solution

If you have any suggestions on how to fix the issue, please describe them here.

Seems like the enable_gpu key is pulling from self.config which is puling from LOLLMSConfig, which is imported from lollms.binding instead of lollms.config. I'm not able to find a binding specific config in there.

Context

---------------------- Installing vLLM ---------------------- Couldn't build binding: ['enable_gpu'] Traceback (most recent call last): File "/home//ai/lollms-webui/scripts/linux/lollms-webui/lollms_core/lollms/server/endpoints/lollms_binding_infos.py", line 163, in install_binding lollmsElfServer.binding = BindingBuilder().build_binding(lollmsElfServer.config, lollmsElfServer.lollms_paths, InstallOption.FORCE_INSTALL, lollmsCom=lollmsElfServer) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home//ai/lollms-webui/scripts/linux/lollms-webui/lollms_core/lollms/binding.py", line 565, in build_binding return binding( ^^^^^^^^ File "/home//ai/lollms-webui/scripts/linux/lollms-webui/zoos/bindings_zoo/vLLM/init.py", line 80, in init super().init( File "/home//ai/lollms-webui/scripts/linux/lollms-webui/lollms_core/lollms/binding.py", line 82, in init self.install() File "/home//ai/lollms-webui/scripts/linux/lollms-webui/zoos/bindings_zoo/vLLM/init.py", line 194, in install check_and_install_torch(self.config.enable_gpu, version=2.1) ^^^^^^^^^^^^^^^^^^^^^^ File "/home//ai/lollms-webui/scripts/linux/lollms-webui/lollms_core/lollms/config.py", line 298, in getattr return self.config[key]


KeyError: 'enable_gpu'

## Screenshots
If applicable, add screenshots to help explain the issue.