turboderp / exllamav2

A fast inference library for running LLMs locally on modern consumer-class GPUs
MIT License
3.2k stars 235 forks source link

Added a mention of lollms-webui as another possible webui that can be used with exllamav2 as a backend #352

Closed ParisNeo closed 4 months ago

ParisNeo commented 4 months ago

Hi, I just added a single line in the supported webui sections about lollms who actually supported exllamav2 for a very long time :)

image

ParisNeo commented 4 months ago

I just added the exl2 support too and updated the binding card in lollms. image

turboderp commented 4 months ago

alr