A fast inference library for running LLMs locally on modern consumer-class GPUs
3.2k
stars
235
forks
source link
Added a mention of lollms-webui as another possible webui that can be used with exllamav2 as a backend #352
Closed
ParisNeo closed 4 months ago
Hi, I just added a single line in the supported webui sections about lollms who actually supported exllamav2 for a very long time :)