Closed RandomNiiChan closed 9 months ago
Never heard about LM Studio but that looks interesting. As this seems to use a OpenAI compatible API this should be really easy to add. Great suggestion! I will add it as soon as I find the time 😄
Thanks again for the cool idea!
Just released a version that supports local llms. Just select "Custom (e.g. Local)" and throw the URL into the "Custom URL". I tried it with LM Studio with StableLM Zephyr 3B
and it worked 👍 Feel free to test and check if anything isn't working as expected 😀
LM Studio allows to easily run any HuggingFace gguf LLM locally (LLama, Mistral, etc) and open an API endpoint.
Lightweight models offer satisfying results and can easily be ran with 16GB RAM and 8GB VRAM. Would that be possible to add support for a self-hosted API server in the AI section?