Open AINXTGENStudio opened 3 weeks ago
Hi thank you so much <3, Actualy there is an Ollama integration for now. But yeah maybe LM studio can be good also.
Thanks for the consideration as most of these UIs are actually using llama.cpp just like ollama. LM Studio's UI though is very user friendly and has access to huggingface URL for any gguf model and quant variants, LM also has a local server feature that your project could connect to, for example they use: http://localhost:1234/v1
I believe this project has huge potential to have an option to be fully local and private. For example adding LM Studio integration to allow any downloaded LLM or VLM like Phi-3 Vision. And integrating xVASynth or XTTS(Local TTS).