Closed velteyn closed 7 months ago
Hello! Thank you for your interest! It should be an easy change to add support for LM Studio. I will add support in the next week.
Ollama has announced Windows support (along with support for AMD and Intel GPUs!)
I investigated supporting LM Studio, I think the best way will be via support for custom OpenAI compatible endpoints. This should allow a variety of endpoints, but because Ollama is adding windows support I will not be working on this as urgently.
If you would like to submit a pull request or for LMStudio or OpenAI compatible endpoints I will review.
Hello, as you certainly know LM Studio is a simple tool to test LLM in a local PC. LM Studio also have a server that mimic an OpenAI api. Is it possible to configure this tool to use for example Mistral in the local pc ? Thank you