Open flowingblaze opened 5 months ago
Thanks for raising a feature request!
Looks like LM Studio has an API identical to OpenAI's, so should be easy to implement. It would take a little re-engineering of the settings UI which I need to think through a bit.
If you need this immediately, you can fork the repo and replace OpenAI API call's base URL with the LM Studio local URL, and run the plugin in development mode on your machine
Otherwise, if you'd like to play around with local models in the meantime, check out Ollama mode!
The current two options of the plugin is Ollama and ChatGPT, but I was wondering if you could add support to use LM studio?