Open kevinseabourne opened 7 months ago
You can do this!
If you go to "cmd+shift+p > cursor settings" Then under "OpenAI API", toggle "OpenAI Base URL" and select the right base url (don't add a trailing "/").
You can use local models, other api providers (together, fireworks, etc...) or your own openai compatible endpoints for running mistral/mixtral.
"Hello! Inferences happens through our backend which cannot access servers running locally on your computer. You’ll need to provide a publicly accessible URL." https://forum.cursor.sh/t/unable-to-use-lm-studio-with-override/2637/5
Add Mistral models, instead of just being able to use OpenAI models allow people to choice other models ?