Generate and brainstorm ideas while creating your notes using Large Language Models (LLMs) from Ollama, LM Studio, Anthropic, Google Gemini, Mistral AI, OpenAI, and more for Obsidian.
I can't seem to get this extension to work with LM Studio. I've successfully used my server with other software, so I know the server works.
I have CORS enabled. I'm serving on the local network. I've set the port and ensured that the URL in BMO Chatbot includes the correct port for my LM Studio server. No glory. It can't detect the model. I just see No Model in the dropdown, and refreshing results in:
[ERROR] Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
I've tried multiple LLM's and I've tried every prompt template in the book. Am I missing something, or is this plugin broken for LM Studio?
Nevermind, ChatGPT to the rescue. Apparently I misunderstood the fact that there were two different URL's and apparently you need to set them both up. Once I setup
Ollama Rest API URL: http://localhost:/v1/chat/completions
and
REST API URL: http://localhost:4444/v1
I have victory!
I can't seem to get this extension to work with LM Studio. I've successfully used my server with other software, so I know the server works. I have CORS enabled. I'm serving on the local network. I've set the port and ensured that the URL in BMO Chatbot includes the correct port for my LM Studio server. No glory. It can't detect the model. I just see No Model in the dropdown, and refreshing results in: [ERROR] Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
I've tried multiple LLM's and I've tried every prompt template in the book. Am I missing something, or is this plugin broken for LM Studio?