longy2k / obsidian-bmo-chatbot

Generate and brainstorm ideas while creating your notes using Large Language Models (LLMs) from Ollama, LM Studio, Anthropic, Google Gemini, Mistral AI, OpenAI, and more for Obsidian.
https://ko-fi.com/longy2k
MIT License
352 stars 45 forks source link

LM Studio returns Unexpected endpoint or method. #113

Open skunkmonkey opened 1 week ago

skunkmonkey commented 1 week ago

I can't seem to get this extension to work with LM Studio. I've successfully used my server with other software, so I know the server works. I have CORS enabled. I'm serving on the local network. I've set the port and ensured that the URL in BMO Chatbot includes the correct port for my LM Studio server. No glory. It can't detect the model. I just see No Model in the dropdown, and refreshing results in: [ERROR] Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway

I've tried multiple LLM's and I've tried every prompt template in the book. Am I missing something, or is this plugin broken for LM Studio?

skunkmonkey commented 1 week ago

Nevermind, ChatGPT to the rescue. Apparently I misunderstood the fact that there were two different URL's and apparently you need to set them both up. Once I setup Ollama Rest API URL: http://localhost:/v1/chat/completions and REST API URL: http://localhost:4444/v1 I have victory!