longy2k / obsidian-bmo-chatbot

Generate and brainstorm ideas while creating your notes using Large Language Models (LLMs) from Ollama, LM Studio, Anthropic, Google Gemini, Mistral AI, OpenAI, and more for Obsidian.
https://ko-fi.com/longy2k
MIT License
360 stars 45 forks source link

Openrouter.ai support #28

Closed twalderman closed 9 months ago

twalderman commented 10 months ago

Title: Integrate Open Router API Support in BMO Chat Client

User Story: As a user of BMO, the chat client for Obsidian, I want to have the ability to connect to and select from various providers and models supported by openrouter.ai as an API provider, so that I can leverage a diverse range of AI capabilities within my chat experience.

Acceptance Criteria:

Technical Notes:

Tasks:

Definition of Done:

longy2k commented 10 months ago

In v1.8.0, I have updated openai base url to pull the models correctly.

  1. Go to https://openrouter.ai and obtain an API Key.
  2. Insert the API key in the BMO settings.
  3. Update the 'OPENAI BASE URL' to https://openrouter.ai/api/v1
  4. You should see all the models populate within the model dropdown.

Let me know if that works!

longy2k commented 9 months ago

https://github.com/longy2k/obsidian-bmo-chatbot/wiki/How-to-setup-with-Openrouter.ai