logancyang / obsidian-copilot

A ChatGPT Copilot in Obsidian
https://www.obsidiancopilot.com/
GNU Affero General Public License v3.0
2.31k stars 154 forks source link

OpenRouter's API doesn't work #178

Closed intari closed 5 months ago

intari commented 8 months ago

I entered base url as https://openrouter.ai/api/v1 and configure api key. Attempt to use OpenRouter results in "Please set an "HTTP-Referer" header with the URL of your app" It doesn't matter if I select LocalAI as model or leave regular OpenAI. According to OpenRouter's docs at https://openrouter.ai/docs#format - HTTP-referer could https://localhost but it must be set.

intari commented 8 months ago

my proposed fix provided in https://github.com/logancyang/obsidian-copilot/issues/179

logancyang commented 6 months ago

Thanks for the PR and sorry for the late reply. I closed it but I will have a solution for openrouter.ai. According to their doc, It seems the way to go is

  1. put https://openrouter.ai/api/v1 at OpenAI Proxy Base URL in the setting
  2. pass a custom model name to it
  3. I'll test if there's issue with the referer header and come up with a solution

I'm moving away from LocalAI in favor of LM Studio. Will probably move the LocalAI model name field to work for all 3rd-party OpenAI API replacements.

logancyang commented 5 months ago

OpenRouter is added as a separate option in the model dropdown and settings in v2.4.9, please try it out!