Closed intari closed 5 months ago
my proposed fix provided in https://github.com/logancyang/obsidian-copilot/issues/179
Thanks for the PR and sorry for the late reply. I closed it but I will have a solution for openrouter.ai. According to their doc, It seems the way to go is
https://openrouter.ai/api/v1
at OpenAI Proxy Base URL in the settingI'm moving away from LocalAI in favor of LM Studio. Will probably move the LocalAI model name field to work for all 3rd-party OpenAI API replacements.
OpenRouter is added as a separate option in the model dropdown and settings in v2.4.9, please try it out!
I entered base url as https://openrouter.ai/api/v1 and configure api key. Attempt to use OpenRouter results in "Please set an "HTTP-Referer" header with the URL of your app" It doesn't matter if I select LocalAI as model or leave regular OpenAI. According to OpenRouter's docs at https://openrouter.ai/docs#format - HTTP-referer could https://localhost but it must be set.