pfrankov / obsidian-local-gpt

Local Ollama and OpenAI-like GPT's assistance for maximum privacy and offline access
MIT License
265 stars 18 forks source link

Error while using OpenRouter models #22

Closed wenlzhang closed 1 month ago

wenlzhang commented 1 month ago

I configured to use OpenRouter models with the following setting: https://openrouter.ai/api. I was able to select a model from the dropdown menu.

However, I get the following errors when trying to use an action:

Could not JSON parse stream message : OPENROUTER PROCESSING SyntaxError: Unexpected token ':', ": OPENROUT"... is not valid JSON

I also tried to use the same prompt and action via Ollama, and it worked fine.

wenlzhang commented 1 month ago

When using the models from OpenAI, it works.

pfrankov commented 1 month ago

The most likely cause of this problem is a mismatch between the Openrouter API and the OpenAI API. I would address the issue of streaming support to the authors of Openrouter.