Closed wenlzhang closed 1 month ago
When using the models from OpenAI, it works.
The most likely cause of this problem is a mismatch between the Openrouter API and the OpenAI API. I would address the issue of streaming support to the authors of Openrouter.
I configured to use OpenRouter models with the following setting:
https://openrouter.ai/api
. I was able to select a model from the dropdown menu.However, I get the following errors when trying to use an action:
I also tried to use the same prompt and action via Ollama, and it worked fine.