OpenRouter models are listed and do work if I do llm keys set openrouter, but not with the OPENROUTER_KEY environment variable.
I've tried this on GitHub Workflows, on a local Ubuntu 22.04 container, and on my Windows 11 laptop. The OpenRouter models aren't visible or usable. In the Ubuntu container, for example, I see this:
I also tried other llm plugins, and here are my results after doing llm install llm-<plugin> ; llm models without setting any API keys. :-1: means the models from the plugin aren't included in the list, and :100: means they are on the list.
OpenRouter models are listed and do work if I do
llm keys set openrouter
, but not with theOPENROUTER_KEY
environment variable.I've tried this on GitHub Workflows, on a local Ubuntu 22.04 container, and on my Windows 11 laptop. The OpenRouter models aren't visible or usable. In the Ubuntu container, for example, I see this:
On Windows 11 and Python 3.11.4 in both PowerShell and the mintty terminal installed by Git:
Trying to use the models on the CLI or through the Python API doesn't work either, I get this on the Python API (here in a GitHub build):
I also tried, without success:
pip install llm ; llm install llm-openrouter
apt install -y git ; pip install llm@git+https://github.com/simonw/llm@main llm-openrouter@git+https://github.com/simonw/llm-openrouter@main
llm install llm-<plugin> ; llm models
without setting any API keys. :-1: means the models from the plugin aren't included in the list, and :100: means they are on the list.llm-mistral
llm-gemini
llm-claude
llm-claude-3
llm-command-r
llm-reka
llm-perplexity
llm-groq
llm-anyscale-endpoints
llm-replicate
llm-fireworks
llm-palm
llm-openrouter
llm-cohere
llm-bedrock-anthropic
llm-bedrock-meta
llm-together
(llm.errors.NeedsKeyException
)