simonw / llm-mistral

LLM plugin providing access to Mistral models using the Mistral API
Apache License 2.0
144 stars 14 forks source link

Silent failure when Mistral API key not defined #10

Open danielcorin opened 1 month ago

danielcorin commented 1 month ago

If no API key is available in the keys.json file or LLM_MISTRAL_KEY environment variable, get_model_ids() fails silently when called in the register_models hook. This results in what appears to be a successful installation of the plugin, but no Mistral models are available.

❯ llm -m "mistral-large" hey                                               
Error: 'mistral-large' is not a known model

I think this behavior is reasonable, but a bit hard to understand as an end user. I propose either documenting that this is what happens if the API key is missing or somehow using the models from the DEFAULT_ALIASES dict rather than relying on the API to generate the model list.

For the former, documenting that LLM_MISTRAL_KEY is the environment variable name used by the plugin would help. I was using the Python API, so I skipped the llm key set mistral step and naively assumed setting the API key as MISTRAL_API_KEY would work.

For the latter approach, the root issue will be more obvious to the end user, since the models would register without the API key then fail on the inference API request:

❯ llm -m "mistral-large" hey
Error: Client error '401 Unauthorized' for url 'https://api.mistral.ai/v1/chat/completions'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/401