Closed lactoseintolerant closed 6 months ago
hi @lactoseintolerant, recent change with the engine settings causing the need to re-input the API key in the engine settings, please check this out: https://github.com/janhq/jan/issues/2736#issuecomment-2058166270
Thanks @Van-QA . I already tried inputting the API key in settings (sorry, I didn't explain clearly above). However, I have figured out how to get it to work. I have to update the endpoint URL in settings to https://openrouter.ai/api/v1/chat/completions Previously, I think you just had to set the endpoint in the individual model json file and the endpoint was unchanged in the engine json file.
Thanks @Van-QA . I already tried inputting the API key in settings (sorry, I didn't explain clearly above). However, I have figured out how to get it to work. I have to update the endpoint URL in settings to https://openrouter.ai/api/v1/chat/completions Previously, I think you just had to set the endpoint in the individual model json file and the endpoint was unchanged in the engine json file.
oh, it's in UI now!
For anyone who is confused: Click gear icon in lower left of UI -> Select OpenAI Inference Engine -> Update Chat Completions Endpoint, Update API Key
The Openrouter API authentication no longer functions. Following the instructions at https://jan.ai/docs/remote-inference/router doesn't work. Also, there used to be a box beneath the Openrouter models on the main UI where you could insert the API key. That is gone. Entering the API key from the UI under the new settings/OpenAI extension API input doesn't appear to work, either.
To reproduce, try to run any Openrouter model.