Using groq as the provider with openrouter endpoint (as groq is the provider which needs to be used for OpenAI compatible API).
And this is the response:
Request Failed: HTTP 400 Bad Request: {"error":{"message":"qwen-2.5-coder-32b-instruct is not a valid model ID","code":400}}
Seems like Cody is sending the model as qwen-2.5-coder-32b-instruct instead of qwen/qwen-2.5-coder-32b-instruct ?
Expected behavior
Seems like Cody is sending the model as qwen-2.5-coder-32b-instruct instead of qwen/qwen-2.5-coder-32b-instruct ? and the expected behaviour is to get a response from the endpoint.
Version
v1.41.1731027960
Describe the bug
Openrouter does not work with Cody
Added this config in settings.json:
{ "provider": "groq", // keep groq as provider "model": "qwen/qwen-2.5-coder-32b-instruct", "inputTokens": 128000, "outputTokens": 8192, "apiKey": "",
"apiEndpoint": "https://openrouter.ai/api/v1/chat/completions"
},
Using groq as the provider with openrouter endpoint (as groq is the provider which needs to be used for OpenAI compatible API).
And this is the response:
Request Failed: HTTP 400 Bad Request: {"error":{"message":"qwen-2.5-coder-32b-instruct is not a valid model ID","code":400}}
Seems like Cody is sending the model as qwen-2.5-coder-32b-instruct instead of qwen/qwen-2.5-coder-32b-instruct ?
Expected behavior
Seems like Cody is sending the model as qwen-2.5-coder-32b-instruct instead of qwen/qwen-2.5-coder-32b-instruct ? and the expected behaviour is to get a response from the endpoint.
Additional context
No response