Closed tapansstardog closed 6 months ago
Hey @tapansstardog - You can use override_params in each target to override any llm param for that target. Here is an example showing the usage: https://docs.portkey.ai/docs/cookbooks/tackling-rate-limiting#id-2.-fallback-to-alternative-llms
In your case, this is how the config will be:
{
"strategy": {
"mode": "fallback"
},
"targets": [
{
"provider": "openai",
"api_key": "dummy",
"override_params": {
"model": "gpt-3.5-turbo-instruct"
}
},
{
"provider": "together-ai",
"api_key": "dummy",
"override_params": {
"model": "codellama/CodeLlama-34b-Python-hf"
}
}
]
}
This worked @VisargD . Thanks!
Hi team,
Is there a way to provide different model names to the providers? I could not find the example. Something like this:
Thanks!