Portkey-AI / gateway

A Blazing Fast AI Gateway with integrated Guardrails. Route to 200+ LLMs, 50+ AI Guardrails with 1 fast & friendly API.
https://portkey.ai/features/ai-gateway
MIT License
6.11k stars 422 forks source link

Providing different models in Config #305

Closed tapansstardog closed 6 months ago

tapansstardog commented 6 months ago

Hi team,

Is there a way to provide different model names to the providers? I could not find the example. Something like this:

{
    "strategy": {
        "mode": "fallback"
    },
    "targets": [
            {
                "provider": "openai",
                "model": "gpt-3.5-turbo-instruct",
                "api_key": "dummy"
            },
            {
                "provider": "together-ai",
                "model": "codellama/CodeLlama-34b-Python-hf"
                "api_key": "dummy",
            }
        ]
    }

Thanks!

VisargD commented 6 months ago

Hey @tapansstardog - You can use override_params in each target to override any llm param for that target. Here is an example showing the usage: https://docs.portkey.ai/docs/cookbooks/tackling-rate-limiting#id-2.-fallback-to-alternative-llms

In your case, this is how the config will be:

{
    "strategy": {
        "mode": "fallback"
    },
    "targets": [
            {
                "provider": "openai",
                "api_key": "dummy",
                "override_params": {
                    "model": "gpt-3.5-turbo-instruct"
                }
            },
            {
                "provider": "together-ai",
                "api_key": "dummy",
                "override_params": {
                    "model": "codellama/CodeLlama-34b-Python-hf"
                }
            }
        ]
    }
tapansstardog commented 6 months ago

This worked @VisargD . Thanks!