Portkey-AI / gateway

A Blazing Fast AI Gateway with integrated Guardrails. Route to 200+ LLMs, 50+ AI Guardrails with 1 fast & friendly API.
https://portkey.ai/features/ai-gateway
MIT License
6.3k stars 452 forks source link

Feature Request: Support for Configuration Files in Portkey's AI Gateway #127

Open levie-vans opened 10 months ago

levie-vans commented 10 months ago

Hello Portkey Team,

I'm currently using the Portkey's AI Gateway for integrating various LLMs into my application. First off, I want to extend my appreciation for the excellent work you've done with the gateway. It's been a game-changer in terms of performance and ease of use.

However, I've encountered a limitation that I believe could be improved. Currently, the configuration for the AI Gateway is passed directly through the request header using the 'x-portkey-config' key. While this works, it's not the most convenient approach, especially when dealing with API calls that strictly adhere to the OpenAI standards and don't allow for easy modification of headers.

To improve this, I'd like to suggest the addition of support for external configuration files. This feature would allow users to define their configurations in a separate file, making it easier to manage and update settings without altering the codebase. It would be particularly helpful for scenarios where different environments (development, staging, production) require different configurations.

Here are a few key benefits I believe this feature would offer:

  1. Simplified Configuration Management: Keeping configurations in separate files makes it easier to maintain and understand.
  2. Environment-Specific Configurations: Facilitates the use of different settings for various deployment environments.
  3. Cleaner Code: Reduces the clutter in the main application code, leading to better readability and maintenance.
  4. Consistency with Industry Standards: Many developers are accustomed to managing configurations through external files, making this a familiar and welcome feature.
  5. Security Concerns: Including multiple keys directly within the code not only bloats the request headers but also raises potential security risks.

I understand that implementing this feature would require effort and resources, but I believe the long-term benefits to users like myself would be significant.

Thank you for considering this request. I'm looking forward to any updates, and I'm more than willing to provide further input or clarification if needed.

ayush-portkey commented 10 months ago

Hi @levie-vans,

Thank you for your excellent suggestion regarding the use of config files similar to Nginx. We've considered this and agree that it's beneficial for scenarios where one or a small set of configs are tied to a deployment. However, for our use case, we believe a different approach is more suitable:

Flexibility per Request: Our system is designed to allow flexibility per LLM call. For example, if an OpenAI call requires a fallback to Azure, or an Anthropic or Llama call necessitates different rules, this setup allows for that customization.

Use Case Variability: Consider a scenario where load balancing between different LLMs is needed. In such cases, being able to override parameters at the request level is crucial.

Stateless Deployment: We recommend deploying a single, stateless gateway. This lets the applications decide the best configuration per request, enhancing adaptability and efficiency of the gateway.

In this specific use case, I believe that configuration settings are more appropriately managed at the request level rather than at the deployment level.

Your input is highly valued, and we're open to further discussion to refine our solution. Looking forward to more thoughts and improving our gateway.

levie-vans commented 10 months ago

Hi @ayush-portkey ,

Thank you for your prompt and detailed response. I understand and appreciate the merits of your approach, emphasizing flexibility and adaptability at the request level. However, I'd like to reiterate the advantages of external configuration files in certain scenarios.

Many users of the Portkey's AI Gateway, including myself, often integrate it into third-party customer service platforms or client applications. In such environments, modifying the core code to change configurations for each request isn't always feasible or practical. External configuration files would offer a streamlined solution in these cases, allowing us to manage settings without altering the application's core code.

For instance, in a customer support scenario using a third-party platform, it might not be possible or practical to modify the request headers for each interaction. An external configuration file can provide a more elegant solution, allowing for easier management of different settings based on the environment (development, staging, production), without the need for direct code intervention.

While the current system's flexibility is invaluable for certain use cases, having the option to use external configuration files would significantly enhance usability in scenarios where modifying request headers is cumbersome or restricted. This approach would also align with common industry practices, making it more intuitive for a broader range of developers.

I hope this additional context sheds light on why this feature could be a valuable addition to your gateway. I'm more than willing to participate in further discussions or beta testing to help develop this feature.

Looking forward to your thoughts on this.

garmoned commented 2 days ago

I would also appreciate a feature like this. It would make this a more comparative project to https://www.litellm.ai/ where when self hosting you can configure it statically. This makes it so each client can use a minimal request and removes the need for a database to store configs.