BerriAI / litellm

Python SDK, Proxy Server to call 100+ LLM APIs using the OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.24k stars 1.42k forks source link

[Feature]: Helm chart don't set Master Key in plain text #4911

Open chris-sanders opened 1 month ago

chris-sanders commented 1 month ago

The Feature

I spent a while trying to figure out why I couldn't authenticate. Since it's open source I was able to work it out. My master key didn't match, even though I'm using:

  environmentSecrets:
    - api-keys

    general_settings:
      master_key: os.environ/PROXY_MASTER_KEY

It took some digging before I found that if the MASTER_KEY isn't set _in plain _text__ one is generated. And that overrides the one that's set in my secret.

I was able to work around it once I got this far. I renamed the MASTER_KEY in my secret and used that in the environ call and it's happy now.

You could just document how to do that, but encouraging the MASTER_KEY to be in plain text doesn't seem great. Maybe consider telling people to set it in the secret and only auto-generate if no secret is used?

Motivation, pitch

Make it easier for people not deeply familiar with helm to do the right thing and avoid accidental leaks of their secrets.

Twitter / LinkedIn details

No response

krrishdholakia commented 1 month ago

hey @chris-sanders thanks for the callout, how did you set your master key?

chris-sanders commented 1 month ago

Just noticed this. I set the master key in the same secrete that you show examples for setting other API keys. I used the PROXY_MASTER_KEY env variable, as the example shows but the chart hardcodes that value. To use it you have to set your master key in plain text.

I just changed to a different variable, and adjusted the general settings to use my new variable and it works fine. So you can set your own, but the chart kind of pushes you toward just setting it in your values file in plain text.