Closed krrishdholakia closed 8 months ago
cc: @yujonglee @shauryr @ishaan-jaff @adriensas @nsbradford @mrT23 if y'all have additional thoughts on this
what's the goal of adding key management? what's the overlap with a SecretOps platform e.g. Doppler?
I have similar questions. Not sure why you need this vs. Doppler / Infisical.
It might simplify adding new LLMs via a UI (just add key + model settings and go to prod). Not sure how it helps currently though.
thoughts @ranjancse26
@krrishdholakia The idea is not to rely on 3rd party providers for key management. Various reasons, like vendor dependency, concurrency issues, request throttling etc. If you want to talk about the public cloud providers, like AWS, Azure, GCP, they got their own way of managing the secrets using the Key Management Service. Most of the On-premises based systems, do not rely on 3rd party providers.
It's possible to come up with a secure key management using the certificate-based encryption/decryption. Just a thought or possibility on what one can do.
Please feel free to add your thoughts.
I came across the below one and the concept seems to be standard practice
https://docs.helicone.ai/features/advanced-usage/vault https://www.loom.com/share/122024dcb1c74cba887a49ce5516a6a1?sid=4fc886f9-2808-4a7b-8965-92d120eb48c1
looks identical to what we built in the proxy - https://github.com/BerriAI/liteLLM-proxy
https://github.com/BerriAI/liteLLM-proxy/blob/bbe0f62e3a413c184607a188ec1b9ca931fef040/main.py#L116
I this what you're hoping for:
@app.route("/chat/completions")
def chat_completion():
user_key = request.data["user_key"]
response = completion(model, messages, user=user_key)
return response
@app.route("/key/new")
def create_key():
litellm.save_key("user_key", "actual_api_key")
closing due to inactivity
The Feature
Have litellm help manage my keys in a secure way.
cc: @ranjancse26 what does this look like?
Motivation, pitch
User request
Twitter / LinkedIn details
No response