BerriAI / litellm

Call all LLM APIs using the OpenAI format. Use Bedrock, Azure, OpenAI, Cohere, Anthropic, Ollama, Sagemaker, HuggingFace, Replicate (100+ LLMs)
https://docs.litellm.ai/docs/
Other
10.16k stars 1.13k forks source link

[Feature]: Support key management #424

Closed krrishdholakia closed 8 months ago

krrishdholakia commented 9 months ago

The Feature

Have litellm help manage my keys in a secure way.

cc: @ranjancse26 what does this look like?

Motivation, pitch

User request

Twitter / LinkedIn details

No response

krrishdholakia commented 9 months ago

cc: @yujonglee @shauryr @ishaan-jaff @adriensas @nsbradford @mrT23 if y'all have additional thoughts on this

nsbradford commented 9 months ago

what's the goal of adding key management? what's the overlap with a SecretOps platform e.g. Doppler?

krrishdholakia commented 9 months ago

I have similar questions. Not sure why you need this vs. Doppler / Infisical.

It might simplify adding new LLMs via a UI (just add key + model settings and go to prod). Not sure how it helps currently though.

thoughts @ranjancse26

ranjancse26 commented 9 months ago

@krrishdholakia The idea is not to rely on 3rd party providers for key management. Various reasons, like vendor dependency, concurrency issues, request throttling etc. If you want to talk about the public cloud providers, like AWS, Azure, GCP, they got their own way of managing the secrets using the Key Management Service. Most of the On-premises based systems, do not rely on 3rd party providers.

It's possible to come up with a secure key management using the certificate-based encryption/decryption. Just a thought or possibility on what one can do.

Please feel free to add your thoughts.

ranjancse26 commented 9 months ago

I came across the below one and the concept seems to be standard practice

https://docs.helicone.ai/features/advanced-usage/vault https://www.loom.com/share/122024dcb1c74cba887a49ce5516a6a1?sid=4fc886f9-2808-4a7b-8965-92d120eb48c1

krrishdholakia commented 9 months ago

looks identical to what we built in the proxy - https://github.com/BerriAI/liteLLM-proxy

https://github.com/BerriAI/liteLLM-proxy/blob/bbe0f62e3a413c184607a188ec1b9ca931fef040/main.py#L116

krrishdholakia commented 9 months ago

I this what you're hoping for:

@app.route("/chat/completions")
def chat_completion():
   user_key = request.data["user_key"]
   response = completion(model, messages, user=user_key)
   return response 

@app.route("/key/new")
def create_key(): 
   litellm.save_key("user_key", "actual_api_key") 
ishaan-jaff commented 8 months ago

closing due to inactivity