BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.09k stars 1.67k forks source link

[Bug]: AWS STS credentials not cached #5142

Open nabuskey opened 3 months ago

nabuskey commented 3 months ago

What happened?

AWS credentials returned by STS is not cached in most situations. This usually leads to excessive call to STS services that could potentially be throttled.

Looks like it is cached for web token exchange, but not for others: https://github.com/BerriAI/litellm/blob/9f0a05d406bb8b0f4f7bc80b9038b35febc5028f/litellm/llms/bedrock_httpx.py#L458

I'd be happy to work on this.

Relevant log output

No response

Twitter / LinkedIn details

No response

krrishdholakia commented 3 months ago

Great point - yes a pr here is welcome