BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.92k stars 1.51k forks source link

[Help]: Bedrock invalid auth error #6260

Open yarimarchetti opened 18 hours ago

yarimarchetti commented 18 hours ago

What happened?

Just tested the vanilla example of Bedrock model, reported here, and I keep on getting invalid credentials for all the calls, even if the credentials are valid (I tested them via AWS CLI and they work).

LiteLLM==1.49.5 boto3==1.35.41

`import os from litellm import completion

os.environ["AWS_ACCESS_KEY_ID"] = "" os.environ["AWS_SECRET_ACCESS_KEY"] = "" os.environ["AWS_REGION_NAME"] = "us-west-2"

response = completion( model="bedrock/anthropic.claude-3-sonnet-20240229-v1:0", messages=[{ "content": "Hello, how are you?","role": "user"}] )`

Relevant log output

raise AuthenticationError(
litellm.exceptions.AuthenticationError: litellm.AuthenticationError: BedrockException Invalid Authentication - {"message":"The security token included in the request is invalid."}

Twitter / LinkedIn details

No response

krrishdholakia commented 15 hours ago

@yarimarchetti can you run it with litellm.set_verbose=True and confirm the correct credentials are being used?

krrishdholakia commented 15 hours ago

Unable to repro, this works for me. Just tested on a google colab instance

Screenshot 2024-10-16 at 8 49 47 AM