Closed xandernewton closed 3 months ago
Hey @xandernewton, can you re-run with
response = completion(
model="bedrock/anthropic.claude-3-haiku-20240307-v1:0",
messages=[{"content": "Hello, how are you?", "role": "user"}],
aws_profile_name=<profile_name_here>,
aws_region_name="us-east-1",
stream=True,
)
and let me know if that fixes it
Unfortunately I get the same error
KeyError Traceback (most recent call last) ~/miniconda3/envs/<>/lib/python3.11/site-packages/litellm/utils.py in ?(self, chunk) 11848 traceback_exception = traceback.format_exc() 11849 e.message = str(e) 11850 raise exception_type( 11851 model=self.model,
KeyError: 'text'
can you share the full stacktrace? i see a key error but i can't see the originating line
DEBUG:botocore.parsers:Response headers: {':event-type': 'chunk', ':content-type': 'application/json', ':message-type': 'event'}
DEBUG:botocore.parsers:Response body:
b'{"bytes":"eyJ0eXBlIjoiY29udGVudF9ibG9ja19zdGFydCIsImluZGV4IjowLCJjb250ZW50X2Jsb2NrIjp7InR5cGUiOiJ0ZXh0IiwidGV4dCI6IiJ9fQ=="}'
17:40:21 - LiteLLM:DEBUG: utils.py:1107 - PROCESSED CHUNK PRE CHUNK CREATOR: {'chunk': {'bytes': b'{"type":"content_block_start","index":0,"content_block":{"type":"text","text":""}}'}}; custom_llm_provider: bedrock
DEBUG:LiteLLM:PROCESSED CHUNK PRE CHUNK CREATOR: {'chunk': {'bytes': b'{"type":"content_block_start","index":0,"content_block":{"type":"text","text":""}}'}}; custom_llm_provider: bedrock
17:40:21 - LiteLLM:DEBUG: utils.py:1107 - Logging Details: logger_fn - None | callable(logger_fn) - False
DEBUG:LiteLLM:Logging Details: logger_fn - None | callable(logger_fn) - False
17:40:21 - LiteLLM:DEBUG: utils.py:1107 - Logging Details LiteLLM-Failure Call: []
DEBUG:LiteLLM:Logging Details LiteLLM-Failure Call: []
17:40:21 - LiteLLM:DEBUG: utils.py:1107 - Logging Details: logger_fn - None | callable(logger_fn) - False
DEBUG:LiteLLM:Logging Details: logger_fn - None | callable(logger_fn) - False
ERROR:__main__:BedrockException - 'text'
Traceback (most recent call last):
File "<>miniconda3/envs/<>/lib/python3.11/site-packages/litellm/utils.py", line 11455, in chunk_creator
completion_obj["content"] = response_obj["text"]
~~~~~~~~~~~~^^^^^^^^
KeyError: 'text'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<>miniconda3/envs/<>/lib/python3.11/site-packages/litellm/utils.py", line 11920, in __next__
response: Optional[ModelResponse] = self.chunk_creator(chunk=chunk)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "<>miniconda3/envs/<>/lib/python3.11/site-packages/litellm/utils.py", line 11850, in chunk_creator
raise exception_type(
^^^^^^^^^^^^^^^
File "<>miniconda3/envs/<>/lib/python3.11/site-packages/litellm/utils.py", line 10067, in exception_type
raise e
File "<>/miniconda3/envs/<>/lib/python3.11/site-packages/litellm/utils.py", line 10042, in exception_type
raise APIConnectionError(
litellm.exceptions.APIConnectionError: 'text'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/var/folders/tb/jqnvgtqd32nbl137g6p0m9gr0000gr/T/ipykernel_83762/884545882.py", line 2, in <module>
for x in response:
File "<>miniconda3/envs/<>/lib/python3.11/site-packages/litellm/utils.py", line 11975, in __next__
raise exception_type(
^^^^^^^^^^^^^^^
File "<>/miniconda3/envs/<>/lib/python3.11/site-packages/litellm/utils.py", line 10067, in exception_type
raise e
File "<>/miniconda3/envs/<>/lib/python3.11/site-packages/litellm/utils.py", line 9048, in exception_type
raise ServiceUnavailableError(
litellm.exceptions.ServiceUnavailableError: BedrockException - 'text'
PROCESSED CHUNK PRE CHUNK CREATOR: {'chunk': {'bytes': b'{"type":"content_block_start","index":0,"content_block":{"type":"text","text":""}}'}}; custom_llm_provider: bedrock
The error comes from " KeyError: 'text' " in litellm/utils.py", line 11455
hey @xandernewton that would make sense in the initial case with the boto3 client, but i don't see how that happens in case 2.
in case 2 the call is made via httpx in this file, not boto3 - https://github.com/BerriAI/litellm/blob/d5a1cc282e21ecffc2e05fca1e7e57009aa4eaa2/litellm/llms/bedrock_httpx.py#L1791
your logs show it went to boto3
DEBUG:botocore.parsers:Response headers: {':event-type': 'chunk', ':content-type': 'application/json', ':message-type': 'event'} DEBUG:botocore.parsers:Response body:
please can you confirm you ran case 2 without passing a boto3 client.
Apologies removing aws_bedrock_client
and using:
response = completion(
model="bedrock/anthropic.claude-3-haiku-20240307-v1:0",
messages=[{"content": "Hello, how are you?", "role": "user"}],
aws_region_name="us-east-1",
aws_profile_name="<>",
stream=True,
)
works as expected.
So it seems like passing the aws_bedrock_client is the cause of the problem?
yep
it's a deprecated flow - we've added this to our docs as well
What happened?
Running this code:
returns this error:
LiteLLM version: 1.40.7 Python version: 3.11.9
Without streaming it works as normal
Relevant log output
No response
Twitter / LinkedIn details
No response