langchain-ai / langchain-aws

Build LangChain Applications on AWS
MIT License
101 stars 79 forks source link

Streaming tool calling with ChatBedrockConverse throws an error with some models #140

Closed danielatchobanikova closed 2 months ago

danielatchobanikova commented 2 months ago

Screenshot 2024-08-03 at 17 13 13 Hi,

I have a LangGraph agent which is using an LLM via langchain_aws.ChatBedrockConverse. When I run my project with LangGraph Studio and submit a HUMAN message I receive the error below. However, when I use, for example, ChatBedrock instead of ChatBedrockConverse, the issue does not reproduce.

I installed LangGraph Studio for the first time on Friday and really loved it. I would appreciate it if you could help me resolve this issue. Is it a configuration issue on my side?

Below is the error message I am receiving:

langgraph-api-1 | File "/usr/local/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 5060, in invoke langgraph-api-1 | return self.bound.invoke( langgraph-api-1 | ^^^^^^^^^^^^^^^^^^ langgraph-api-1 | File "/usr/local/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 274, in invoke langgraph-api-1 | self.generate_prompt( langgraph-api-1 | File "/usr/local/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 714, in generate_prompt langgraph-api-1 | return self.generate(prompt_messages, stop=stop, callbacks=callbacks, kwargs) langgraph-api-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ langgraph-api-1 | File "/usr/local/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 571, in generate langgraph-api-1 | raise e langgraph-api-1 | File "/usr/local/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 561, in generate langgraph-api-1 | self._generate_with_cache( langgraph-api-1 | File "/usr/local/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 781, in _generate_with_cache langgraph-api-1 | for chunk in self._stream(messages, stop=stop, kwargs): langgraph-api-1 | File "/usr/local/lib/python3.12/site-packages/langchain_aws/chat_models/bedrock_converse.py", line 427, in _stream langgraph-api-1 | response = self.client.converse_stream( langgraph-api-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ langgraph-api-1 | File "/usr/local/lib/python3.12/site-packages/botocore/client.py", line 565, in _api_call langgraph-api-1 | return self._make_api_call(operation_name, kwargs) langgraph-api-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ langgraph-api-1 | File "/usr/local/lib/python3.12/site-packages/botocore/client.py", line 1017, in _make_api_call langgraph-api-1 | raise error_class(parsed_response, operation_name) langgraph-api-1 | botocore.errorfactory.ValidationException: An error occurred (ValidationException) when calling the ConverseStream operation: This model doesn't support tool use in streaming mode. langgraph-api-1 | 2024-08-03T14:01:03.326001Z [info ] 192.168.65.1:52399 - "POST /threads/2b6d9e7d-83f9-48e4-ae81-5f074a05a9fa/history HTTP/1.1" 200 [uvicorn.access] filename=httptools_impl.py func_name=send lineno=466 langgraph-api-1 | 2024-08-03T14:01:03.355357Z [info ] 192.168.65.1:52399 - "GET /assistants/f69d7c70-ea79-56d5-9773-67796ece2bec/schemas HTTP/1.1" 200 [uvicorn.access] filename=httptools_impl.py func_name=send lineno=466

My current LangGraph Studio version is 0.0.11. Thank you for your assistance.

Best Regards, Daniela

hwchase17 commented 2 months ago

tagging in @efriis - this is an issue with the LangChain/AWS integration

baskaryan commented 2 months ago

which model are you using?

efriis commented 2 months ago

Moving this issue to the langchain-aws repo that this integration is co-maintained with the aws folks out of!

danielatchobanikova commented 2 months ago

which model are you using?

Good question! The issue reproduces with the mistral.mistral-large-2402-v1:0 model. I just tested with anthropic.claude-3-5-sonnet-20240620-v1:0, and the issue does not reproduce