langchain-ai / langchain-aws

Build LangChain Applications on AWS
MIT License
103 stars 80 forks source link

Create agent with tools or Bind_tools does not implement the correct chain with Llama models in Bedrock #175

Open rsgrewal-aws opened 2 months ago

rsgrewal-aws commented 2 months ago

Using the ChatBedrock class and if we run the bind_tools or if we do the create

using Converse api -- or beta_converse_api flag we see error like RunnableBindingBase.invoke(self, input, config, kwargs) /langchain_core/runnables/base.py:5097) {self.kwargs, kwargs},

...... virtualenv/trainenv/lib/python3.11/site-packages/botocore/client.py:1019) return parsed_response

ValidationException: An error occurred (ValidationException) when calling the Converse operation: This model doesn't support tool use.

If turn the converse api flag to FALSE -- we see the bind tools not having any tool definition and it becomes like -- client=<botocore.client.BedrockRuntime object at 0x10bc9ee10> model_id='meta.llama3-8b-instruct-v1:0' --- no tools definitions here --- and invocation works but returns generic results

2nd issue -- Using -- create_tool_calling_agent(....) with llama we do not see the tools in the chain definition -- see below

[SystemMessagePromptTemplate(prompt=PromptTemplate(input_variables=[], template="\n\nUse the following format:\nQuestion: the input question you must answer\nThought: you should always think about what to do, Also try to follow steps mentioned above\nAction: the action to take, should be one of ['search_medical_policy',]\nAction Input: the input to the action\nObservation: the result of the action\n... (this Thought/Action/Action Input/Observation can repeat N times)\nThought: I now know the final answer\nFinal Answer: the final answer to the original input question\n\n")), MessagesPlaceholder(variable_name='chat_history', optional=True), HumanMessagePromptTemplate(prompt=PromptTemplate(input_variables=['input'], template='{input}')), MessagesPlaceholder(variable_name='agent_scratchpad', optional=True)]) | ChatBedrock(client=<botocore.client.BedrockRuntime object at 0x10bc9ee10>, model_id='meta.llama3-8b-instruct-v1:0', model_kwargs={'temperature': 0.0, 'top_p': 0.5, 'max_gen_len': 200}) | ToolsAgentOutputParser()

There is no tool here

with Claude models these calls work fine.

danielatchobanikova commented 1 month ago

Hi, I am experiencing issue 1. While my code works with anthropic sonnet, when I replace the model with llama 3.2, I get this message: This model doesn't support tool use.