Open abhishek-atigro opened 3 months ago
Hi there, same problem here.
I've been investigating, this is my code implementation:
self.model = ChatBedrock(
model_id="meta.llama3-70b-instruct-v1:0",
model_kwargs={"temperature": temperature, "top_k": 5, "max_tokens": 3000},
region_name=aws_region,
callbacks=[stdout_callback_handler]
)
Upon execution the following message is returned: "Error to process request: Stop sequence key name for meta is not supported."
Some guys have been reporting this as a solution but I am not sure this works:
self.model.provider_stop_sequence_key_name_map = {
'anthropic': 'stop_sequences', 'amazon': 'stopSequences',
'ai21': 'stop_sequences', 'cohere': 'stop_sequences',
'mistral': 'stop', 'meta': ''}
Include the meta key empty.
There is a thread in Langchain where people are talking about this. LangchainIssue
Following up.
I've updated the code and removed the model kwargs... I was getting an error related to unsupported parameters.
self.model = ChatBedrock(
model_id="meta.llama3-70b-instruct-v1:0",
region_name=aws_region,
callbacks=[stdout_callback_handler]
)
When I include this line:
self.model.provider_stop_sequence_key_name_map = {
'anthropic': 'stop_sequences', 'amazon': 'stopSequences',
'ai21': 'stop_sequences', 'cohere': 'stop_sequences',
'mistral': 'stop', 'meta': ''}
I stop getting the error: "Error to process request: Stop sequence key name for meta is not supported."
But now I am getting this error:
"kwargs key generation_token_count already exists in left dict and value has unsupported type <class 'int'>."
The title of the issue indicates llama3.1 but it looks like you're using llama3? Does it occur when using a model_id of meta.llama3-1-70b-instruct-v1:0?
Same error reported before.
Cheers,
Could we get an update -- this error happens with llama 3.1 instruct model --meta.llama3-8b-instruct-v1:0
Creating the chat model like this below -- ChatBedrock( model_id=modelId, client=boto3_bedrock, model_kwargs=model_parameter, beta_use_converse_api=True )
the error -- ValidationException: An error occurred (ValidationException) when calling the Converse operation: This model doesn't support tool use.
If we do not use the converse flag -- with the same model id. But without any model arguments like temperature -- it works fine
ChatBedrock( model_id=modelId, client=boto3_bedrock, )
Hello,
Still looking into the issue with ChatBedrock
but LangChain docs recommend using ChatBedrockConverse
outside of custom models, and it appears to correctly support tool calling with llama 3.1:
from langchain_aws import ChatBedrockConverse
from langchain_core.tools import tool
@tool
def magic_function(input: int) -> int:
"""Call a magic function."""
pass
llm = ChatBedrockConverse(
model="meta.llama3-1-70b-instruct-v1:0"
).bind_tools([magic_function])
response = llm.invoke("What is the value of magic_function(3)?")
response.tool_calls
[{'name': 'magic_function',
'args': {'input': 3},
'id': 'tooluse_j8ZTuYegQUOoUD8qXd8mAA',
'type': 'tool_call'}]
See Langsmith trace.
Couple of points -
1/ This is interesting if this is working with the ChatBedrockConverse class
2/ My understanding was the ChatBedrockConverse class is deprecated in favor of using of the ChatBedrock. is that not accurate ?
Hi @rsgrewal-aws,
For (1), could you confirm you're using meta.llama3-1-70b-instruct-v1:0
? This snippet returns tool calls for me (using langchain-aws==0.1.17):
from langchain_aws import ChatBedrock
from langchain_core.tools import tool
@tool
def magic_function(input: int) -> int:
"""Call a magic function."""
pass
llm = ChatBedrock(
model_id="meta.llama3-1-70b-instruct-v1:0",
beta_use_converse_api=True,
).bind_tools([magic_function])
response = llm.invoke("What is the value of magic_function(3)?")
response.tool_calls
Regarding (2): it's not the case that ChatBedrockConverse
is deprecated in favor of ChatBedrock
. When AWS originally released Bedrock, ChatBedrock
was implemented and contained logic internally to handle differences among providers. See for example this snippet: https://github.com/langchain-ai/langchain-aws/blob/d89fcf80c61174c21c88ef8d1cfbf3a55c7d84c5/libs/aws/langchain_aws/llms/bedrock.py#L307-L316
AWS then released the Bedrock Converse API, which was standardized across providers. ChatBedrockConverse interacts with this API and works well for many purposes.
The Converse API currently does not support all needed features (a big one is native async execution, currently ChatBedrockConverse delegates to asyncio.get_running_loop().run_in_executor for async). But when that gap is closed the classes may be merged somehow, and likely favor the Converse API internally.
Just a note that the ChatBedrock.with_structure_output()
hard codes everything to claude, unless I'm reading this wrong: https://github.com/langchain-ai/langchain-aws/blob/feb8f09134e383827f343ef81534679b691f0406/libs/aws/langchain_aws/chat_models/bedrock.py#L789
Maybe that's a red herring on the tools use, but that's what I tried first when trying to get llama 3.1 running on bedrock with langchain and tools.
Also, just a note that the exception that is created on this line doesn't actually ever get thrown. So it appears to the caller that things are functioning as expected.
Hi @ccurme ,
Hi @rsgrewal-aws,
For (1), could you confirm you're using
meta.llama3-1-70b-instruct-v1:0
? This snippet returns tool calls for me (using langchain-aws==0.1.17):
Does Bedrock support llama 3.1? I don't see it yet, I only see llama 3 models. I don't believe the llama 3 70b had tool calling support. I also don't think that should matter, if the tools are being handled in langchain and the llm is just few-shot or zero-shoting a response. But when I try and run your code I get a model id error from the AWS side.
Ok, turns out it's just my us-east-1 which doesn't have llama 3.1 models yet. Moving over to the west coast with us-west-1 and they all show up there. Doesn't change the underlying bedrock python and tools binding issue, but my guess is that the code using ChatBedrockConverse
might now work. Will verify when I have model access and update.
Just an update that I can confirm that using bedrock ChatBedrockConverse
does seem to work, while ChatBedrock
does not. Here's my MWE's with output:
Same error here, with init_chat_model and provider=bedrock_converse tool is in within metadata thanks to .structured_output, but with llama 3.1 70b the tool is not called. Works fine with Claude 3.5 through Bedrock Converse
any update on this @3coins ?
any updates @3coins
Any updates to supporting Llama3.1+ tools supported natively and not using the "old way"?
Tool calling and structured output both are not working in ChatBedrock and ChatBedrockConverse when using the LLama3.1 model. It works fine with Claude