Open bigbernnn opened 4 months ago
Hi, Is there anything we can do to allow the PR to get merged ?
Hi, is the _toolchoice argument of _bindtools method functioning? Because I would like to force the LLM to use a tool, but it does not seem to work. Are there any workarounds to do so? Thank you in advance.
For those coming across this PR in search of a solution, I had an issue where the tools were not actually being called. I have created a PR against the fork created by @bigbernnn which can be found here. This has resolved my related issue here.
Can this please be reviewed and merged? Can't switch my existing codebase to bedrock due to this error.
# llm = ChatOpenAI(model="gpt-4o", openai_api_key=OPENAI_API_KEY) [WORKS FINE]
# llm = ChatAnthropic(model="claude-3-opus-20240229") [WORKS FINE]
llm = ChatBedrock(
model_id="anthropic.claude-3-sonnet-20240229-v1:0",
model_kwargs=dict(temperature=0),
) [ERROR]
ValueError: System message must be a string, instead was: <class 'list'>
The proposed changes include: 1/ Ability to use tools with
.generate()
2/ Returningstop_reason
andtool_calls
to AIMessage when using tools in the response metadataSupport for function call using the
generate
function not directly implemented in ChatBedrock.