With the most recent version it appears that ChatBedrock does not function when I call using a llama3 model using chain.ainvoke({}).
\langchain_aws\llms\bedrock.py", line 867, in _aprepare_input_and_invoke_stream
body = json.dumps(input_body)
^^^^^^^^^^
UnboundLocalError: cannot access local variable 'input_body' where it is not associated with a value
It appears that the code does not set the input_body value if the model is not claude-3:
if "claude-3" in self._get_model():
if _tools_in_params(params):
input_body = LLMInputOutputAdapter.prepare_input(
provider=provider,
model_kwargs=params,
prompt=prompt,
system=system,
messages=messages,
tools=params["tools"],
)
else:
input_body = LLMInputOutputAdapter.prepare_input(
provider=provider,
prompt=prompt,
system=system,
messages=messages,
model_kwargs=params,
)
body = json.dumps(input_body)
With the most recent version it appears that ChatBedrock does not function when I call using a llama3 model using chain.ainvoke({}).
It appears that the code does not set the input_body value if the model is not claude-3:
This is seemingly a result of the following commit, perhaps the else is at the wrong level? https://github.com/langchain-ai/langchain-aws/commit/018ed897c79ed51008fef98ef4b2818e7306e044