langchain-ai / langchain-aws

Build LangChain Applications on AWS
MIT License
95 stars 69 forks source link

llama3-70b chat no longer functioning with Bedrock #90

Closed jefflavallee closed 2 months ago

jefflavallee commented 3 months ago

With the most recent version it appears that ChatBedrock does not function when I call using a llama3 model using chain.ainvoke({}).

\langchain_aws\llms\bedrock.py", line 867, in _aprepare_input_and_invoke_stream
    body = json.dumps(input_body)
                      ^^^^^^^^^^
UnboundLocalError: cannot access local variable 'input_body' where it is not associated with a value

It appears that the code does not set the input_body value if the model is not claude-3:

if "claude-3" in self._get_model():
            if _tools_in_params(params):
                input_body = LLMInputOutputAdapter.prepare_input(
                    provider=provider,
                    model_kwargs=params,
                    prompt=prompt,
                    system=system,
                    messages=messages,
                    tools=params["tools"],
                )
            else:
                input_body = LLMInputOutputAdapter.prepare_input(
                    provider=provider,
                    prompt=prompt,
                    system=system,
                    messages=messages,
                    model_kwargs=params,
                )
body = json.dumps(input_body)

This is seemingly a result of the following commit, perhaps the else is at the wrong level? https://github.com/langchain-ai/langchain-aws/commit/018ed897c79ed51008fef98ef4b2818e7306e044

ccurme commented 3 months ago

@laithalsaadoon would you mind taking a look at this? Thank you!

3coins commented 2 months ago

@jefflavallee @laithalsaadoon Can you take a look at #115.