langchain-ai / langchain-aws

Build LangChain Applications on AWS
MIT License
95 stars 70 forks source link

ChatBedrock doesn't work with Cohere Command R and Command R+ models #66

Open Adam-Thometz opened 4 months ago

Adam-Thometz commented 4 months ago

Running the ChatBedrock model with Cohere's new Command R and Command R+ models will throw a ValidationException.

When you invoke the following model:

llm = ChatBedrock(
    model_id="cohere.command-r-v1:0",
    model_kwargs={
        "temperature": 0.3,
        "max_tokens": 4096
    },
)

You will get the following error:

ERROR: Error raised by bedrock service: An error occurred (ValidationException) when calling the InvokeModel operation: Malformed input request: #: extraneous key [prompt] is not permitted, please reformat your input and try again.

This is likely because the new Command R models do not accept "prompt" as an input. It accepts "message". (Source).
This should be refactored to pass in a message key instead of prompt if Command R or R+ is used.

keshavd commented 3 months ago

The Command R models also no longer accept stream as a parameter. Opened up a pull request to fix this (https://github.com/langchain-ai/langchain-aws/pull/101).

3coins commented 2 months ago

@Adam-Thometz For ChatBedrock use with the command models, use the converse API, here is some sample code.

params = {
    "region_name": "us-west-2",
    "model_id": "cohere.command-r-v1:0",
    "beta_use_converse_api": True
}
model = ChatBedrock(**params)
response = model.invoke("Hello!")