langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
94.41k stars 15.26k forks source link

Stop Sequenced Not Supported by AWS Bedrock #20095

Closed nick-velocity closed 3 months ago

nick-velocity commented 7 months ago

Checked other resources

Example Code

import boto3 import json import os import langchain from langchain.llms.bedrock import Bedrock from langchain import hub from langchain.agents import AgentExecutor, create_structured_chat_agent from langchain_community.llms import Bedrock from langchain.tools import tool

AWS_ACCESS_KEY = os.getenv('AWS_ACCESS_KEY_ID') AWS_SECRET_ACCESS_KEY = os.getenv('AWS_SECRET_ACCESS_KEY') AWS_REGION = os.getenv('AWS_REGION', 'us-east-1')

bedrock = boto3.client( service_name='bedrock-runtime', aws_access_key_id=AWS_ACCESS_KEY, aws_secret_access_key=AWS_SECRET_ACCESS_KEY, region_name=AWS_REGION )

llm = Bedrock( credentials_profile_name="default", model_id="mistral.mistral-large-2402-v1:0")

@tool def multiply(a: int, b: int): """Multiply two integers""" return a * b

tools = [multiply] prompt = hub.pull("hwchase17/structured-chat-agent") agent = create_structured_chat_agent(llm, tools, prompt) agent_executor = AgentExecutor(agent=agent, tools=tools) result = agent_executor.invoke({"input": "what is 123 * 456"}) print(result)

Error Message and Stack Trace (if applicable)

File ".../env/lib/python3.11/site-packages/langchain_community/llms/bedrock.py", line 654, in _prepare_input_and_invoke_stream raise ValueError(f"Error raised by bedrock service: {e}") ValueError: Error raised by bedrock service: An error occurred (ValidationException) when calling the InvokeModelWithResponseStream operation: Malformed input request: #: extraneous key [stop_sequences] is not permitted, please reformat your input and try again.

Description

I'm using AWS Bedrock for an agent application. It throws an error due to a stop sequence parameter that isn't supported by the AWS api.

The error can be mitigated by commenting out lines 611 - 619 in langchain_community.llms.bedrock

    # if stop:
    #     if provider not in self.provider_stop_sequence_key_name_map:
    #         raise ValueError(
    #             f"Stop sequence key name for {provider} is not supported."
    #         )

    #     # stop sequence from _generate() overrides
    #     # stop sequences in the class attribute
    #     _model_kwargs[self.provider_stop_sequence_key_name_map.get(provider)] = stop

System Info

langchain==0.1.14 langchain-community==0.0.31 langchain-core==0.1.40 langchain-openai==0.0.3 langchain-text-splitters==0.0.1 langchainhub==0.1.14 boto3==1.34.79 botocore==1.34.79

mikamboo commented 7 months ago

Same issue for me on crewai, need help

rabejens commented 7 months ago

I am having the same problem, and commenting out these lines just seems to freeze the pipeline.

kateruksha commented 6 months ago

Same issue when combining any Bedrock model with Langchain Agents.

Keerththipan commented 6 months ago

Try this one, go to langchain_community/llms/bedrock.py file > line 331, change the "mistral": "stop_sequences" to "mistral": "stop"

provider_stop_sequence_key_name_map: Mapping[str, str] = { "anthropic": "stop_sequences", "amazon": "stopSequences", "ai21": "stop_sequences", "cohere": "stop_sequences", "mistral": "stop_sequences", }

to

provider_stop_sequence_key_name_map: Mapping[str, str] = { "anthropic": "stop_sequences", "amazon": "stopSequences", "ai21": "stop_sequences", "cohere": "stop_sequences", "mistral": "stop", } Screenshot 2024-04-14 at 17-54-48 Mistral AI models - Amazon Bedrock

raajChit commented 2 months ago

How do I implement this for the llama models? Bedrock.py file doesnt seem to mention anything regarding the llama models