langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
94.95k stars 15.38k forks source link

Error : Stop sequence key name for {meta or mistral or any other mode} is not supported with AWS Bedrock #20053

Open david7joy opened 7 months ago

david7joy commented 7 months ago

Checked other resources

Example Code

I am trying to use AWS Bedrock models such as Llama / Mistral with Langchain Libraries such as SQLDatabaseToolkit.

model = Bedrock(credentials_profile_name="my-profile",
                model_id="meta.llama2-70b-chat-v1",
                model_kwargs={"temperature": 0.5},
                streaming=True,
                callbacks=[StreamingStdOutCallbackHandler()])

db = SQLDatabase.from_uri('database url')
toolkit = SQLDatabaseToolkit(llm=model,db=db)
agent_executor = create_sql_agent(llm=model, toolkit=toolkit,verbose=True, handle_parsing_errors=True)

# Query Module 
result = agent_executor.invoke(prompt)

Error Message and Stack Trace (if applicable)

This errors out with the following.

  File "/opt/homebrew/lib/python3.11/site-packages/langchain_community/llms/bedrock.py", line 833, in _call
    for chunk in self._stream(
  File "/opt/homebrew/lib/python3.11/site-packages/langchain_community/llms/bedrock.py", line 613, in _prepare_input_and_invoke_stream
    raise ValueError(
ValueError: Stop sequence key name for meta is not supported.

Description

I have tried the same code with OpenAI / Ollama Mistral/Lamma as well as google GenAI models and they don't seem to show this error. This seems like something with the way the bedrock library works in Langchain or the bedrock service.

Is there a workaround I can use to get this to work.

System Info

System Information

OS: Darwin OS Version: Darwin Kernel Version 23.3.0: Wed Dec 20 21:30:44 PST 2023 Python Version: 3.11.7 (main, Dec 4 2023, 18:10:11) [Clang 15.0.0 (clang-1500.1.0.2.5)]

Package Information

langchain_core: 0.1.40 langchain: 0.1.14 langchain_community: 0.0.31 langsmith: 0.1.40 langchain_experimental: 0.0.56 langchain_openai: 0.0.5 langchain_text_splitters: 0.0.1

Packages not installed (Not Necessarily a Problem)

The following packages were not found:

langgraph langserve

jonathancaevans commented 7 months ago

Bedrock meta models don't currently support stop sequences.

Mistral models have a parameter mapping issue addressed in the above PR'

If you want to use any of the three Mistral models for an agent you can use the kwargs settings to define them at the client level. And then modify the agent_executor object to remove any stop sequences it will try to pass over to the llm class until the PR is merged.

Like so'ish

from langchain_community.llms import Bedrock

llm = Bedrock(
    model_id="mistral.mixtral-8x7b-instruct-v0:1",
    model_kwargs={'stop': ['Stop!']}
)

llm('<s>[INST]Can you say `Stop!`?[/INST]')
t-mac81 commented 7 months ago

Thanks for the PR, i'm surprised this wasn't tested before the bedrock wrapper was released.

david7joy commented 7 months ago

@jonathancaevans Thanks .. I did try this before. but it doesn't work either specially when I am using an agent_executor.

prompt = "How much money do I have in my account. Final Result should show amount. Think step by step."

model = Bedrock(credentials_profile_name="crl-revenue",
                model_id="mistral.mistral-7b-instruct-v0:2",
                model_kwargs={'stop' : ['Stop!']},
                streaming=True,
                callbacks=[StreamingStdOutCallbackHandler()])

db = SQLDatabase.from_uri('database url')
toolkit = SQLDatabaseToolkit(llm=model,db=db)
agent_executor = create_sql_agent(llm=model, toolkit=toolkit, verbose=True, handle_parsing_errors=True)

# Query Modeule 
result = agent_executor.invoke(prompt)
File "/opt/homebrew/lib/python3.11/site-packages/langchain_community/llms/bedrock.py", line 654, in _prepare_input_and_invoke_stream
    raise ValueError(f"Error raised by bedrock service: {e}")
ValueError: Error raised by bedrock service: An error occurred (ValidationException) when calling the InvokeModelWithResponseStream operation: Malformed input request: #: extraneous key [stop_sequences] is not permitted, please reformat your input and try again.

I tried other services like - OpenAI models, Gemini Models via Vertex and none of them have this issue. Hopefully that PR merge will help.

t-mac81 commented 7 months ago

Fairly simple workaround for now until the PR is merged: after your declare your llm:

llm.provider_stop_sequence_key_name_map = {'anthropic': 'stop_sequences', 'amazon': 'stopSequences',
                                                   'ai21': 'stop_sequences', 'cohere': 'stop_sequences',
                                                   'mistral': 'stop'}
wild-thomas commented 6 months ago

Fairly simple workaround for now until the PR is merged: after your declare your llm:

llm.provider_stop_sequence_key_name_map = {'anthropic': 'stop_sequences', 'amazon': 'stopSequences',
                                                   'ai21': 'stop_sequences', 'cohere': 'stop_sequences',
                                                   'mistral': 'stop'}

What do we have to change for 'meta' here? I'm getting the same error for 'meta.llama3-8b-instruct-v1:0'.

t-mac81 commented 6 months ago

Fairly simple workaround for now until the PR is merged: after your declare your llm:

llm.provider_stop_sequence_key_name_map = {'anthropic': 'stop_sequences', 'amazon': 'stopSequences',
                                                   'ai21': 'stop_sequences', 'cohere': 'stop_sequences',
                                                   'mistral': 'stop'}

What do we have to change for 'meta' here? I'm getting the same error for 'meta.llama3-8b-instruct-v1:0'.

Meta is not in the key map, maybe you can add in stop sequences as kawargs, you'll have to check what boto3 supports with meta models.

bqmackay commented 6 months ago

I added "meta": "" into the list. It doesn't crash, but I doubt that's the right way to do it. The fact that Llama3 doesn't have a stop sequence makes me want to believe that leaving it blank is ok.

wild-thomas commented 6 months ago

llm.provider_stop_sequence_key_name_map = {...} is not the solution for the initial bug reported.

19220 seems to address the problem at the create_react_agent() function.

AyushSonuu commented 4 months ago

"meta": ""

ValueError: Stop sequence key name for meta is not supported. getting this when using meta llama3

verissimomanoel commented 3 months ago

I'm having the same problem using create_sql_agent. The only models that can be used are from Anthropic.

raajChit commented 2 months ago

Is there any other way to use a ReAct agent? If not , when will this bug be fixed on AWS side?