Open david7joy opened 7 months ago
Bedrock meta models don't currently support stop sequences.
Mistral models have a parameter mapping issue addressed in the above PR'
If you want to use any of the three Mistral models for an agent you can use the kwargs settings to define them at the client level. And then modify the agent_executor object to remove any stop sequences it will try to pass over to the llm class until the PR is merged.
Like so'ish
from langchain_community.llms import Bedrock
llm = Bedrock(
model_id="mistral.mixtral-8x7b-instruct-v0:1",
model_kwargs={'stop': ['Stop!']}
)
llm('<s>[INST]Can you say `Stop!`?[/INST]')
Thanks for the PR, i'm surprised this wasn't tested before the bedrock wrapper was released.
@jonathancaevans Thanks .. I did try this before. but it doesn't work either specially when I am using an agent_executor.
prompt = "How much money do I have in my account. Final Result should show amount. Think step by step."
model = Bedrock(credentials_profile_name="crl-revenue",
model_id="mistral.mistral-7b-instruct-v0:2",
model_kwargs={'stop' : ['Stop!']},
streaming=True,
callbacks=[StreamingStdOutCallbackHandler()])
db = SQLDatabase.from_uri('database url')
toolkit = SQLDatabaseToolkit(llm=model,db=db)
agent_executor = create_sql_agent(llm=model, toolkit=toolkit, verbose=True, handle_parsing_errors=True)
# Query Modeule
result = agent_executor.invoke(prompt)
File "/opt/homebrew/lib/python3.11/site-packages/langchain_community/llms/bedrock.py", line 654, in _prepare_input_and_invoke_stream
raise ValueError(f"Error raised by bedrock service: {e}")
ValueError: Error raised by bedrock service: An error occurred (ValidationException) when calling the InvokeModelWithResponseStream operation: Malformed input request: #: extraneous key [stop_sequences] is not permitted, please reformat your input and try again.
I tried other services like - OpenAI models, Gemini Models via Vertex and none of them have this issue. Hopefully that PR merge will help.
Fairly simple workaround for now until the PR is merged: after your declare your llm:
llm.provider_stop_sequence_key_name_map = {'anthropic': 'stop_sequences', 'amazon': 'stopSequences',
'ai21': 'stop_sequences', 'cohere': 'stop_sequences',
'mistral': 'stop'}
Fairly simple workaround for now until the PR is merged: after your declare your llm:
llm.provider_stop_sequence_key_name_map = {'anthropic': 'stop_sequences', 'amazon': 'stopSequences', 'ai21': 'stop_sequences', 'cohere': 'stop_sequences', 'mistral': 'stop'}
What do we have to change for 'meta' here? I'm getting the same error for 'meta.llama3-8b-instruct-v1:0'.
Fairly simple workaround for now until the PR is merged: after your declare your llm:
llm.provider_stop_sequence_key_name_map = {'anthropic': 'stop_sequences', 'amazon': 'stopSequences', 'ai21': 'stop_sequences', 'cohere': 'stop_sequences', 'mistral': 'stop'}
What do we have to change for 'meta' here? I'm getting the same error for 'meta.llama3-8b-instruct-v1:0'.
Meta is not in the key map, maybe you can add in stop sequences as kawargs, you'll have to check what boto3 supports with meta models.
I added "meta": ""
into the list. It doesn't crash, but I doubt that's the right way to do it. The fact that Llama3 doesn't have a stop sequence makes me want to believe that leaving it blank is ok.
llm.provider_stop_sequence_key_name_map = {...} is not the solution for the initial bug reported.
"meta": ""
ValueError: Stop sequence key name for meta is not supported. getting this when using meta llama3
I'm having the same problem using create_sql_agent. The only models that can be used are from Anthropic.
Is there any other way to use a ReAct agent? If not , when will this bug be fixed on AWS side?
Checked other resources
Example Code
I am trying to use AWS Bedrock models such as Llama / Mistral with Langchain Libraries such as SQLDatabaseToolkit.
Error Message and Stack Trace (if applicable)
This errors out with the following.
Description
I have tried the same code with
OpenAI
/Ollama Mistral/Lamma
as well as google GenAI models and they don't seem to show this error. This seems like something with the way the bedrock library works in Langchain or the bedrock service.Is there a workaround I can use to get this to work.
System Info
System Information
Package Information
Packages not installed (Not Necessarily a Problem)
The following packages were not found: