Open shayanrayamzn opened 4 hours ago
If there are any workarounds for this fix please let me know. adding a provider_stop_sequence_key_name_map
works for other issues but not in this case, when guardrails intervene and enter the msg_type == "message_delta"
block
llm.provider_stop_sequence_key_name_map = {'anthropic': 'stop_sequences',
'amazon': 'stopSequences',
'ai21': 'stop_sequences',
'cohere': 'stop_sequences',
'mistral': 'stop'}
Hello,
In
langchain_aws/llms/bedrock.py
the "stop_sequence" hardcoded for msg_type == "message_delta" - Shouldn't this be captured using the LLM provider type withself._get_provider()
This breaks integration with bedrock guardrails and langchain agents when guardrails intervene