langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
94.77k stars 15.34k forks source link

When BedrockChat model is initialized with guardrails argument, _prepare_input_and_invoke raises "Unknown parameter in input: "guardrail" exception" #21107

Closed gnanda17 closed 6 months ago

gnanda17 commented 6 months ago

Checked other resources

Example Code

from langchain_community.chat_models import BedrockChat
from langchain_core.prompts import ChatPromptTemplate, HumanMessagePromptTemplate
from langchain_core.output_parsers import StrOutputParser
from langchain.schema import SystemMessage

bedrock_client = boto3.client(service_name="bedrock-runtime")
bedrock_model = BedrockChat(
    client=bedrock_client, 
    model_id="anthropic.claude-3-sonnet-20240229-v1:0", 
    model_kwargs={"temperature": 0},
    guardrails={"id": "<ModelID>", "version": "1", "trace": True}
 )

prompt = ChatPromptTemplate.from_messages(messages)
human_message_template = HumanMessagePromptTemplate.from_template(
    "Input: ```{activity_note_input}```\nOutput: "
)
messages = [
   SystemMessage(content="<Prompt>"),
   human_message_template,
]
activity_note_input = "FOOBAR"
chain = prompt | bedrock_model | StrOutputParser()
response = chain.invoke({"activity_note_input": activity_note_input})

Error Message and Stack Trace (if applicable)

2024-04-30 14:06:46,160 ERROR:request:Traceback (most recent call last): File "/Users/girishnanda/Code/python-mono/lasagna/venv/lib/python3.12/site-packages/langchain_community/llms/bedrock.py", line 546, in _prepare_input_and_invoke response = self.client.invoke_model(**request_options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/girishnanda/Code/python-mono/lasagna/venv/lib/python3.12/site-packages/botocore/client.py", line 565, in _api_call return self._make_api_call(operation_name, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/girishnanda/Code/python-mono/lasagna/venv/lib/python3.12/site-packages/botocore/client.py", line 974, in _make_api_call request_dict = self._convert_to_request_dict( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/girishnanda/Code/python-mono/lasagna/venv/lib/python3.12/site-packages/botocore/client.py", line 1048, in _convert_to_request_dict request_dict = self._serializer.serialize_to_request( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/girishnanda/Code/python-mono/lasagna/venv/lib/python3.12/site-packages/botocore/validate.py", line 381, in serialize_to_request raise ParamValidationError(report=report.generate_report()) botocore.exceptions.ParamValidationError: Parameter validation failed: Unknown parameter in input: "guardrail", must be one of: body, contentType, accept, modelId, trace, guardrailIdentifier, guardrailVersion

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/Users/girishnanda/Code/python-mono/lasagna/venv/lib/python3.12/site-packages/flask/app.py", line 1484, in full_dispatch_request rv = self.dispatch_request() ^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/girishnanda/Code/python-mono/lasagna/venv/lib/python3.12/site-packages/flask/app.py", line 1469, in dispatch_request return self.ensure_sync(self.view_functions[rule.endpoint])(view_args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/girishnanda/Code/python-mono/lasagna/venv/lib/python3.12/site-packages/flask/views.py", line 109, in view return current_app.ensure_sync(self.dispatch_request)(kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/girishnanda/Code/python-mono/lasagna/venv/lib/python3.12/site-packages/flask/views.py", line 190, in dispatch_request return current_app.ensure_sync(meth)(kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/girishnanda/Code/python-mono/lasagna/lasagna/clients/artichoke.py", line 28, in inner return func(*args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/Users/girishnanda/Code/python-mono/lasagna/lasagna/clients/launchdarkly.py", line 29, in inner return func(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/Users/girishnanda/Code/python-mono/lasagna/lasagna/api/v3/activity_summary.py", line 79, in post resp = retry_with_backoff( ^^^^^^^^^^^^^^^^^^^ File "/Users/girishnanda/Code/python-mono/lasagna/lasagna/utilities/retry.py", line 25, in retry_with_backoff raise last_exception File "/Users/girishnanda/Code/python-mono/lasagna/lasagna/utilities/retry.py", line 18, in retry_with_backoff return func(*args) ^^^^^^^^^^^ File "/Users/girishnanda/Code/python-mono/lasagna/lasagna/api/v3/activity_summary.py", line 139, in fetch_activity_summary response = chain.invoke({"activity_note_input": activity_note_input}) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/girishnanda/Code/python-mono/lasagna/venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 2499, in invoke input = step.invoke( ^^^^^^^^^^^^ File "/Users/girishnanda/Code/python-mono/lasagna/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 158, in invoke self.generate_prompt( File "/Users/girishnanda/Code/python-mono/lasagna/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 560, in generate_prompt return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/girishnanda/Code/python-mono/lasagna/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 421, in generate raise e File "/Users/girishnanda/Code/python-mono/lasagna/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 411, in generate self._generate_with_cache( File "/Users/girishnanda/Code/python-mono/lasagna/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 632, in _generate_with_cache result = self._generate( ^^^^^^^^^^^^^^^ File "/Users/girishnanda/Code/python-mono/lasagna/venv/lib/python3.12/site-packages/langchain_community/chat_models/bedrock.py", line 294, in _generate completion, usage_info = self._prepare_input_and_invoke( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/girishnanda/Code/python-mono/lasagna/venv/lib/python3.12/site-packages/langchain_community/llms/bedrock.py", line 553, in _prepare_input_and_invoke raise ValueError(f"Error raised by bedrock service: {e}") ValueError: Error raised by bedrock service: Parameter validation failed: Unknown parameter in input: "guardrail", must be one of: body, contentType, accept, modelId, trace, guardrailIdentifier, guardrailVersion

Description

I am trying to use AWS Bedrock Guardrails with the BedrockChat model. If i set the guardrails parameter when instantiating the BedrockChat model I get a ValueError when the chain is invoked.

System Info

langchain==0.1.17rc1 langchain-community==0.0.34 langchain-core==0.1.47 langchain-openai==0.0.8 langchain-text-splitters==0.0.1 openinference-instrumentation-langchain==0.1.14

mattsalt123 commented 6 months ago

Same issue here, bedrock have official released guardrails in the last week and in the process have changed the parameters for invoke_model and invoke_model_with_response_stream.

response = client.invoke_model_with_response_stream( body=b'bytes'|file, contentType='string', accept='string', modelId='string', trace='ENABLED'|'DISABLED', guardrailIdentifier='string', guardrailVersion='string' ) #

gnanda17 commented 6 months ago

https://github.com/langchain-ai/langchain/pull/20216#issuecomment-2088006582 suggests that langchain community bedrock integrations are deprecated and instead the current implementations live at https://github.com/langchain-ai/langchain-aws

Found relevant issue at https://github.com/langchain-ai/langchain-aws/issues/25 with an open PR at https://github.com/langchain-ai/langchain-aws/pull/26

Going to close this one then.