aws-samples / aws-genai-llm-chatbot

A modular and comprehensive solution to deploy a Multi-LLM and Multi-RAG powered chatbot (Amazon Bedrock, Anthropic, HuggingFace, OpenAI, Meta, AI21, Cohere, Mistral) using AWS CDK on AWS
https://aws-samples.github.io/aws-genai-llm-chatbot/
MIT No Attribution
1.1k stars 332 forks source link

Issue: ERROR Incompatibility of Callback Handler in `AzureChatOpenAI` with Custom Callback Class #572

Closed michel-heon closed 2 months ago

michel-heon commented 2 months ago

Description

After cloning the repository git@github.com:aws-samples/aws-genai-llm-chatbot.git and running the code, I encountered an issue where my custom callback handler (LLMStartHandler) causes a validation error with AzureChatOpenAI. This same handler works correctly with BedrockLLM. The error suggests that AzureChatOpenAI strictly expects a BaseCallbackManager for the callback_manager field. This issue began after commit 6242c59d0a9be8910d049e42dc03af5d4d614de1.

Error Context

The error trace was extracted from CloudWatch logs associated with the Lambda function GenAIChatBotStack-LangchainInterfaceRequest, which is part of the GenAIChatBotStack-LangchainInterfaceRequest stack. The issue occurs when executing the Lambda function after cloning the repository and deploying the function.

The following commands can be used to reproduce the issue:

git clone git@github.com:aws-samples/aws-genai-llm-chatbot.git
cd aws-genai-llm-chatbot
git checkout 6242c59d0a9be8910d049e42dc03af5d4d614de1

Here’s the error trace from CloudWatch logs:

pydantic.v1.error_wrappers.ValidationError: 1 validation error for AzureChatOpenAI
callback_manager
  instance of BaseCallbackManager expected (type=type_error.arbitrary_type; expected_arbitrary_type=BaseCallbackManager)

This error appears immediately after executing the Lambda function GenAIChatBotStack-LangchainInterfaceRequest with the custom callback handler, while the same handler works fine with BedrockLLM.

Affected Files and Paths

The issue impacts the following files located at these paths:

Reproduction Steps

  1. Clone the repository:

    git clone git@github.com:aws-samples/aws-genai-llm-chatbot.git
    cd aws-genai-llm-chatbot
    git checkout 6242c59d0a9be8910d049e42dc03af5d4d614de1
  2. Deploy the Lambda function and execute it, using a custom callback handler in AzureChatOpenAI:

    from langchain_community.chat_models import AzureChatOpenAI
    
    return AzureChatOpenAI(
        openai_api_key=os.environ.get(f"AZURE_OPENAI_API_KEY"),
        callbacks=[self.callback_handler],  # Custom callback handler
    )
  3. Check CloudWatch for the error logs. The validation error occurs for the callback_manager.

  4. Test the same handler in the following Bedrock files, where it works without issue:

    • /lib/model-interfaces/langchain/functions/request-handler/adapters/bedrock/bedrock_mistral.py
    • /lib/model-interfaces/langchain/functions/request-handler/adapters/bedrock/bedrock_cohere.py
    • /lib/model-interfaces/langchain/functions/request-handler/adapters/bedrock/ai21_j2.py
    return BedrockLLM(
        client=bedrock,
        model_id=self.model_id,
        callbacks=[self.callback_handler],
    )

Expected Behavior

The custom callback handler should be compatible with both AzureChatOpenAI and BedrockLLM without throwing a validation error.

Actual Behavior

Commit Information

This issue started after commit 6242c59d0a9be8910d049e42dc03af5d4d614de1. The commit likely introduced stricter validation for AzureChatOpenAI.

Suggested Fix

  1. Align the behavior of AzureChatOpenAI with BedrockLLM by relaxing the type validation for the callback_manager, or
charles-marion commented 2 months ago

Hi @michel-heon ,

The validation error is triggered by Langchain based on my understanding (making it difficult to change the validation)

I commented this line setting the callback_handler and the issue was still happening https://github.com/aws-samples/aws-genai-llm-chatbot/blob/main/lib/model-interfaces/langchain/functions/request-handler/adapters/azureopenai/azuregpt.py#L36

This might be related to the class being incompatible with the current Langchain version.

from langchain_community.chat_models import AzureChatOpenAI (langchain-community) is deprecated in favor of from langchain_openai import AzureChatOpenAI ( I tried with langchain-openai==0.1.25) https://github.com/aws-samples/aws-genai-llm-chatbot/blob/main/lib/shared/layers/common/requirements.txt

After changing library and updating the parameter from openai_api_base (deprecated) to azure_endpoint, it got pass the error unfortunately I was not able to test because my environment is not setup.

To unblock you, you could try with an older commit with an older version of langchain (it was updated in #553) but I would recommend to upgrade AzureChatOpenAI to use the new version.

Also note BedrockLLM, bedrock_cohere.py and bedrock_mistral.py are not used in the project in favor of ChatBedrockConverse (recent change)

charles-marion commented 2 months ago

@michel-heon I added a fix here: https://github.com/aws-samples/aws-genai-llm-chatbot/pull/574

charles-marion commented 2 months ago

Closing. Please re-open if you are still having issues.