second-opinion-ai / second-opinion

6 stars 0 forks source link

Add suport for Anthropic model #39

Open branhoff opened 2 months ago

branhoff commented 2 months ago

Description: Currently, our car diagnostic application supports using OpenAI's GPT models for generating diagnostic responses. However, we would like to enhance the application to also support Anthropic's language models, such as Claude, to provide more flexibility and options for our users.

Desired Functionality

Implementation Details

  1. Add the langchain_anthropic package to the requirements.txt

  2. Update the llm_interaction module:

    • Import the necessary classes from langchain_anthropic, such as ChatAnthropic and AnthropicEmbeddings.
    • Modify the get_llm function to return a ChatAnthropic instance when the LLM_TYPE environment variable is set to "anthropic".
    • Modify the get_embeddings function to return an AnthropicEmbeddings instance when the EMBEDDINGS_TYPE environment variable is set to "anthropic".
  3. Update the get_context function in the util.py module:

    • Check the LLM_TYPE environment variable to determine the selected LLM type.
    • If LLM_TYPE is set to "anthropic", create the agent_executor without specifying the agent parameter to avoid passing unsupported arguments.
    • For other LLM types (e.g., OpenAI), create the agent_executor with the agent parameter set to "openai-functions".
  4. Update the unit tests in tests/test_llm_interaction.py to include test cases for Anthropic models and embeddings.

  5. Update the application's documentation and README to reflect the added support for Anthropic models and provide instructions on how to configure the environment variables for using Anthropic.

Benefits

Considerations