Open jarib opened 2 months ago
To address the issue of using Anthropic Claude 3 with LangChain.js, particularly with the ChatVertexAI
class, it's important to note that the current implementation indeed focuses on Gemini models, leading to the error you encountered. However, you can integrate Claude 3 by adapting the approach used for Anthropic models in LangChain.js, specifically through a custom implementation similar to the ChatAnthropic
class.
Here's a concise guide to get you started:
Ensure Dependencies: Make sure you have both LangChain and the Anthropic SDK installed in your project.
Use the ChatAnthropic
Class: Import and configure the ChatAnthropic
class from LangChain to work with Claude 3. You'll need to specify the model name, API key, and other relevant settings like temperature and max tokens.
Invoke the Model: Utilize the invoke
method for sending prompts to Claude 3 and receiving responses. For interactive applications, consider exploring streaming responses.
Customize as Needed: Tailor the integration to fit your application's needs, adjusting prompt formatting, response handling, or integrating with other services as required.
This approach allows you to leverage Claude 3's capabilities within LangChain.js applications. For detailed implementation and examples, refer to the ChatAnthropic
class in the LangChain.js repository and the associated documentation.
Keep in mind, this solution circumvents the ChatVertexAI
class's current limitations with non-Gemini models. The LangChain.js team is working on expanding support for a broader range of models, so staying updated with the repository may provide more streamlined solutions in the future.
To continue the conversation, mention @dosu.
Similar to #2562
The issue is that Claude in Vertex AI is neither the same protocol as Gemini, nor quite the same as what is offered through Anthropic. So we need something that combines the two.
Do-able, but hasn't risen in priority.
My general thoughts about how to do this would be
baseMessageToContent
, safeResponseToChatGeneration
, and safeResponseToChatResult
need to call their equivalents in this classGoogleAIConnection
modelFamily()
should add "claude" or something along those lines. And GoogleLLMModelFamily should be adjusted as well (probably).buildUrlVertex()
needs a change to handle non-Google models (possibly determine that based on the model family?)GoogleAIConnection
or AbstractGoogleLLMConnection
to handle the Claude layout.
But I haven't dug into the details.The good news is that the authentication part is handled by default with the Connection classes.
+1, would be a very useful capability. Seems like it's already supported in BedrockChat, which is great.
+1, especially given the renewed interest in Claude 3.5 Sonnet
+1 for Claude 3.5 Sonnet
+1 another vote - Claude 3.5 support through Vertex AI is a key integration for companies on GCP to adopt Langchain.
I see the +1s - this is now next on my priority list!
I'm aiming to get #5835 into a stable state ASAP so it can be merged, since it contains some updates that would be useful for this effort as well. The approach I outlined above still mostly holds.
Sorry I couldn't get this in place for I/O Connect, but hope to have good news soon!.
Could you extend to other LLMs in the model garden such as Llama 3?
Yes... but...
In the Vertex AI Model Garden, Llama 3 requires you to deploy the model to an endpoint you control. And the details of the API are poorly documented. Claude has more full API support with "model-as-a-service" and a standard endpoint.
My current thinking is that I'll be able to provide direct support through the current classes we have (ie - you'll just be able to specify a claude model and it will work), while you may need to do some work for other models in the Model Garden, including deploying an endpoint.
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
Description
I am trying to use ChatVertexAI with Anthropic Claude 3, but it seems this class only supports Gemini models and returns the above error message.
This appears to be a deliberate choice in the code:
https://github.com/langchain-ai/langchainjs/blob/b9d86b16bd0788144f8f27cf5d18960c43da49af/libs/langchain-google-common/src/utils/common.ts#L125-L132
I've verified that using Claude and Vertex AI and Anthropics Vertex SDK works fine:
Output:
System Info
Platform: Mac Node: 20.11.0