Closed samching closed 5 months ago
Have you tried ChatAnthropic?
Working with ConversationBufferWindowMemory and anthropic LLM's will also throw a lot of warnings since the Base template "_DEFAULT_SUMMARIZER_TEMPLATE" has "AI:" examples in it. The summarization will however work in the end.
Hi, @samching,
I'm helping the LangChain team manage their backlog and am marking this issue as stale. From what I understand, you raised a concern about Langchain not supporting chat format for Anthropic, and the proposed solution was to create a wrapper class and add a function to translate AIMessage
and HumanMessage
to anthropic.AI_PROMPT
and anthropic.HUMAN_PROMPT
. In the comments, there were suggestions to try ChatAnthropic, and warnings were mentioned when working with ConversationBufferWindowMemory and anthropic LLM's due to examples in the Base template.
Could you please confirm if this issue is still relevant to the latest version of the LangChain repository? If it is, please let the LangChain team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days.
Thank you for your understanding and cooperation.
Problem
Langchain currently doesn't support chat format for Anthropic (e.g. being able to use
HumanMessage
andAIMessage
classes)Currently, when testing the same prompt across both Anthropic and OpenAI chat models, it requires rewriting the same prompt, although they fundamentally use the same
Human:... AI:....
structure.This means duplicating
2 * n chains
prompts (and more if you write separate prompts forturbo-3.5
and4
(likewise forinstant
andv1.2
for Claude), making it very unwieldy to test and scale the number of chains.Potential Solution
ChatClaude
and add a function like this to translate bothAIMessage
andHumanMessage
toanthropic.AI_PROMPT
andanthropic.HUMAN_PROMPT
respectively.But, definitely also open to other solutions which could work here.