langchain4j / langchain4j

Java version of LangChain
https://docs.langchain4j.dev
Apache License 2.0
3.85k stars 729 forks source link

[FEATURE] MessageWindowChatMemory support conversational exchange #1326

Open noday opened 1 week ago

noday commented 1 week ago

Is your feature request related to a problem? Please describe.

use maxMessages, it may start with AiMessage like below, and llm will has exception'messages must strat will Humen message ' AI:* Humen:*** AI:

Describe the solution you'd like

like langchain's ConversationBufferWindowMemory(k=), k is conversational exchange Describe alternatives you've considered

Additional context

image

langchain4j commented 1 week ago

Hi @noday, which LLM are you using that has this restriction?

noday commented 1 week ago

@langchain4j qwen and bedrock-anthropic has, this is test: image

noday commented 1 week ago

if set maxMessages 4, it throw exception always.

langchain4j commented 1 week ago

This issue has been fixed for Anthropic: https://github.com/langchain4j/langchain4j/pull/1197 We could port the fix for qwen and bedrock as well

noday commented 4 days ago

Perhaps having a ConversationMessageWindowChatMemory would be better