spring-projects / spring-ai

An Application Framework for AI Engineering
https://docs.spring.io/spring-ai/reference/index.html
Apache License 2.0
3.31k stars 848 forks source link

How can I implement the ChatMemory in an agentic workflow? #1408

Open FredLiu0813 opened 1 month ago

FredLiu0813 commented 1 month ago

I am working on an agentic workflow project using SpringAI, now I have a question/problem with ChatMemory.

Question How can I implement the memories in a workflow?

Example I have an agentic workflow that includes the following steps:

  1. Perform a GoogleSearch based on user input.
  2. Send the Google result to a LLM(e.g., OpenAI) to summarize them into a short article.
  3. Return the summarized result.

Workflow: Start→GoogleSearch→OpenAI LLM→End

Problem When I running this workflow and enter "NBA" for the first time, and it returns an article about "NBA". However, when I input "please reduce the content of previous answer to 20 words" for the second time, the workflow does not retain the memory. Instead, It transmits the "please reduce the content of previous answer to 20 words" to Google and generates a new article.

Is there any way to solve this problem? I would greatly appreciate any responses.

alexcheng1982 commented 1 month ago

Spring AI provides some built-in advisors to support chat memory. In your case, you can use either MessageChatMemoryAdvisor or PromptChatMemoryAdvisor. The configuration may look like below (untested code):

@Configuration
public class AppConfiguration {
  @Bean
  ChatMemory chatMemory() {
    return new InMemoryChatMemory(); // Use in-memory chat memory
  }

  @Bean
  MessageChatMemoryAdvisor messageChatMemoryAdvisor(ChatMemory chatMemory) {
    return new MessageChatMemoryAdvisor(chatMemory); // create the advisor
  }

  @Bean
  ChatClient chatClient(ChatClient.Builder builder, MessageChatMemoryAdvisor advisor) {
    return builder.defaultAdvisors(advisor).build(); // create a ChatClient
  }
}
FredLiu0813 commented 1 month ago

Thanks for your answers.

I am considering a solution: before invoking the Flow, I send the Flow's description along with the user's question to the LLM. Let the LLM determine if the question is suitable for the Flow.

If it is not suitable, I will use the LLM's response directly. If it is suitable, I will proceed with the Flow.

In this process, each round of Chat uses ChatMemory and MessageChatMemoryAdvisor.

I am not sure if this approach is correct.

alexcheng1982 commented 1 month ago

From your description, only the first time the user interacts with your service, you need to determine if the flow should be used. You can use another advisor to call the LLM and determine if the flow should be used. The result should be stored somewhere. Based on the result, you can choose to bypass ChatMemory in the advisor implementation.

On Mon, Sep 30, 2024 at 8:57 PM Fred @.***> wrote:

Thanks for your answers.

I am considering a solution: before invoking the Flow, I send the Flow's description along with the user's question to the LLM. Let the LLM determine if the question is suitable for the Flow.

If it is not suitable, I will use the LLM's response directly. If it is suitable, I will proceed with the Flow.

In this process, each round of Chat uses ChatMemory and MessageChatMemoryAdvisor.

I am not sure if this approach is correct.

— Reply to this email directly, view it on GitHub https://github.com/spring-projects/spring-ai/issues/1408#issuecomment-2382378647, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAI6AEGONDOYWZBD4UHHFMDZZEAAHAVCNFSM6AAAAABOZPKEPKVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDGOBSGM3TQNRUG4 . You are receiving this because you commented.Message ID: @.***>

-- Regards Alex Cheng

FredLiu0813 commented 1 month ago

How to use "another advisor to call the LLM and determine if the flow should be used"? can you help me and give some sample codes?

matosoma123 commented 2 weeks ago

@alexcheng1982 , is it possible to set MessageChatMemoryAdvisor for BedrockAnthropic3ChatModel ?