JoshuaC215 / agent-service-toolkit

Full toolkit for running an AI agent service built with LangGraph, FastAPI and Streamlit
https://agent-service-toolkit.streamlit.app
MIT License
278 stars 43 forks source link

Background task UI notification #38

Open peterkeppert opened 6 days ago

peterkeppert commented 6 days ago

I have been using the toolkit as a template to build a RAG-based agent. This agent performs 2-3 background tasks before generating a visible response for the user. Background tasks either retrieve structured output from LLM that informs agent's decision making or perform programatic tasks, like document retrieval from vector DB. Each of the steps is fairly fast, but together, before the streamed tokens from the final response begin to appear, the waiting time is easily >30s. It might not be that much, but I can easily imagine a longer-running background task (like waiting for human input or performing complicated retrieval).

I looked around and couldn't find a good solution how to notify the user on the progress of these tasks and make the app more interactive. Then I was thinking to repurpose tool calling functionality, but in the end decided against, because tool calls work differently - they basically start and end in LLM + are tied to given AI response message, whereas for the background tasks, LLM is just an optional part + they usually precede response generation step.

In the end, I made it work by creating a new TaskMessage based on Langchain's ChatMessage, extended Agent Service Toolkit's ChatMessage to support task-oriented fields and used custom events to stream the notifications. With small additions to UI, background tasks now appear in dedicated container in between the messages, where they in my opinion logically belong (see screenshot).

background_tasks

Is this something somebody had to tackle as well and perhaps has a better solution at hand? @JoshuaC215 is this a functionality that you would wish to incorporate into the toolkit or would it bloat the template unnecessarily?

FDA-1 commented 6 days ago

I've recently started using this template, and this addition is exactly what I've been thinking of building. IMO, it would make perfect sense to add to the template.

JoshuaC215 commented 5 days ago

Nice! This sounds cool

I wonder if using something like langchain_core.callbacks.adispatch_custom_event would be the right approach for these progress updates, similar to here.

The st.status container used to render the tool output also supports updating the label with intermediate calls. So if you have multiple serial steps, another approach instead of showing multiple containers would be to keep one "running" container and update the label to show the current step

I am nervous about bloat but open to this if it doesn't require a huge refactor or a ton of additional code or branching logic. If we find it's too much, also happy to link to a separate branch / PR from the main README or something so it's more discoverable. What do you think?

peterkeppert commented 5 days ago

Yes, my solution uses langchain_core.callbacks.adispatch_custom_event.

Initially, I preferred multiple containers to be able to show inputs/outputs of the background tasks in them. On the other hand I was concerned that they might take too much space and disrupt the conversation flow. After using the solution for a few conversations, I like the idea of one "running" container better and I think I also found a clear way to display the data from all background tasks in a single container. Will clear it up and submit a pull request.