run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
35.42k stars 4.99k forks source link

[Feature Request]: Add streaming for OpenAIAssistantAgent #13015

Open tioans opened 4 months ago

tioans commented 4 months ago

Feature Description

The OpenAI Assistant API now supports streaming, it would be great to also support this feature via the OpenAIAssistantAgent wrapper.

Reason

Currently, the LlamaIndex agent only supports standard message passing via chats. There seems to be some functionality for async conversations, but it is not implemented.

Value of Feature

Having this feature available to use with agents will significantly improve the user experience, especially for complex iterations that require more processing.

justinzyw commented 4 months ago

+1 this would be very useful

naingthet commented 4 months ago

I'd love to take a look at this if there's interest? @logan-markewich

logan-markewich commented 4 months ago

Definitely @naingthet go for it! It's been on my today list to update the openai assistant, since it's basically broken right now 😅

naingthet commented 4 months ago

Awesome! This would be my first contribution, so it may take me a bit to get oriented with the process. If anyone else is able to get to it before me, please feel free!

mpereira commented 3 months ago

FTR I'm giving this a go. Will maybe open a PR within the next day or so.

mpereira commented 3 months ago

I have a working implementation here: https://github.com/mpereira/llama_index/pull/2/files

Still not ready to upstream I think. Would still need to figure out how to make OpenAIAssistantAgent.stream_chat return a StreamingAgentChatResponse instead of an Iterator[AssistantStreamEvent].