The prompt. py file PROMPT_NEED_STREAM_OUT in chat_db defaults to False. I hope that the chat data module will also implement streaming output like other modules. When PROMPT_NEED_STREAM_OUT = True, a bug will occur in the page output.
What you expected to happen
When PROMPT_NEED_STREAM_OUT = True, the chat db will achieve streamout. Stream text first and then display table, or display table first and then stream text.
How to reproduce
In chat db, When PROMPT_NEED_STREAM_OUT = True, the bug occur.
Has the problem been solved?
I set PROMPT_NEED_STREAM_OUT = True in the dbgpt/app/scene/chat_db/auto_execute/prompt.py file. The front-end page did not have the effect of streaming output.
Search before asking
Operating system information
MacOS(M1, M2...)
Python version information
DB-GPT version
latest release
Related scenes
Installation Information
[X] Installation From Source
[ ] Docker Installation
[ ] Docker Compose Installation
[ ] Cluster Installation
[ ] AutoDL Image
[ ] Other
Device information
CPU: M1
Models information
LLM:glm-4 Embedding model:text2vec-large-chinese
What happened
The prompt. py file PROMPT_NEED_STREAM_OUT in chat_db defaults to False. I hope that the chat data module will also implement streaming output like other modules. When PROMPT_NEED_STREAM_OUT = True, a bug will occur in the page output.
What you expected to happen
When PROMPT_NEED_STREAM_OUT = True, the chat db will achieve streamout. Stream text first and then display table, or display table first and then stream text.
How to reproduce
In chat db, When PROMPT_NEED_STREAM_OUT = True, the bug occur.
Additional context
No response
Are you willing to submit PR?