Open lianghsun opened 9 months ago
Waiting for it as well. I can see the response from llm is streaming in the terminal, but currently looks like cannot stream the final result.
The iostream would support this now.
Hello, I was wondering if there's any development in this possible feature.
I tried a simple test with streamlit
import streamlit as st
from io import StringIO
import autogen
# Title
st.title("Hello World")
and just importing autogen
makes a segmentation fault
env: Python 3.12.2 streamlit: 1.38.0 pyautogen: 0.3.0
Should I revert to previous versions of python
, or autogen
to have this code not crashing ?
thanks, Jonathan
I have a question about the application of streaming beyond the terminal. Is it possible to integrate streaming with existing chat UI frameworks, such as Streamlit or Chainlit? I am curious to know if streaming can be effectively merged with these frameworks to enhance user interaction and interface capabilities.