Open asmith26 opened 3 months ago
It should be possible. What is your exact use case? If the use case is sending prompts and receiving responses from an LLM hosted somewhere, take a look at this tutorial: https://docs.taipy.io/en/latest/tutorials/visuals/5_multithreading/
Hi Taipy/ @AlexandreSajus ,
Just wondering is it possible to stream the output from an LLM to the GUI/chat message?
Many thanks for any help, and this lib! :)