Avaiga / demo-chatbot

A template to create any LLM Inference Web Apps using Python only
162 stars 33 forks source link

Is it possible to stream output? #6

Open asmith26 opened 3 months ago

asmith26 commented 3 months ago

Hi Taipy/ @AlexandreSajus ,

Just wondering is it possible to stream the output from an LLM to the GUI/chat message?

Many thanks for any help, and this lib! :)

AlexandreSajus commented 3 months ago

It should be possible. What is your exact use case? If the use case is sending prompts and receiving responses from an LLM hosted somewhere, take a look at this tutorial: https://docs.taipy.io/en/latest/tutorials/visuals/5_multithreading/