Open noobHappylife opened 1 week ago
@AlexandreSajus Could you check this issue?
@AlexandreSajus Could you check this issue?
EDIT: My bad, I always thought you said the code works when debug was off. I don't really know how I could help here. Maybe R&D has an idea on what can be causing this.
We already discussed this on Discord. I think this is expected behavior. Debug mode has to consume performance somewhere (R&D should know more), and this causes any real-time application to be slower. I'm not sure this is an issue.
Kindly allow me to help u to solve the bug
@KunjShah95 You are already assigned to another issue. For hacktoberfest, we only assign issues one at a time. Please submit a PR on the other issue first, or remove your assignment.
Thank you.
i wanrt to work on this issue as i have removed my previous issue
What went wrong? 🤔
I'm working on a LLM chatbot example, I'm using update_content to update the partial while streaming response from the LLM. However, while it works, it only works well in debug mode. While turning debug mode off, the update becomes "chunky". (see the video attached).
Debug off, streaming and update_content seems to be very "chunky/jumpy" https://github.com/user-attachments/assets/39cebd87-95eb-4a65-8e38-1581032a7686
Debug on, streaming and update_content works https://github.com/user-attachments/assets/7fb47928-b991-4ee5-b5be-caabd1954386
Env: Taipy is installed from source, commit 2f33ab1e3cdbc2f91553fe16ff60ea8eeab73422 Ubuntu server 20.04 (Also tested on windows 10)
p.s. I'm not using chat control, because I can't get the streaming response work with it.
Expected Behavior
No response
Steps to Reproduce Issue
Here is the sample code
Solution Proposed
No response
Screenshots
Runtime Environment
No response
Browsers
No response
OS
No response
Version of Taipy
No response
Additional Context
No response
Acceptance Criteria
Code of Conduct