⏩ Continue is the leading open-source AI code assistant. You can connect any models and any context to build custom autocomplete and chat experiences inside VS Code and JetBrains
I'm having a conversation inside a session right now, llm streaming the return content is long it took a little while.
At this point I switched to another historical session, the content that was being returned by the previous session will be displayed on top of this historical session.
When I switch to the previous session again, part of the data is lost, because his data was incorrectly displayed to the historical session.
Before submitting your bug report
Relevant environment info
Description
I'm having a conversation inside a session right now, llm streaming the return content is long it took a little while. At this point I switched to another historical session, the content that was being returned by the previous session will be displayed on top of this historical session. When I switch to the previous session again, part of the data is lost, because his data was incorrectly displayed to the historical session.
To reproduce
Log output
No response