continuedev / continue

⏩ Continue is the leading open-source AI code assistant. You can connect any models and any context to build custom autocomplete and chat experiences inside VS Code and JetBrains
https://docs.continue.dev/
Apache License 2.0
19.25k stars 1.66k forks source link

Switching to a historical session, the current session returns the wrong llm data. #2800

Open xielingdm opened 1 week ago

xielingdm commented 1 week ago

Before submitting your bug report

Relevant environment info

- OS:macOS
- Continue version: v0.9.225
- IDE version: VSCode 1.94
- Model: 
- config.json:

Description

I'm having a conversation inside a session right now, llm streaming the return content is long it took a little while. At this point I switched to another historical session, the content that was being returned by the previous session will be displayed on top of this historical session. When I switch to the previous session again, part of the data is lost, because his data was incorrectly displayed to the historical session.

To reproduce

  1. new session01 and chat
  2. llm streaming the return content
  3. switch historical session
  4. displayed incorrectly

企业微信截图_24354590-9ba6-43bd-a250-e3ca1b88cf29

企业微信截图_355fdf86-66a2-4b6c-8434-a9c5b8577706

Log output

No response