⏩ Continue is the leading open-source AI code assistant. You can connect any models and any context to build custom autocomplete and chat experiences inside VS Code and JetBrains
[x] I believe this is a way to improve. I'll try to join the Continue Discord for questions
[X] I'm not able to find an open issue that requests the same enhancement
Problem
Currently when receiving a response from the LLM, the chat scrolls upward to keep the end of the chat in view.
This however (for longer responses) disrupts the user from reading the chat while the response is still being finished. I would prefer (or have an option to) allow the chat to generate without automatically scrolling the chat.
I constantly have to wait for the chat to finish generating, then I always need to scroll back up to the beginning of the response to start reading.
Validations
Problem
Currently when receiving a response from the LLM, the chat scrolls upward to keep the end of the chat in view.
This however (for longer responses) disrupts the user from reading the chat while the response is still being finished. I would prefer (or have an option to) allow the chat to generate without automatically scrolling the chat.
I constantly have to wait for the chat to finish generating, then I always need to scroll back up to the beginning of the response to start reading.
Solution
No response