JHubi1 / ollama-app

A modern and easy-to-use client for Ollama
Apache License 2.0
542 stars 37 forks source link

Enable scorlling when prompt response is being generated #42

Open DevGoran opened 2 months ago

DevGoran commented 2 months ago

Requirements

Description

As prompt responses can be quite fast, I'd suggest letting the user scroll while the prompt is being generated. This is for instance possible with the official ChatGPT app. It's a quality of life improvement since, if a very large rresponse gets generated, one can not read until the LLM has finished with it's response.

Possible solution

Not sure how it works but possibly disable auto scrolling with new content?

Additional context

No response

JHubi1 commented 2 months ago

Hi there 👋 Sorry for not answering for so long. I looked into it and as far as I can tell, it's easily possible - well kinda. In one of the recent commits, I already implemented most of the things that have to be done. One problem I'm currently facing is that the underlying library I'm using has no option to disable the auto scrolling. I'll see if there's anything I can do, and I'll keep you updated about it. Thanks for reporting

travist85 commented 1 month ago

Just thought I'd drop a +1 for this. Every time I ask a question i have to wait for the response to finish, then scroll back up and start reading it from the top again. Disabling auto scroll would be a big improvement!