Closed suhail-singh closed 9 months ago
Hi @suhail-singh. Thank you for your feedback. I will see how to do it.
@suhail-singh do you need auto scrolling in chat mode only? Or it will be also useful in other cases too?
@suhail-singh Add to your configuration (setq ellama-auto-scroll t)
. Works for ellama chat buffer only. Disabled by default. Will be soon on package archives.
Second part relates to https://github.com/ahyatt/llm/issues/6
You also can try to create model from existing one in ollama with additional stop parameter like this: https://github.com/jmorganca/ollama/blob/main/docs/modelfile.md#modelfiles-in-ollamaailibrary
Or it will be also useful in other cases too?
Could you please explain some of the other "cases" you were wondering about? i.e., what are the commands where the ellama-auto-scroll
setting currently wouldn't be respected?
All commands not related to chat in *ellama*
buffer
Sorry, but I don't understand what "not related to chat" specifically means in present context.
If I understand you correctly, ellama-ask
would be covered by ellama-auto-scroll
since it's an alias to ellama-chat
. Would ellama-summarize
currently be scrollable as well? If not, yes it would be useful for the output there to be scrollable as well (for the same reasons as noted above).
Btw, since I haven't yet tried the update, a point worth mentioning: the ability that's desirable is for the user to be able to scroll at will while the model is still responding to the query.
I.e., if ellama-auto-scroll
being t
makes it such that the point always is at the most recent utterance (as the output is being generated) that too would be problematic from a usability perspective.
User can scroll even without that latest change.
Sorry, but I don't understand what "not related to chat" specifically means in present context.
If I understand you correctly,
ellama-ask
would be covered byellama-auto-scroll
since it's an alias toellama-chat
. Wouldellama-summarize
currently be scrollable as well? If not, yes it would be useful for the output there to be scrollable as well (for the same reasons as noted above).
Summarize not covered by this change.
User can scroll even without that latest change.
Not when I created this issue. And not on tag 0.4.0
. You can scroll on past responses, but not the currently being generated response. Any attempt to move the point past the beginning of the "currently being generated response" resets the point back to the beginning of said response.
If the above is not what you're observing on your machine, I can look into it further via emacs -Q
.
When response is still being generated (as of commit d089d66
), this is what I observe:
emacs-auto-scroll
as nil
, the situation is same as I described hereemacs-auto-scroll
as t
, the point (and thus the *ellama*
buffer) follows the most recently generated token, which too is problematic.I.e., issue still remains and isn't resolved (from my perspective). Please let me know if you have trouble reproducing.
Now I realized what's the problem. I will fix it.
@suhail-singh check 0.4.3 version - should be fixed.
@s-kostyaev 0.4.3 fixes #22. thanks!
Currently, when a model is generating a response it doesn't seem possible to scroll past the beginning of the response. Any attempt to do so results in
point
being reverted to the beginning of the response.Context and motivation
Some models can generate long responses that extend past the visible area. It is convenient to be able to scroll the output and read the response as it is being generated.
Additionally, some models (e.g.
starling-lm
) can generate runaway responses. In such cases it may be necessary, at times, to be able to review the existing output to determine whether or not the model needs to be forcibly stopped.