s-kostyaev / ellama

Ellama is a tool for interacting with large language models from Emacs.
GNU General Public License v3.0
378 stars 27 forks source link

Possible? A hook that is called once streaming output is done #26

Closed tvraman closed 6 months ago

tvraman commented 7 months ago

Not sure how we would build that; perhaps it needs to be part of the llm.el package?

s-kostyaev commented 7 months ago

It is in llm package. Are you sure that you need it in ellama? Can you explain your use case? It can be implemented in ellama but I don't understand why it should be here.

tvraman commented 7 months ago

Sergey Kostyaev @.***> writes:

Agreed that it likely belongs in LLM, but I asked because:

  1. My use-case is to produce auditory feedback (emacspeak), e.g. a short auditory tick as the results stream in.

  2. That feels like a front-end thing, and llm.el is designed to be separate from various front-ends.

see https:github.com/tvraman/emacspeak> It is in llm package. Are you sure that you need it in ellama? Can you Also, I'll create a separate issue for the next question below so you can track it:

Should ellama-mode derive from comint-mode; we'll then get the ability to navigate through various steps in a conversation for free.

explain your use case? It can be implemented in ellama but I don't understand why it should be here.

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.

--

s-kostyaev commented 7 months ago

@tvraman I see. So, based on your use case this callback should be function with generated text as an argument?

tvraman commented 7 months ago

Sergey Kostyaev @.***> writes:

that would be one possibility. Other possibility would be to provide a functon that I can advice via defadvice.

There are two possible types of behavior I'd like to be able to implement.

  1. Content is automatically spoken as it is streamed -- might get annoying and not always desirable.
    1. User hears a short auditory icon and when streaming stops, hits a key to hear the accumulated response for the last query.

@tvraman I see. So, based on your use case this callback should be function with generated text as an argument?

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you were mentioned.

--

s-kostyaev commented 7 months ago

@tvraman in version 0.4.5 function ellama-stream supports :on-done callback function parameter. Does it helps you?

tvraman commented 7 months ago

Sergey Kostyaev @.***> writes:

Not directly -- will need to think through it. Would be better if the callback could be specified by setting a variable.

@tvraman in version 0.4.5 function ellama-stream supports :on-done callback function parameter. Does it helps you?

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you were mentioned.

--