Closed tvraman closed 10 months ago
It is in llm package. Are you sure that you need it in ellama? Can you explain your use case? It can be implemented in ellama but I don't understand why it should be here.
Sergey Kostyaev @.***> writes:
Agreed that it likely belongs in LLM, but I asked because:
My use-case is to produce auditory feedback (emacspeak), e.g. a short auditory tick as the results stream in.
That feels like a front-end thing, and llm.el is designed to be separate from various front-ends.
see https:github.com/tvraman/emacspeak> It is in llm package. Are you sure that you need it in ellama? Can you Also, I'll create a separate issue for the next question below so you can track it:
Should ellama-mode derive from comint-mode; we'll then get the ability to navigate through various steps in a conversation for free.
explain your use case? It can be implemented in ellama but I don't understand why it should be here.
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.
--
@tvraman I see. So, based on your use case this callback should be function with generated text as an argument?
Sergey Kostyaev @.***> writes:
that would be one possibility. Other possibility would be to provide a functon that I can advice via defadvice.
There are two possible types of behavior I'd like to be able to implement.
@tvraman I see. So, based on your use case this callback should be function with generated text as an argument?
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you were mentioned.
--
@tvraman in version 0.4.5 function ellama-stream
supports :on-done
callback function parameter. Does it helps you?
Sergey Kostyaev @.***> writes:
Not directly -- will need to think through it. Would be better if the callback could be specified by setting a variable.
@tvraman in version 0.4.5 function ellama-stream supports :on-done callback function parameter. Does it helps you?
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you were mentioned.
--
Not sure how we would build that; perhaps it needs to be part of the llm.el package?