Closed stephenwithav closed 6 months ago
Not yet. But it's possible in principle. I will think about it.
Could you explain your use case more?
I am working around that by saving the session's log on my disk. When I come back I say hi to the LLM, and then copy paste it in the *ellama*
buffer but I think it could be much more streamlined.
I will see, how to implement it. Thank you for request.
Could you explain your use case more?
Here's a gist of this morning's fake standup. If promptnig in the morning with ellama-ask
with yesterday's conversation allowed me to resume, it would be helpful.
The context window for llama2 maxes out at 4096, I think, so this may be a useless request.
Do you paste the whole log, @tusharhero? or do you C-x h M-x ellama-ask-about
?
Ollama supports LoRAs. Would it be possible to add to the LoRA after each interaction?
Ollama supports LoRAs. Would it be possible to add to the LoRA after each interaction?
I don't think it will work inside one session without resending previous messages.
I don't think it will work inside one session without resending previous messages.
You're right that LoRAs aren't the answer; I was thinking about that in bed this morning. LoRAs are more for creating a derivative model. They're cool, but not quite what we need here.
Do you paste the whole log, @tusharhero? or do you
C-x h M-x ellama-ask-about
?
I just paste the whole thing.
@stephenwithav Feature to manage ellama sessions is planned. I will think more how to implement it. For now you can try to save your standups into file, open it, write new question and use new command ellama-complete
- it will continue your buffer. Not sure if it will works good, but you can try and return with feedback.
Thanks, @s-kostyaev. I'll test when it's available in MELPA.
Thanks, @s-kostyaev. I'll test when it's available in MELPA.
It's there. Update ellama and try out.
It's there. Update ellama and try out.
Doom Emacs was the culprit. I didn't realize doom sync
required a -u
flag to update to the newest packages.
My current approach is here, so I'll test tomorrow.
Do you still want me to try pasting in the previous days standup each morning?
Try to open file with your standups, add what you want and call ellama-complete
to get response from model. Maybe you need to insert model nick first, but you can write elisp helper for that. And return wit feedback. I would like to know if it works for you.
Thanks for the tip, I'll try that.
One thing I find useful is to just M-:
and evaluate the following s-expression: (while t (call-interactively #'ellama-ask))
. It's as close as I can get to a Bard/ChatGPT-like experience. Is there a plan to incorporate something like that?
M-x
ellama-ask-selection
M-x
ellama-ask-selection
This just summarized the existing selection.
ellama-complete
did as well.
Bad. So, you can wait when I will implement proper session saving and loading. It will take some time. Thank you for feedback.
Fixed in #49
Feedback is welcome
The session management definitely has potential. While I'm unable to resume earlier sessions, it's useful to have a history of earlier chats.
My initial thoughts:
doom-docs-mode
instead of org-mode
? The Suggest edits
and Help
links don't work.org-journal
works, would be useful. Each daily standup conversation begins with "Good morning", so C-c e a i
reloads the earlier session if I accidentally save it.ellama-code-review
buffers aren't interactive. Should they be?.org
files in ~/.emacs.d/.local/cache
as Doom Docs. Changing ellama-sessions-directory
to ~/.emacs.d/.ellama-sessions
resolved this issue.SPC u
as an alternative to C-u
... I didn't realize that.*scratch*
buffer is never saved due to changes. It would be useful, imo, for ellama to have similar buffers that are never saved. Suppose you can't remember something, like the name of/keybinding from a new-to-you-package, a quick C-c e i d
for a disposable buffer could be helpful.Thank you for ellama!
I'm a solo developer so I decided to test a daily standup with ellama this morning (ollama backend). Would it be possible to resume this session tomorrow?
I'm assuming not because Ollama's responses aren't static, but it would be kinda cool.