Closed thanhvg closed 1 year ago
Yes, I would also like to have this functionality.
@thanhvg Thanks for the PR!
The reason I haven't implemented persistence yet is because I don't know a good way to do it. Right now gptel uses text properties to determine who said what, i.e. to distinguish between what you typed and what ChatGPT generated. These are not saved to disk along with the buffer contents, so upon loading the buffer this information is lost, making the conversation unrecoverable.
Other ChatGPT packages are dealing with this by adding structure to the conversation, where what you type and the generated response are formatted differently: e.g.
#+begin_src ai
etc), or----
) before and after responses, etc.My goal is to have freeform interaction in any buffer without markup constraints, this allows the user to format things any way they want. I'm almost there now, you can select a region of text in any buffer and call gptel-send
. If the region includes past responses from ChatGPT, those are identified as responses and sent as well.
So I'll need to think about persistence a little more. The only other idea I have is to store the metadata to disk separately and apply it when opening the file (or when calling gptel-enable-current-buffer
), but this is a fragile solution that breaks easily depending on how the file is modified.
If you have any ideas, please let me know.
It doesn't sound bad to me if you add metadata below the headlines. Org :PROPERTIES:
drawers seem to be a good fit for this.
Yes, that is a good solution for org-mode buffers. For markdown, text or prog-mode buffers it's looking like the only possible solutiosn are
I would just introduce similar metadata blocks within the document (source blocks, comments, ...). You could also put the metadata at the end of the document, but this will make the diffs worse. The same applies to metadata outside the document - I would avoid that.
I found a more complicated package. https://github.com/CarlQLange/chatgpt-arcana.el
But it saves conversations in ~/.emacs.d/.local/cache/chatgpt-arcana/sessions
, it is so useful. Just FYI
Hi,
Thank you for your comprehensive review of GPT clients for Emacs on Reddit. I like yours the most :).
This merge request allows us to save the current session as an MD file. Later, when we open the file again,
gptel-enable-current-buffer
will resume the session.This solution may not be the best. Perhaps a derived mode based on markdown-mode and GPTel for a dedicated chat file (for example .gpt extension) is a better solution. However, this solution scratches the itch for me.