karthink / gptel

A simple LLM client for Emacs
GNU General Public License v3.0
1.29k stars 128 forks source link

[FR] To isolate the response from the next and previous text #103

Closed Ypot closed 9 months ago

Ypot commented 1 year ago

To avoid the response to be mixed with the original text, I would add an option to "Response to:"

Maybe one of these Response to: -b block (-b) -q quote (-q)

So the response is separated from the previous and next text. Example:

  • Conquest of Egypt

The Fatimid conquest of Egypt took place in 969, as the troops of the Fatimid Caliphate under the general Jawhar captured Egypt, then ruled by the autonomous Ikhshidid dynasty in the name of the Abbasid Caliphate. The Fatimids had launched repeated unsuccessful invasions of Egypt soon after coming to power in Ifriqiya in 909.

#+BEGIN_QUOTE
I apologize for any confusion, but it seems like there might have been a misunderstanding. The Fatimid conquest of Egypt in 969 is not directly related to the early 20th-century labor strikes 
(ChatGPT 3.5 Turbo, 2023-08-19)

#+END_QUOTE

By the 960s, the collapse of the Ikhshidid regime, and an economic crisis and factional infighting in Egypt, allowed Fatimid caliph al-Mu'izz (coin pictured) to organize a large expedition to conquer the country, aided by the activity of a network of Fatimid agents there. When the Fatimid invasion came, the Ikhshidid elites negotiated a peaceful surrender, and the brief resistance of the Ikhshidid soldiery was overcome. Jawhar took control of Fustat, the Egyptian capital, on 6 July. Jawhar served as viceroy of Egypt until 973, when al-Mu'izz arrived and took up residence in a new capital, Cairo, which became the seat of the Fatimid Caliphate.

Ypot commented 1 year ago

Or maybe using inline tasks. It is needed to eval first: (require 'org-inlinetask). Then:

*** My headline.
Body of the headline

The question (the sent text) would be here.

(And the answer would be as follows):

*************** COMMENT [LLM]. ([year]). /[Response's first words]/. In ([GPTEL TOPIC] or [My headline])
:PROPERTIES:
:GPTEL_MODEL: gpt-3.5-turbo
:GPTEL_TOPIC: 
:GPTEL_SYSTEM: 
:END:

CHATGPT response here
*************** END

Here could continue the body of the headline

...
karthink commented 1 year ago

To avoid the response to be mixed with the original text, I would add an option to "Response to:"

Maybe one of these Response to: -b block (-b) -q quote (-q)

So the response is separated from the previous and next text. Example:

I don't understand your example. You want ChatGPT's response to be enclosed inside (say) quotes, so that the interaction looks like the following:

*** My question here

#+begin_quote
ChatGPT's response here.
#+end_quote

*** My next question here.

#+begin_quote
ChatGPT's response here.
#+end_quote

Is this correct?

Ypot commented 1 year ago

Yes, and with automatic signature.

I am thinking in the case where there is not a dedicated buffer for chatgpt, but you are working inside an already existing buffer, where responses mix with your already existing text.

But I think I like better using inlinetasks. There, PROPERTIES could offer information about chatgpt version and its configuration. And, if "commented", they are not exported.

Ypot commented 1 year ago
*** My headline.
Body of the headline

My question here

*************** COMMENT OpenAI. (2023). /ChatGPT: Response's first words/ (Ago 20 version). In ("GPTEL TOPIC" or "My headline")
:PROPERTIES:
:GPTEL_MODEL: gpt-3.5-turbo
:GPTEL_TOPIC: 
:GPTEL_SYSTEM: 
:END:

CHATGPT response here
*************** END

Continues the body of the headline

...

If quotes:

*** My headline.
Text here.
My question here

#+begin_quote
ChatGPT's response here.
-- (ChatGPT 3.5 Turbo, 2023-08-19)
#+end_quote

More text here

My next question here
...
karthink commented 9 months ago

@Ypot This is now possible via #142, at least when gptel-mode is turned on. See the documentation for the variables gptel-prompt-prefix-alist and gptel-response-prefix-alist.

Ypot commented 9 months ago

Thanks. I don't know how to, but I am closing this issue as I believe it is possible. Bests!

karthink commented 9 months ago

Thanks. I don't know how to, but I am closing this issue as I believe it is possible.

*** My headline.
Text here.
My question here

#+begin_quote
ChatGPT's response here.
-- (ChatGPT 3.5 Turbo, 2023-08-19)
#+end_quote

More text here

My next question here
...

To reproduce this format in org-mode, for example,

;; Place ChatGPT response between quotes
(setf (alist-get 'org-mode gptel-prompt-prefix-alist)
      "#+end_quote\n")
(setf (alist-get 'org-mode gptel-response-prefix-alist)
      "#+begin_quote\n")

;; Add LLM model info and date at end of response
(add-hook 'gptel-post-response-hook
          (defun my/gptel-add-model-info ()
            (save-excursion
              (search-forward
               (alist-get 'org-mode gptel-prompt-prefix-alist)
               nil t)
              (previous-line 3)
              (insert
               "\n-- (" (gptel-backend-name gptel-backend)
               ": " gptel-model
               (format-time-string "%Y-%m-%d)")))))