karthink / gptel

A simple LLM client for Emacs
GNU General Public License v3.0
1.04k stars 111 forks source link

LaTeX rendering in GPTel buffer #266

Closed daedsidog closed 3 months ago

daedsidog commented 3 months ago

I really think the dedicated chat buffer should offer LaTeX rendering out of the box like most online LLM interfaces.

latex

Right now I have this implemented in my config via a hack where I go into Org mode, call (org-latex-preview), then return to the previous mode (I use Markdown for my chat because I feel the models tend to "know" it better).

The only thing that isn't being preserved between the buffers is the text scale, which I save and restore on the switch. I do not know what other things may break. All this jumping through hoops is because org-latex-preview doesn't work outside Org mode.

This works fine (see https://github.com/karthink/gptel/wiki#rendering-latex-in-dedicated-chat-buffer), and if it is up to your standard, I can integrate it (all the pertaining code is already on the wiki page).

karthink commented 3 months ago

@daedsidog Here's the same example in my Emacs without any hacks, post-response hook or additional processing -- I just opened a chat buffer and typed in the question. This is the response:

https://github.com/karthink/gptel/assets/8607532/0bd3d2dc-11c8-4a07-bc74-4a5d57b8fbd8

As you can see, the latex experience is pretty good, easily on par with online LLM interfaces.

It's actually better than them, because I can then do this:

https://github.com/karthink/gptel/assets/8607532/1ef6e120-bba5-474b-9310-40bcbc7ece8a

I will explain later today -- the latex preview situation in Emacs is quite complicated right now. I want to make sure I have the right context for my answer so I have a question for you: Are you aware of Org's upcoming latex preview system rewrite?

daedsidog commented 3 months ago

@karthink Yes I am aware of it actually... however, even though I have the latest Org 9.7 from the repository, I definitely don't have the LaTeX experience you showcased! I figured what you showcased was not even available on the latest branch, and was just a configuration you had on top of the latest version.

karthink commented 3 months ago

@karthink Yes I am aware of it actually... however, even though I have the latest Org 9.7 from the repository, I definitely don't have the LaTeX experience you showcased! I figured what you showcased was not even available on the latest branch, and was just a configuration you had on top of the latest version.

That is correct, except that it's not a configuration -- we rewrote the LaTeX preview system from scratch because the version that exists now is extremely inefficient from the ground up. We get 100x - 800x speed ups, previewing is now non-blocking and we can do live updates like in the above video.

It's available as a fork of Org mode here until we merge it, and I wrote some detailed usage instructions here.

It's written to be modular and not pull in all of Org mode, with the idea that other major modes can provide "adapters" to use it as well. Here's an example of using it in prog-mode and calc. (Proof of concept only.)

It's taking a while to merge because it's a pretty deep change to Org, and we have a lot of loose ends to tie up, like fixing LaTeX in the org-export system.


With that said, there's nothing for gptel to do: eventually live LaTeX previews will work automatically in Org and (assuming someone writes an adapter) Markdown, and not just in dedicated chat buffers. In the meantime your code on the wiki is a welcome addition; thanks for adding it!

I'll close this issue now, please reopen if there's something else specific to gptel+LaTeX you'd like to discuss.

daedsidog commented 3 months ago

@karthink Thanks for the fork, I'll definitely be checking it out.

Would still be nice to be able to use previews in Markdown as well.

EDIT: This really rocks. ~I'll get back to using Org mode for now in the dedicated chat buffer because of this.~ Actually, this also works in Markdown with the hack, but now with much less code. Thank you.