karthink / gptel

A simple LLM client for Emacs
GNU General Public License v3.0
1.03k stars 111 forks source link

gptel: Add `gptel-add-context` function #292

Closed munen closed 1 week ago

munen commented 2 months ago

Hi @karthink

Coming back to our short discussion 5 hours ago.

I haven't looked around to see what other LLM clients offer since ~May 2023. Surely someone must have written something more featureful and seamless than gptel? A quick google search indicates that there are multiple (paid) commercial offerings for VSCode, I can't imagine them doing worse than gptel. This package is just a fancy wrapper around Curl.

Maybe I'm just an old, grumpy dude, but those commercial offerings that I checked out didn't hold their water. Cody was quite OK, but I didn't get it to work with my LLM API keys (for example OpenAI). It does work well with Claude 2.0 and ollama, though.

Having said so, I prefer the functionality of gptel. I appreciate that it has a small and simple API surface which nonetheless includes all actually interesting interaction patterns. When adding Emacs and the GPL into the mix, it's a hard to beat mixture. There was only one important feature missing for me, so I built it. Couldn't have done that to any of those proprietary systems(;

The main reason is that gptel is a one-weekend-per-month project for me and that time is eaten up fixing bugs!

Thank you for your effort and wonderful project :pray: :bow:

There are three features on the roadmap that I haven't had the time to work on: attaching context (this PR), supporting function calling, and multimodal support (mainly vision) in chats.

Happy to hear :+1: Maybe you'll like this implementation of attaching context. I tried to keep the implementation small, but flexible. I don't have much experience using it, so far, but I did dogfood on this PR from the very beginning(; Also, I'm looking very much forward to using it to tackle some bigger projects next week.

Copilot-style completion-at-point, a fourth feature, seems very difficult to do via the Chat APIs so I've shelved that plan for now.

Agreed. I think it's hard to do with the chat APIs. In my tests of the proprietary alternatives, they also didn't really yield better results than actually using the chat modality (when given enough context). So I don't mind intentionally omitting this this feature.

Yes, keeping gptel simple and focused is the only way for it to remain maintainable given my time constraints.

That's a smart choice! Apart from time, these AI based projects tend to have a lot of feature creep and a short half-life. I appreciate your effort to keeping this maintainable while at the same time making it all the more usable. Thank you :pray:

@munen Love your Emacs work and presentations! Thank you for organice as well, it's very handy!

Thank you for your kind words :pray: I didn't have too much time for FOSS lately, but this time was needed so that I can come back to it again in the long term. I'm looking forward to that time!

daedsidog commented 1 month ago

Could perhaps benefit from the features I have on my PR, i.e. removing context, highlighting context, dedicated context buffer, etc? I think those don't really conflict with the vision of this library just so long as they are opt-in & customizable, which isn't the current state in my PR.

daedsidog commented 3 weeks ago

256 is close to fruition and I think it adds everything you want, making this PR obsolete in my opinion.

karthink commented 1 week ago

@munen A pretty expansive version of this feature has now been added, thanks to @daedsidog's efforts. Please let me know after trying it if it's missing any features you might want. If you have trouble finding/using the available features (after updating gptel), that will also serve as very useful feedback for us.

I will close this now, but I'm looking forward to how you find the feature, so please let us know in this thread!