karthink / gptel

A simple LLM client for Emacs
GNU General Public License v3.0
1.04k stars 113 forks source link

[Looking for feedback] Add support for Function Calls #209

Open isaacphi opened 4 months ago

isaacphi commented 4 months ago

Demo

I made a quick screen recording showing the proof of concept: https://screenapp.io/app/#/shared/6d146099-4bc3-4b34-afd7-bff3d16ee0ed

Overview

I've added support for Function Calling to gptel. This is just a proof of concept that I wanted to share before going any further.

First I want to thank you for making this package! It's become a seamless part of my emacs workflow because of how well it blends in, in particular using gptel-send for some of my own scripts. I think that supporting function calls would fit in well with the design philosophy of this package, allowing users to define their own callable functions for their own use cases. Some use cases I can think of off hand:

I should note that function calls are not specific to OpenAI, they're available many if not all of the other models you support.

Before moving on, I'd like to know if this is even something you'd be interested in including in gptel. I'd also like to know your thoughts on how the user would use this feature. So far this is what I'm thinking:

How to test this PR

  1. Run all the elisp in the file I included called test-user-config.el
  2. Open a chat using the model I called "OpenAI with function calls" (I turned off streaming mode because I only support that for now)
  3. Ask the chat to create new files or to have a cow say something. If it doesn't recognize your prompt as relating to either of those topics, it would work regularly.

Let me know what you think! Feel free to open a discussion if you think that's a more appropriate venue for this. Related issue: #76

karthink commented 4 months ago

This is fantastic -- and quite a small change too! Thanks for the PR, this looks quite promising.

I'm not familiar with the OpenAI function-calling API. I'll take a look at it soon and then think about the UI, it'll probably be a few days from now.