ahyatt / llm

A package abstracting llm capabilities for emacs.
GNU General Public License v3.0
178 stars 24 forks source link

Use Plz in OpenAI provider #27

Closed r0man closed 6 months ago

r0man commented 6 months ago

Hi @ahyatt

This changes the OpenAI provider to use plz. This is still WIP, but I open it to get the ball rolling and so you can see how I would imagine migrating providers.

This is based on the other PR where I added the plz extensions.

Some open questions:

Wdyt?

ahyatt commented 6 months ago

I've finished the Open AI provider, and it seems to work, and the code is shorter too, so that's super nice.

The Gemini / Vertex, however, is not working, and I'm stuck on that one. As I mentioned, it uses a streaming JSON, so I don't think the current plz code can handle something like that, unless I'm mistaken. I basically need just text chunks at the minimum, but ideally it'd be the entire text so far. Let me know if it is possible and how, or if not, if you plan to add that.

r0man commented 6 months ago

Hi @ahyatt, nice! I will try the Open AI provider soon. I'm also going to look how I can support you with the JSON format from Gemini/Vertex.

ahyatt commented 6 months ago

Cool, thanks! FYI, I probably will change the Open AI provider to be closer to how you designed it initially, since I realize the Claude provider I just added uses event-streaming with completely different and much more complicated event types.

ahyatt commented 6 months ago

FYI, I've now changed it so that there's a special method for event-source streaming, with the logic more on the client side. I'll port Claude over as well today.