matrixgpt / matrix-chatgpt-bot

Talk to ChatGPT via any Matrix client!
GNU Affero General Public License v3.0
235 stars 63 forks source link

Switch to use the official API #75

Closed bertybuttface closed 1 year ago

bertybuttface commented 1 year ago

It is now possible to access the ChatGPT model via the official API. Right now this is undocumented and unreleased.

Changes are minimal from a code point of view but from a users point of view they will have to add an API key.

We will therefore bump the major version number.

max298 commented 1 year ago

I've already changed the bot to use the official API, I'd make a pull request as soon as I've finished testing

bertybuttface commented 1 year ago

Excellent thanks, I'm guessing you used this: https://github.com/waylaidwanderer/node-chatgpt-api

We will also need to tidy the documentation up after this change.

bertybuttface commented 1 year ago

We are limited to a certain number of tokens for both the request to ChatGPT and the response from ChatGPT

If we use node-chatgpt-api it is using the same conversationId and parentMessageId that the ChatGPT Web UI does. This makes sense to make the transition easier but there are potentially other things we can do.

If you look at: https://github.com/waylaidwanderer/node-chatgpt-api/blob/c0903e220b18a131e9a6b94310b24c7b6b58c2ff/src/ChatGPTClient.js#L107

node-chatgpt-api uses the following algorithm to work out what to include as context for the message:

// Iterate through messages, building an array based on the parentMessageId.
// Each message has an id and a parentMessageId. The parentMessageId is the id of the message that this message is a reply to.
// The array will contain the messages in the order they should be displayed, starting with the root message.

Which could produce too long of an output, so they chop it down like this:

// I decided to limit conversations to 3097 tokens, leaving 1000 tokens for the response.
// Iterate backwards through the messages, adding them to the prompt until we reach the max token count.

Which seems reasonable enough for now but it is possible we can do something smarter our end.

max298 commented 1 year ago

I didn't use the library you mentioned, I've just bumped the version of the existing library which also just added support for the official api. I'lve linked the pull request :)

max298 commented 1 year ago

So, OpenAI just turned off the API, making my PR obsolete (at least for now, I think as soon as the subscription goes live this will become relevant again). While it would be easy to change the model as suggested in the issue mentioned above or also in the node-chatgpt-api repository, I think this is missing the point of this bot as this won't really use ChatGPT.

I've changed my PR back to WIP for know, maybe something changes soon. However you can also close the PR, I think I wont look into this until a official API comes back.