ujjwall-R / wrapGPT

Chatbot API built over GPT3 which memorizes previous chat prompts/chat.
6 stars 0 forks source link

max token #1

Open ahmedRSA opened 1 year ago

ahmedRSA commented 1 year ago

if history reaches 4000 tokens api will not work you either have to delete full history or some older message manually

what you can do is add timestamp to messages and delete them after a certain amout of time so we can never reach 4000 tokens limit

ahmedRSA commented 1 year ago

maybe something like this? this way we will never hit 4000 token limit image

ujjwall-R commented 1 year ago

you mean the prompt size? Actually, I used this wrapper in a company I am working for and the moment the prompt size increases upto certain value, the error is thrown and we used it as a check for subscription.

Here I see, we can manually handle the database and will erase the database history or the previous(maybe half) the prompt text from the database. What do you think so?

ahmedRSA commented 1 year ago

instead of deleting half database history you can just add key in POST post route like

{
    "userid": "",
    "prompt": "",
    "size":  3 // take last 3 messages from chat history in prompt
}
ahmedRSA commented 1 year ago

what do you think?

ujjwall-R commented 1 year ago

I think using this approach remove the sole purpose of chat memorization. There is very common use cases that we want the bot to remember the last hour or even last day chat.

ahmedRSA commented 1 year ago

I think using this approach remove the sole purpose of chat memorization. There is very common use cases that we want the bot to remember the last hour or even last day chat.

what if in last hour user/bot have 100+ message chat history how will you process it? you still have to pass messages in limit to the prompt.

ahmedRSA commented 1 year ago

chatgpt uses the same approach i think it does not remember 4-5 older prompts and its responses.

ahmedRSA commented 1 year ago

??

ujjwall-R commented 1 year ago

Are you sure?

But in a use case I created a QnA on research paper where the previous prompt was needed. So I think there might be several use cases like that.