flynnoct / chatgpt-telegram-bot

Telegram bot implemented by OFFICIAL OpenAI ChatGPT API (gpt-3.5-turbo, released on 2023-03-01)
MIT License
180 stars 43 forks source link

No perceptual clear the conversation context #41

Open emperorjoker opened 1 year ago

emperorjoker commented 1 year ago

This model's maximum context length is 4096 tokens. However, your messages resulted in 4119 tokens. Please reduce the length of the messages. Sorry, I am not feeling well. Please try again.

After exceeding the limit of tokens, we can only execute /clear to clear conversation context to restore function.

This seems to be the only option, so is it possible that sending the message again after reaching the limit will automatically execute the /clear command?

peteisacat commented 1 year ago

or automatically delete the earlier conversation?

flynnoct commented 1 year ago

We will implement this later.

em108 commented 1 year ago

I think it can also be possible with embeddings. Another advantage can store days worth of conversations in case you need long term memory and brings the cost of conversations down by 5 times depending on usage. I've been reading on ways to implement it from these sources

Article: https://towardsdatascience.com/generative-question-answering-with-long-term-memory-c280e237b144

Implementation examples: https://github.com/openai/openai-cookbook/blob/main/examples/Question_answering_using_embeddings.ipynb

There's also a guide on pinecone too. A site one can utilize for the database https://docs.pinecone.io/docs/openai

emperorjoker commented 1 year ago

I think it can also be possible with embeddings. Another advantage can store days worth of conversations in case you need long term memory and brings the cost of conversations down by 5 times depending on usage. I've been reading on ways to implement it from these sources

This seems to be a much better and elegant solution.