Open emperorjoker opened 1 year ago
or automatically delete the earlier conversation?
We will implement this later.
I think it can also be possible with embeddings. Another advantage can store days worth of conversations in case you need long term memory and brings the cost of conversations down by 5 times depending on usage. I've been reading on ways to implement it from these sources
Article: https://towardsdatascience.com/generative-question-answering-with-long-term-memory-c280e237b144
Implementation examples: https://github.com/openai/openai-cookbook/blob/main/examples/Question_answering_using_embeddings.ipynb
There's also a guide on pinecone too. A site one can utilize for the database https://docs.pinecone.io/docs/openai
I think it can also be possible with embeddings. Another advantage can store days worth of conversations in case you need long term memory and brings the cost of conversations down by 5 times depending on usage. I've been reading on ways to implement it from these sources
This seems to be a much better and elegant solution.
This model's maximum context length is 4096 tokens. However, your messages resulted in 4119 tokens. Please reduce the length of the messages. Sorry, I am not feeling well. Please try again.
After exceeding the limit of tokens, we can only execute /clear to clear conversation context to restore function.
This seems to be the only option, so is it possible that sending the message again after reaching the limit will automatically execute the /clear command?