debanjandhar12 / logseq-chatgpt-plugin

A tightly integrated ChatGPT plugin for Logseq.
GNU Affero General Public License v3.0
108 stars 10 forks source link

Question about token limit implementation #29

Closed sumergoconicio closed 1 year ago

sumergoconicio commented 1 year ago

Discussed in https://github.com/debanjandhar12/logseq-chatgpt-plugin/discussions/28

Originally posted by **sumergoconicio** July 6, 2023 First off, I LOVE this plugin. Thank you for making it and sharing it for free. It has changed my workflows and it is a tool I rely on daily nowadays. I have a question, basically an error that arises after a long conversation. By default, ChatGPT stops responding, either mid-sentence or refuses to process the tokens with an error that token limits have been exceeded. My understanding is that the LLM basically looks at the last 8000 tokens in the conversation, right? So is there a hard limit to the length of a conversation e.g. max 20 back-and-forth chats to a limit of 8000 tokens OR is there effectively no limit since a rolling window of the last 8000 tokens are used EVEN IF the overall conversation has gone beyond that token limit? So I want to understand why I get this error: 20230706-09 16 14AM -Logseq 004059@2x and if it can be counteracted with any settings on my side or code updates on your side?
debanjandhar12 commented 1 year ago

There is actually a rolling window but it is broken atm due a bug caused when updating langchain dependency.

I will fix that and do a release on the weekend.

debanjandhar12 commented 1 year ago

I believe the error is now fixed (v2.0.3). If the error reoccurs, please re-open the issue.

sumergoconicio commented 1 year ago

Thanks Deban!

Naveen Srivatsav Anticipatory Design & Emerging Technologies https://www.glyphsandgraphs.com https://www.facebook.com/sum.ergo.conicio https://www.linkedin.com/in/naveensrivatsav/

On Sat, 8 Jul 2023 at 13:47, debanjandhar12 @.***> wrote:

I believe the error is now fixed (v2.0.3). If the error reoccurs, please re-open the issue.

— Reply to this email directly, view it on GitHub https://github.com/debanjandhar12/logseq-chatgpt-plugin/issues/29#issuecomment-1627167756, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADYOBKY7ONU6X7YATKCYV3TXPFCD5ANCNFSM6AAAAAA2ACQJEE . You are receiving this because you authored the thread.Message ID: @.***>

sumergoconicio commented 1 year ago

Hi Deban

I believe there is still an error in the context window implementation. In the last few chats, when the context limit is exceeded, the error message does not pop up BUT the topic of the responses becomes incoherent with the bot responses diverging wildly from the actual topic under discussion in that chat.

It was a pretty weird error actually, because it was answering as if asked some prompt in one of my earlier chats, as if it reverted to some cache to continue the conversation.