transitive-bullshit / agentic

AI agent stdlib that works with any LLM and TypeScript AI SDK.
https://agentic.so
MIT License
16.22k stars 2.12k forks source link

Are conversations that exceed 4000 tokens automatically culled by oldest messages? #547

Closed strich closed 1 year ago

strich commented 1 year ago

Describe the feature

Based ona cursory glance at the code it seems that chatgpt-api doesn't cull old messages/text from conversations when they hit the token limits. Is this correct? If so, how do we manage this?

transitive-bullshit commented 1 year ago

Yes, any tokens that would push the prompt over the limit are automatically culled.

Ideally, this would be easier to customize, since you may want to do things like history summarization.

But this library is really aimed at being a simple wrapper around OpenAI's chat completion APIs, so the current logic is hard-coded.

Closing for now as it's less of an issue and more of a question and/or feature request. Feel free to continue the conversation in our discord https://www.chatgpthackers.dev