signebedi / gptty

ChatGPT wrapper in your TTY
MIT License
47 stars 7 forks source link

[model] allow `forgetting` older conversations within same context when approaching token limit #36

Closed signebedi closed 1 year ago

signebedi commented 1 year ago

Using the ChatCompletion method #31, keyword tokenization #25 doesn't really work because each element of the conversation is divided into constituent parts, rendering frequency counters less useful.

Instead, we should add a forget_old_context bool config that tells gptty.context:get_context to drop the oldest elements of a conversation when approaching the max_context_length.

signebedi commented 1 year ago

Honestly, we can just make this the default behavior (unless we want a config for this) be reversing the order in which we read lines in gptty.context:get_config...