Closed cabbagecabbage closed 3 weeks ago
Yes recursive summarization is implemented - you can see the implementation of the summarizer here https://github.com/cpacker/MemGPT/blob/main/letta/agent.py#L993
You can configure the percentage of tokens that get truncated down here https://github.com/cpacker/MemGPT/blob/main/letta/constants.py#L123C1-L123C33
Could you please let me know if the recursive summarization method mentioned in Section 2.2 of the MemGPT paper has been implemented to handle cases where the context exceeds a predefined length? For instance, if I set the maximum context length to 4k, when total_tokens reaches 3k (75% of 4k), would 50% of the context be summarized and replace the first half of the original context? This approach would then reduce the context length to alleviate pressure on the LLM.