princeton-nlp / TRIME

[EMNLP 2022] Training Language Models with Memory Augmentation https://arxiv.org/abs/2205.12674
194 stars 13 forks source link

How can i use TRIME with chatGPT to improve understanding of context?any advice? #7

Closed SeanXu1219 closed 1 year ago

a3616001 commented 1 year ago

Hi, thanks for your interest! But sorry I may not fully understand your question. It'd be helpful if you could elaborate on what "improve understanding of context" means here!

SeanXu1219 commented 1 year ago

As we all know, chatGPT3.5 has 4K tokens, while GPT4 has 8K/32K. However, currently, I have implemented my own chatAI based on API and embedding, but its performance in multi-turn conversations is not satisfactory. The TRIME memory enhancement method has inspired me, and I wonder if it can replace the recent N-turn conversation + embedding (which is heavily limited by token length) to enhance memory and improve the performance of private domain chatAI in multi-turn conversations.

danqi commented 1 year ago

@SeanXu1219 For long-context modeling, I think you should look at the long-term memory variant of our approach called TRIME_long. I think it is a promising solution to incorporate long-range context with proper training, although our paper only shows perplexity so far.