tsb0601 / i-CTRL

15 stars 2 forks source link

Inquiry about "model-epoch1000.pt" in Your Paper on Incremental Learning #2

Open TuWei6 opened 4 months ago

TuWei6 commented 4 months ago

Hi Shengbang,

I hope this email finds you well! My name is Zhengji Li, and I am a first-year graduate student at Dalian University of Technology, currently diving into continual learning. I recently came across your excellent paper, "Incremental Learning of Structured Memory via Closed-Loop Transcription," and I must say, I'm thoroughly impressed. Your work is incredibly inspiring!

I'm reaching out to you regarding a specific aspect of your paper: the pre-trained file "model-epoch1000.pt" mentioned in your code repository. Based on my understanding of both the paper and the provided code, I attempted to replicate your results. To do this, I ran the first task for 1000 epochs without loading the pre-trained file, generating my own version of "model-epoch1000.pt." However, when I substituted this file for the original one in your code, the results did not match those reported in your paper.

I'm curious about the process you followed to train the original "model-epoch1000.pt" file. Did you use any specific settings, hyperparameters, or additional data that might not be immediately apparent from the paper or the code repository?

I truly appreciate the effort and innovation you've put into this research, and any insights you could share would be incredibly valuable to my work. Also, it's motivating to see your accomplishments, and it makes me look forward to the future of my own research career.

Thank you so much for your time, and I look forward to your response.

Best regards, Zhengji Li Graduate Student, Dalian University of Technology

lllzt47 commented 1 month ago

Hello, I've encountered the same issue. Have you resolved it in the end?

TuWei6 commented 1 month ago

Hello, I've encountered the same issue. Have you resolved it in the end?

Unfortunately, I haven't. It seems that I don't quite understand how this model was pre-trained. Feel free to reach out to me at lzj0130@gmail.com if you'd like to talk more about it.