mims-harvard / TFC-pretraining

Self-supervised contrastive learning for time series via time-frequency consistency
https://zitniklab.hms.harvard.edu/projects/TF-C/
MIT License
436 stars 82 forks source link

Fine-tuning SleepEEG -> Epilepsy #2

Closed RomanKoshkin closed 1 year ago

RomanKoshkin commented 2 years ago

In the paper, you say that for the 'SleepEEG -> Epilepsy' transfer you only fine-tune on 60 samples of the target data (Epilepsy). How many epochs? Is this really correct, given that the classifier has 16578 parameters? Also, for how many epochs did you pre-train the TF-C feature extractor on the SleepEEG data? One final one: I diffed this repo with the one at https://anonymous.4open.science/r/TFC-pretraining-6B07/README.md and they are the same (at least the code that implements the model, data augmentation and loading). Do you have a more recent version of the code? I can't fine-tune on the 60 samples. Thanks!

dongjiaxiang commented 2 years ago

Appendix E said using batch size of 64 and training epoch of 40.

RomanKoshkin commented 2 years ago

Appendix E said using batch size of 64 and training epoch of 40.

Okay, but if the number of fine-tuning samples is 60 (Section 5.1. Setup, last line), how can you make batches of size 64 if the total number of fine-tuning samples is 60? Sorry, I must be missing something.

dongjiaxiang commented 2 years ago

In config_files, you can see that the batch size is 60. One epoch means one batch in this setting(the number of fine-tuning samples 60 is less than 64). But there are still some other questions. I think the code is not the latest version,because i can’t get the same result if i use the pre-trained checkpoint to fine-tune.

XZhang97666 commented 1 year ago

In config_files, you can see that the batch size is 60. One epoch means one batch in this setting(the number of fine-tuning samples 60 is less than 64). But there are still some other questions. I think the code is not the latest version,because i can’t get the same result if i use the pre-trained checkpoint to fine-tune.

Same here, I run the "SleepEEG -> Epilepsy". However, the results on Precision, Recall and F1 are really different from the paper reported.

943fansi commented 1 year ago

In config_files, you can see that the batch size is 60. One epoch means one batch in this setting(the number of fine-tuning samples 60 is less than 64). But there are still some other questions. I think the code is not the latest version,because i can’t get the same result if i use the pre-trained checkpoint to fine-tune.

Same here, I run the "SleepEEG -> Epilepsy". However, the results on Precision, Recall and F1 are really different from the paper reported.

The results you got is ?

XZhang97666 commented 1 year ago

In config_files, you can see that the batch size is 60. One epoch means one batch in this setting(the number of fine-tuning samples 60 is less than 64). But there are still some other questions. I think the code is not the latest version,because i can’t get the same result if i use the pre-trained checkpoint to fine-tune.

Same here, I run the "SleepEEG -> Epilepsy". However, the results on Precision, Recall and F1 are really different from the paper reported.

The results you got is ?

I got around 65 F1 with 5 random seeds.

xiangzhang1015 commented 1 year ago

Dear guys,

I'm so sorry for the previous messy version. I was transferring to a new job and didn't get time to clean the codes.

We have updated the TFC implementation. I assure you that it should be good without bugs. Please check more details in the Updates on Jan 2023 section of the repo readme. In summary:

Hi @RomanKoshkin, for your questions specifically: