mims-harvard / TFC-pretraining

Self-supervised contrastive learning for time series via time-frequency consistency
https://zitniklab.hms.harvard.edu/projects/TF-C/
MIT License
440 stars 81 forks source link

Availability of pre-trained weights #13

Closed angy50 closed 1 year ago

angy50 commented 1 year ago

Thanks for sharing your work. Can you please help to understand whether the pretrained weights are available? if it is available, can you pls share the link. If not, are we planning to release the weights? Thanks in advance.

xiangzhang1015 commented 1 year ago

Hi,

We just released an example of a pre-trained model. The model weights can be found in pathTFC-pretraining/code/experiments_logs/SleepEEG_2_Epilepsy/run1/pre_train_seed_42_2layertransformer/saved_models/ckp_last.pt. However, please note that, for quick debugging, the model is pretrained on a subset of SleepEEG (1280 samples which are only <1% of the whole dataset). This is an example to show what the model looks like, but cannot guarantee the transferring ability broadly (as the huge pre-trained models in CV and NLP). We currently don't have the plan to share the whole model weights. Will update in this repo if we gonna do it in the future.