wgcban / adamae

[CVPR'23] AdaMAE: Adaptive Masking for Efficient Spatiotemporal Learning with Masked Autoencoders
https://www.wgcban.com/research/adamae
MIT License
72 stars 8 forks source link

Pre-trained models' epoch values don't match paper #2

Closed Salarios77 closed 1 year ago

Salarios77 commented 1 year ago

Hi, thanks for sharing the great work. When I tried loading the provided pre-trained models and printing the values corresponding to their 'epoch' keys, I observed values of 1199 for SSv2 and 1899 for K400. Could you please confirm that the provided models are the ones corresponding to 800 epochs of pretraining as stated in the paper?