lxtGH / CAE

This is a PyTorch implementation of “Context AutoEncoder for Self-Supervised Representation Learning"
193 stars 22 forks source link

Will you provide CAE's training logs? #1

Closed bsxxdw closed 2 years ago

bsxxdw commented 2 years ago

I'm planing to train CAE with ImageNet1K dataset. But it takes about 1 hour to train CAE with one epoch, which is much longer than MAE (about 4 minutes per epoch). Is it normal? How long did you take to train CAE for 800 epochs?

SelfSup-MIM commented 2 years ago

Hi, it takes about 6.5 minutes (batch size = 2048) to train CAE-base for one epoch on 32 V100 GPUs. Training logs will be uploaded soon.

bsxxdw commented 2 years ago

Thanks for your reply, and your great work!

SelfSup-MIM commented 2 years ago

@bsxxdw Training logs of base models are available: https://drive.google.com/drive/folders/1wwhg7nj2GQuU9uthVuQLkEEXEjx90G7g?usp=sharing.