Closed bsxxdw closed 2 years ago
Hi, it takes about 6.5 minutes (batch size = 2048) to train CAE-base for one epoch on 32 V100 GPUs. Training logs will be uploaded soon.
Thanks for your reply, and your great work!
@bsxxdw Training logs of base models are available: https://drive.google.com/drive/folders/1wwhg7nj2GQuU9uthVuQLkEEXEjx90G7g?usp=sharing.
I'm planing to train CAE with ImageNet1K dataset. But it takes about 1 hour to train CAE with one epoch, which is much longer than MAE (about 4 minutes per epoch). Is it normal? How long did you take to train CAE for 800 epochs?