facebookresearch / mae

PyTorch implementation of MAE https//arxiv.org/abs/2111.06377
Other
6.93k stars 1.17k forks source link

Is possible to enable FP16 or TF32 in pretraining? #167

Open Wongboo opened 1 year ago

Wongboo commented 1 year ago

Is possible to enable FP16 or TF32 in pretraining?

forever208 commented 5 months ago

I have the same question, anyone know?