Julie-tang00 / Point-BERT

[CVPR 2022] Pre-Training 3D Point Cloud Transformers with Masked Point Modeling
MIT License
542 stars 65 forks source link

Pre-training and Fine-tuning times #9

Closed eliahuhorwitz closed 2 years ago

eliahuhorwitz commented 2 years ago

Hey, Thanks for publishing the code, these are indeed very impressive results! Could you please elaborate on how long it took for the pre-training and fine-tuning and on what type of hardware?

Thanks!

yuxumin commented 2 years ago

Hi, thanks for your interest in our work. For pre-training, it takes about 22 hours on two Nvidia 2080Ti or 28 hours on one Nvidia 3090. For fine-tuning, it takes about 9 hours on one Nvidia 2080 Ti or 5 hours on one Nvidia 3090.

eliahuhorwitz commented 2 years ago

Thanks for the quick response! Do these times iclude the dVAE training or is it just for the MPM part?

yuxumin commented 2 years ago

The 22 hours for pre-training is only conclude MPM part. The training of dVAE takes another 20 hours on one Nvidia 2080Ti at the beginning.

eliahuhorwitz commented 2 years ago

Thanks!