chaytonmin / Occupancy-MAE

Official implementation of our TIV'23 paper: Occupancy-MAE: Self-supervised Pre-training Large-scale LiDAR Point Clouds with Masked Occupancy Autoencoders
Apache License 2.0
252 stars 18 forks source link

About dataset during training #10

Closed Yontara closed 2 years ago

Yontara commented 2 years ago

Did you use different data for pretrain and fine tuning? If you did so, what percentage of the total data did you use for the pretrain?

chaytonmin commented 2 years ago

Did you use different data for pretrain and fine tuning? If you did so, what percentage of the total data did you use for the pretrain?

We use the same training data for pretraining and fine-tuning.