chaytonmin / Occupancy-MAE

Official implementation of our TIV'23 paper: Occupancy-MAE: Self-supervised Pre-training Large-scale LiDAR Point Clouds with Masked Occupancy Autoencoders
Apache License 2.0
251 stars 18 forks source link

Overly smooth training loss #33

Open huixiancheng opened 10 months ago

huixiancheng commented 10 months ago

Hi, Min @chaytonmin

I visualized training loss from tensorboard and it looks like it's too smooth and batch independent. I'm not sure it's correct, would you like to share your insights ?

image