Nightmare-n / GD-MAE

GD-MAE: Generative Decoder for MAE Pre-training on LiDAR Point Clouds (CVPR 2023)
Apache License 2.0
114 stars 6 forks source link

Loss fluctuation during pretraining #21

Open sinatayebati opened 8 months ago

sinatayebati commented 8 months ago

After pretraining the model on KITTI and plotting the loss, I noticed lots of fluctuations between ~min of 0.1 to ~max of 0.8, it appears that the loss does not stabilize nor saturate. Is this a usual trend and have you experienced the same training situation?

Nightmare-n commented 7 months ago

Hi, I have trained the model on KITTI-360 before. From what I remember, the losses seem to converge normally.