TiagoCortinhal / SalsaNext

Uncertainty-aware Semantic Segmentation of LiDAR Point Clouds for Autonomous Driving
MIT License
413 stars 102 forks source link

Questions about uncertainty results #45

Open fcyeh opened 3 years ago

fcyeh commented 3 years ago
  1. Is your SalsaNext result the same as SalsaNextUncertainty? I downloaded your pretrained model and did inference on kitti validation set sequence 08. SalsaNextUncertainty looks worse than SalsaNext at frame 0. SalsaNext origin SalsaNextUncertainty uncert

  2. What are your data and model uncertainty numbers? For kitti validation set sequence 08: In log_var folder file 000000.label, the first 10 numbers are [-0.01334856, 0.00081099, -0.02349238, -0.02350008, -0.02083466,-0.0205034 , -0.02259915, -0.01983254, -0.01969955, -0.00620079] In uncert folder file 000000.label, the first 10 numbers are [0.2471824 , 0.21661215, 0.38824964, 0.36705604, 0.38694522, 0.3721131 , 0.35623956, 0.33999357, 0.33527 , 0.3765529 ]

Are those numbers correct? I would expect the log_var number is lower because you set the data variance to 2e-7 and uncert also lower because in the paper you showed the model variance is roughly 1e-4.

  1. SalsaNextUncertainty are outputing weird results on some frames. For example kitti validation set sequence 08 frame 21: image The predictions are all 0. The uncert and log_var are all nan.

Thank you!

TiagoCortinhal commented 3 years ago
  1. We reported the scores on the leaderboard without the uncertainty estimation, yes.
    1. I will need to check that. Last time we discussed the uncertainty I tested it and I didn't see any issue. I will try to check this weekend =)
finnSartoris commented 2 years ago

Any updates for this?