Is your SalsaNext result the same as SalsaNextUncertainty? I downloaded your pretrained model and did inference on kitti validation set sequence 08. SalsaNextUncertainty looks worse than SalsaNext at frame 0.
SalsaNext
SalsaNextUncertainty
What are your data and model uncertainty numbers?
For kitti validation set sequence 08:
In log_var folder file 000000.label, the first 10 numbers are [-0.01334856, 0.00081099, -0.02349238, -0.02350008, -0.02083466,-0.0205034 , -0.02259915, -0.01983254, -0.01969955, -0.00620079]
In uncert folder file 000000.label, the first 10 numbers are [0.2471824 , 0.21661215, 0.38824964, 0.36705604, 0.38694522, 0.3721131 , 0.35623956, 0.33999357, 0.33527 , 0.3765529 ]
Are those numbers correct? I would expect the log_var number is lower because you set the data variance to 2e-7 and uncert also lower because in the paper you showed the model variance is roughly 1e-4.
SalsaNextUncertainty are outputing weird results on some frames. For example kitti validation set sequence 08 frame 21:
The predictions are all 0. The uncert and log_var are all nan.
Is your SalsaNext result the same as SalsaNextUncertainty? I downloaded your pretrained model and did inference on kitti validation set sequence 08. SalsaNextUncertainty looks worse than SalsaNext at frame 0. SalsaNext SalsaNextUncertainty
What are your data and model uncertainty numbers? For kitti validation set sequence 08: In log_var folder file 000000.label, the first 10 numbers are [-0.01334856, 0.00081099, -0.02349238, -0.02350008, -0.02083466,-0.0205034 , -0.02259915, -0.01983254, -0.01969955, -0.00620079] In uncert folder file 000000.label, the first 10 numbers are [0.2471824 , 0.21661215, 0.38824964, 0.36705604, 0.38694522, 0.3721131 , 0.35623956, 0.33999357, 0.33527 , 0.3765529 ]
Are those numbers correct? I would expect the log_var number is lower because you set the data variance to 2e-7 and uncert also lower because in the paper you showed the model variance is roughly 1e-4.
Thank you!