TiagoCortinhal / SalsaNext

Uncertainty-aware Semantic Segmentation of LiDAR Point Clouds for Autonomous Driving
MIT License
417 stars 102 forks source link

Slow inference with uncertainty #63

Open RaphaelLorenzo opened 3 years ago

RaphaelLorenzo commented 3 years ago

Hi,

I tried infering labels using the pretrained model, and it worked great without uncertainty

Network seq 00 scan 000000.label in 0.7649648189544678 sec KNN Infered seq 00 scan 000000.label in 0.0006988048553466797 sec Network seq 00 scan 000001.label in 0.049358367919921875 sec KNN Infered seq 00 scan 000001.label in 0.0001277923583984375 sec Network seq 00 scan 000002.label in 0.03960108757019043 sec KNN Infered seq 00 scan 000002.label in 0.00012493133544921875 sec Network seq 00 scan 000003.label in 0.03216409683227539 sec KNN Infered seq 00 scan 000003.label in 0.00012087821960449219 sec Network seq 00 scan 000004.label in 0.03187704086303711 sec KNN Infered seq 00 scan 000004.label in 0.00012159347534179688 sec Network seq 00 scan 000005.label in 0.03268098831176758 sec KNN Infered seq 00 scan 000005.label in 0.0001220703125 sec Network seq 00 scan 000006.label in 0.035898447036743164 sec KNN Infered seq 00 scan 000006.label in 0.0001232624053955078 sec Network seq 00 scan 000007.label in 0.03408312797546387 sec KNN Infered seq 00 scan 000007.label in 0.0001232624053955078 sec Network seq 00 scan 000008.label in 0.032814741134643555 sec KNN Infered seq 00 scan 000008.label in 0.00012636184692382812 sec Network seq 00 scan 000009.label in 0.0343012809753418 sec KNN Infered seq 00 scan 000009.label in 0.0001857280731201172 sec

Using -u (and requiring a little fix as mentionned here : https://github.com/Halmstad-University/SalsaNext/issues/12#issuecomment-885596970) it seems to work and generates log_var and uncert label files but is much slower, (more than 6 seconds per sequence, thus 200 times slower for 30 iterations).

Infered seq 00 scan 000000.label in 7.025984764099121 sec 7.025984764099121 Infered seq 00 scan 000001.label in 6.477065801620483 sec 6.751525282859802 Infered seq 00 scan 000002.label in 6.456561803817749 sec 6.653204123179118 Infered seq 00 scan 000003.label in 6.463520765304565 sec 6.60578328371048 Infered seq 00 scan 000004.label in 6.522738695144653 sec 6.589174365997314 Infered seq 00 scan 000005.label in 6.484813451766968 sec 6.571780880292256 Infered seq 00 scan 000006.label in 6.5031116008758545 sec 6.56197098323277 Infered seq 00 scan 000007.label in 6.512105464935303 sec 6.555737793445587 Infered seq 00 scan 000008.label in 6.457853555679321 sec 6.544861767027113 Infered seq 00 scan 000009.label in 6.514480829238892 sec 6.541823673248291 Infered seq 00 scan 000010.label in 6.4799792766571045 sec 6.536201455376365

I also reduced the number of iterations with -mc 10 instead of the default 30, but it still takes around 3 seconds per sequence inference.

Is there a particular reason for such a difference ?

Gatsby23 commented 2 years ago

Hi,

I tried infering labels using the pretrained model, and it worked great without uncertainty

Network seq 00 scan 000000.label in 0.7649648189544678 sec KNN Infered seq 00 scan 000000.label in 0.0006988048553466797 sec Network seq 00 scan 000001.label in 0.049358367919921875 sec KNN Infered seq 00 scan 000001.label in 0.0001277923583984375 sec Network seq 00 scan 000002.label in 0.03960108757019043 sec KNN Infered seq 00 scan 000002.label in 0.00012493133544921875 sec Network seq 00 scan 000003.label in 0.03216409683227539 sec KNN Infered seq 00 scan 000003.label in 0.00012087821960449219 sec Network seq 00 scan 000004.label in 0.03187704086303711 sec KNN Infered seq 00 scan 000004.label in 0.00012159347534179688 sec Network seq 00 scan 000005.label in 0.03268098831176758 sec KNN Infered seq 00 scan 000005.label in 0.0001220703125 sec Network seq 00 scan 000006.label in 0.035898447036743164 sec KNN Infered seq 00 scan 000006.label in 0.0001232624053955078 sec Network seq 00 scan 000007.label in 0.03408312797546387 sec KNN Infered seq 00 scan 000007.label in 0.0001232624053955078 sec Network seq 00 scan 000008.label in 0.032814741134643555 sec KNN Infered seq 00 scan 000008.label in 0.00012636184692382812 sec Network seq 00 scan 000009.label in 0.0343012809753418 sec KNN Infered seq 00 scan 000009.label in 0.0001857280731201172 sec

Using -u (and requiring a little fix as mentionned here : #12 (comment)) it seems to work and generates log_var and uncert label files but is much slower, (more than 6 seconds per sequence, thus 200 times slower for 30 iterations).

Infered seq 00 scan 000000.label in 7.025984764099121 sec 7.025984764099121 Infered seq 00 scan 000001.label in 6.477065801620483 sec 6.751525282859802 Infered seq 00 scan 000002.label in 6.456561803817749 sec 6.653204123179118 Infered seq 00 scan 000003.label in 6.463520765304565 sec 6.60578328371048 Infered seq 00 scan 000004.label in 6.522738695144653 sec 6.589174365997314 Infered seq 00 scan 000005.label in 6.484813451766968 sec 6.571780880292256 Infered seq 00 scan 000006.label in 6.5031116008758545 sec 6.56197098323277 Infered seq 00 scan 000007.label in 6.512105464935303 sec 6.555737793445587 Infered seq 00 scan 000008.label in 6.457853555679321 sec 6.544861767027113 Infered seq 00 scan 000009.label in 6.514480829238892 sec 6.541823673248291 Infered seq 00 scan 000010.label in 6.4799792766571045 sec 6.536201455376365

I also reduced the number of iterations with -mc 10 instead of the default 30, but it still takes around 3 seconds per sequence inference.

Is there a particular reason for such a difference ?

Sorry, I'm the fresh man to this topic. Could you pelase help me how to inference the point cloud using the pretrained model? I only download the 00 sequence on my disk

Xavier-wa commented 9 months ago

same quetion!,did you solve it?