osrf / subt

This repostory contains software for the virtual track of the DARPA SubT Challenge. Within this repository you will find Gazebo simulation assets, ROS interfaces, support scripts and plugins, and documentation needed to compete in the SubT Virtual Challenge.
Other
309 stars 98 forks source link

Strange LIDAR returns for various robots #888

Open rsawtell opened 3 years ago

rsawtell commented 3 years ago

Looking through some of the robots, I've noticed that the LIDAR returns are sometimes wonky, typically but not always if part of the scan intersects robot geometry. I haven't exhaustively checked them all but below are a few for which it is particularly noticeable. For the UGV's the self intersecting scan is somehow displaced from the actual robot geometry, and for the UAV's there is a weird box shaped thing going on.

CSIRO_DATA61_DTR DTR

EXPLORER_R2 explorer_r2

MARBLE_QAV500: qav500

CERBERUS_Gagarin: gagarin

nkoenig commented 3 years ago

The boxes seen in the gagarin and qav500 are effect of the min range associated with the laser sensors, which are currently set at 0.8 meters.

The returns for EXPLORER_R2 match the robot model in simulation. The robot description seen in rviz is not an accurate reflection of the physical robot.

The strange returns for CSIRO_DATA61_DTR are also related to a min range of 0.3 meters.

rsawtell commented 3 years ago

Is the minimum distance using L1 norm then? I would expect default distance vectors to form a spherical pattern and not a box.

nkoenig commented 3 years ago

I left out an important piece of information. The lidar is simulated using a camera with cube maps. We set the camera near clip distance to the laser minimum range. The effect is the box pattern you see.

A fix would require changing the internal camera implementation to use a smaller near-clip and then clip the ranges after the GPU rendering process.

malcolmst commented 3 years ago

I was going to ask the same question about the Marble HD2 lidar returns, but I think that answers it. Thanks for the info!

marble hd2 lidar

peci1 commented 3 years ago

Can't it at least partly be caused by https://github.com/ignitionrobotics/ign-sensors/issues/128 ?

This is CTU_CRAS_NORLAB_SPOT_SENSOR_CONFIG_1:

Robot parallel with world axes:

spot_ok

Robot about 30° to world:

spot_half_bad

Robot 45° to world:

spot_all_bad

Yellow points mark those closer than 0.2 m from the sensor.

peci1 commented 3 years ago

Ahh, some of the problems here could actually be consequences of https://github.com/ignitionrobotics/ign-sensors/issues/131 . Mitigation should be simple - just crop the pointcloud 1 cm after min range and 1 cm before max range.

iche033 commented 3 years ago

created ignitionrobotics/ign-rendering#356 to fix the incorrect near clip plane issue. Here are the point clouds that I get with the above robots. The lidar points now form a spherical pattern at the bottom and incorrect points that don't correspond to any robot geometry are gone.

csiro_data61_points cerberus_gagarin_points marble_qav500_points marble_hd2_points