Field-Robotics-Japan / UnitySensors

ROS/ROS2 enabled Sensor models (Assets) on Unity
Apache License 2.0
191 stars 28 forks source link

Lidar points are not accurate towards scene representation. #153

Open sriceumich opened 4 months ago

sriceumich commented 4 months ago

We are evaluating this as an alternative to our lidar simulator and seem to be having a few issues with the points generated. The lidar points are not accurately representing the scene and have a curve to them, points closer to the lidar have more of a curve. The noise doesn't seem to be representative of a lidar in the scene causing detection to seem inaccurate. Noise adjustment does not seem to compensate for this issue. I tried the various lidar that were represented and all seemed to exhibit this issue.

Curved floor with wide noise paths. Screenshot from 2024-02-07 18-50-59

Extra thick points which should be a wall. Screenshot from 2024-02-07 17-37-05

RyodoTanaka commented 4 months ago

@sriceumich Thank you for your issue. This problem looks comes from depth buffer, I think. For the implementation, we use depth buffer image as raw data, and transform that image into distance value.

@Autumn60 If you have any opinion (mainly depth-buffer transform part), I am glad to hear that.

Autumn60 commented 4 months ago

@sriceumich Could you tell me which script file (RaycastLiDARSensor.cs or DepthBufferLiDARSensor.cs) you are using for LiDAR simulation?

sriceumich commented 4 months ago

@Autumn60 I am using the DepthBufferLiDARSensor currently for this and the 32 beam configuration. The RaycastLiDARSensor is presenting it's own set of challenges. We noticed some additional issues in the sensor this past week as we prepared for a paper. As we started moving the ego, the sensor started exhibiting some broken details the further away the beams are. We attempted to use the points for our methods, but with the noise we were unable to produce good results from them.

image

Autumn60 commented 4 months ago

@sriceumich

As you say, DepthBufferLiDAR has poor accuracy at long distances. This is because the DepthBuffer values provided by Unity are not linear with respect to the actual distance.

Also, the lower the resolution of the RenderTexture used, the less accurate the point cloud. It may be difficult to provide enough pixels for a high quality LiDAR such as the VLS-128.

Autumn60 commented 4 months ago

It is possible that the part that calculates the corresponding pixel index from the laser direction is wrong, but I don't know the exact cause yet now. (Perhaps the calculation of the camera distortion matrix...? 🤔 )

Autumn60 commented 4 months ago

Based on some experiments, I am convinced that the calculation to find the texture pixels from the laser direction is wrong.

https://github.com/Field-Robotics-Japan/UnitySensors/blob/master/Assets/UnitySensors/Runtime/Scripts/Sensors/LiDAR/DepthBufferLiDAR/DepthBufferLiDARSensor.cs#L93-L118