ouster-lidar / ouster-sdk

Ouster, Inc. sample code
Other
468 stars 440 forks source link

How to handle self-obstruction ? Or how to reproject 3d points in the 2d scan to create a mask in the scan #577

Closed mathislm closed 8 months ago

mathislm commented 8 months ago

Description of my question Hello, I'm trying to remove points from the LiDAR scan that belong to my robot. I know exactly where my robot is w.r.t to the LiDAR's sensor coordinates (in 3D). With a pinhole camera I could just use the inverse of the intrinsic projection matrix to figure out where the robot's shape should be in the image.

From my understanding the equivalent of the intrinsic matrix is the XYZLut function, however I'm not sure how I can use this to project 3D points in the LiDAR scan.

How could I generate a mask with the estimation of where my robot is w.r.t to the LiDAR to avoid adding the robot itself to the map?

Platform (please complete the following information):

Samahu commented 8 months ago

Is the Lidar mounted on the Robot itself? or is it mounted to a fixed object that is separate from the robot?

mathislm commented 8 months ago

It is mounted on the robot itself

Samahu commented 8 months ago

If I understood your question correctly you want to project the points of your robot into the 2D surfaces and see where they land such that you could use these as a mask and apply to other 2D sources.

If you are looking for an intrinsic matrix I think the closest to what need would be the matrix saved sensor_info::beam_to_lidar_transform however it doesn't fully match it because the lenses are mounted with some offset from the Lidar origin. You can learn more about it here. If you use the beam_to_lidar_transform matrix and account for the lenses offset you should be able to project any 3D point unto a 2D surface.

Hope this help!

mathislm commented 8 months ago

This is not exactly what I'm trying to do. I figured it out by reading the function computing the LUT.

For others that might be looking to do the same thing, I basically looped through all the directions computed by the LUT and removed (set to zero) each that seemed to be pointing towards the shape of my robot.

ouster::XYZLut lut = ouster::make_xyz_lut(info_);
for (auto rayDirection : lut.direction.rowwise()) {
    if (rayIntersectingRobot(rayDirection)) {
        rayDirection.setZero();
    }
}

After that I filter out all the points from the cloud that are (0, 0, 0). I used some basic raycasting against planes to figure out if a ray is hitting my robot or not. It's probably not the most optimal but it works fine for me.

Best,