norlab-ulaval / libpointmatcher

An Iterative Closest Point (ICP) library for 2D and 3D mapping in Robotics
BSD 3-Clause "New" or "Revised" License
1.58k stars 542 forks source link

Organized point clouds #376

Open YoshuaNava opened 4 years ago

YoshuaNava commented 4 years ago

(This issue continues the discussion started in #342 about organized point clouds)

Organized point clouds are an attractive way to represent 3D point sets coming from range sensors. The main advantage of them is that they are akin to matrices, thus easier to traverse. Furthermore, the neighboring set of each point is known a priori, so operations like normal computations could potentially skip the K-NN step, reducing computation time and CPU/memory taken in building a K-D tree.

Existing implementations of organized point clouds (pcl, sensor_msgs/PointCloud2) represent the range measurements in a flat data structure akin to a matrix, where each cell corresponds to a range sensor bin. Thus the range sensor FOV is discretized horizontally and vertically. Missing points are usually filled with NaN.

I'm opening this issue to discuss how organized point clouds could be represented within the framework of libpointmatcher. Some points for discussion I can think of right now:

YoshuaNava commented 3 years ago

I'm currently working on speeding up surface normals computation on DataPoints structures. I'm planning to implement an 'Index Image', a structure that contains a mapping between a 2D matrix (an organized point cloud grid) and 1D (DataPoints) elements. This approach is inspired by Serafin et al

I consider it a better option than depth images / depth buffers, as it seems more general and straightforward to perform a mapping in 3D space with a direction table, than to map point clouds from arbitrary sensors into depth images (for example, dome LIDARs).

We plan to use this as a mean to speed up feature sampling filters, initially.