microsoft / Azure-Kinect-Sensor-SDK

A cross platform (Linux and Windows) user mode SDK to read data from your Azure Kinect device.
https://Azure.com/Kinect
MIT License
1.47k stars 614 forks source link

How to realize real-time 3D point cloud reconstruction using three azure kinects #1778

Open NotFound157 opened 2 years ago

NotFound157 commented 2 years ago

How to realize real-time 3D point cloud reconstruction using three azure Kinect kinects by 360 ° real-time 3D point cloud reconstruction using Kinect fusion technolog Current thinking

  1. First, obtain the target color image and depth image through multiple kinects at the same time, and keep the device synchronized with 3.5mm audio cable.
  2. Get the corresponding point cloud data after alignment and superposition, and simply filter it. Put the three kinects at the position of 120 ° to each other, use the calibration tool in MATLAB to obtain the internal and external parameters of each Kinect, and calibrate the Kinect in pairs, so as to complete the transformation between the three coordinate systems.
  3. The iterative closest point algorithm (ICP) is used to splice the point clouds from three different angles, and the KD tree algorithm is used to eliminate the redundant data to obtain a more complete point cloud model, so as to achieve the purpose of real-time reconstruction. Want to know if anyone has done it ?
zener90818 commented 2 years ago

Did you do it, Could you share the results?

ChristopherRemde commented 2 years ago

You can take a look at this project here, it does all the things you described: https://github.com/MarekKowalski/LiveScan3D (You need to switch to the AzureKinect Branch)