Open Erol444 opened 2 years ago
Azure Kinect also has it. It only works up to 6 meters. If Luxonis gives longer distance, he will win.
Sweet! Yes, this would be great!
This would be a very useful feature. Between the IMU data of each camera, or the ability to define the relative orientations of each camera the required information is already there, but an example of the appropriate way to do this would be very useful.
plus one!
This would be a great feature for us as well.
Progress: Multi-cam calibration & spatial detection fusion
Initial demo here: https://github.com/luxonis/depthai-experiments/tree/master/gen2-multiple-devices/rgbd-pointcloud-fusion cc @MarekKot1 @LyceanEM @mhdtahawi @andrewwetzel23
When I try to run the Multi-cam calibration script, it keeps getting stuck in the capturing still image loop. There is nothing moving in the camera frame, is there anything that could be going wrong?
CC: @MaticTonin
Start with the
why
:For folks that would want to use our cameras for 3D object scanning, using multiple cameras would be crucial.
Move to the
what
:Create a script that will let you calibrate multiple cameras looking at the same scene. This will provide extrinsics of cameras relative to each other, which would allow aligning multiple point clouds (produced from different cameras). We could then do some additional filtering of this combined pointcloud.
Move to the
how
:1. Calibrating
2. Alignment