Closed SyahirMuzni closed 1 year ago
As cameras usually don't share the complete FOV, we calibrate each one separately (e.g., camera1 vs. lidar and camera2 vs. lidar).
An example of how to set up the launcher files can be found at: https://github.com/tier4/CalibrationTools/blob/tier4/universe/sensor/extrinsic_calibration_manager/launch/aip_xx1/tag_based_sensor_kit.launch.xml and https://github.com/tier4/CalibrationTools/blob/tier4/universe/sensor/docs/how_to_extrinsic_tag_based.md
Ok. Understood! Thanks for the response. Another thing, can you please elaborate more about the lidar-placement.svg.
Sure thing.
With regards to the schematic you mention:(https://raw.githubusercontent.com/tier4/CalibrationTools/tier4/universe/sensor/docs/images/camera-lidar/lidartag-placement.svg)
In the case that you have a 360 degrees FOV lidar and several cameras around that FOV, you can calibrate all the camera extrinsics with a setting like the one previously mentioned. The number of tags in the schematic is limited by how they can be placed without them disturbing each other, but you need at least three for a reasonably well calibrated camera intrinsics. If you wan to proceed with an approach like this, the only thing you would need is the appropriate launch, with which we can give some assistance in the case you provide us with more details.
In our case, we are not taking the static approach and instead move a single lidartag through the shared FOV of the sensor since we haven't printed many tags yet, and that in doing things our way, we can also collect more points, which in practice provides a slightly better calibration
Dear @SyahirMuzni , due to inactivity, we will be closing this issue in about a week's time
Closed due to inactivity
Hi,
How am I going to calibrate two cameras with one lidar. How should I set up all the sensors?
thank you.