MIT-SPARK / Kimera-Multi-Data

A large-scale multi-robot dataset for multi-robot SLAM
MIT License
150 stars 10 forks source link

add kmd_tools ros package #9

Open plusk01 opened 11 months ago

plusk01 commented 11 months ago

This ROS package is particularly useful for extrinsic calibrations.

This also provides code for publishing odometry and pose graphs from the output of past kimera multi experiments - the original purpose was to be able to benchmark new algorithms against previous runs of kimera multi / dpgo in a deterministic way.

plusk01 commented 11 months ago

fyi @yuluntian - there are many details which i am sure i am not remembering here and are unfortunately undocumented. Perhaps not all of this is useful. If there are specific questions i may be able to help.

Also, i recall that I (1) made the rosbags significantly smaller by replaying and recording the depth topic as compressed and (2) i had a specific data directory structure that may be assumed in some of this code and (3) i think i relied on the output of dpgo logs for some of the functionality in the c++ nodes.

re extrinsics - i think i would only trust acl_jackal and acl_jackal2 as it currently stands. It looks like i made some attempts on the ARL vehicles, but they may not be "high quality". I could not calibrate spark vehicles - something to do with how the data was recorded i think. It's probably worth inspecting the extrinsics using https://github.com/plusk01/lidar2camera

LimHyungTae commented 2 months ago

Hello guys! Thanks for the nice sharing.

However, recently, I found that the extrinsics of apis, sobek, and thoth might be wrong, especially lidar-to-camera extrinsics. Because in cases of Outser OS1-64, the forward direction is -X (see https://static.ouster.dev/sensor-docs/image_route1/image_route2/sensor_data/sensor-data.html),

so each rotation matrix in the extrinsic should be similar to
[0 1 0 0 0 -1 -1 0 0]. However, those are set like [0 -1 0 0 0 -1 1 0 0] , which are mostly correct if the sensor is Velodyne types.

I was not here when you guys did experiments, so feel free to let me know what I'm supposed to do to fix the extrinsic issues. The attached figure is the coordinates of an Ouster LiDAR sensor.

image

LimHyungTae commented 2 months ago

And @plusk01 , I tried to fix this calibration issue, so follow the calibration github https://github.com/plusk01/lidar2camera. However, it did not show anything :(.

roslaunch lidar2camera manual_calib.launch camera:=/apis/forward/color points:=/apis/lidar_points
(in another cmd)
roslaunch lidar2camera projection.launch camera:=/apis/forward/color points:=/apis/lidar_points

The only message that I could see is as follows:

Warning: TF_OLD_DATA ignoring data from the past (Possible reasons are listed at http://wiki.ros.org/tf/Errors%20explained) for frame apis/base (parent apis/realsense_base) at time 1670531082.067386 according to authority unknown_publisher
         at line 278 in /tmp/binarydeb/ros-noetic-tf2-0.7.7/src/buffer_core.cpp
plusk01 commented 2 months ago

I found that the extrinsics of apis, sobek, and thoth might be wrong, especially lidar-to-camera extrinsics.

yeah, this is because

I could not calibrate spark vehicles - something to do with how the data was recorded i think

regarding the rviz/tf errors, that is probably something to do with ros on your computer, maybe something related to timing / use_sim_time, or there are some frames not set up correctly - try looking at rqt tf tree visualizer

LimHyungTae commented 2 months ago

image (above figures are frame of LiDAR sensors)

Thanks for your response @plusk01! As far as I checked, some robots' TFs do not follow the convention, and I guess that's why. I'll try manual calibration! Thanks.