PRBonn / ir-mcl

IR-MCL: Implicit Representation-Based Online Global Localization https://arxiv.org/abs/2210.03113
122 stars 12 forks source link

Production of data sets #2

Open TaoXu-HNU opened 1 year ago

TaoXu-HNU commented 1 year ago

Can the author or anyone else tell me how to convert the (.bag) file format to (json) format and divide the training/validation/testing set when creating my own dataset?

KuangHaofei commented 1 year ago

Thanks for your question! To convert rosbag to json file, we assume the 2D LiDAR data (sensor_msgs/LaserScan.msg) and pose data (nav_msgs/Odometry.msg) have been included in your bag file.

Regarding the json format, the LiDAR's parameters can be found in the LaserScan message:

'num_beams': the length of ranges "angle_min": angle_min "angle_max": angle_max "angle_res": angle_increment "field_of_view": angle_max - angle_min "max_range": range_max

For the "scan" in the json file:

"timestamp": in the header from LaserScan message "range_readings": ranges from LaserScan message "odom": pose from Odometry message, you need convert it to (x, y, yaw) format "transform_matrix": converting the "odom" to the transformation matrix

There are two points you should be careful of:

  1. You should make sure the pose data has been synchronized with LiDAR data;
  2. You should make sure the pose message is under the LiDAR's frame, otherwise, you should transform it to LiDAR's frame (you can get the transformation from the tf message).

We will provide an example later. Thanks!

TaoXu-HNU commented 1 year ago

Thank you very much for your patient and professional reply and I will follow your instructions for further attempts.

KuangHaofei commented 1 year ago

Hi,

I provide a toolbox for data preprocessing, including data format conversions. You can check the INSTRUCTIONS.md in the dev branch now.

Because the toolbox is not finished yet, we put it in the dev branch now. If you found any bugs in these tools, please let me know! We will merge it into the main after all modules are completed.

Thanks!