RuanJY / SLAMesh

ICRA2023, A real-time LiDAR simultaneous localization and meshing method.
GNU General Public License v3.0
351 stars 47 forks source link

KITTI with Ground Truth Poses #15

Open hungdche opened 10 months ago

hungdche commented 10 months ago

Hi! Thank you for such an awesome work.

I saw that you have some parameters grt_available and GroundTruthCallback, but didn't see any interface or any mentions in the README for feeding ground truth poses to SLAMesh to test only the meshing part. I'm wondering if I have to feed gt poses through rosbag play, or load it like how you load lidar points for KITTI.

Any help is much appreciated. Thanks!

RuanJY commented 9 months ago

Hi, Thank you for your interest. Sorry for this late reply.

Currently, the ground truth msg is only used to align the first pose of SLAM with it so that we can visualize the drift of SLAM online. We save the pose of the ground truth msg for offline evaluation.

I understand your demand. If you want to do so, I think you may try to use the ground truth msg as the odometry msg (remap the topic, set odom_available as true and grt_available as false), and then set the parameter register_times as 0 to turn off the scan registration.

hungdche commented 9 months ago

Thanks for the reply. I will try that and get back to you with more questions if I have any.

Rustli11 commented 4 months ago

Hi, thank you so much for providing this perfect program. I am trying to use my own vertical radar scan data for model reconstruction, which will get a cleaner mesh. But the matching process always shifts. I referred the solution from issue #15. Then I tried the pose data obtained from the horizontal radar of the same machine to correct it. It runs without errors but never draws the mesh. The result of the run looks like this [ INFO] [1709195543.923199405]: PointCloud seq: [3428] [ INFO] [1709195543.973204881]: PointCloud seq: [3429] [ INFO] [1709195544.024206786]: PointCloud seq: [3430] [ INFO] [1709195544.074742817]: PointCloud seq: [3431] Hopefully I can get some luck with your help.

RuanJY commented 4 months ago

Do you mean radar rather than lidar? Can you show me a frame of your data, both vertical and horizontal?


发件人: Rustli11 @.> 发送时间: 2024年2月29日 16:59 收件人: RuanJY/SLAMesh @.> 抄送: RUAN, Jianyuan [Student] @.>; Comment @.> 主题: Re: [RuanJY/SLAMesh] KITTI with Ground Truth Poses (Issue #15)

Hi, thank you so much for providing this perfect program. I am trying to use my own vertical radar scan data for model reconstruction, which will get a cleaner mesh. But the matching process always shifts. I referred the solution from issue #15https://github.com/RuanJY/SLAMesh/issues/15. Then I tried the pose data obtained from the horizontal radar of the same machine to correct it. It runs without errors but never draws the mesh. The result of the run looks like this [ INFO] [1709195543.923199405]: PointCloud seq: [3428] [ INFO] [1709195543.973204881]: PointCloud seq: [3429] [ INFO] [1709195544.024206786]: PointCloud seq: [3430] [ INFO] [1709195544.074742817]: PointCloud seq: [3431] Hopefully I can get some luck with your help.

― Reply to this email directly, view it on GitHubhttps://github.com/RuanJY/SLAMesh/issues/15#issuecomment-1970691172, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AF3YWJD5RQHVZBPRFK6XIS3YV3WVRAVCNFSM6AAAAAA5G7UF2KVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSNZQGY4TCMJXGI. You are receiving this because you commented.Message ID: @.***>

[https://www.polyu.edu.hk/emaildisclaimer/PolyU_Email_Signature.jpg]

Disclaimer:

This message (including any attachments) contains confidential information intended for a specific individual and purpose. If you are not the intended recipient, you should delete this message and notify the sender and The Hong Kong Polytechnic University (the University) immediately. Any disclosure, copying, or distribution of this message, or the taking of any action based on it, is strictly prohibited and may be unlawful.

The University specifically denies any responsibility for the accuracy or quality of information obtained through University E-mail Facilities. Any views and opinions expressed are only those of the author(s) and do not necessarily represent those of the University and the University accepts no liability whatsoever for any losses or damages incurred or caused to any party as a result of the use of such information.

Rustli11 commented 4 months ago

Sorry, my bad! It's lidar from backpack lidar scanner. Here is my data frame. image

RuanJY commented 1 month ago

Hi, thank you so much for providing this perfect program. I am trying to use my own vertical radar scan data for model reconstruction, which will get a cleaner mesh. But the matching process always shifts. I referred the solution from issue #15. Then I tried the pose data obtained from the horizontal radar of the same machine to correct it. It runs without errors but never draws the mesh. The result of the run looks like this [ INFO] [1709195543.923199405]: PointCloud seq: [3428] [ INFO] [1709195543.973204881]: PointCloud seq: [3429] [ INFO] [1709195544.024206786]: PointCloud seq: [3430] [ INFO] [1709195544.074742817]: PointCloud seq: [3431] Hopefully I can get some luck with your help.

Sorry for the delayed response; If you see [ INFO] [1709195543.923199405]: PointCloud seq: [3428] repeatedly, the SLAM doesn't start. It is the info from the callback function: https://github.com/RuanJY/SLAMesh/blob/200216576f6196a0285234d667375192177617de/src/slamesher_node.cpp#L729

I think maybe your odometry topic is not set correctly and the algorithm is waiting for it. You can uncomment this line and see if you can receive the print from odometry callback function: https://github.com/RuanJY/SLAMesh/blob/200216576f6196a0285234d667375192177617de/src/slamesher_node.cpp#L723

You should remap the odometry topic in the launch file like:

 <remap from="/odom" to="/your_slam_odometry" />

By the way, I have made some changes to the code. Would you please test the new code again using vertical lidar directly and check if it still drift?