PRBonn / kiss-icp

A LiDAR odometry pipeline that just works
https://www.ipb.uni-bonn.de/wp-content/papercite-data/pdf/vizzo2023ral.pdf
MIT License
1.54k stars 312 forks source link

Does Kiss-icp assume something about the Frames orientation ?or can we change it? #99

Closed MblRobotics closed 1 year ago

MblRobotics commented 1 year ago

I'm trying to test kiss-icp on turtlebot3 point cloud from depth image ,i'm getting the point cloud on the z axis of the frames(it's like the ground frame is (x,z) and not (x,y) ) so that when the robot moves forward on x axis the model thinks that he moved on z axis and when i try to rotate the robot thinks that it will rotate on x or y axis so it makes a very strange behaviour Thank you for your help Screenshot from 2023-03-23 13-58-29

nachovizzo commented 1 year ago

Hello @MblRobotics , we do not assume anything in particular about the sensor orientation.

If you have a specific setup, then you should write your own preprocessing pipeline, or your own dataloader. There is plenty of examples on this repo and on the issues. If the axis are flipped, there is absolute no way we can "guess" the axis change for all the possible combinations in the world.

If the estimation is not great anyway, this is independent of the axis setup. You can play around with the max_range, and the voxel_size params which are the most important params in the pipeline. For depth sensor we have seen the pipeline working, although it was never great, make sure you have setup the max_range to something more meaningful that the default value(which is 100 meters) so something like 4 or 5 meters.

Next time you can also share a bagfile so we can try ourselves what's going on. Best luck!