Open PigletPh opened 3 years ago
Hey @jiu876, thank you for following our work!
The algorithm runs in real-time currently only after it converges. Once it converges, it computes the difference between the current LiDAR range image and the one rendered in a local map which is very fast.
Theoretically it should work with different LiDAR scanners also including livox. The only changing part should be the projection function for generating the range image. The way for livox generating range image should be different from velodyne/ouster.
I hope this helps.
Hey @jiu876, thank you for following our work!
- The algorithm runs in real-time currently only after it converges. Once it converges, it computes the difference between the current LiDAR range image and the one rendered in a local map which is very fast.
- Theoretically it should work with different LiDAR scanners also including livox. The only changing part should be the projection function for generating the range image. The way for livox generating range image should be different from velodyne/ouster.
I hope this helps. Thans for your answers. the question one may be a little bit confused.I know after convergence the location can run in real-time.But how to locate in real time without lidar dataset, or input my own osbag file to achieve positioning.Does this part of the function implementation have to partially modify the source code? And another question is that Can I directly construct a mesh map from the point cloud map output by SLAM?
I'm looking forward to your answers,thank you Yours sincerely
the question one may be a little bit confused.I know after convergence the location can run in real-time.But how to locate in real time without lidar dataset, or input my own osbag file to achieve positioning.Does this part of the function implementation have to partially modify the source code?
Yes, currently we don't have a plan to develop a ROS wrapper, you may write a ROS API following this
And another question is that Can I directly construct a mesh map from the point cloud map output by SLAM?
To build a mesh map, you need the normals for each point. If your SLAM provides the normal information, it would work. You could also try our Mesh SLAM method PUMA who runs SLAM and generates a mesh map simultaneously.
Thank you for your answer. There is another question, can I directly convert the point cloud map PCD file generated by SLAM into a PLY file through the PCL library, and then use it as a new mesh map? Because I found the PCL library can generate PLY which includes the normals for cloud points.
Thank you for your answer. There is another question, can I directly convert the point cloud map PCD file generated by SLAM into a PLY file through the PCL library, and then use it as a new mesh map? Because I found the PCL library can generate PLY which includes the normals for cloud points.
You are welcome. I will appreciate it a lot if you could star this repo if my answers help.
The PCL way sounds doable and you could give it a try. I usually use Open3D, but I think it should work similarly with PCL.
Thank you for your answer. There is another question, can I directly convert the point cloud map PCD file generated by SLAM into a PLY file through the PCL library, and then use it as a new mesh map? Because I found the PCL library can generate PLY which includes the normals for cloud points.
ok,I'll have a try.thanks!
The work is excellent and I'm honor to study for it.But I have two questions about it: 1、how the algorithm runs in real time not need "velodyne_bin" 2、its compatibility with solid-state lidar ,such as livox I'm looking forward to your answers,thank you Yours sincerely