Open fangcywangermu opened 4 years ago
The general approach can be studied with this paper:
M. Labbé and F. Michaud, “RTAB-Map as an Open-Source Lidar and Visual SLAM Library for Large-Scale and Long-Term Online Operation,” in Journal of Field Robotics, vol. 36, no. 2, pp. 416–446, 2019. (Wiley)
There are also other papers on the main page specific to other aspects of RTAB-Map, like the memory management approach and loop closure detection. For understanding the code, a starting point could be to start by the main function Rtabmap::process() called for each frame.
cheers, Mathieu
可以使用本文研究通用方法:
M.Labbé和F. Michaud,“ RTAB-Map作为用于大规模和长期在线操作的开源激光雷达和Visual SLAM库,”《现场机器人学报》,第1卷。36号 2,pp。416–446,2019.(Wiley)
也有在其他文件主页特定RTAB地图的其他方面,如内存管理方法和环路闭合检测。为了理解代码,一个起点可以是从针对每个帧调用的主要函数Rtabmap :: process()开始。
干杯, 马修
Can you give any suggestions for RTABMAP positioning by combining ORBSLAM2 with lidar?
@fangcywangermu orbslam2 is mainly visual based slam and it needs to be backed by sensor that has image output (ie webcam, stereo cam, rgbd module), by using lidar you probably have in mind using the rotary mounted plane scanning version, and there are other implementation that use this specific approach like intelligent vacuum cleaners or grass movers. https://github.com/simondlevy/BreezySLAM
I am a graduate, and my Graduation thesis is about this with orbslam2, but I cant understand the code,could you please help me?