MAM3SLAM is a new fully centralized multi-agent and multi-map monocular Visual Simultaneous Localization And Mapping (VSLAM) framework based on ORB-SLAM3. This README therefore includes both the README informations of ORB-SLAM3 provided by ORB-SLAM3's authors, and informations on MAM3SLAM including building and use instruction, and reference paper. Information similar with ORB-SLAM3 have just been copy-pasted from ORB-SLAM3's README.
Authors: Carlos Campos, Richard Elvira, Juan J. Gómez Rodríguez, José M. M. Montiel, Juan D. Tardos.
The Changelog describes the features of each version.
ORB-SLAM3 is the first real-time SLAM library able to perform Visual, Visual-Inertial and Multi-Map SLAM with monocular, stereo and RGB-D cameras, using pin-hole and fisheye lens models. In all sensor configurations, ORB-SLAM3 is as robust as the best systems available in the literature, and significantly more accurate.
We provide examples to run ORB-SLAM3 in the EuRoC dataset using stereo or monocular, with or without IMU, and in the TUM-VI dataset using fisheye stereo or monocular, with or without IMU. Videos of some example executions can be found at ORB-SLAM3 channel.
This software is based on ORB-SLAM2 developed by Raul Mur-Artal, Juan D. Tardos, J. M. M. Montiel and Dorian Galvez-Lopez (DBoW2).
<img src="https://img.youtube.com/vi/HyLNq-98LRo/0.jpg" alt="ORB-SLAM3" width="240" height="180" border="10" />
[ORB-SLAM3] Carlos Campos, Richard Elvira, Juan J. Gómez Rodríguez, José M. M. Montiel and Juan D. Tardós, ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual-Inertial and Multi-Map SLAM, IEEE Transactions on Robotics 37(6):1874-1890, Dec. 2021. PDF.
[IMU-Initialization] Carlos Campos, J. M. M. Montiel and Juan D. Tardós, Inertial-Only Optimization for Visual-Inertial Initialization, ICRA 2020. PDF
[ORBSLAM-Atlas] Richard Elvira, J. M. M. Montiel and Juan D. Tardós, ORBSLAM-Atlas: a robust and accurate multi-map system, IROS 2019. PDF.
[ORBSLAM-VI] Raúl Mur-Artal, and Juan D. Tardós, Visual-inertial monocular SLAM with map reuse, IEEE Robotics and Automation Letters, vol. 2 no. 2, pp. 796-803, 2017. PDF.
[Stereo and RGB-D] Raúl Mur-Artal and Juan D. Tardós. ORB-SLAM2: an Open-Source SLAM System for Monocular, Stereo and RGB-D Cameras. IEEE Transactions on Robotics, vol. 33, no. 5, pp. 1255-1262, 2017. PDF.
[Monocular] Raúl Mur-Artal, José M. M. Montiel and Juan D. Tardós. ORB-SLAM: A Versatile and Accurate Monocular SLAM System. IEEE Transactions on Robotics, vol. 31, no. 5, pp. 1147-1163, 2015. (2015 IEEE Transactions on Robotics Best Paper Award). PDF.
[DBoW2 Place Recognition] Dorian Gálvez-López and Juan D. Tardós. Bags of Binary Words for Fast Place Recognition in Image Sequences. IEEE Transactions on Robotics, vol. 28, no. 5, pp. 1188-1197, 2012. PDF
ORB-SLAM3 is released under GPLv3 license. For a list of all code/library dependencies (and associated licenses), please see Dependencies.md.
For a closed-source version of ORB-SLAM3 for commercial purposes, please contact the authors: orbslam (at) unizar (dot) es.
If you use ORB-SLAM3 in an academic work, please cite:
@article{ORBSLAM3_TRO,
title={{ORB-SLAM3}: An Accurate Open-Source Library for Visual, Visual-Inertial
and Multi-Map {SLAM}},
author={Campos, Carlos AND Elvira, Richard AND G\´omez, Juan J. AND Montiel,
Jos\'e M. M. AND Tard\'os, Juan D.},
journal={IEEE Transactions on Robotics},
volume={37},
number={6},
pages={1874-1890},
year={2021}
}
If you use MAM3SLAM in an academic work, please cite:
@article{ORBSLAM3_TRO,
title={{ORB-SLAM3}: An Accurate Open-Source Library for Visual, Visual-Inertial
and Multi-Map {SLAM}},
author={Campos, Carlos AND Elvira, Richard AND G\´omez, Juan J. AND Montiel,
Jos\'e M. M. AND Tard\'os, Juan D.},
journal={IEEE Transactions on Robotics},
volume={37},
number={6},
pages={1874-1890},
year={2021}
}
@article{MAM,
title={{MAM$^3$SLAM}: Towards underwater-robust multi-agent visual {SLAM}},
author={Drupt, Juliette AND Comport, Andrew I. AND Dune, Claire AND Hugel, Vincent},
journal={Ocean Engineering},
year={2024}
}
The current code has been developped and tested using Ubuntu 18.04.
We use the new thread and chrono functionalities of C++11.
We use Pangolin for visualization and user interface. Dowload and install instructions can be found at: https://github.com/stevenlovegrove/Pangolin. The current code has been tested using version 0.8, which you can find here: https://github.com/stevenlovegrove/Pangolin/tree/aff6883c83f3fd7e8268a9715e84266c42e2efe3.
We use OpenCV to manipulate images and features. Dowload and install instructions can be found at: http://opencv.org. Required at leat 3.0.
Required by g2o (see below). Download and install instructions can be found at: http://eigen.tuxfamily.org. Required at least 3.1.0.
We use modified versions of the DBoW2 library to perform place recognition and g2o library to perform non-linear optimizations. Both modified libraries (which are BSD) are included in the Thirdparty folder.
Required to calculate the alignment of the trajectory with the ground truth. Required Numpy module.
sudo apt install libpython2.7-dev
The code is interfaced with ROS, and has been tested with ROS Melodic. An exemple file for 2 agent scenario is provided, which can be modified easily to increment the number of agents.
Clone the repository:
git clone https://github.com/LaboratoireCosmerTOULON/MAM3SLAM.git
We provide a script build.sh
to build the Thirdparty libraries and ORB-SLAM3. Please make sure you have installed all required dependencies. Execute:
cd MAM3SLAM
chmod +x build.sh
./build.sh
Build the ROS nodes:
Add the path including Examples/ROS/MAM3SLAM to the ROS_PACKAGE_PATH environment variable. Open .bashrc file:
gedit ~/.bashrc
and add at the end the following line. Replace PATH by the folder where you cloned MAM3SLAM:
export ROS_PACKAGE_PATH=${ROS_PACKAGE_PATH}:PATH/MAM3SLAM/Examples/ROS
Execute build_ros.sh
script:
chmod +x build_ros.sh
./build_ros.sh
In the current version, the camera needs to be interfaced with ROS to do so. The steps needed to use your own camera are:
Calibrate your camera
Run MAM3SLAM. For a 2 agent scenario, you can run:
rosrun MAM3SLAM MonoMulti2 path_to_vocabulary path_to_settings_1 topic_1 path_to_settings_2 topic_2 [is_mono]