LaboratoireCosmerTOULON / MAM3SLAM

Working copy of ORBSLAM3 to include mutli-robot input
GNU General Public License v3.0
5 stars 0 forks source link

MAM3SLAM

MAM3SLAM is a new fully centralized multi-agent and multi-map monocular Visual Simultaneous Localization And Mapping (VSLAM) framework based on ORB-SLAM3. This README therefore includes both the README informations of ORB-SLAM3 provided by ORB-SLAM3's authors, and informations on MAM3SLAM including building and use instruction, and reference paper. Information similar with ORB-SLAM3 have just been copy-pasted from ORB-SLAM3's README.

1 - ORB-SLAM3

V1.0, December 22th, 2021

Authors: Carlos Campos, Richard Elvira, Juan J. Gómez Rodríguez, José M. M. Montiel, Juan D. Tardos.

The Changelog describes the features of each version.

ORB-SLAM3 is the first real-time SLAM library able to perform Visual, Visual-Inertial and Multi-Map SLAM with monocular, stereo and RGB-D cameras, using pin-hole and fisheye lens models. In all sensor configurations, ORB-SLAM3 is as robust as the best systems available in the literature, and significantly more accurate.

We provide examples to run ORB-SLAM3 in the EuRoC dataset using stereo or monocular, with or without IMU, and in the TUM-VI dataset using fisheye stereo or monocular, with or without IMU. Videos of some example executions can be found at ORB-SLAM3 channel.

This software is based on ORB-SLAM2 developed by Raul Mur-Artal, Juan D. Tardos, J. M. M. Montiel and Dorian Galvez-Lopez (DBoW2).

<img src="https://img.youtube.com/vi/HyLNq-98LRo/0.jpg" alt="ORB-SLAM3" width="240" height="180" border="10" />

Related Publications:

[ORB-SLAM3] Carlos Campos, Richard Elvira, Juan J. Gómez Rodríguez, José M. M. Montiel and Juan D. Tardós, ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual-Inertial and Multi-Map SLAM, IEEE Transactions on Robotics 37(6):1874-1890, Dec. 2021. PDF.

[IMU-Initialization] Carlos Campos, J. M. M. Montiel and Juan D. Tardós, Inertial-Only Optimization for Visual-Inertial Initialization, ICRA 2020. PDF

[ORBSLAM-Atlas] Richard Elvira, J. M. M. Montiel and Juan D. Tardós, ORBSLAM-Atlas: a robust and accurate multi-map system, IROS 2019. PDF.

[ORBSLAM-VI] Raúl Mur-Artal, and Juan D. Tardós, Visual-inertial monocular SLAM with map reuse, IEEE Robotics and Automation Letters, vol. 2 no. 2, pp. 796-803, 2017. PDF.

[Stereo and RGB-D] Raúl Mur-Artal and Juan D. Tardós. ORB-SLAM2: an Open-Source SLAM System for Monocular, Stereo and RGB-D Cameras. IEEE Transactions on Robotics, vol. 33, no. 5, pp. 1255-1262, 2017. PDF.

[Monocular] Raúl Mur-Artal, José M. M. Montiel and Juan D. Tardós. ORB-SLAM: A Versatile and Accurate Monocular SLAM System. IEEE Transactions on Robotics, vol. 31, no. 5, pp. 1147-1163, 2015. (2015 IEEE Transactions on Robotics Best Paper Award). PDF.

[DBoW2 Place Recognition] Dorian Gálvez-López and Juan D. Tardós. Bags of Binary Words for Fast Place Recognition in Image Sequences. IEEE Transactions on Robotics, vol. 28, no. 5, pp. 1188-1197, 2012. PDF

License

ORB-SLAM3 is released under GPLv3 license. For a list of all code/library dependencies (and associated licenses), please see Dependencies.md.

For a closed-source version of ORB-SLAM3 for commercial purposes, please contact the authors: orbslam (at) unizar (dot) es.

If you use ORB-SLAM3 in an academic work, please cite:

@article{ORBSLAM3_TRO,
  title={{ORB-SLAM3}: An Accurate Open-Source Library for Visual, Visual-Inertial 
           and Multi-Map {SLAM}},
  author={Campos, Carlos AND Elvira, Richard AND G\´omez, Juan J. AND Montiel, 
          Jos\'e M. M. AND Tard\'os, Juan D.},
  journal={IEEE Transactions on Robotics}, 
  volume={37},
  number={6},
  pages={1874-1890},
  year={2021}
 }

2 - MAM3SLAM

If you use MAM3SLAM in an academic work, please cite:

@article{ORBSLAM3_TRO,
  title={{ORB-SLAM3}: An Accurate Open-Source Library for Visual, Visual-Inertial 
           and Multi-Map {SLAM}},
  author={Campos, Carlos AND Elvira, Richard AND G\´omez, Juan J. AND Montiel, 
          Jos\'e M. M. AND Tard\'os, Juan D.},
  journal={IEEE Transactions on Robotics}, 
  volume={37},
  number={6},
  pages={1874-1890},
  year={2021}
 }

 @article{MAM,

  title={{MAM$^3$SLAM}: Towards underwater-robust multi-agent visual {SLAM}},
  author={Drupt, Juliette AND Comport, Andrew I. AND Dune, Claire AND Hugel, Vincent},
  journal={Ocean Engineering}, 
  year={2024}
 }

Prerequisites

The current code has been developped and tested using Ubuntu 18.04.

C++11 or C++0x Compiler

We use the new thread and chrono functionalities of C++11.

Pangolin

We use Pangolin for visualization and user interface. Dowload and install instructions can be found at: https://github.com/stevenlovegrove/Pangolin. The current code has been tested using version 0.8, which you can find here: https://github.com/stevenlovegrove/Pangolin/tree/aff6883c83f3fd7e8268a9715e84266c42e2efe3.

OpenCV

We use OpenCV to manipulate images and features. Dowload and install instructions can be found at: http://opencv.org. Required at leat 3.0.

Eigen3

Required by g2o (see below). Download and install instructions can be found at: http://eigen.tuxfamily.org. Required at least 3.1.0.

DBoW2 and g2o (Included in Thirdparty folder)

We use modified versions of the DBoW2 library to perform place recognition and g2o library to perform non-linear optimizations. Both modified libraries (which are BSD) are included in the Thirdparty folder.

Python

Required to calculate the alignment of the trajectory with the ground truth. Required Numpy module.

ROS

The code is interfaced with ROS, and has been tested with ROS Melodic. An exemple file for 2 agent scenario is provided, which can be modified easily to increment the number of agents.

Building MAM3SLAM

Clone the repository:

git clone https://github.com/LaboratoireCosmerTOULON/MAM3SLAM.git

We provide a script build.sh to build the Thirdparty libraries and ORB-SLAM3. Please make sure you have installed all required dependencies. Execute:

cd MAM3SLAM
chmod +x build.sh
./build.sh

Build the ROS nodes:

  1. Add the path including Examples/ROS/MAM3SLAM to the ROS_PACKAGE_PATH environment variable. Open .bashrc file:

    gedit ~/.bashrc

    and add at the end the following line. Replace PATH by the folder where you cloned MAM3SLAM:

    export ROS_PACKAGE_PATH=${ROS_PACKAGE_PATH}:PATH/MAM3SLAM/Examples/ROS
  2. Execute build_ros.sh script:

    chmod +x build_ros.sh
    ./build_ros.sh

Running MAM3SLAM with your camera input

In the current version, the camera needs to be interfaced with ROS to do so. The steps needed to use your own camera are:

  1. Calibrate your camera

  2. Run MAM3SLAM. For a 2 agent scenario, you can run:

rosrun MAM3SLAM MonoMulti2 path_to_vocabulary path_to_settings_1 topic_1 path_to_settings_2 topic_2 [is_mono]