eliabntt / irotate_active_slam

Public Code Repository of the iRotate Active SLAM for Omnidirectional robots at the Max Planck Institute for Intelligent Systems, Tübingen
GNU General Public License v3.0
65 stars 11 forks source link
active-slam gazebo irotate localization localizations mapping mpi nmpc omnidirectional robotics ros slam

iRotate: an Active Visual SLAM Approach

iRotate: Active Visual SLAM for Omnidirectional Robots --- Published in Robotics and Autonomous Systems

Active Visual SLAM with Independently Rotating Camera --- Published in ECMR2021

Check our teasers here and here to get a quick overview of the project.



This repository contains the code of iRotate, a three-layered active V-SLAM method for ground robots.

It combines concept of long term planning (frontier exploration), viewpoint refinement (Next-Best-View, Receiding Horizon planners), and online refinement (feature tracking) in a seemless and structured framework. This results in a continuos robot's heading refinement that is always based on the latest map's entropy and feature information. Thanks to that, we are able to fully explore an environment with paths that are up to 39% shorter!

To allow a continuous rotational movement the system was first developed with an omnidirectional platform, our Robotino, that allowed us to avoid kinematic constraints and a free 3DOF planning.

That was a huge limitation. Therefore, with the use of an independent camera rotation mechanism, we extended our algorithm to different non-omnidirectional ground robots.

This project has been developed within Robot Perception Group at the Max Planck Institute for Intelligent Systems, Tübingen.

The published papers can be found in open acces: iRotate and Independent Camera -- please cite us if you find this work useful

As a bonus, here you can find a gazebo controller for omnidirectional robots.


The code can be run in both ROS-Melodic/Ubuntu 18.04 and ROS-Noetic/Ubuntu 20.04


Getting Started

These instructions will help you to set up both the simulation environment and a real robot scenario for development and testing purposes.

Branches description

Installation

Please check the detailed instructions here

How to run

You can find more detailed instructions here. The following is just an example of what we have run during the experiments. Not all commands are mandatory. COMMAND LIST EXAMPLE

roslaunch robotino_simulations world.launch name:="cafe" gui:=true
roslaunch robotino_mpc robotino_mpc.launch 
roslaunch robotino_simulations rtabmap.launch delete:=-d
[optional, recorder] rosrun robotino_simulations rtabmap_eval.py  
[optional, timer] python src/pause.py && ./back.sh 'main_folder' 'experiment_name' 'session_number' #or python3
roslaunch active_slam active_node.launch
roslaunch robotino_camera_heading best_heading.launch 
roslaunch robotino_fsm robotino_fsm.launch kind:=7 only_last_set:=false pre_fix:=false mid_optimizer:=false weighted_avg:=false

Custom robot system

You might want to run the code on your own robot. Here you can find some hints on what needs to be adapted.

Evaluation

Reproducing results

Results achieved in real world experiments always depend on the hardware in question as well as environmental factors on the day of experiment. However our simulated experiments results were averaged over a large number of identical experiments and should be reproducible by third parties.

We have uploaded our data here for iRotate experiments, and here for the independent camera rotation ones.

If you want to keep track of your own results you may want to check this document that contains our evaluation procedure.

Included external packages / sources

Citation

If you find this work useful, or you use part of the developed code, please cite us.

@article{BONETTO2022104102,
title = {iRotate: Active visual SLAM for omnidirectional robots},
journal = {Robotics and Autonomous Systems},
pages = {104102},
year = {2022},
issn = {0921-8890},
doi = {https://doi.org/10.1016/j.robot.2022.104102},
url = {https://www.sciencedirect.com/science/article/pii/S0921889022000550},
author = {Elia Bonetto and Pascal Goldschmid and Michael Pabst and Michael J. Black and Aamir Ahmad},
keywords = {View planning for SLAM, Vision-based navigation, SLAM}
}
@INPROCEEDINGS{9568791,  
author={Bonetto, Elia and Goldschmid, Pascal and Black, Michael J. and Ahmad, Aamir},  
booktitle={2021 European Conference on Mobile Robots (ECMR)},   
title={Active Visual SLAM with Independently Rotating Camera},   
year={2021},  
volume={},  
number={},  
pages={1-8},  
doi={10.1109/ECMR50962.2021.9568791}
}

License

All Code in this repository - unless otherwise stated in local license or code headers is

Copyright 2021 Max Planck Institute for Intelligent Systems, Tübingen.

Licensed under the terms of the GNU General Public Licence (GPL) v3 or higher. See: https://www.gnu.org/licenses/gpl-3.0.en.html