2lambda123 / HKUST-Aerial-Robotics-Fast-Planner

GNU General Public License v3.0
0 stars 0 forks source link

Fast-Planner

Fast-Planner is developed aiming to enable quadrotor fast flight in complex unknown environments. It contains a rich set of carefully designed planning algorithms.

News:

Authors: Boyu Zhou and Shaojie Shen from the HUKST Aerial Robotics Group, Fei Gao from ZJU FAST Lab.

Complete videos: video1, video2, video3. Demonstrations about this work have been reported on the IEEE Spectrum: page1, page2, page3 (search for HKUST in the pages).

To run this project in minutes, check Quick Start. Check other sections for more detailed information.

Please kindly star :star: this project if it helps you. We take great efforts to develope and maintain it :grin::grin:.

Table of Contents

1. Quick Start

The project has been tested on Ubuntu 16.04(ROS Kinetic) and 18.04(ROS Melodic). Take Ubuntu 18.04 as an example, run the following commands to setup:

  sudo apt-get install libarmadillo-dev ros-melodic-nlopt
  cd ${YOUR_WORKSPACE_PATH}/src
  git clone https://github.com/HKUST-Aerial-Robotics/Fast-Planner.git
  cd ../ 
  catkin_make

You may check the detailed instruction to setup the project. After compilation you can start the visualization by:

  source devel/setup.bash && roslaunch plan_manage rviz.launch

and start a simulation (run in a new terminals):

  source devel/setup.bash && roslaunch plan_manage kino_replan.launch

You will find the random map and the drone in Rviz. You can select goals for the drone to reach using the 2D Nav Goal tool. A sample simulation is showed here.

2. Algorithms and Papers

The project contains a collection of robust and computationally efficient algorithms for quadrotor fast flight:

These methods are detailed in our papers listed below.

Please cite at least one of our papers if you use this project in your research: Bibtex.

All planning algorithms along with other key modules, such as mapping, are implemented in __fast_planner__:

Besides the folder fast_planner, a lightweight uav_simulator is used for testing.

3. Setup and Config

Prerequisites

  1. Our software is developed and tested in Ubuntu 16.04(ROS Kinetic) and 18.04(ROS Melodic). Follow the documents to install Kinetic or Melodic according to your Ubuntu version.

  2. We use NLopt to solve the non-linear optimization problem. The __uav_simulator depends on the C++ linear algebra library Armadillo__. The two dependencies can be installed by the following command, in which ${ROS_VERSION_NAME} is the name of your ROS release.

    sudo apt-get install libarmadillo-dev ros_${ROS_VERSION_NAME}_nlopt

Build on ROS

After the prerequisites are satisfied, you can clone this repository to your catkin workspace and catkin_make. A new workspace is recommended:

  cd ${YOUR_WORKSPACE_PATH}/src
  git clone https://github.com/HKUST-Aerial-Robotics/Fast-Planner.git
  cd ../
  catkin_make

If you encounter problems in this step, please first refer to existing issues, pull requests and Google before raising a new issue.

Now you are ready to run a simulation.

Use GPU Depth Rendering (can be skipped optionally)

This step is not mandatory for running the simulations. However, if you want to run the more realistic depth camera in __uav_simulator__, installation of CUDA Toolkit is needed. Otherwise, a less realistic depth sensor model will be used.

The local_sensing package in __uav_simulator__ has the option of using GPU or CPU to render the depth sensor measurement. By default, it is set to CPU version in CMakeLists:

 set(ENABLE_CUDA false)
 # set(ENABLE_CUDA true)

However, we STRONGLY recommend the GPU version, because it generates depth images more like a real depth camera. To enable the GPU depth rendering, set ENABLE_CUDA to true, and also remember to change the 'arch' and 'code' flags according to your graphics card devices. You can check the right code here.

    set(CUDA_NVCC_FLAGS 
      -gencode arch=compute_61,code=sm_61;
    ) 

For installation of CUDA, please go to CUDA ToolKit

4. Run Simulations

Run Rviz with our configuration firstly:

  <!-- go to your workspace and run: -->
  source devel/setup.bash
  roslaunch plan_manage rviz.launch

Then run the quadrotor simulator and Fast-Planner. Several examples are provided below:

Kinodynamic Path Searching & B-spline Optimization

In this method, a kinodynamic path searching finds a safe, dynamically feasible, and minimum-time initial trajectory in the discretized control space. Then the smoothness and clearance of the trajectory are improved by a B-spline optimization. To test this method, run:

  <!-- open a new terminal, go to your workspace and run: -->
  source devel/setup.bash
  roslaunch plan_manage kino_replan.launch

Normally, you will find the randomly generated map and the drone model in Rviz. At this time, you can trigger the planner using the 2D Nav Goal tool. When a point is clicked in Rviz, a new trajectory will be generated immediately and executed by the drone. A sample is displayed below:

Related algorithms are detailed in this paper.

Topological Path Searching & Path-guided Optimization

This method features searching for multiple trajectories in distinctive topological classes. Thanks to the strategy, the solution space is explored more thoroughly, avoiding local minima and yielding better solutions. Similarly, run:

  <!-- open a new terminal, go to your workspace and run: -->
  source devel/setup.bash
  roslaunch plan_manage topo_replan.launch

then you will find the random map generated and can use the 2D Nav Goal to trigger the planner:

Related algorithms are detailed in this paper.

Perception-aware Replanning

The code will be released after the publication of associated paper.

5. Use in Your Application

If you have successfully run the simulation and want to use Fast-Planner in your project, please explore the files kino_replan.launch or topo_replan.launch. Important parameters that may be changed in your usage are contained and documented.

Note that in our configuration, the size of depth image is 640x480. For higher map fusion efficiency we do downsampling (in kino_algorithm.xml, skip_pixel = 2). If you use depth images with lower resolution (like 256x144), you might disable the downsampling by setting skip_pixel = 1. Also, the _depth_scalingfactor is set to 1000, which may need to be changed according to your device.

Finally, for setup problem, like compilation error caused by different versions of ROS/Eigen, please first refer to existing issues, pull request, and Google before raising a new issue. Insignificant issue will receive no reply.

6. Updates

Known issues

Compilation issue

When running this project on Ubuntu 20.04, C++14 is required. Please add the following line in all CMakelists.txt files:

set(CMAKE_CXX_STANDARD 14)

Unexpected crash

If the planner dies after triggering a 2D Nav Goal, it is possibly caused by the ros-nlopt library. In this case, we recommend to uninstall it and install nlopt following the official document. Then in the CMakeLists.txt of bspline_opt package, change the associated lines to link the nlopt library:

find_package(NLopt REQUIRED)
set(NLopt_INCLUDE_DIRS ${NLOPT_INCLUDE_DIR})

...

include_directories( 
    SYSTEM 
    include 
    ${catkin_INCLUDE_DIRS}
    ${Eigen3_INCLUDE_DIRS} 
    ${PCL_INCLUDE_DIRS}
    ${NLOPT_INCLUDE_DIR}
)

...

add_library( bspline_opt 
    src/bspline_optimizer.cpp 
    )
target_link_libraries( bspline_opt
    ${catkin_LIBRARIES} 
    ${NLOPT_LIBRARIES}
    # /usr/local/lib/libnlopt.so
    )  

Acknowledgements

We use NLopt for non-linear optimization.

Licence

The source code is released under GPLv3 license.

Disclaimer

This is research code, it is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of merchantability or fitness for a particular purpose.