Deep-PANTHER deployed on different environments. The policy in all the videos above is the same one, and was trained using an obstacle that followed a trefoil-knot trajectory. The green pyramid represents the field of view of the camera.
When using Deep-PANTHER, please cite Deep-PANTHER: Learning-Based Perception-Aware Trajectory Planner in Dynamic Environments (pdf and video):
@article{tordesillas2023deep,
title={{Deep-PANTHER}: Learning-based perception-aware trajectory planner in dynamic environments},
author={Tordesillas, Jesus and How, Jonathan P},
journal={IEEE Robotics and Automation Letters},
year={2023},
publisher={IEEE}
}
This and this pull requests solve the issues that appear with the new versions of gym (gymnasium), Casadi, and Matlab. They also introduce a way to use deep-panther in 2D (and not just 3D). Both pull requests have been merged on the develop
branch.
Deep-PANTHER has been tested with Ubuntu 20.04/ROS Noetic. Other Ubuntu/ROS version may need some minor modifications, feel free to create an issue if you have any problems.
The instructions below assume that you have ROS Noetic installed on your Linux machine.
Note: the instructions below are partly taken from here
sudo apt-get install gcc g++ gfortran git cmake liblapack-dev pkg-config --install-recommends
sudo apt-get install coinor-libipopt1v5 coinor-libipopt-dev
sudo apt-get remove swig swig3.0 swig4.0 #If you don't do this, the compilation of casadi may fail with the error "swig error : Unrecognized option -matlab"
mkdir ~/installations && cd ~/installations
git clone https://github.com/jaeandersson/swig
cd swig
git checkout -b matlab-customdoc origin/matlab-customdoc
sh autogen.sh
sudo apt-get install gcc-7 g++-7 bison byacc
sudo apt-get install libpcre3 libpcre3-dev
./configure CXX=g++-7 CC=gcc-7
make
sudo make install
cd ~/installations && mkdir casadi && cd casadi
git clone https://github.com/casadi/casadi
cd casadi
#cd build && make clean && cd .. && rm -rf build #Only if you want to clean any previous installation/compilation
mkdir build && cd build
cmake . -DCMAKE_BUILD_TYPE=Release -DWITH_IPOPT=ON -DWITH_MATLAB=OFF -DWITH_PYTHON=ON -DWITH_DEEPBIND=ON ..
#You may need to run the command above twice until the output says that `Ipopt` has been detected (although `IPOPT` is also being detected when you run it for the first time)
make -j20
sudo make install
sudo apt-get install python3-venv
cd ~/installations && mkdir venvs_python && cd venvs_python
python3 -m venv ./my_venv
printf '\nalias activate_my_venv="source ~/installations/venvs_python/my_venv/bin/activate"' >> ~/.bashrc
source ~/.bashrc
activate_my_venv
And finally download the repo and compile it:
sudo apt-get install git-lfs ccache
cd ~/Desktop/
mkdir ws && cd ws && mkdir src && cd src
git clone https://github.com/mit-acl/deep_panther
cd deep_panther
git lfs install
git submodule init && git submodule update
cd panther_compression/imitation
pip install numpy Cython wheel seals rospkg defusedxml empy pyquaternion pytest
pip install -e .
sudo apt-get install python3-catkin-tools #To use catkin build
sudo apt-get install ros-"${ROS_DISTRO}"-rviz-visual-tools ros-"${ROS_DISTRO}"-pybind11-catkin ros-"${ROS_DISTRO}"-tf2-sensor-msgs ros-"${ROS_DISTRO}"-jsk-rviz-plugins
cd ~/Desktop/ws/
catkin build
printf '\nsource PATH_TO_YOUR_WS/devel/setup.bash' >> ~/.bashrc #Remember to change PATH_TO_YOUR_WS
printf '\nexport PYTHONPATH="${PYTHONPATH}:$(rospack find panther)/../panther_compression"' >> ~/.bashrc
source ~/.bashrc
Simply use:
roslaunch panther simulation.launch
Wait until the terminal says Planner initialized
. Then, you can press G (or click the option 2D Nav Goal on the top bar of RVIZ) and click any goal for the drone. By default, simulation.launch
will use the policy Hung_dynamic_obstacles.pt (which was trained with trefoil-knot trajectories). You can change the trajectory followed by the obstacle during testing using the type_of_obst_traj
field of the launch file.
You can also use policies trained using a static obstacle. Simply change the field student_policy_path
of simulation.launch
. The available policies have the format A_epsilon_B.pt
, where A
is the algorithm used: Hungarian (i.e., LSA), RWTAc, or RWTAr. B
is the epsilon used. Note that this epsilon is irrelevant for the LSA algorithm. Check the paper for further details.
If you want to...
Use the expert: You first need to install a linear solver (see instructions below). Then, you can use the expert by simply setting use_expert: true
, use_student: false
, and pause_time_when_replanning:true
in panther.yaml
and running roslaunch panther simulation.launch
.
Modify the optimization problem:, You will need to have MATLAB installed (especifically, you will need the Symbolic Math Toolbox
and the Phased Array System Toolbox
installed), and follow the steps detailed in the MATLAB section below. You can then make any modification in the optimization problem by modifying the file main.m
, and then running it. This will generate all the necessary .casadi
files in the casadi_generated_files
folder, which will be read by the C++ code.
Train the policy: You first need to install a linear solver (see instructions below). Then, you can train a new policy but simply running python3 policy_compression_train.py
inside the panther_compression
folder.