The simulation code of my undergraduate thesis:
Perception-constrained Visual Servoing Based NMPC for Quadrotor Flight
https://github.com/user-attachments/assets/533cb507-51b4-43ee-a9b6-e20464969bbf
In this project, I leveraged Image-Based Visual Servo to control the translation and rotation of the four-rotor UAV, where the expected velocity v was calculated according to the image moment error, and the velocity v was tracked by the Nonlinear Model Predictive Control. Meanwhile, I also take the vision restriction into consideration to prevent the loss of visual features.
Python 3
OpenCV 4+
ROS Noetic
Gazebo - ROS Noetic
wget -c https://raw.githubusercontent.com/qboticslabs/ros_install_noetic/master/ros_install_noetic.sh && chmod +x ./ros_install_noetic.sh && ./ros_install_noetic.sh
sudo apt-get install ros-noetic-desktop-full ros-noetic-joy ros-noetic-octomap-ros ros-noetic-mavlink protobuf-compiler libgoogle-glog-dev ros-noetic-control-toolbox python3-wstool python3-catkin-tools
sudo curl https://bootstrap.pypa.io/get-pip.py | sudo python3
sudo curl https://bootstrap.pypa.io/get-pip.py | python3
echo 'export PATH=$PATH:"$HOME/.local/bin"' >> ~/.bashrc
source ~/.bashrc
sudo python3 -m pip install -U rosdep catkin_pkg future
python3 -m pip install -U rosdep catkin_pkg future empy defusedxml numpy matplotlib imageio opencv-python
git clone https://gitee.com/Hang_SJTU/ibvs_nmpc_px4_ws.git
catkin build
Declare:Compile the workspace, Then in the devel/lib directory of this workspace directory, you can find the file 'librealsense_gazebo_plugin.so'
(or possibly 'librealsense_ros_gazebo.so'
, and rename it as 'librealsense_gazebo_plugin.so'
) Copy and paste it under /opt/ros/noetic/lib/
source devel/setup.bash
Spawn the drone and the circle/aruco world:
roslaunch simulation_iris_circle.launch # for circle world
roslaunch simulation_iris_aruco.launch # for aruco world
Detect the target:
roslaunch ibvs_pkg ibvs_circle.launch # for circle world
roslaunch ibvs_pkg ibvs_aruco.launch # for aruco world
Control the drone:
roslaunch mpc_pkg mpc_acados_controller.launch # nmpc based on acados
roslaunch mpc_pkg mpc_acado_controller.launch # nmpc based on acado(bad result)
Declare:Based on the expected speed and IMU feedback attitude estimated by VIO, the C++ code generated by ACADO or the Python code of ACADOS is used to construct the nmpc model for calculation, and the thrust and 3 direction bodyrates are the output to the underlying controller.
Please use the following bibtex if you find this repo helpful and would like to cite:
@misc{ibvsnmpcpx4,
author = {Zhang, Yuanhang},
title = {Perception-constrained Visual Servoing Based NMPC for Quadrotor Flight},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/hang0610/ibvs_nmpc_px4}},
}