KumarRobotics / kr_autonomous_flight

KR (KumarRobotics) autonomous flight system for GPS-denied quadrotors
Other
654 stars 110 forks source link

how to replace lidar with realsense depth camera #184

Closed panchal-harsh closed 1 year ago

XuRobotics commented 1 year ago

Are you running with a real robot? Or in gazebo simulation? Either way, two things you will need to change:

1. point cloud topic, change this to whatever is published by realsense. For gazebo, see here: https://github.com/KumarRobotics/kr_autonomous_flight/blob/9d9a87eb84ff47b77242cd457d367001d5b8ac49/autonomy_sim/gazebo_sim/gazebo_utils/launch/full_sim.launch#L12 For real robot, see here: https://github.com/KumarRobotics/kr_autonomous_flight/blob/9d9a87eb84ff47b77242cd457d367001d5b8ac49/autonomy_real/real_experiment_launch/launch/full_autonomy.launch#L65

2. rigid body transform from point cloud frame (realsense) to robot body frame. For simulation, it should directly work since Gazebo publish the static tf using the robot URDF you specified. For real robot, see here: https://github.com/KumarRobotics/kr_autonomous_flight/blob/9d9a87eb84ff47b77242cd457d367001d5b8ac49/autonomy_real/real_experiment_launch/launch/publish_tf.launch#L14

XuRobotics commented 1 year ago

One thing to note is that our current planner assumes that the sensor has a good 360-degree observation of obstacles in the environment. This is easy to achieve with LIDAR / omni-directional cameras, but with a single depth camera, you may want to (1) align the sensor's facing direction with the direction of flight all the time and (2) accumulate more history point clouds. Our stack does allow you to achieve these two goals.

For (1): Make sure that you set align_yaw to be true (in the YAML file linked below), and maybe set yaw_speed_magnitude to a higher value so that the UAV aligns the heading with its movement direction fast enough to see obstacles.

tracker_params_mp.yaml (https://github.com/KumarRobotics/kr_autonomous_flight/blob/master/autonomy_core/control/control_launch/config/tracker_params_mp.yaml).

For (2) Increase the decay durations for local / global voxel mapper, see the following: Global: https://github.com/KumarRobotics/kr_autonomous_flight/blob/9d9a87eb84ff47b77242cd457d367001d5b8ac49/autonomy_core/map_plan/map_plan_launch/config/mapper.yaml#L17 Local: https://github.com/KumarRobotics/kr_autonomous_flight/blob/9d9a87eb84ff47b77242cd457d367001d5b8ac49/autonomy_core/map_plan/map_plan_launch/config/mapper.yaml#L27

Soon in the future our team will integrate perception aware planning algorithm into the stack to guarantee safety for sensors with limited field of view.

panchal-harsh commented 1 year ago

Thankyou for your response ,I am trying to run with real robot ,In this setup an external IMU is used but can I just use the pixhawk IMU along with intel realsense's IMU will the accuracy still be satisfactory and where the changes need to be done in the code for that , I just want to use an Intel realsense ,pixhawk 4 ,jetson nano and companian computer to perform autonomous flight is it possible with this stack and where must the changes be done for that