matlabbe / rtabmap_drone_example

Example of using move_base with mavros/px4 and rtabmap visual SLAM
BSD 3-Clause "New" or "Revised" License
103 stars 24 forks source link

Real drone - d455 #16

Closed Murali25111993 closed 4 months ago

Murali25111993 commented 5 months ago

Hello,

I have tried with simulation , it is working fine. Now I want to try with real drone.

Setup: Realsense D455 Jetson Orin NX,(Ubuntu 20.04, Ros Noetic) Px4- v 1.12.3

I have tested official offboard example with GPS , Its working fine. Now I want to try Indoor. Since, you were using r200 color and depth image + /mavros/imu/data , px4flow . I also want to try with d455 rgbd + mavros/imu , without px4flow.

What are the changes I want to make ?

matlabbe commented 5 months ago

You can adjust camera topics here to ones from realsense2: https://github.com/matlabbe/rtabmap_drone_example/blob/eaeeb6f723cd0b0307575d3969f2cb5df42f928e/launch/slam.launch#L29-L31 and here: https://github.com/matlabbe/rtabmap_drone_example/blob/eaeeb6f723cd0b0307575d3969f2cb5df42f928e/launch/slam.launch#L41-L42

Otherwise, other stuff would be fairly the same.

What are you referring by "px4flow"?

Murali25111993 commented 5 months ago

I mean are you using Opti-track in simulation. Link Since, there was reference to mocap and px4flow in the px4_config.yaml

https://github.com/matlabbe/rtabmap_drone_example/blob/master/config/px4_config.yaml#L190

https://github.com/matlabbe/rtabmap_drone_example/blob/master/config/px4_config.yaml#L212

Anyhow, I will the implement the changes you mentioned above and update you the result.

Thanks

matlabbe commented 5 months ago

I took a basic px4_config.yaml and changed what I needed. The mocap and px4flow are not used. I lot of stuff in that config is not used actually. The ~original~ file is based on is this one: https://github.com/mavlink/mavros/blob/1.9.0/mavros/launch/px4_config.yaml, you may do a diff to see the differences (I think the main difference is that vision_pose is in map frame instead of odom).

Murali25111993 commented 5 months ago

Thanks for clarifying, I will keep you posted.

Murali25111993 commented 4 months ago

Hello,

I made the changes as you said. Changed slam.launch.

1.Launched Realsense 2.launch spawn_drone.launch 3.launch slam.launch 4.rosrun static transform between base_link and camera_link

I am getting 3d map, and when i run rosrun rtabmap_drone_example offboard,

Screenshot from 2024-07-02 13-06-391

I am getting Error light in pixhawk, upon checking mavros , it is having an info [ INFO] [1719910658.472820145]: HP: requesting home position

Screenshot from 2024-07-02 13-03-51

frames1

Any px4 parameter has to changed?

Thanks

matlabbe commented 4 months ago

It looks repeatedly re-arming the vehicle. The TF tree looks fine. I think [ INFO] [1719910658.472820145]: HP: requesting home position is not an error, but maybe an effect of the error happening.

I don't remember exactly but there should be some configs (similar to this) to setup with QGroundControl on the actual drone. The main parameters to setup are:

EKF2_AID_MASK 24
EKF2_EV_DELAY 200

EKF2_EV_DELAY would need to be tuned based on visual odometry delay.

Murali25111993 commented 4 months ago

Yes, This worked. Changed the params EKF2_AID_MASK , EKF2_HGT_MODE, EKF2_EV_DELAY Link

Now, It moves to offboard mode, arming and taking off.

Screenshot from 2024-07-03 12-39-51 Screenshot from 2024-07-03 12-40-21

Thank you , This issue can be closed.

guptaPrabhav commented 4 months ago

You can adjust camera topics here to ones from realsense2:

https://github.com/matlabbe/rtabmap_drone_example/blob/eaeeb6f723cd0b0307575d3969f2cb5df42f928e/launch/slam.launch#L29-L31

and here: https://github.com/matlabbe/rtabmap_drone_example/blob/eaeeb6f723cd0b0307575d3969f2cb5df42f928e/launch/slam.launch#L41-L42

Otherwise, other stuff would be fairly the same.

What are you referring by "px4flow"?

Hi @matlabbe,

I am also trying to get this to work for a real drone. But, I do not have access to depth image directly such as what may be provided by a RealSense depth camera.

I do have access to a stereo pair. I know that RTAB-MAP supports stereo directly. Could you suggest the changes I need to make to the launch file in order for the SLAM, costmap and navigation to work with a pair of stereo cameras directly.

I had another query along with this. The drone I am using (Qualcomm RB5 Flight) does not publish Camera Information natively via ROS. The workaround I figured for this was to find the camera information off the internet and publish it using a seperate publisher. Will this be okay? I haven't implemented/tested this yet. Kindly let me know if there is a workaround for this, or a better way to get the camera info for RTAB-MAP.

Thanks!

matlabbe commented 4 months ago

In general, you cannot "just take any calibration found on internet", there should be a way to extract the factory calibration from your camera. Otherwise, you will need to calibrate is yourself.

To convert to stereo, you would have to change this: https://github.com/matlabbe/rtabmap_drone_example/blob/3055aea3a69da6c8acc0177c4c9c386578df38d5/launch/slam.launch#L29-L34 to something like this:

 <arg name="stereo"                  value="true" />  
 <arg name="left_image_topic"        value="/stereo/left/image_rect" /> 
 <arg name="left_camera_info_topic"  value="/stereo/left/camera_info" /> 
 <arg name="right_image_topic"       value="/stereo/right/image_rect" /> 
 <arg name="right_camera_info_topic" value="/stereo/right/camera_info" /> 
 <arg name="imu_topic"         value="/mavros/imu/data"/> 
 <arg name="wait_imu_to_init"  value="true"/> 
 <arg name="approx_sync"       value="false"/> 

Then for the obstacle cloud, you can change this: https://github.com/matlabbe/rtabmap_drone_example/blob/3055aea3a69da6c8acc0177c4c9c386578df38d5/launch/slam.launch#L40-L48 for this:

 <node pkg="nodelet" type="nodelet" name="camera_points_xyz" args="standalone rtabmap_util/point_cloud_xyzrgb"> 
   <remap from="left/image"       to="/stereo/left/image_rect"/> 
   <remap from="left/camera_info" to="/stereo/left/camera_info"/> 
   <remap from="right/image"       to="/stereo/right/image_rect"/> 
   <remap from="right/camera_info" to="/stereo/right/camera_info"/> 
   <remap from="cloud"             to="camera_cloud" /> 

   <param name="decimation"  type="double" value="4"/> 
   <param name="voxel_size"  type="double" value="0.0"/> 
   <param name="approx_sync" type="bool"   value="false"/> 
 </node>  

If you have to rectify your images, you can use stereo_image_proc. If you use it, you could use the disparity image generated by stereo_image_proc to create the point cloud for obstacles:

 <node pkg="nodelet" type="nodelet" name="camera_points_xyz" args="standalone rtabmap_util/point_cloud_xyz"> 
   <remap from="disparity/image"       to="/stereo/disparity"/> 
   <remap from="disparity/camera_info" to="/stereo/camera_info"/> 
   <remap from="cloud"             to="camera_cloud" /> 

   <param name="decimation"  type="double" value="4"/> 
   <param name="voxel_size"  type="double" value="0.0"/> 
   <param name="approx_sync" type="bool"   value="true"/> 
 </node>  
guptaPrabhav commented 3 months ago

@matlabbe Thank you for the response!

I did find the factory configuration for the cameras and will be using that to get the camera_info topic for the left and right cameras.

I do have an additional question. The following line of code mentioned above <remap from="cloud" to="camera_cloud" /> suggests that RTAB-MAP is expecting a topic publishing pointcloud data as well. I do not have a pointcloud from the sensors that I am using by default. What should I do in this scenario. Kindly correct me if I am wrong in this interpretation.

Thanks!

matlabbe commented 3 months ago
<remap from="cloud"             to="camera_cloud" />

It is the output of the point_cloud_xyz node, it is created based on the disparity image or stereo images depending on which approach you used.