professorfabioandrade / georef

6 stars 1 forks source link

Using the Direct UAS Image Registration Algorithm for DJI Matrice 300 RTK with DJI H20T Camera #1

Open HectorJBastidasG opened 2 years ago

HectorJBastidasG commented 2 years ago

Hello Professor Fabio,

First, I would like to thank you because you have published this important work. I am sure that it will be very usefull for the UAS developers & user community.

I am trying to use the Python code of the Direct UAS Image Registration Algorithm for images taken with DJI Matrice 300 RTK with DJI H20T Camera. But I have problems determining the values of these variables for my case:

I understand I need the detailed layouts including the center of gravity for the drone and the camera together, but the DJI doesn't give this type of information in its manuals. In this sense, I would like you to send me your impression and recommendations about the values I should use.

I attach an Excel with the metadata of the image and links to DJI Manuals.

metadata.xlsx

https://www.dji.com/matrice-300/downloads https://www.dji.com/downloads/products/zenmuse-h20-series

I stay attentive to your comments

Hector Bastidas

fabiots50 commented 2 years ago

Hi Hector,

We are glad that you reached out to us. The intention of the paper was exactly to give some enlightenment to the UAS community on this topic. I will answer your questions on topics:

  1. (TC→G)

Regarding the translation vector from camera to gimbal frame, if that information is not in the manuals, I suggest that you measure manually. This translation vector is the distance from the camera lens to the center of the gimbal. This paper illustrates that and the authors used a DJI drone and gimbal: https://www.researchgate.net/publication/338779727_Accuracy_assessment_of_real-time_kinematics_RTK_measurements_on_unmanned_aerial_vehicles_UAV_for_direct_geo-referencing

  1. DCM from the gimbal to UAS.

I see in the metadata that you have the euler angles (roll, yaw, pitch) of the gimbal and of the "flight", which I imagine is the UAS. What you need to find out is if the gimbal attitude is with respect to the world or to the drone. I would guess that it is given with respect to the world. Therefore, you can use the gimbal angles to get the DCM. In this case, you can do the process in the following order: transformation from pixel to camera using the K matrix, rotation from camera to gimbal, rotation from gimbal to NED (then the center of the NED is the center of the gimbal), translation from gimbal NED to drone NED (distance from gimbal to RTK reference in the drone), rotation to ENU, translation to ENU (using the RTK coordinates of the drone).

However, if the gimbal angles are given with respect to the drone, you just do all the steps as described in the paper. First do the rotation from gimbal to UAS (using the gimbal roll, yaw, pitch in the metadata), translation from gimbal to UAS and then rotation from UAS to NED (using the flight roll, yaw, pitch in the metadata).

Obs: remember that normally the angles are given with respect to the "outer" frame. For example, the gimbal angles are used to get the DCM from UAS (or world) to the gimbal. To get the DCM from gimbal to UAS (or world), you just use the transpose of that matrix. This is also described in the paper.

Regards, Fabio