thinclab / uga_tum_ardrone

Repository for a fork of the tum_ardrone ROS package, implementing autonomous flight with PTAM-based visual navigation for the Parrot AR.Drone.
http://wiki.ros.org/tum_ardrone
GNU General Public License v3.0
0 stars 1 forks source link

Drone Coordinate System #3

Closed mprannoy closed 5 years ago

mprannoy commented 8 years ago

In HelperFunctions.h, the drone coordinate system is given as follows: // the drone coordinate system is: // x-axis: to the left // y-axis: up // z-axis: forward // roll rhs-system correnct // pitch: rhs-system -1; // yaw: rhs system -1;

Is this correct? Shouldn't it be: //x-axis: forward // y-axis: to the left // z-axis: up

Also, could you please explain where you're getting the state from?

kbogert commented 8 years ago

That appears to be a description of the internal coordinate system used by tum_ardrone, might be just an inaccurate comment.

See the papers at the beginning of the readme.md file: the state is estimated using PTAM to give a scale-free positioning based on observed features and a kalman filter to fuse the various sensor data to estimate the scale of the features. Combining the two gives a 6d pose.

mprannoy commented 8 years ago

Ok, but could you explain the transformation below(from DroneController.cpp):

// rotate error to drone CS, invert pitch
double yawRad = yaw * 2 * 3.141592 / 360;
double pitchRad = pitch * 2 * 3.141592 / 360;
double rollRad = roll * 2 * 3.141592 / 360;
vel_term[0] = cos(yawRad)*new_velocity[0] - sin(yawRad)*new_velocity[1];
vel_term[1] = - sin(yawRad)*new_velocity[0] - cos(yawRad)*new_velocity[1];

p_term[0] = cos(yawRad)*new_err[0] - sin(yawRad)*new_err[1];
p_term[1] = - sin(yawRad)*new_err[0] - cos(yawRad)*new_err[1];

Is it possible to get the state solely from the imu/odometry data, without using the PTAM?

kbogert commented 8 years ago

These take the current state and error terms and transform them to the drone's local coordinate system (as far as the spring model is concerned) so that commands may be calculated.

Yes, the system treats the other sources of data as observations to update the kalman filter. Just don't initialize PTAM and you'll see the drone state being updated (give it a start command instead of autoinit). The state will drift over time without some form of absolute reference like PTAM provides. Estimating the state is handled in the stateestimation portion of this code base. The code you're looking at is the control system, and can be thought of as a specialized PID controller based on the motion of a damped spring (https://en.wikipedia.org/wiki/Damping) whose output is used to determine an ideal acceleration the drone should use to simulate the motion of a spring, and then control the pitch and yaw to achieve it.