thinclab / uga_tum_ardrone

Repository for a fork of the tum_ardrone ROS package, implementing autonomous flight with PTAM-based visual navigation for the Parrot AR.Drone.
http://wiki.ros.org/tum_ardrone
GNU General Public License v3.0
0 stars 1 forks source link

Package uga_tum_ardrone

This package is a fork of the popular tum_ardrone package. Currently, the changes are:

This package contains the implementation corresponding to the following publications:

You can find a video on youtube. This Package builds on the well known monocular SLAM framework PTAM presented by Klein & Murray in their paper at ISMAR07. Please study the original PTAM website and the corresponding paper for more information on this part of the software. Also, be aware of the license that comes with it.

The code works for both the AR.Drone 1.0 and 2.0, the default-parameters however are optimized for the AR.Drone 2.0 by now.

Installation

with catkin

cd catkin_ws/src
git clone https://github.com/thinclab/uga_tum_ardrone.git -b indigo-devel
cd ..
rosdep install uga_tum_ardrone
catkin_make

Quick start

Launch the nodes

roslaunch uga_tum_ardrone ardrone_driver.launch
roslaunch uga_tum_ardrone uga_tum_ardrone.launch

Check status

On the GUI, under Drone Communication Status, you should see:

Keyboard control

Joystick control

Autopilot

Nodes

drone_stateestimation

Estimates the drone's position based on sent navdata, sent control commands and PTAM.

IMPORTANT: requires messages to be sent on both /ardrone/navdata (>100Hz) and /ardrone/image_raw (>10Hz), i.e. a connected drone with running ardrone_autonomy node, or a .bag replay of at least those two channels. ardrone_autonomy should be started with:

rosrun ardrone_autonomy ardrone_driver _navdata_demo:=0 _loop_rate:=500

 Subscribed topics

Published topics

Services

None

 Parameters

 Required tf transforms

TODO

 Provided tf transforms

TODO

Using it

To properly estimate PTAM's scale, it is best to fly up and down a little bit (e.g. 1m up and 1m down) immediately after initialization. There are two windows, one shows the video and PTAM's map points, the other one the map. To issue key commands, focus the respective window and hit a key. This generates a command on /uga_tum_ardrone/com, which in turn is parsed and does something.

Video Window

Video window

Key /tum_adrone/com message Action
r "p reset" resets PTAM
u "p toggleUI" changes view
space "p space" takes first / second Keyframe for PTAM's initialization
k "p keyframe" forces PTAM to take a keyframe
l "toggleLog" starts / stops extensive logging of all kinds of values to a file
m "p toggleLockMap" locks map, equivalent to parameter PTAMMapLock
n "p toggleLockSync" locks sync, equivalent to parameter PTAMSyncLock

Clicking on the video window will generate waypoints, which are sent to drone_autopilot (if running):

Map Window

Map window

Key /tum_adrone/com message Action
r "f reset" resets EKF and PTAM
u "m toggleUI" changes view
v "m resetView" resets viewpoint of viewer
l "toggleLog" starts / stops extensive logging of all kinds of values to a file
v "m clearTrail" clears green drone-trail

drone_autopilot

Critically damped spring based controller for the drone (see: Wikipedia: Damping ). Also includes basic way-point-following and automatic initialization. Requires drone_stateestimation to be running. The target is set via the /uga_tum_ardrone/com topic.

Two dimensions (Yaw and Z) can be controlled directly by the drone's motors. The other two (X and Y) are controlled by the drone "leaning" in that direction, which complicates control slightly. The damped springs provide a target force that the drone tries to generate with its motors and leaning, while still maintaining it's altitude.

 Subscribed topics

Published topics

Services

None

 Parameters

 Required tf transforms

TODO

 Provided tf transforms

TODO

Using it

The behavior of the autopilot is set by sending commands on /uga_tum_ardrone/com of the form "c COMMAND". A Queue of commands is kept, and as soon as one command is finished (for example a way point reached), the next command is popped. The queue can be cleared by sending "c clearCommands". Commands can be sent using the drone_gui node. Some example scripts can be found in /flightPlans/*.txt. Possible commands are:

drone_gui

This node offers a simple QT GUI to control the drone_autopilot node, the drone_stateestimation node and fly the drone manually via keyboard or joystick.

 Subscribed topics

Published topics

Services

 Parameters

 Required tf transforms

None

 Provided tf transforms

None

Using it

Drone GUI

Monitor Drone, Autopilot and Stateestimation Nodes (top-right part).

On the top-right, the current publish-frequency of important topics is displayed:

Manual or joystick control of the drone.

The current control source has to be set (i.e. joystick or KB). The autopilot is only allowed to send control commands, if this is set to autopilot.

Autopilot Control

Troubleshooting

Known Bugs & Issues

Tips and Tricks

Camera calibration

calibrate with ethzasl_ptam. to work with colored images, in src/CameraCalibrator.cc change:

Parameters: c1 to c8

can be estimated easily by

Parameters: PID control

approximate in "simulation" based on c1 to c8:

Licence

The major part of this software package - that is everything except PTAM - is licensed under the GNU General Public License Version 3 (GPLv3), see http://www.gnu.org/licenses/gpl.html. PTAM (comprised of all files in /src/stateestimation/PTAM) has it's own licence, see http://www.robots.ox.ac.uk/~gk/PTAM/download.html. This licence in particular prohibits commercial use of the software.