sensoarltd / Dart

0 stars 0 forks source link

Dart

DART (Detect and avoid Artificial Reality Testing) is an open community benchmarking system to evaluate visual system detect and avoid algorithms for drones and other aircraft. Our work as of v1.0 reached TRL 4 (proof of concept). We welcome the community to join us in building on these foundations with the vision to have a standardised system to evaluate DAA systems. An overview video of the project is available here. 2021 work was funded by the Dft D-Trig program. Our work is released under Apache License V 2.0.

Consultancy

If you would like private consultancy on how to setup and use DART in your organisation then please contact Sky Tech directly david.redpath@skytech.limited

Scope

DART v1.0 is designed around the scenario of a single aircraft on collision course in good lighting. The input to the system is a single forward-facing video camera. Our initial study looked at criterion effecting this type of DAA system and typical scenarios such a normal flight (no false alarms), single and multiple collision scenarios. The combinatorial nature of the problem makes testing solutions exhaustive. We were limited to 8 weeks of development so a fair number of assumptions need to be made to complete on time. We chose to drive simulations via AirSim exclusively and avoid locking in development to one particular setup. Therefore, with minimal changes DART will work on Windows/Linux and with Unreal Engine or Unity.

Overview

DART is built on Microsoft AirSim using Unreal Engine 4. Python along with the AirSim plugin APIs was used to script different collisions. A Flask server is used to stream virtual cameras from the simulation. The simulator code start/stops each DAA algorithm under test. Each of these are Python scripts which read the virtual camera from the Flask server and input consecutive frames into OpenCV functions to assess the collision threat and issue an avoid manoeuvre. Results on true, missed and false detection were recorded with the detection distance. An overview video of the system working is available here.

Contribute

Please look at open issues if you are looking for areas to contribute to. Our aim is to setup a working group to steer the future of the project, so please join the discussion here

Workstation Setup

Running a Simulation

Start Unreal Engine on Linux, open a terminal, in our case all files were installed under the folder /Dart in the user home directory (NOTE: You need to have first created the Simulation.uproject as described below).

Airsim settings.json

The settings in this file are sensitive and formatting mistakes can cause AirSim/UE4 to freeze. It’s recommended to change and confirm one setting at a time and keep backups of your working settings. We have tried loading several aircraft on the airfield but experienced inconsistent results. We therefore recommend using one aircraft at a time and setting it here. Two default files are provided for SITL and HITL in the repo under /Setup/Airsim

Setting Up Your Own Simulation

Due to asset licencing and frequent Unreal Engine updates it’s necessary to create and populate your own project and import your own models. Follow this tutorial on the Airsim pages here. There is also a video here. You will also need to import your own aircraft for testing, this can be done by cloning the blueprint of the default Airsim drone and changing the 3D model. Be careful with naming as the wrong format will crash Unreal Engine. One issue is that the new aircraft will still have a drone kinematic model and look awkward in flight. This is a future area of improvement.

HITL Raspberry Pi

If you wish to run DAA algorithms on real hardware you can. We used a Raspberry Pi but any Linux based PC should be fine (Jetson,NUC,etc). We also used the camera module on Pi to use it as an actual DAA system but have not yet tested on a physical drone in flight. Future work will add augmented reality collision to this project but we found doing so isn’t trivial without months of development work. Android/iOS make this easy but support is only available for off the shelf mobile devices. Supporting the Pi on Unreal Engine would be a long journey and a stripped down 3D sim would be more suitable. Our setup is motivated to allow the mission controller (Pi) to issue avoid manoeuvres to the flight controller (PX4-Offboard mode) via Mavlink.

HITL Benchmarking

We attempted to reconciliate logs on the sim and hardware to find DAA accuracy but we found the lag from video streaming and slow HW performance made this too inaccurate. We welcome discussions how to do this better such as timestamoing the stream or using sim recordings if HW interaction isn't present. Realistically the whole system needs reviewed to improve speed and better understand the bottle necks. In the meantime SITL assessment is still very reliable!

Future Work

An original aim of DART was to include augmented reality to test DAA on a real drone in flight. We found this was going to be months/years of work to customise a plugin and firmware for RaspberryPi/Jetson. The tools exisit for Android/iOS devices and it’s possible to stream from a mobile device into the mission computer but such actions were deemed a novelty in the end over conducting solid research. We found the two DAA algorithms we specified to be sensitive to global movement and unsuitable for practical application. Working on new robust DAA algorithms is therefore essential. Rejecting false alarms is quite important and it’s expected DART can be used as a starting point in your own conops digital twin. Supporting different kinematic models is also an area for future improvement such as aeroplanes, helicopters, rockets.