robotic-vision-lab / Vision-Based-Guidance-For-Tracking-Dynamic-Objects

Vision-based guidance for tracking dynamic objects via unmanned aircraft systems.
MIT License
2 stars 0 forks source link
computer-vision tracking

Vision-Based Guidance for Tracking Dynamic Objects

Overview

In recent years there has been an increase in the number of applications using unmanned aircraft systems (UASs). Additionally, researchers have progressively moved towards using vision as a primary source of perception. This is mainly due to cameras becoming cheaper in cost, smaller in size, lighter in weight, and higher in image resolution. An important UAS application is the tracking of objects via the use of visual information.

occlusion_handling.gif

This repository consists of the code base and documentation for a vision-based object tracking system based our 2021 ICUAS paper titled "Vision-Based Guidance for Tracking Dynamic Objects." Specifically, we implement experiments for performing the diagnosis and analysis of visual tracking techniques under occlusions along with UAS guidance based on a rendezvous cone approach. Our system contains computer vision algorithms that may be used in various combinations, pipelines, or standalone depending upon the complexity and/or requirement of the task.

Citation

If you find this project useful, then please consider citing our work.

@inproceedings{karmokar2021vision,
  title={Vision-Based Guidance for Tracking Dynamic Objects},
  author={Karmokar, Pritam and Dhal, Kashish and Beksi, William J and Chakravarthy, Animesh},
  booktitle={Proceedings of the International Conference on Unmanned Aircraft Systems (ICUAS)},
  pages={1106--1115},
  year={2021}
}

Installation

To run the experiments within this repository, opencv, numpy, and pygame need to be installed along with their dependencies. The requirements.txt file (generated by pip freeze) may be used as follows. Navigate into the downloaded source folder where requirements.txt is located. Then, run the following

pip install -r requirements.txt

Usage

From the source folder, navigate into the experiments folder

cd .\vbot\experiments

To run the occlusion handling experiment, run the following

python -m exp_occ

To run the lane changing experiment, run the following

python -m exp_lc

To run the squircle following experiment, run the following

python -m exp_sf

Running Experiments

The process of running experiments has the following steps.

  1. Simulator window pops up.
  2. User inputs bounding boxing around car (with some extra room).
  3. Users hits space to start experiment.
  4. Tracker window appears displaying tracking results.
  5. To stop the experiment, user selects the simulator window and hits space, closes the window.

License

license