Event-based cameras are popular for tracking fast-moving objects due to their high temporal resolution, low latency, and high dynamic range. In this paper, we propose a novel algorithm for tracking event blobs using raw events asynchronously in real time. We introduce the concept of an event blob as a spatio-temporal likelihood of event occurrence where the conditional spatial likelihood is blob-like. Many real-world objects such as car headlights or any quickly moving foreground objects generate event blob data. The proposed algorithm uses a nearest neighbour classifier with a dynamic threshold criteria for data association coupled with an extended Kalman filter to track the event blob state. Our algorithm achieves highly accurate blob tracking, velocity estimation, and shape estimation even under challenging lighting conditions and high-speed motions (> 11000 pixels/s). The microsecond time resolution achieved means that the filter output can be used to derive secondary information such as time-to-contact or range estimation, that will enable applications to real-world problems such as collision avoidance in autonomous driving.
The paper was accepted by the 2024 IEEE Transactions on Robotics (TRO).
Ziwei Wang, Timothy Molloy, Pieter van Goor and Robert Mahony
[PDF] [IEEE Xplore]
We are currently working on making the codebase cleaner and more accessible, which will take some time.
Please find the early release in link and follow the README file to run the code.
If you use or discuss our event blob tracking method, please cite our paper as follows:
@Article{2024_Wang_AEB_Tracker_TRO, author = {Ziwei Wang and Timothy Molloy and Pieter {van Goor} and Robert Mahony}, journal = {IEEE Transactions on Robotics}, title = {Asynchronous Blob Tracker for Event Cameras}, year = {2024}, volume = {40}, pages = {4750-4767}, issn = {1552-3098}, doi = {10.1109/TRO.2024.3454410}, }
Should you have any questions or suggestions, please don't hesitate to get in touch with ziwei.wang1@anu.edu.au