Computer Vision Tracking of sUAS from a Pan/Tilt Platform

Abstract

The ability to quickly, accurately, and autonomously identify and track objects in digital images in real-time has been an area of investigation for quite some time. Research in this area falls under the broader category of computer vision. Only in recent decades, with advances in computing power and commercial optical hardware, has this capability become a possibility. There are many different methods of identifying and tracking objects of interest, and best practices are still being developed, varying based on application. This thesis examines background subtraction methods as they apply to the tracking of small unmanned aerial systems (sUAS). A system combining commercial off-the-shelf (COTS) cameras and a pan-tilt unit (PTU), along with custom developed code, is developed for the purpose of continuously pointing at and tracking the motion of a sUAS in flight. Mixtures of Gaussians Background Modeling (MOGBM) is used to track the motion of the sUAS in frame and determine when to command the PTU. When the camera is moving, background subtraction methods are unusable, so additional methods are explored for filling this performance gap. The stereo vision capabilities of the system, enabled by the use of two cameras simultaneously, allow for estimation of the three-dimensional position and trajectory of the sUAS. This system can be used as a supplement or replacement to traditional tracking methods such as GPS and RADAR as part of a larger unmanned aerial systems traffic control (UTC) infrastructure.

Research Gate
DOI
Project Code & Docs

Findings

Jeremy Ogorzalek