A Novel Illumination Robust Hand Gesture Recognition System With Event Based Neuromorphic Vision Sen

A Novel Illumination Robust Hand Gesture Recognition System With Event Based Neuromorphic Vision Sen

Abstract

Regardless of the marvels brought by the conventional frame‐based cameras, they have significant drawbacks due to their redundancy in data and temporal latency. This causes problem in applications where low‐latency transmission and high‐speed processing are mandatory. Proceeding along this line of thought, the neurobiological principles of the biological retina have been adapted to accomplish data sparsity and high dynamic range at the pixel level. These bio‐inspired neuromorphic vision sensors alleviate the more serious bottleneck of data redundancy by responding to changes in illumination rather than to illumination itself. This paper reviews in brief one such representative of neuromorphic sensors, the activity‐driven event‐based vision sensor, which mimics human eyes. Spatio‐temporal encoding of event data permits incorporation of time correlation in addition to spatial correlation in vision processing, which enables more robustness. Henceforth, the conventional vision algorithms have to be reformulated to adapt to this new generation vision sensor data. It involves design of algorithms for sparse, asynchronous, and accurately timed information. Theories and new researches have begun emerging recently in the domain of event‐based vision. The necessity to compile the vision research carried out in this sensor domain has turned out to be considerably more essential. Towards this, this paper reviews the state‐of‐the‐art event‐based vision algorithms by categorizing them into three major vision applications, object detection/recognition, object tracking, localization and mapping. This article is categorized under: bull; Technologies gt; Machine Learning