In this paper, we introduce a framework for dynamic gesture recognition with background suppression Bathing Aids operating on the output of a moving event-based camera.The system is developed to operate in real-time using only the computational capabilities of a mobile phone.It introduces a new development around the concept of time-surfaces.It also presents a novel event-based methodology to dynamically remove backgrounds that uses the high temporal resolution properties of event-based cameras.To our knowledge, this is the first Android event-based framework for vision-based recognition of dynamic gestures running on a smartphone without off-board processing.
We assess Automotive Connectors the performances by considering several scenarios in both indoors and outdoors, for static and dynamic conditions, in uncontrolled lighting conditions.We also introduce a new event-based dataset for gesture recognition with static and dynamic backgrounds (made publicly available).The set of gestures has been selected following a clinical trial to allow human-machine interaction for the visually impaired and older adults.We finally report comparisons with prior work that addressed event-based gesture recognition reporting comparable results, without the use of advanced classification techniques nor power greedy hardware.