EchoTouch: Low-power Face-touching Behavior Recognition Using Active Acoustic Sensing on Glasses
Abstract
Accurately recognizing face-touching behavior at any time and place can help prevent potential health risks and improve personal habits. However, there remains a lack of effective methods that can be applied in real-world scenarios. In this paper, we propose EchoTouch, a low-power, unobtrusive active acoustic sensing system for monitoring face-touching behavior. EchoTouch captures features from both sides of the face by emitting and receiving orthogonal ultrasound signals through two pairs of microphones and speakers mounted along the under frame of glasses. Then, a lightweight and multi-task deep learning framework identifies the touch area and determines whether the behavior is intrusive to prevent such actions. Finally, a two-stage irrelevant action filtering mechanism effectively handles various interferences. We evaluate EchoTouch on 20 individuals using 11 different types of face-touching areas. EchoTouch achieves an average accuracy of 92.9%, with an 87.2% accuracy in determining whether the behavior is intrusive. Additionally, in-the-wild evaluations further validate the robustness of EchoTouch. We believe that EchoTouch can serve as an unobtrusive and reliable way to monitor and prevent intrusive face-touching behavior.
Type
Publication
Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies