SonicASL: An Acoustic-based Sign Language Gesture Recognizer Using Earphones
Jun 1, 2021·,,,,,,,,,·
0 min read
Yincheng Jin
Yang Gao
Yanjun Zhu
Wei Wang
Jiyang Li
Seokmin Choi
Zhangyu Li
Jagmohan Chauhan
Anind K. Dey
Zhanpeng Jin
Abstract
We propose SonicASL, a real-time gesture recognition system that can recognize sign language gestures on the fly, leveraging front-facing microphones and speakers added to commodity earphones worn by someone facing the person making the gestures. In a user study (N=8), we evaluate the recognition performance of various sign language gestures at both the word and sentence levels. Given 42 frequently used individual words and 30 meaningful sentences, SonicASL can achieve an accuracy of 93.8% and 90.6% for word-level and sentence-level recognition, respectively. The proposed system is tested in two real-world scenarios: indoor (apartment, office, and corridor) and outdoor (sidewalk) environments with pedestrians walking nearby. The results show that our system can provide users with an effective gesture recognition tool with high reliability against environmental factors such as ambient noises and nearby pedestrians. CCS Concepts: • Human-centered computing → Human computer interaction (HCI); Ubiquitous and mobile computing systems and tools.
Type
Publication
Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies