TY - JOUR
T1 - TapStrapGest: Elicitation and Recognition for Ring-based Multi-Finger Gestures
AU - CORNELLA, GUILLEM
AU - SANGENIS, EUDALD
AU - OUSMER, MEHDI
AU - Vanderdonckt, Jean
AU - Villarreal Narvaez, Santiago
AU - Dumas, Bruno
AU - CHAFFANGEON CAILLET, ADRIEN
PY - 2025/6
Y1 - 2025/6
N2 - We introduce TapStrapGest, a novel solution for customizable ring-based multi-finger gestures, encompassing the process from gesture elicitation to gesture recognition. Recognizing the growing demand for intuitive and customizable gesture interaction with fingers, TapStrapGest uses Tap Strap to enable users to perform simple and complex multi-finger gestures using smart rings. We conducted a gesture elicitation study, detailing the systematic process of soliciting and refining a custom set of user-defined ring-based finger gestures through participatory design and ergonomic considerations, including thinking time, goodness of fit, and memorization. Subsequently, we delve into the technical underpinnings of gesture recognition. We reduce the dimensionality of a dataset of 27 gesture classes from 21 to 15 by filtering, then from 15 to 5 by a Principal Component Analysis. We implement and compare four machine learning algorithms to show that a Quadratic Discriminant Analysis (precision=99.33%, recall=99.26%, and F1-score=99.26%) outperforms three other machine learning classifiers, i.e., a Linear Discriminant Analysis, a Support Vector Machines, and a Random Forest, as well as existing recognizers from the literature, to accurately recognize such gestures without the need to call for Deep Learning. Through a performance analysis, we demonstrate that TapStrapGest is a versatile and admissible solution for ring-based multi-finger gesture interaction, opening avenues for "eyes-free" or "screen-free" human-computer interaction in various domains.
AB - We introduce TapStrapGest, a novel solution for customizable ring-based multi-finger gestures, encompassing the process from gesture elicitation to gesture recognition. Recognizing the growing demand for intuitive and customizable gesture interaction with fingers, TapStrapGest uses Tap Strap to enable users to perform simple and complex multi-finger gestures using smart rings. We conducted a gesture elicitation study, detailing the systematic process of soliciting and refining a custom set of user-defined ring-based finger gestures through participatory design and ergonomic considerations, including thinking time, goodness of fit, and memorization. Subsequently, we delve into the technical underpinnings of gesture recognition. We reduce the dimensionality of a dataset of 27 gesture classes from 21 to 15 by filtering, then from 15 to 5 by a Principal Component Analysis. We implement and compare four machine learning algorithms to show that a Quadratic Discriminant Analysis (precision=99.33%, recall=99.26%, and F1-score=99.26%) outperforms three other machine learning classifiers, i.e., a Linear Discriminant Analysis, a Support Vector Machines, and a Random Forest, as well as existing recognizers from the literature, to accurately recognize such gestures without the need to call for Deep Learning. Through a performance analysis, we demonstrate that TapStrapGest is a versatile and admissible solution for ring-based multi-finger gesture interaction, opening avenues for "eyes-free" or "screen-free" human-computer interaction in various domains.
KW - Finger tapping
KW - Tap gestures
KW - Gesture elicitation study
KW - Support Vector Machines
KW - Gesture input
KW - Gesture recognition
M3 - Article
JO - Proceedings of the ACM on Human-Computer Interaction
JF - Proceedings of the ACM on Human-Computer Interaction
ER -