Abstract
We introduce TapStrapGest, a novel solution for customizable ring-based multi-finger gestures, encompassing the process from gesture elicitation to gesture recognition. Recognizing the growing demand for intuitive and customizable gesture interaction with fingers, TapStrapGest uses Tap Strap to enable users to perform simple and complex multi-finger gestures using smart rings. We conducted a gesture elicitation study, detailing the systematic process of soliciting and refining a custom set of user-defined ring-based finger gestures through participatory design and ergonomic considerations, including thinking time, goodness of fit, and memorization. Subsequently, we delve into the technical underpinnings of gesture recognition. We reduce the dimensionality of a dataset of 27 gesture classes from 21 to 15 by filtering, then from 15 to 5 by a Principal Component Analysis. We implement and compare four machine learning algorithms to show that a Quadratic Discriminant Analysis (precision=99.33%, recall=99.26%, and F1-score=99.26%) outperforms three other machine learning classifiers, i.e., a Linear Discriminant Analysis, a Support Vector Machines, and a Random Forest, as well as existing recognizers from the literature, to accurately recognize such gestures without the need to call for Deep Learning. Through a performance analysis, we demonstrate that TapStrapGest is a versatile and admissible solution for ring-based multi-finger gesture interaction, opening avenues for "eyes-free" or "screen-free" human-computer interaction in various domains.
Original language | English |
---|---|
Number of pages | 26 |
Journal | Proceedings of the ACM on Human-Computer Interaction |
Publication status | Accepted/In press - Jun 2025 |
Keywords
- Finger tapping
- Tap gestures
- Gesture elicitation study
- Support Vector Machines
- Gesture input
- Gesture recognition