TY - JOUR
T1 - Exploring user-defined gestures for lingual and palatal interaction
AU - Villarreal Narvaez, Santiago
AU - Perez-Medina, Jorge Luis
AU - Vanderdonckt, Jean
N1 - Funding Information:
S. Viillarreal-Narvaez thanks Oliver Villarreal-Loayza for collaborating in designing the gesture silhouettes. S. Viillarreal-Narvaez is also thankful to the Symbiotik project for its support throughout the preparation and submission of this work. Additionally, S. Viillarreal-Narvaez appreciates the OPTIMIS project’s support during the article’s review and resubmission process.
Publisher Copyright:
© 2023, The Author(s), under exclusive licence to Springer Nature Switzerland AG.
PY - 2023/9
Y1 - 2023/9
N2 - Individuals with motor disabilities can benefit from an alternative means of interacting with the world: using their tongue. The tongue possesses precise movement capabilities within the mouth, allowing individuals to designate targets on the palate. This form of interaction, known as lingual interaction, enables users to perform basic functions by utilizing their tongues to indicate positions. The purpose of this work is to identify the lingual and palatal gestures proposed by end-users. In order to achieve this goal, our initial step was to examine relevant literature on the subject, including clinical studies on the motor capacity of the tongue, devices detecting the movement of the tongue, and current lingual interfaces (e.g., using a wheelchair). Then, we conducted a gesture elicitation study (GES) involving 24 (N= 24) participants, who proposed lingual and palatal gestures to perform 19 Internet of Things (IoT) referents, thus obtaining a corpus of 456 gestures. These gestures were clustered into similarity classes (80 unique gestures) and analyzed by dimension, nature, complexity, thinking time, and goodness-of-fit. Using the Agreement Rate (Ar) methodology, we present a set of 16 gestures for a lingual and palatal interface, which serve as a basis for further comparison with gestures suggested by disabled people.
AB - Individuals with motor disabilities can benefit from an alternative means of interacting with the world: using their tongue. The tongue possesses precise movement capabilities within the mouth, allowing individuals to designate targets on the palate. This form of interaction, known as lingual interaction, enables users to perform basic functions by utilizing their tongues to indicate positions. The purpose of this work is to identify the lingual and palatal gestures proposed by end-users. In order to achieve this goal, our initial step was to examine relevant literature on the subject, including clinical studies on the motor capacity of the tongue, devices detecting the movement of the tongue, and current lingual interfaces (e.g., using a wheelchair). Then, we conducted a gesture elicitation study (GES) involving 24 (N= 24) participants, who proposed lingual and palatal gestures to perform 19 Internet of Things (IoT) referents, thus obtaining a corpus of 456 gestures. These gestures were clustered into similarity classes (80 unique gestures) and analyzed by dimension, nature, complexity, thinking time, and goodness-of-fit. Using the Agreement Rate (Ar) methodology, we present a set of 16 gestures for a lingual and palatal interface, which serve as a basis for further comparison with gestures suggested by disabled people.
KW - Gesture interaction
KW - Tongue interaction
KW - Internet of Things
KW - Gesture elicitation study
UR - http://www.scopus.com/inward/record.url?scp=85167504914&partnerID=8YFLogxK
U2 - 10.1007/s12193-023-00408-7
DO - 10.1007/s12193-023-00408-7
M3 - Article
SN - 1783-7677
VL - 17
SP - 167
EP - 185
JO - Journal on Multimodal User Interfaces
JF - Journal on Multimodal User Interfaces
IS - 3
ER -