Exploring user-defined gestures for lingual and palatal interaction

Santiago Villarreal Narvaez, Jorge Luis Perez-Medina, Jean Vanderdonckt

Résultats de recherche: Contribution à un journal/une revueArticleRevue par des pairs

Résumé

Individuals with motor disabilities can benefit from an alternative means of interacting with the world: using their tongue. The tongue possesses precise movement capabilities within the mouth, allowing individuals to designate targets on the palate. This form of interaction, known as lingual interaction, enables users to perform basic functions by utilizing their tongues to indicate positions. The purpose of this work is to identify the lingual and palatal gestures proposed by end-users. In order to achieve this goal, our initial step was to examine relevant literature on the subject, including clinical studies on the motor capacity of the tongue, devices detecting the movement of the tongue, and current lingual interfaces (e.g., using a wheelchair). Then, we conducted a gesture elicitation study (GES) involving 24 (N= 24) participants, who proposed lingual and palatal gestures to perform 19 Internet of Things (IoT) referents, thus obtaining a corpus of 456 gestures. These gestures were clustered into similarity classes (80 unique gestures) and analyzed by dimension, nature, complexity, thinking time, and goodness-of-fit. Using the Agreement Rate (Ar) methodology, we present a set of 16 gestures for a lingual and palatal interface, which serve as a basis for further comparison with gestures suggested by disabled people.

langue originaleAnglais
Pages (de - à)167-185
Nombre de pages19
journalJournal on Multimodal User Interfaces
Volume17
Numéro de publication3
Les DOIs
Etat de la publicationPublié - sept. 2023

Empreinte digitale

Examiner les sujets de recherche de « Exploring user-defined gestures for lingual and palatal interaction ». Ensemble, ils forment une empreinte digitale unique.

Contient cette citation