Fusion in multimodal interactive systems: An HMM-Based Algorithm for User-Induced Adaptation

Bruno Dumas, Beat Signer, Denis Lalanne

Research output: Contribution in Book/Catalog/Report/Conference proceedingConference contribution

63 Downloads (Pure)

Abstract

Multimodal interfaces have shown to be ideal candidates for interactive systems that adapt to a user either automatically or based on user-defined rules. However, user-based adaptation demands for the corresponding advanced software architec- tures and algorithms. We present a novel multimodal fusion algorithm for the development of adaptive interactive systems which is based on hidden Markov models (HMM). In order to select relevant modalities at the semantic level, the algorithm is linked to temporal relationship properties. The presented algorithm has been evaluated in three use cases from which we were able to identify the main challenges involved in de- veloping adaptive multimodal interfaces.
Original languageEnglish
Title of host publicationProceedings of the 4th ACM SIGCHI symposium on Engineering interactive computing systems (EICS '12)
PublisherACM Press
Pages15-24
Number of pages10
ISBN (Print)9781450311687
DOIs
Publication statusPublished - 2012
Externally publishedYes

Keywords

  • user interface adaptation
  • multimodal fusion
  • HMM-based fusion
  • Multimodal interaction
  • EICS2012
  • multimodal
  • HephaisTK

Fingerprint

Dive into the research topics of 'Fusion in multimodal interactive systems: An HMM-Based Algorithm for User-Induced Adaptation'. Together they form a unique fingerprint.

Cite this