Multimodal interfaces have shown to be ideal candidates for interactive systems that adapt to a user either automatically or based on user-defined rules. However, user-based adaptation demands for the corresponding advanced software architec- tures and algorithms. We present a novel multimodal fusion algorithm for the development of adaptive interactive systems which is based on hidden Markov models (HMM). In order to select relevant modalities at the semantic level, the algorithm is linked to temporal relationship properties. The presented algorithm has been evaluated in three use cases from which we were able to identify the main challenges involved in de- veloping adaptive multimodal interfaces.
|Title of host publication||Proceedings of the 4th ACM SIGCHI symposium on Engineering interactive computing systems (EICS '12)|
|Number of pages||10|
|Publication status||Published - 2012|
- user interface adaptation
- multimodal fusion
- HMM-based fusion
- Multimodal interaction