Adapting to Users through Interaction Metaphor Selection
Numerous recent projects, such as DFKI project SiAM, have recognized the need for interfaces that adapt to their users. Doing so can increase efficiency and comfort, as well as account for a user’s specific requirements.
In MADMACS, we introduce the concept of multi-adaptivity as an extension of the situation adaptivity introduced in SiAM: the creation of a multiadaptive dialogue management system that is adaptive in multiple ways – adaptive to various CPE, adaptive to diverse modality combinations, adaptive to a variety of interaction metaphors, and adaptive to diverse task domains and user models.
Various interaction metaphors for a dialogue with a CPE must be supported, since depending on the user type, the task, and the situation, the choice between anthropomorphic intermediary interfaces or direct manipulation interfaces in a dual-reality approach should be possible.
We aim to create a social and emotional dialogue management by employing social, emotional user models and using real-time sensor values and knowledge of CPS and CPE. As a result, the generated mixed-initiative multiconversational (meta-)dialogues are influenced by the estimated affect states of the users talking to the CPE. By having these emotional and social models integrated, such a dialogue management is enabled to perform and take into account a sophisticated conversational analysis.