Overview

siamdp_1

Project Overview

siamdp_2

Utterance Editor

siamdp_3

Connections

siam

The Situation-Adaptive Dialogue Platform (SiAM-dp) was created as a new approach to support modern multimodal interaction concepts while incorporating as much situational knowledge as possible. Its development was started as part of the project SiAM. The platform is a modular framework that is designed to support rapid application development with a low learning curve, tool support, and out-of-the-box integration of many devices and modalities. Further features include:

modalities1
  • Integrated coverage of multimodality concepts (fusion and fission)
  • Intelligent dialogue system behavior through semantic interpretation of user input
  • Situation adaptivity through dynamic behavior depending on user and context
  • Consideration of user resources (e.g. cognitive load, time)
  • Ready for offline evaluation of dialog runs to gain early insights without a costly user study
  • Multi-party support allows inclusion of passengers in the dialogue discourse
  • Several devices representing common modalities are supported out-of-the-box
  • Ability to dynamically connect to external devices as output devices, e.g. electronic road signs and billboards.

Due to its strong set of features and support for multimodality, SiAM-dp was chosen as starting point to add new dialogue management concepts in MADMACS. Our roadmap includes the following new additions during the lifetime of MADMACS:

  • Integration of new input and output devices and modalities into the shared i/o model: smart glasses, smart watches, 3d audio, haptic devices, head trackers, VR / AR / holographic displays etc.
  • Module for attention control (focus of attention detection and attention guidance)
  • Multi-party communication support, including a group detection module and turn-taking logics
  • Integrated metadialogue elements of confirmation dialogue, clarification dialogue, and persuasion dialogue
  • Connection to the new device platform and integration of semantic actions into the dialogue, as well as realizing dynamic “Plug & Interact” functionality
  • Including emotional behavior and language as part of the EmoSocial output generation component

Multiple intermediate releases of SiAM-dp are planned during the runtime of the project.