• research
  • cyber-physical systems
  • dialogue
  • focus of attention
  • speech
  • user adaptation
  • cyber-physical environments
  • eyegaze
  • smart glasses
  • massively multimodal
  • interaction metaphors
  • attention guidance
  • multi-party
  • gestures
  • wearables
  • free choice modalities
  • multiadaptive
  • user studies
  • activity detection
  • plug & interact
  • head tracking
  • dialogue platform
  • emosocial
  • metadialogue
  • dialogue management
  • MADMACS
  • research
  • cyber-physical systems
  • dialogue
  • focus of attention
  • speech
  • user adaptation
  • cyber-physical environments
  • eyegaze
  • smart glasses
  • massively multimodal
  • interaction metaphors
  • attention guidance
  • multi-party
  • gestures
  • wearables
  • free choice modalities
  • multiadaptive
  • user studies
  • activity detection
  • plug & interact
  • head tracking
  • dialogue platform
  • emosocial
  • metadialogue
  • dialogue management
  • MADMACS
  • research
  • cyber-physical systems
  • dialogue
  • focus of attention
  • speech
  • user adaptation
  • cyber-physical environments
  • eyegaze
  • smart glasses
  • massively multimodal
  • interaction metaphors
  • attention guidance
  • multi-party
  • gestures
  • wearables
  • free choice modalities
  • multiadaptive
  • user studies
  • activity detection
  • plug & interact
  • head tracking
  • dialogue platform
  • emosocial
  • metadialogue
  • dialogue management
  • MADMACS
  • research
  • cyber-physical systems
  • dialogue
  • focus of attention
  • speech
  • user adaptation
  • cyber-physical environments
  • eyegaze
  • smart glasses
  • massively multimodal
  • interaction metaphors
  • attention guidance
  • multi-party
  • gestures
  • wearables
  • free choice modalities
  • multiadaptive
  • user studies
  • activity detection
  • plug & interact
  • head tracking
  • dialogue platform
  • emosocial
  • metadialogue
  • dialogue management
  • MADMACS

Welcome to MADMACS

Cebit_Demo_Composition

Cyber-physical systems, that bridge the cyber-world of computing and communication with the physical world, are enabling innovative applications with enormous societal and economic impact. Currently, networked cyber-physical systems are the basis for intelligent environments in a variety of settings such as smart factories, smart transportation systems, smart shops, and smart buildings. However, one of the remaining grand challenges for the new post-PC era of the Internet of Things is to transform the way how humans interact with and control such cyber-physical environments (CPE). Today, user-interface methodologies and technologies are still largely dominated by the traditional paradigm of human-computer interaction, where a single user interacts with a single stationary computing device. Therefore, the main goal of the MADMACS project is to lay the foundations for a new generation of user interfaces that are adequate for human-environment interaction in CPE.

The projects aims to make the large number of sensors and actuators available at any time for mobile users, who can control them intuitively and in multimodal fashion. In the same way as users change in a CPE, the environment should adapt in multiple ways to the user, the task, and the interaction method. Multimodal input and output should allow a free choice of modalities, and over closely and over distances. In all situations, the system should be aware of the user’s focus of attention, and be able to guide it using output devices on the user and in the environment, if needed.

The two main test sites for MADMACS are situated in the work / business domain: The first is located in the retail environment of the future, in which employees perform tasks such as commissioning and intra-logistics more easily in the highly instrumented store with the help of MADMACS multiadaptive dialogue, commissioning carts and wearable devices. The second scenario focuses on a car garage, where mechanics and engineers are supported in their daily work, working jointly on a task guided by MADMACS multi-party interaction. In addition to these main test sites, the project is working together with partners to pursue dissemination opportunities in other domains as well, such as smart homes, automotive, and smart cities.

In order to learn more about these individual project challenges, please visit the sections under “Research Topics” on this website.

Contact

Feel free to contact us for research assignments, collaboration inquiries, hiwi applications, theses / teaching activities etc.

Project Management:

Michael Feld
Campus D3 2
Stuhlsatzenhausweg 3
66123 Saarbrücken
Phone: +49 681 85775 5328
michael.feld (at) dfki . de
Tim Schwartz
Campus D3 2
Stuhlsatzenhausweg 3
66123 Saarbrücken
Phone: +49 681 85775 5306
tim.schwartz (at) dfki . de