This project addresses the modelling of multimodal communicative human behaviour during natural interactions in multiparty settings and specifically the conversational and feedback strategies of the interaction participants in terms of attention, engagement and conflict management. The knowledge base for this modelling problem will be a collection of human-human interaction recordings to be acquired through a concise experimental protocol and state-of-the-art equipment. The communicative behavior will be analysed in multimodal signals, including facial expressions, gaze and prosodic cues that provide information about social interaction management and speakers’ attitudes. The project builds on the strengths of the applicant in multimodal communication and language technology, and combines these with new training in intelligent systems and human-computer interaction to support a new research direction and to strengthen cross-European collaboration.

Through its novel perspective, work carried out in the project has substantially covered the multiparty interaction setting and has integrated the underlying emotional states and communicative intent into models of interactional behavior. It has thus laid the foundations of intuitive and context-sensitive multimodal communication through interfaces that have the potential to automatically adapt to the users and their state and take into account their cognitive, auditory and visual cues. The integration of technologies emerging from the research output will influence the design of collaborative human-machine interfaces and, in the long run, improve the quality or efficiency of private or professional services and positively influence user communities.

Learn more:

  • Start date: 1 September 2016
  • PI : Carl Vogel - Coordinator
  • Acronym: MULTISIMO
  • Title: MULTImodal and MULTIparty Social Interactions Modelling
  • Website: htt
  • Grant ID: 701621
  • Overall budget: €175,866
Project Contact
  • Carl Vogel