Emotions are essential for social interactions – they are multimodal (tone of voice, gestures, language) yet they are mostly studied unimodally. Emotionally expressive agents are trusted more, giving product designers the opportunity to explore deeper digital experiences with users.
Think about your own digital concierge. In the quest to understand multimodality in the perception of artificial agents, it is necessary to build machines that people trust to engage potential users in the process of assessing this perception.
ADAPT generated an animated avatar and deployed it within a Furhat Robot with an associated application to access the users’ interaction with the avatar.
This research highlighted that perceiving the avatar as intelligent, knowledgeable and trustworthy increases behavioural trust. This research has use cases across game development and entertainment and avatars for health and training.
Check out this video from Robin Wilton on Trust and Ethics in Research and Innovation
@DCUFSH @DCUSNPCH @Joe_Quinn_Davra @Ceic_DCU @AdaptCentre @subhashishhh delighted to receive this award today on behalf of the team