Emotions are essential for social interactions – they are multimodal (tone of voice, gestures, language) yet they are mostly studied unimodally. Emotionally expressive agents are trusted more, giving product designers the opportunity to explore deeper digital experiences with users.
Think about your own digital concierge. In the quest to understand multimodality in the perception of artificial agents, it is necessary to build machines that people trust to engage potential users in the process of assessing this perception.
ADAPT generated an animated avatar and deployed it within a Furhat Robot with an associated application to access the users’ interaction with the avatar.
This research highlighted that perceiving the avatar as intelligent, knowledgeable and trustworthy increases behavioural trust. This research has use cases across game development and entertainment and avatars for health and training.
ADAPT researchers discuss Ethics & Emotion in AI at @RCSI_Irl and @TrinityResInnov Innovation Showcase 2021: https://bit.ly/3I4czEP #ResponsibleAI