Most AI voice command systems are unable to hold a conversation. They function on single commands unlike humans who know when to interrupt and when to restart the conversation. New research aims to inform the design of proactive digital agents to develop the digital assistants of the future. These will go beyond today’s Alexa and Siri by starting interactions with us in just the right way and at just the right moment. The research was conducted by Professor Ben Cowan and Dr Jusin Edwards of the ADAPT Centre at UCD, along with Christian Janssen from Utrecht University and Sandy Gould from Cardiff University. The paper, titled “Eliciting Spoken Interruptions to Inform Proactive Speech Agent Design”, was presented at the 3rd Conversational User Interfaces (CUI) conference recently.
The sale of digital voice assistants such as Alexa or Siri has grown year on year and voice technology has become a key component of the smart device industry. Developers are looking for ways to create a more natural flow to the conversation with these devices. Current speech agent interactions are typically user-initiated, limiting the interactions they can deliver. Future functionality will require agents to be proactive, sometimes interrupting users. Little is known about how these spoken interruptions should be designed, especially in urgent interruption contexts. This research explores the design of proactive agent interruptions through investigating how people interrupt others engaged in complex tasks.
Speaking about the work, Dr Justin Edwards said: “We hope to empower speech agent designers to quickly and easily gather data about how people interrupt those engaged in another task, as we see this as a critical question for the future of proactive speech agent development.”
The full research paper is available online. A video of the presentation can be viewed on You Tube here.
Interesting read in @RTEBrainstorm by @AdaptCentre Dr. Mani Dhingra, Digital Twin Ecosystem Manager @MaynoothUni @smartdublin @AphraK @scienceirel