Nearly three-quarters of people aren’t likely to trust a conversational AI tool like Google Duplex to make simple calls for them correctly, but experts say trust will build as usage increases.
In May 2018, Google announced Duplex, a new feature of its Google Assistant. This AI-powered tool could call restaurants for smartphone users and book reservations, speaking seamlessly to real humans. So seamlessly, in fact, that some claimed they could not distinguish Duplex from a real person.
The voice assistant’s speech is peppered with “uhm’s” and pauses, as a real conversation would be, and the tool could seemingly handle accents and unexpected elements of conversation that astounded those less familiar with AI’s evolution.
Duplex signals a new era for conversational AI in everyday life. While consumers are increasingly comfortable speaking commands to their Alexas or Echoes, how do they feel about letting an AI virtual assistant speak for them?
Ciklum partnered with Clutch, a B2B ratings and reviews firm in Washington, D.C., to survey more than 500 consumers about their opinions on Duplex and conversational AI overall.
The survey found that consumers are currently wary of letting conversational AI complete phone calls or emails for them. Experts say this distrust follows a similar pattern of past technologies, though, and that trust will build in the future.
Nearly Three-Quarters of People Unlikely to Trust a Tool Like Duplex
The survey found that 73% of people say they are somewhat or very unlikely to trust a tool like Google Duplex to make a simple call for them correctly.
Experts spoke to how Duplex requires users to give up more control in an interaction than they are used to. Users can’t monitor how the AI performed until the call is complete.
On the computer, you have a user interface. You can see what it’s doing. Over the phone, you really have no idea what it’s capable of doing. You have to just believe.Daniel Shapiro, chief technology officer and co-founder of Lemay.ai
Given this, there is seemingly greater room for error. Thus, the tool will need to spend more time demonstrating to consumers how it is effective to gain their trust. There is a long history of consumers distrusting new technology, though, only to later accept it.
We’re all vulnerable to different forms of this anxiety when it comes to adjusting to new ideas or technological advancesIvan Kotiuchyi, research engineer at Ciklum
For example, consider the early public reactions to ATMs.
When ATMs first came to this world, people were scared. At that time, I was in India. We never went to the ATM because we didn’t trust it.Dj Das, founder and CEO of ThirdEye Data
Nowadays, ATMs are commonplace. Conversational artificial intelligence may follow a similar pattern.
Consumers Must Be Wary of Security Risks
The technology behind Duplex does have the potential to fuel new security risks, though. The survey report explores how conversational AI can boost “vishing” scams.
Vishing, a portmanteau of the words phishing and voice, refers to scammers that seek to steal private information by pretending to be from a reputable source. While these scams can sometimes be run as robocalls, other scammers will conduct the calls themselves, using social engineering to more effectively fool people.
Conversational AI could potentially automate more personalized vishing calls, increasing their reach and effectiveness. This is especially potent when you consider that technology already exists that can replicate people’s voices.
Imagine a call that comes from your boss. It sounds like your boss, talks with his or her cadence but is actually an AI assistant using a voice imitation algorithm. Such a call offers limitless malicious potential to bad actors.Corey Nachreiner, WatchGuard Technologies Chief Technology Officer
Consumers should be wary of more sophisticated AI-powered robocalls and phone scams.
Overall, Ciklum and Clutch’s new report demonstrates the hesitancy and risks behind conversational AI, but indicates a positive future for these tools.