Can you guess the AI-generated voice?
- 1Ensure speakers are on
- 2Play each audio clip
- 3Click on a silhouette
Three seconds of audio is all it takes.
Cybercriminals have taken up newly forged artificial intelligence (AI) voice cloning tools and created a new breed of scam. With a small sample of audio, they can clone the voice of nearly anyone and send bogus messages by voicemail or voice messaging texts.
The aim, most often, is to trick people out of hundreds, if not thousands, of dollars.
Our recent global study found that out of 7,000 people surveyed, one in four said that they experienced an AI voice cloning scam or knew someone who had.
Further, our research team at McAfee Labs discovered just how easily cybercriminals can pull off these scams.
With a small sample of a person’s voice and a script cooked up by a cybercriminal, these voice clone messages sound convincing, 70% of people in our worldwide survey said they weren’t confident they could tell the difference between a cloned voice and the real thing.
Cybercriminals create the kind of messages you might expect. Ones full of urgency and distress. They will use the cloning tool to impersonate a victim’s friend or family member with a voice message that says they’ve been in a car accident, or maybe that they’ve been robbed or injured. Either way, the bogus message often says they need money right away.
In all, the approach has proven quite effective so far. One in ten of people surveyed in our study said they received a message from an AI voice clone, and 77% of those victims said they lost money as a result.
Of the people who reported losing money, 36% said they lost between $500 and $3,000, while 7% got taken for sums anywhere between $5,000 and $15,000.
Of course, a clone needs an original. Cybercriminals have no difficulty sourcing original voice files to create their clones. Our study found that 53% of adults said they share their voice data online or in recorded notes at least once a week, and 49% do so up to ten times a week. All this activity generates voice recordings that could be subject to hacking, theft, or sharing (whether accidental or maliciously intentional).
Consider that people post videos of themselves on YouTube, share reels on social media, and perhaps even participate in podcasts. Even by accessing relatively public sources, cybercriminals can stockpile their arsenals with powerful source material.
Nearly half (45%) of our survey respondents said they would reply to a voicemail or voice message purporting to be from a friend or loved one in need of money, particularly if they thought the request had come from their partner or spouse (40%), mother (24%), or child (20%).
Further, they reported they’d likely respond to one of these messages if the message sender said:
These messages are the latest examples of targeted “spear phishing” attacks, which target specific people with specific information that seems just credible enough to act on it. Cybercriminals will often source this information from public social media profiles and other places online where people post about themselves, their families, their travels, and so on—and then attempt to cash in.
Payment methods vary, yet cybercriminals often ask for forms that are difficult to trace or recover, such as gift cards, wire transfers, reloadable debit cards, and even cryptocurrency. As always, requests for these kinds of payments raise a major red flag. It could very well be a scam.
In conjunction with this survey, researchers at McAfee Labs spent two weeks investigating the accessibility, ease of use, and efficacy of AI voice cloning tools. Readily, they found more than a dozen freely available on the internet.
These tools required only a basic level of experience and expertise to use. In one instance, just three seconds of audio was enough to produce a clone with an 85% voice match to the original (based on the benchmarking and assessment of McAfee security researchers). Further effort can increase the accuracy yet more. By training the data models, McAfee researchers achieved a 95% voice match based on just a small number of audio files.
McAfee’s researchers also discovered that that they could easily replicate accents from around the world, whether they were from the US, UK, India, or Australia. However, more distinctive voices were more challenging to copy, such as people who speak with an unusual pace, rhythm, or style. (Think of actor Christopher Walken.) Such voices require more effort to clone accurately and people with them are less likely to get cloned, at least with where the AI technology stands currently and putting comedic impersonations aside.
The research team stated that this is yet one more way that AI has lowered the barrier to entry for cybercriminals. Whether that’s using it to create malware, write deceptive messages in romance scams, or now with spear phishing attacks with voice cloning technology, it has never been easier to commit sophisticated looking, and sounding, cybercrime.
Likewise, the study also found that the rise of deepfakes and other disinformation created with AI tools has made people more skeptical of what they see online. Now, 32% of adults said their trust in social media is less than it’s ever been before.
A lot can come from a three-second audio clip.
With the advent of AI-driven voice cloning tools, cybercriminals have created a new form of scam. With arguably stunning accuracy, these tools can let cybercriminals nearly anyone. All they need is a short audio clip to kick off the cloning process.
Yet like all scams, you have ways you can protect yourself. A sharp sense of what seems right and wrong, along with a few straightforward security steps can help you and your loved ones from falling for these AI voice clone scams.
For a closer look at the survey data, along with a nation-by-nation breakdown, download a copy of our report here.
The survey was conducted between January 27th and February 1st, 2023 by Market Research Company MSI-ACI, with people aged 18 years and older invited to complete an online questionnaire. In total 7,000 people completed the survey from nine countries, including the United States, United Kingdom, France, Germany, Australia, India, Japan, Brazil, and Mexico.