July 19, 2023
Until recently, cloning your own voice or someone else’s required a particular set of tech skills. With the tsunami-like advent of AI, however, that’s no longer the case. Now anyone can do it using sophisticated software that’s widely available. But while services like Speechify and Podcastle tout the numerous benefits of voice cloning tech (speeding up audio content creation, for instance), there’s also a dark side: impersonation scams.
Here’s how they work in a hypothetical nutshell, according to a Federal Trade Commission warning issued this past spring: “You get a call. There’s a panicked voice on the line. It’s your grandson. He says he’s in deep trouble — he wrecked the car and landed in jail. But you can help by sending money. You take a deep breath and think. You’ve heard about grandparent scams. But darn, it sounds just like him. How could it be a scam? Voice cloning, that’s how.”
All the scammer needs, the FTC noted, “is a short audio clip of your family member’s voice — which he could get from content posted online — and a voice-cloning program. When the scammer calls you, he’ll sound just like your loved one.”
And it’s happened over and over again. In one recent case, using audio and information scraped from social media, scammers cloned a teenage girl’s voice in an attempt to convince her mother that she’d been kidnapped during a ski trip. As the mother, Jennifer DeStafano, told CNN, “The voice sounded just like Brie’s, the inflection, everything,” she told CNN recently. “Then, all of a sudden, I heard a man say, ‘Lay down, put your head back.’ I’m thinking she’s being gurnied off the mountain, which is common in skiing. So I started to panic.”
The report went on: “As the cries for help continued in the background, a deep male voice started firing off commands: “Listen here. I have your daughter. You call the police, you call anybody, I’m gonna pop her something so full of drugs. I’m gonna have my way with her then drop her off in Mexico, and you’re never going to see her again.”
Ultimately tagged as a hoax, it’s just one of literally countless cases of AI-assisted criminality that happen every day — part of a rapidly worsening plague that cost Americans $2.6 billion last year alone. “Technology is making it easier and cheaper for bad actors to mimic voices, convincing people, often the elderly, that their loved ones are in distress,” the Washington Post reported recently. “In 2022, impostor scams were the second most popular racket in America, with over 36,000 reports of people being swindled by those pretending to be friends and family, according to data from the Federal Trade Commission. Over 5,100 of those incidents happened over the phone, accounting for over $11 million in losses, FTC officials said.”
One AI voice synthesis tool from startup ElevenLabs, touted as “the most realistic and versatile AI speech software, ever” and a boon to storytelling, became something of a poster child for abuse in this realm. Using the software, anonymous posts on the controversial site 4Chan veered into violence, transphobia, homophobia and racism. One reporter used ElevenLabs to break into his own bank account. The company publicly acknowledged the “misuse” and reportedly took steps to address it.
Beyond individual victims, businesses of all types and sizes are ripe targets for voice scams. That’s because, as this article from TheStreet notes, “people are more likely to open an email, click a link or download an attachment,” especially if there’s an urgency attached.
“This will make spear phishing attacks on corporate entities much more effective, especially when it comes to wire fraud schemes,” Chris Pierson, CEO of BlackCloak, told the site. “When you combine this with the ease with which phone numbers can be spoofed and scripts that can be created by Chat GPT, it can create a perfect storm.”
As Hany Farid, a computer sciences professor at the University of California, Berkeley, and a member of the Berkeley Artificial Intelligence Lab, told CNN, “A reasonably good clone can be created with under a minute of audio and some are claiming that even a few seconds may be enough,” he added. “The trend over the past few years has been that less and less data is needed to make a compelling fake.”
So how can people avoid being sitting ducks as these AI-enabled scams proliferate? Here are some solid tips courtesy of CNN and FBI special agent Siobhan Johnson:
- Don’t post information about upcoming trips on social media – it gives the scammers a window to target your family.
- Create a family password. If someone calls and says they’ve kidnapped your child, you can tell them to ask the child for the password.
- If you get such a call, buy yourself extra time to make a plan and alert law enforcement.
- If you’re in the middle of a virtual kidnapping and there’s someone else in the house, ask them to call 911 and urge the dispatcher to notify the FBI.
- Be wary of providing financial information to strangers over the phone. Virtual kidnappers often demand a ransom via wire transfer service, cryptocurrency or gift cards.
- Don’t trust the voice you hear on the call. If you can’t reach a loved one, have a family member, friend or someone else in the room try to contact them for you.
“The worst part of this is, every week we’re getting more calls, seeing more victims,” IdentityIQ founder Scott Hermann told Fox Business of mounting voice impersonation scams. “And the reason we’re seeing more victims is because it’s effective.”
About The Expert
Mishaal Khan, Mindsight’s Security Solutions Architect, has been breaking and – thankfully – rebuilding computers for as long as he can remember. As a Certified Ethical Hacker (CEH), CCIE R&S, Security Practitioner, and Certified Social Engineer Pentester, Khan offers insight into the often murky world of cybersecurity. Khan brings a multinational perspective to the business security posture, and he has consulted with SMBs, schools, government institutions, and global enterprises, seeking to spread awareness in security, privacy, and open source intelligence.
Mindsight is industry recognized for delivering secure IT solutions and thought leadership that address your infrastructure and communications needs. Our engineers are expert level only – and they’re known as the most respected and valued engineering team based in Chicago, serving emerging to enterprise organizations around the globe. That’s why clients trust Mindsight as an extension of their IT team.
Visit us at http://www.gomindsight.com.