Scammers now can use voice-cloning AI to impersonate us or others and steal money
Just as you're ready to mingle and jingle, it's time for a warning about how a holiday-themed TikTok or Facebook reel that you post now could end up being used by scammers with AI-cloning tools to steal money from Grandma.
Even more scary, the same could be said about that friendly message you're leaving on your voicemail. Yep, we're now being told that it's wise to ditch the "Hi this is Sally, can't come to the phone right now" custom message and go with the boring, pre-recorded default greeting offered on your cell phone that uses a voice that isn't yours.
It's not exactly the cheery kind of stuff we want to hear as the calendar moves closer into 2025. But it's not exactly the kind of message we can afford to ignore, either.
Artificial intelligence tools can replicate our voices
Cyber criminals have a few new tools that experts say will open up the door for even more fraud in the next few years — AI-powered voice and video cloning techniques.
Holiday deals: Shop this season’s top products and sales curated by our editors.
Scammers want our voices and videos so that they can do a more convincing job of impersonating us when they're out to steal money. Such cloning can be wrongly used when crooks make a call pretending to be a grandson who claims to need money to get out of jail, a boss who wants you to pay some mysterious invoice, a romantic interest met on social media and a host of others.
The FBI is warning that artificial intelligence tools pose an escalating threat to consumers and businesses as cyber criminals using AI to conduct sophisticated phishing and social engineering attacks.
Michigan Attorney General Dana Nessel in early December warned residents that rapid advancements in AI are being misused to create "deepfake audio and video scams so realistic that they can even fool those who know us best."
We're not hearing from local law enforcement about a ton of such voice-impersonation scams taking place yet. But experts say people need to be prepared for an onslaught of activity and take precautions.
Those operating sophisticated fraud rings only need roughly three seconds of your voice to duplicate who you are — replicating the pitch of your voice, your tone, the pace at which you speak — when the crooks use some readily available, low-cost AI tools, according to Greg Bohl, chief data officer for Transaction Network Services. The company provides services to the telecommunications industry, including cell phone companies. Bohl's work focuses on developing AI technologies that can be used to combat fraud.
Many times, Bohl said, criminals will take information that's already readily available on social media or elsewhere, such as your cell phone, to clone a voice.
"The longer the greeting, the more accurate they can be with that voice replication," Bohl told me via a video conference call.
He called a 30-second snippet on a voicemail or a social media post a "gold mine for bad actors."
Many scams already spoof a legitimate phone number to make it appear like the call is coming from a well-known business or government agency. Often, real names are even used to make it seem like you're really hearing from someone who works at that agency or business.
But this new AI-cloning development will take scams to an entirely new level, making it harder for consumers to spot fraudulent robocalls and texts.
The Federal Communications Commission warns that AI can be used to "make it sound like celebrities, elected officials, or even your own friends and family are calling." The FCC has been working, along with state attorney generals, to shut down illegal AI voices and texts.
Cyber crooks do their research to sound real
People unknowingly make the problem worse with social media posts by identifying family members — say your son Leo or your daughter Kate — in videos or photos.
The crooks, of course, need to know who cares about you enough to try to help you in an emergency. So, the scammers first must identify who they might target among your real friends and family before staging a crisis call to ask for money.
During the holidays, Bohl said, anything you do on social media to connect with families and friends can trigger some risk and make you more open to fraud.
His top two tips:
No. 1: switch to automated voicemail.
No. 2: Create a family "safe word."
Scam calls will sound even more real using replicated voices of those we know, experts say. So, we will want to be able to calmly figure out if we're talking to a crook. You want a safe word or security question in place long before any of these calls start.
Questions can help, such as: What five tricks can the dog do in the morning? What was your favorite memory as a child? What was the worst golf score you ever posted? You want something that a scammer won't be able to easily guess — or quickly look up online. (And if you don't have a dog or play golf, well, you might have a good trick question there.)
"We can expect a significant uptick in AI-powered fraudulent activities by 2025," said Katalin Parti, an associate professor of sociology and a cybercrime expert at Virginia Tech.
The combination of social media and generative AI will create more sophisticated and dangerous attacks, she said.
As part of the fraud, she said, scammers also can create robocalls to collect voice samples from potential victims. It can be best not to engage in these types of calls, even by responding with a simple "hello."
Parti gives more tips: Don't contact any telephone number received via pop-up, text or email. Do not answer cold calls, even if you see a local area code. If you do not recognize the caller but you decide to answer the call anyhow, let the caller talk first.
AI voice-cloning is a significant threat as part of financial scams targeting older adults, as well as for misinformation in political campaigns, according to Siwei Lyu, professor of computer science and engineering at the University of Buffalo and director of the UB Media Forensic Lab.
What's troubling, he said, is that AI-generated voices can be extremely difficult to detect, especially when they are played over the phone and when the message can elicit emotional reactions such as when you think a close family member is hurt.
Take time to step back and doublecheck if the call is real, Lyu said, and listen carefully to other clues to detect an AI-generated sound.
"Pay attention to abnormal characteristics, such as overly quiet background, lack of emotional tone in the voice or even the lack of breathing in between utterances," he said.
New tools can make a scam phone call more convincing
But remember, new technology is evolving. Today, more types of phishing emails and texts look legitimate, thanks to AI.
The old saw, for example, which suggests you just need to look for bad grammar or spelling mistakes to spot a fake email or text could prove useless one day, as AI tools assist foreign criminals in translating the phrases they're using to target U.S. businesses and consumers.
Among other things, the FBI warned that cyber crooks could:
- Generate short audio clips containing a loved one's voice to impersonate a grandchild or other relative who was arrested, hurt in a car accident or facing some other crisis. When the voice sounds like someone you know, you might be more likely to panic and give into a request for bail money or even a demand for a ransom. And you might be more willing to take swift action when a call from your "boss" demands that buy gift cards for Best Buy to pay a particular invoice. Be skeptical.
- Crooks could use AI-generated audio clips of individuals and impersonate them to gain access to bank accounts.
- Scammers can be expected to use realistic videos for private communications to "prove" the online contact is a "real person."
Many times, we cannot even imagine how cyber criminals thousands of miles away could know how our voices sound. But much is out there already — more than even a simple voicemail message.
More:'Selling fast' or just a sales tactic? The truth about online alerts fueling impulse buys
More:Consumers could see a $5 overdraft fee in 2025 under a final rule — or maybe, not
School events are streamed. Business conferences are available online. Sometimes, our jobs require that we post information online to market the brand.
And "there's growing concern that bad guys can hack into voicemail systems or even phone companies to steal voicemail messages that might be left with a doctor's office or financial advisor," said Teresa Murray, who directs the Consumer Watchdog office for U.S. PIRG, a nonprofit advocacy group.
Such threats become more real, she said, in light of incidents such as the massive data breach suffered by National Public Data, which aggregates data to provide background checks. The breach was announced in August.
Yep, it's downright sickening.
Murray said the proliferation of scams makes it essential to have conversations with our loved ones to make sure everyone understands that computers can impersonate voices of people we know.
Talk about how you cannot trust Caller ID to show that a legitimate government agency is calling you, too.
Don't be afraid to just hang up
Michigan Attorney General Nessel's alert about potential holiday scams using artificial intelligence recommended that:
- Families should agree on a "code word" or key phrase that only your family would know to confirm an identity during a suspicious call.
- Be ready to hang up. If something feels off, just hang up.
- Call someone to verify their identity. Use a phone number that you know is real.
- Do not hand over money easily. Scammers often demand that you pay them with cryptocurrency, gift cards or money transfers. But once you send that money, it's hard to trace or reverse.
Contact personal finance columnist Susan Tompor: [email protected]. Follow her on X (Twitter) @tompor.