Loading Now

Scammers now can use voice-cloning AI to impersonate us or others and steal money


Phone scams

Scammers now can use voice-cloning AI to impersonate us or others and steal money

Susan Tompor
Detroit Free Press

Just as you’re ready to mingle and jingle, it’s period for a warning about how a holiday-themed TikTok or Facebook reel that you post now could complete up being used by scammers with AI-cloning tools to steal money from Grandma.

Even more scary, the same could be said about that amiable communication you’re leaving on your voicemail. Yep, we’re now being told that it’s sensible to ditch the “Hi this is Sally, can’t arrive to the phone correct now” custom communication and leave with the dull, pre-recorded default greeting offered on your cell phone that uses a voice that isn’t yours.

It’s not exactly the cheery benevolent of stuff we desire to listen as the calendar moves closer into 2025. But it’s not exactly the benevolent of communication we can afford to ignore, either.

Artificial intelligence tools can replicate our voices

Cyber criminals have a few recent tools that experts declare will open up the door for even more fraud in the next few years — AI-powered voice and video cloning techniques.

Holiday deals: Shop this period’s top products and sales curated by our editors.

Financial exploitation of seniors is growing. Cyber security experts warn that AI will cause more troubles. Rick Nease, Detroit Free Press

Scammers desire our voices and videos so that they can do a more convincing job of impersonating us when they’re out to steal money. Such cloning can be wrongly used when crooks make a call pretending to be a grandson who claims to require money to get out of jail, a boss who wants you to pay some mysterious invoice, a romantic profit met on social media and a host of others.

The FBI is warning that artificial intelligence tools pose an escalating threat to consumers and businesses as cyber criminals using AI to conduct sophisticated phishing and social engineering attacks.

Michigan Attorney General Dana Nessel in early December warned residents that rapid advancements in AI are being misused to make “deepfake audio and video scams so realistic that they can even fool those who recognize us best.”

We’re not hearing from local law enforcement about a ton of such voice-impersonation scams taking place yet. But experts declare people require to be prepared for an onslaught of activity and receive precautions.

Those operating sophisticated fraud rings only require roughly three seconds of your voice to duplicate who you are — replicating the pitch of your voice, your tone, the pace at which you talk — when the crooks use some readily available, low-expense AI tools, according to Greg Bohl, chief data officer for deal Network Services. The corporation provides services to the telecommunications industry, including cell phone companies. Bohl’s work focuses on developing AI technologies that can be used to combat fraud.

Many times, Bohl said, criminals will receive information that’s already readily available on social media or elsewhere, such as your cell phone, to clone a voice.

“The longer the greeting, the more accurate they can be with that voice replication,” Bohl told me via a video conference call.

He called a 30-second snippet on a voicemail or a social media post a “gold mine for impoverished actors.”

Many scams already spoof a legitimate phone number to make it appear like the call is coming from a well-known business or government agency. Often, real names are even used to make it seem like you’re really hearing from someone who works at that agency or business.

But this recent AI-cloning advancement will receive scams to an entirely recent level, making it harder for consumers to spot fraudulent robocalls and texts.

The Federal Communications fee warns that AI can be used to “make it sound like celebrities, elected officials, or even your own friends and household are calling.” The FCC has been working, along with state attorney generals, to shut down illegal AI voices and texts.

Cyber crooks do their research to sound real

People unknowingly make the issue worse with social media posts by identifying household members — declare your son Leo or your daughter Kate — in videos or photos.

The crooks, of course, require to recognize who cares about you enough to try to assist you in an emergency. So, the scammers first must identify who they might target among your real friends and household before staging a crisis call to inquire for money.

During the holidays, Bohl said, anything you do on social media to connect with families and friends can trigger some hazard and make you more open to fraud.

His top two tips:

No. 1: switch to automated voicemail.

No. 2: make a household “secure word.”

Scam calls will sound even more real using replicated voices of those we recognize, experts declare. So, we will desire to be able to calmly figure out if we’re talking to a crook. You desire a secure word or safety question in place long before any of these calls commence.

Questions can assist, such as: What five tricks can the dog do in the morning? What was your favorite recollection as a kid? What was the worst golf score you ever posted? You desire something that a scammer won’t be able to easily guess — or quickly look up online. (And if you don’t have a dog or play golf, well, you might have a excellent trick question there.)

“We can expect a significant uptick in AI-powered fraudulent activities by 2025,” said Katalin Parti, an associate professor of sociology and a cybercrime specialist at Virginia Tech.

The combination of social media and generative AI will make more sophisticated and risky attacks, she said.

As part of the fraud, she said, scammers also can make robocalls to collect voice samples from potential victims. It can be best not to engage in these types of calls, even by responding with a straightforward “hello.”

Parti gives more tips: Don’t contact any telephone number received via pop-up, text or email. Do not respond cold calls, even if you view a local area code. If you do not recognize the caller but you decide to respond the call anyhow, let the caller talk first.

AI voice-cloning is a significant threat as part of monetary scams targeting older adults, as well as for misinformation in political campaigns, according to Siwei Lyu, professor of computer science and engineering at the University of Buffalo and director of the UB Media Forensic Lab.

What’s troubling, he said, is that AI-generated voices can be extremely challenging to detect, especially when they are played over the phone and when the communication can elicit emotional reactions such as when you ponder a close household member is hurt.

receive period to step back and doublecheck if the call is real, Lyu said, and listen carefully to other clues to detect an AI-generated sound.

“Pay attention to abnormal characteristics, such as overly silent background, lack of emotional tone in the voice or even the lack of breathing in between utterances,” he said.

recent tools can make a scam phone call more convincing

But recall, recent technology is evolving. Today, more types of phishing emails and texts look legitimate, thanks to AI.

The ancient saw, for example, which suggests you just require to look for impoverished grammar or spelling mistakes to spot a fake email or text could prove useless one day, as AI tools assist foreign criminals in translating the phrases they’re using to target U.S. businesses and consumers.

Among other things, the FBI warned that cyber crooks could:

  • Generate short audio clips containing a loved one’s voice to impersonate a grandchild or other relative who was arrested, hurt in a car accident or facing some other crisis. When the voice sounds like someone you recognize, you might be more likely to panic and provide into a request for bail money or even a demand for a ransom. And you might be more willing to receive swift action when a call from your “boss” demands that buy gift cards for Best Buy to pay a particular invoice. Be skeptical.
  • Crooks could use AI-generated audio clips of individuals and impersonate them to boost access to lender accounts.
  • Scammers can be expected to use realistic videos for private communications to “prove” the online contact is a “real person.”

Many times, we cannot even imagine how cyber criminals thousands of miles away could recognize how our voices sound. But much is out there already — more than even a straightforward voicemail communication.

More:‘Selling quick’ or just a sales tactic? The truth about online alerts fueling impulse buys

More:Consumers could view a $5 overdraft fee in 2025 under a final rule — or maybe, not

School events are streamed. Business conferences are available online. Sometimes, our jobs require that we post information online to economy the brand.

And “there’s growing concern that impoverished guys can hack into voicemail systems or even phone companies to steal voicemail messages that might be left with a doctor’s office or monetary advisor,” said Teresa Murray, who directs the customer Watchdog office for U.S. PIRG, a nonprofit advocacy throng.

Such threats become more real, she said, in light of incidents such as the massive data breach suffered by National community Data, which aggregates data to provide background checks. The breach was announced in August.

Yep, it’s downright sickening.

Murray said the proliferation of scams makes it essential to have conversations with our loved ones to make sure everyone understands that computers can impersonate voices of people we recognize.

Talk about how you cannot depend Caller ID to display that a legitimate government agency is calling you, too.

Don’t be afraid to just hang up

Michigan Attorney General Nessel’s alert about potential holiday scams using artificial intelligence recommended that:

  • Families should consent on a “code word” or key phrase that only your household would recognize to confirm an identity during a suspicious call.
  • Be ready to hang up. If something feels off, just hang up.
  • Call someone to verify their identity. Use a phone number that you recognize is real.
  • Do not hand over money easily. Scammers often demand that you pay them with cryptocurrency, gift cards or money transfers. But once you send that money, it’s challenging to trace or reverse.

Contact money management columnist Susan Tompor: [email protected]. pursue her on X (Twitter) @tompor.

Featured Weekly Ad





Source link

Post Comment

YOU MAY HAVE MISSED