英语轻松读发新版了,欢迎下载、更新

Scammers Are Using Voice Clips to Create AI Clones. Here's How to Stay Safe

2025-02-28 22:00:00 英文原文

作者:Geoff Williams Contributor

Like something out of a science fiction movie, criminals are using AI to create voice clones of your family, friends and coworkers, all to scam you. 

In a 2023 survey, the antivirus software company McAfee found that a quarter of adults across seven countries have experienced some form of AI voice scam. The survey also found that 77% of victims lost money due to the interaction.

TAX SOFTWARE DEALS OF THE WEEK

Deals are selected by the CNET Group commerce team, and may be unrelated to this article.

The rise of AI-enabled voice clones has taken identity theft to an entirely new level. Here's how you can protect yourself and your money.

How do fraudsters use AI to create clones?

It works like this: The con artist finds an audio clip of the person's voice, often on social media. Scammers need as little as 3 seconds to make a decent simulation of a voice, according to the McAfee study.

"The longer the sample, the more accurate the fake," said Neal O'Farrell, founder of Think Security First! and the Identity Theft Council, a nonprofit that educates the public about identity theft.

Once the criminal has a voice sample, turning the clip into a believable voice depends on how sophisticated the criminal and their equipment are.
"If there isn't something already available, like a video on social media, a quick phone call might suffice," said O'Farrell, who's also a CNET Expert Review Board member. "A target responding with something like, 'No, I'm sorry, there's no one here by the name, and I've lived here for at least 10 years' should be enough."

The fraudster then runs the audio clip through an AI program that creates the voice clone. The thief may tell the AI what to say in the beginning, but depending on its level of sophistication, the AI program could end up doing most of the work.

O'Farrell said it's possible to have well-trained, large language models -- a type of machine that learns a language -- and text-to-speech software to speak with a victim. 

But because criminals know you're more likely to get suspicious if you talk to a voice clone for too long, they'll typically use no more than a few quick sentences when impersonating a loved one.

The laws surrounding using AI to fake someone's voice are murky, given the relative newness of the technology. But if it's done to make money, it's usually illegal. 

Watch out -- these people could be clones

There are several different professions -- and even people personally close to you -- that make ideal targets for bad actors to try to clone.

Loved ones

You've probably heard of the grandparent scam, where a young-sounding fraudster calls an elderly person and pretends to be their grandchild in jail, needing bail money. The grandparent then wires the money to help their grandchild. 

The Federal Bureau of Investigation released a public-service announcement in December 2024 warning people of the advances of AI tech creating harder-to-discern scams, including criminals creating audio clips to impersonate a close relative.

Your boss

If your boss called you and told you to transfer money from one corporate account to another, you might do it. 

While it might seem unlikely that a criminal is aware of your relationship to your boss, they could get the information from social media sites like LinkedIn.

Real estate agents

Scammers are creating AI voices that pose as real estate agents, according to the National Association of Realtors. Since buying a house typically involves moving large amounts of money from your accounts to lenders, a scammer could potentially pose as your agent and convince you to wire a large sum into an account they control. 

It sounds like an unlikely scenario. Most criminals won't know who your real estate or banker is, but they could find targets via online reviews or social media. For example, criminals could scrape the names off the reviews on an agent's Google page, and then target those individuals.

Lawyers

Lawyers have been fretting over the idea that criminals could start using AI to impersonate them. Similar to the real estate agent example, a criminal may mimic your attorney, leaving a voicemail asking you to wire them money. 

Your Accountant or Financial Advisor

If you get an urgent call from your financial professional telling you to send money right away -- especially a money order, cryptocurrency or anything else different from your typical wire interactions -- it should raise an alarm. 

Take a breath and consider if your financial advisor or accountant would typically advocate for this.

How to spot a voice scam

The good news is that it shouldn't be too difficult to spot a voice scam if you find yourself in this situation. Some telltale signs include:

Your conversation is brief

In AI voice clone scams, you'll often hear a short, urgent message like, "Mom, this is Denise. I'm in jail. I need bail money. I'm going to let Officer Duncan explain everything."

"Officer Duncan" will then tell you where you should send the bail money. In this scenario, your familiarity with the person should be the first red flag, and the "officer" who pressures you to send money quickly should be another. You should hang up -- regardless if they warn you not to -- and call the person directly to verify.

Your trusted family member or friend doesn't seem like themselves

A criminal may be able to train AI to impersonate a person you know, mimicking their political beliefs and personality traits. But if you feel like the conversation is off and you're about to part with a lot of money, trust your intuition and hang up. Call your friend or family member directly.

They don't know the passcode

One way to avoid this scam is to prepare in advance. Come up with a code word for your family, friends and anyone who handles your money. If you're ever in this situation, you can say, "What's our code word?"

You don't recognize the number they're calling from

You likely wouldn't recognize the number if you think your grandson is calling from jail. But if you don't recognize the area code for a call from your local financial advisor or real estate agent, or your boss has never called you from the number used, that's a red flag.

The money needs to be paid via gift card or cryptocurrency

If someone is demanding payment in gift cards or crypto, you're probably being scammed. 

What to do if you're the victim of a voice scam

It's important to act quickly if you're the victim of a voice scam. Take these steps:

关于《Scammers Are Using Voice Clips to Create AI Clones. Here's How to Stay Safe》的评论


暂无评论

发表评论

摘要

Criminals are using AI to create voice clones of individuals to carry out scams, with McAfee's 2023 survey revealing that 25% of adults across seven countries have experienced such attacks, and 77% of victims lost money. Scammers can simulate a person's voice using as little as three seconds of audio, making identity theft more sophisticated. To protect oneself, it’s crucial to spot brief, urgent messages from familiar contacts, verify through direct calls, use prearranged code words, and be wary of unfamiliar numbers or demands for payment via gift cards or cryptocurrency. Victims should report scams to the police, FTC, their bank, and credit agencies, and alert potential targets that their voices may have been cloned.

相关新闻