Show Notes

Follow on LinkedIn: www.linkedin.com/company/idtheftcenter/
Follow on X: twitter.com/IDTheftCenter

Show Transcript

Welcome to the Identity Theft Resource Center’s (ITRC) Weekly Breach Breakdown for March 15, 2024. Thanks to Sentilink for their support of the podcast and the ITRC. Each week, we look at the most recent events and trends related to data security and privacy. This week, we discuss one of the topics most discussed in recent months: AI voice scams. The rise in those deceiving scams gives a new meaning to the song “Voices Inside My Head,” which debuted in the 1980’s.

The Latest Variations of AI Voice Scams

Voice cloning scams have been around for years. However, criminals continue to adapt as AI improves and becomes more accessible. A recent high-profile AI voice scam happened when New Hampshire residents received AI-generated robocalls mimicking U.S. President Joe Biden’s voice asking voters to “save your vote for the November election.” An extreme example involves a business in Asia where cybercriminals used real-time voice cloning and deep fake video technology to stage a Zoom call where executives instructed a team member to wire $25 million.

Should You Be Concerned About AI Voice Scams?

Now’s a good time to point out how concerned we should be about becoming a victim of a voice-cloning scam. The short answer is we shouldn’t, at least not as an average person. Businesses, like the company in Asia, along with high net worth and high-profile individuals, are far more likely to be targeted.

Identity criminals do not want to take the time to attack people one person at a time. It’s very labor-intensive to create a voice clone, and it can only be used once. For a criminal, there has to be a huge pot of gold at the other end of a scam for them to want to target a specific person using this technology. It’s far more likely we’ll see more attacks against employees of companies lured into paying invoices or transferring company funds by using a voice clone of an executive or supervisor.

How Criminals Use AI Tools to Clone People’s Voices

It is important to know how criminals use AI tools to clone the voices of people they target to make calls to family, friends or co-workers. AI tools require as little as three seconds of a voice to create a realistic clone. Scammers can also spoof a phone number so it looks like a known caller. They can add sounds like laughter, fear and other emotions into the cloned voice, as well as sound effects like a subway station, an airport or a car crash. The technology is so advanced that scammers can also add accents and age ranges.  

Tech Brew Reporter Demonstrates How Easy It Is to Clone Someone’s Voice

Tech Brew reporter Kelcee Griffis spoke with call platform TNS about the anatomy of a voice clone. TNS cloned Griffis’s voice to show how easy it is to do. They began with a small voice sample and ended with a panel moderated by Griffis. Each time, TNS generated a realistic clone of Griffis’s voice. To read Griffis’s story for Tech Brew and hear all the audio samples, click here.

How to Avoid AI Voice Scams

Like just about all scams, AI voice scams are used to create a sense of panic or urgency, like claiming a loved one is in danger or an important vendor must be paid RIGHT NOW! The biggest thing for people to do is not panic. Criminals want to scare you into doing something you would typically not do. If you have any doubt about who is calling you and what they are asking you to do, hang up, collect your thoughts and contact the person directly to verify the situation. If you cannot reach them, contact them through other family, friends or co-workers. Two other tips to avoid voice cloning scams are:

  1. Be vigilant on all phone calls, even if you recognize the voice on the other line. If anything said on the phone seems wrong, ask questions only the real person would know.
  2. Avoid personalized voicemail messages. They can give bad actors easy access to your voice. Use automated tools offered on phone systems.

Contact the ITRC

If you want to know more about how to protect your business or personal information, AI voice scams, or think you have been the victim of an identity crime, you can speak with an expert ITRC advisor on the phone, chat live on the web or exchange emails during our normal business hours (Monday-Friday, 6 a.m.-5 p.m. PST). Just visit www.idtheftcenter.org to get started.

Thanks again to Sentilink for their support of the ITRC and this podcast. Be sure to check out our sister podcast, the Fraudian Slip, where, as part of National Consumer Protection Week, we sit down with the Federal Trade Commission to discuss fraud risks consumers and businesses face. We will return next week with another episode of the Weekly Breach Breakdown.