© 2024 Boise State Public Radio
NPR in Idaho
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
Chad Daybell's murder trial has begun. Follow along here.

Preparing for the age of AI scams

A Wehead, an AI companion that can use ChatGPT, is seen during Pepcom's Digital Experience at the The Mirage resort during the Consumer Electronics Show (CES) in Las Vegas, Nevada.
A Wehead, an AI companion that can use ChatGPT, is seen during Pepcom's Digital Experience at the The Mirage resort during the Consumer Electronics Show (CES) in Las Vegas, Nevada.

If a loved one called you in a panic asking for help—maybe they just got arrested or kidnapped and needed money immediately. What would you do? 

Here’s the thing, the voice on the other end of the line might not be them. It could be AI.

Artificial Intelligence is now making it possible to clone someone’s voice – and use it to trick family or friends. Scammers are taking advantage of the technology to con panicked loved ones out of hundreds and sometimes thousands of dollars. AI is also being used to devise more realistic romance scams and AI generated videos, also known as deepfakes. Recently, a Taylor Swift deepfake was used in a video to shill pots and pans to unwitting fans. 

Washington has been watching. A bipartisan group of House lawmakers introduced the No AI Fraud Act this month. The bill would protect Americans’likenesses and voices against AI-generated fakes. Earlier this month, the FTC created a competition with an award of $25,000 for the best ideas to protect consumers from these scams.And in November, the Senate Special Committee on Aging held a hearing on this kind of fraud and how to address it. 

We learn more about these scams and what people can do to protect themselves from falling victim.

Some tips from our guests:

  • If you suspect a voice clone scam, try to interrupt the caller and ask a question
  • Ask a question only that person would know
  • Establish a password with family and friends
  • Don’t send money through untraceable means like gift cards or cryptocurrency
  • Report all instances of fraud here: ReportFraud.ftc.gov

Copyright 2024 WAMU 88.5

Michelle Harven

You make stories like this possible.

The biggest portion of Boise State Public Radio's funding comes from readers like you who value fact-based journalism and trustworthy information.

Your donation today helps make our local reporting free for our entire community.