Sunday, January 28, 2024

Preparing for the age of AI scams

by Michelle Harven 

A Wehead, an AI companion that can use ChatGPT, is seen during Pepcom’s Digital Experience at the The Mirage resort during the Consumer Electronics Show (CES) in Las Vegas, Nevada.
BRENDAN SMIALOWSKI/AFP via Getty Images

If a loved one called you in a panic asking for help—maybe they just got arrested or kidnapped and needed money immediately. What would you do? 

Here’s the thing, the voice on the other end of the line might not be them. It could be AI.

Artificial Intelligence is now making it possible to clone someone’s voice – and use it to trick family or friends. Scammers are taking advantage of the technology to con panicked loved ones out of hundreds and sometimes thousands of dollars. AI is also being used to devise more realistic romance scams and AI generated videos, also known as deepfakes. Recently, a Taylor Swift deepfake was used in a video to shill pots and pans to unwitting fans. 

Washington has been watching. A bipartisan group of House lawmakers introduced the No AI Fraud Act this month. The bill would protect Americans likenesses and voices against AI-generated fakes. Earlier this month, the FTC created a competition with an award of $25,000 for the best ideas to protect consumers from these scams. And in November, the Senate Special Committee on Aging held a hearing on this kind of fraud and how to address it.  

We learn more about these scams and what people can do to protect themselves from falling victim.

Some tips from our guests:

  • If you suspect a voice clone scam, try to interrupt the caller and ask a question
  • Ask a question only that person would know
  • Establish a password with family and friends
  • Don’t send money through untraceable means like gift cards or cryptocurrency
  • Report all instances of fraud here: ReportFraud.ftc.gov

Source:
Preparing for the age of AI scams

No comments: