Can your voice really be cloned to scam loved ones? Yes – and AI makes it easy

Can your voice really be cloned to scam loved ones? Yes – and AI makes it easy

by Cristina Flores

AI technology makes life easier in many ways, doing jobs or tasks that humans would normally do. It helps some people drive their cars, it helps doctors diagnose illnesses but it also helps scammers swindle unsuspecting victims.

The FTC (Federal Trade Commission) issued a warning that scammers are now cloning people’s voices in order to make phone scams sound more convincing.

The scams target victims who often give up money, believing the scammer is someone else – often a family member who needs urgent help.

In “family emergency scams,” a victim gets a call from someone who claims the victim’s loved one is in trouble (in jail or in another country without a passport) and needs money. Senior citizens are often targets of these scams.

AI technology makes it very easy for fraudsters to clone a person’s voice and use it during the scam phone call.

“You don’t need a computer science degree to create these voice clones anymore. You need $5 and basic computer know how,” said Brandon Amacher, adjust professor at UVU Center for National Security Studies and director of Emerging Tech Policy Lab.

Amacher said with AI technology advancing very quickly, it takes just a small sample of someone’s voice to create a voice clone.

Scammers can find samples of people’s voices all over the internet on social media. They can use those samples to create a clone that says whatever they want.

Amacher said figuring out what’s fake and what’s real is a serious issue that our society will have to grapple with. He said we will have to be more careful about what information we give away and what we post on social media.

“If we are posting long videos with our voice and our image, then that could be repurposed to imitate us in the future,” he said.

Katie Hart, Director of Utah Division of Consumer Protection said currently there are no local laws on the books to protect consumers who get ripped-off during AI scams.

For now, she said, the best protection for consumers is to be aware of these scams and question any phone calls that sound suspicious.

She said if someone calls you urgently demanding you send them money, hang up then call the person yourself using the number where you’d normally reach them. Even if the person on the phone sounds like someone you know, verify before giving them more information or money.

If you can’t get a hold of that person, call someone else you can trust that can help verify they are in fact okay and not in need of money.

Utah lawmakers passed legislation in 2022 to form a cyber security commission, to “gather information and share best practices on cyber security”.

The recently-formed commission has not yet addressed AI and scams.

Maggie Hutchens, a recent UVU graduate in UVU’s cyber security studies program said laws have not caught up with AI scammers.

She said it’s urgent for lawmakers to address this issue before more people are hurt. She said just like scammers use data to victimize people, lawmakers can use data to create policy to protect people.

“I don’t think we need to wait for hundreds of thousands of dollars to be taken from people before we address the issue,” she said.

About Post Author

%d bloggers like this: