NewsLocal News

Actions

Convincing 'deepfake' scams mimic voices using AI to try and get money

Posted at 6:35 AM, May 23, 2023
and last updated 2023-05-23 08:35:34-04

SALT LAKE CITY — Scammers are getting smarter with ways to try and trick everyday people and the FBI warns that the newest scam clones voices in order to demand money.

It's a trick that is all too familiar, but with modern technology, it can be harder to determine what's fake and what's real.

"I've had individuals [call and say', 'We've got your son down here in Mexico and would you pay so much money to get him out,'" explained Robert Brundige of Salt Lake City.

Fraudsters will fake a dire situation, such as a medical emergency or intense crisis, then call a loved one, asking for money.

Recently, Utahns have reported getting calls where there are sound effects playing in the background, making a dire situation more believable.

Melissa Goodman was targeted by the scammers and said the fake call was believable.

"There's a person in the background, screaming, crying, hyperventilating. He thought it was me," Goodman explained. "They knew our names, they were calling us by my name, by his name."

Officials with the Taylorsville Police Department have heard about these types of scams from other local departments, where even real recordings are part of the scheme.

"The people did think that their child was with the kidnappers cause they had some kind of audio from a video or something like that," Lt. Aaron Cheshire explained.

The FBI warns that we're entering a new era where AI is used for scams and the result is shocking.

"There's no turning back the clock on this," said Jason Wiese, Assistant Professor at the University of Utah for the School of Computing. "Now that the technology is out there, even if you pass laws, it's not very easy to actually reign this in."

Already in 2023, the world has seen a boom in AI and the technology is so good, it's even stumping internet pros.

An image or recording that's "convincingly altered and manipulated to misrepresent someone as doing or saying something that was not actually done or said" is called a "deepfake."

The only "tell" for these deepfake images and recording is that they can look and sound a little robotic but as technology improves, so will the scams.

"Even if there's a 'tell' today, it may not be something that exists tomorrow as the technology continues to improve," Wiese explained.

The Salt Lake County Aging and Adult Services is constantly educating older Utahns on what to watch out for.

"The best thing to do is hang up the phone and then call that loved one back from your cellphone on their saved contact information," explained Afton January with the department.

So, if you post videos on public social media accounts, be aware that someone could use that audio to trick your loved ones.

"They are often the targets of these types of scams," January said. "And it's really important that we protect them."

But by being aware of the scams and what to watch out for, Utahns of all ages can be prepared for any scam that may come their way.