Viral Trends

AI clones teen girl’s voice in $1M kidnapping scam: ‘I’ve got your daughter’

It was a dead “ringer” for her daughter.

Artificial intelligence has taken phone scams to a frightening new level.

An Arizona mom claims that scammers used AI to clone her daughter’s voice so they could demand a $1 million ransom from her as part of a terrifying new voice scheme.

“I never doubted for one second it was her,” distraught mother Jennifer DeStefano told WKYT while recalling the bone-chilling incident. “That’s the freaky part that really got me to my core.”

This bombshell comes amid a rise in “caller-ID spoofing” schemes, in which scammers claim they’ve taken the recipient’s relative hostage and will harm them if they aren’t paid a specified amount of money.

The Scottsdale, Ariz., resident recounted how she received a call from an unfamiliar phone number, which she almost let go to voicemail.

Then DeStefano remembered that her 15-year-old daughter, Brie, was on a ski trip, so she answered the call to make sure nothing was amiss.

That simple decision would turn her entire life upside down: “I pick up the phone, and I hear my daughter’s voice, and it says, ‘Mom!’ and she’s sobbing,” the petrified parent described. “I said, ‘What happened?’ And she said, ‘Mom, I messed up,’ and she’s sobbing and crying.”

Mom Jennifer DeStefano said the voice on the ransom call sounded just like her daughter Brie’s. Jennifer DeStefano/Facebook
“I never doubted for one second it was her,” said DeStefano. Jennifer DeStefano/Facebook

Her confusion quickly turned to terror after she heard a “man’s voice” tell “Brie” to put her “head back” and “lie down.”

“This man gets on the phone, and he’s like, ‘Listen here. I’ve got your daughter,’ ” DeStefano explained, adding that the man described exactly how things would “go down.”

“You call the police, you call anybody, I’m going to pop her so full of drugs,” the mysterious caller threatened, per DeStefano, who was “shaking” at the time. “I’m going to have my way with her, and I’m going to drop her off in Mexico.”

Brie was on a ski trip the whole time. Briana DeStefano/Facebook

All the while, she could hear her daughter in the background pleading, “‘Help me, Mom. Please help me. Help me,’ and bawling.”

That’s when Brie’s faux kidnapper demanded the ransom.

He initially asked for $1 million, but then lowered the figure to $50,000 after DeStefano said she didn’t “have the money.”

The nightmare finally ended after the terrified parent, who was at her other daughter’s studio at the time, received help from one of her fellow moms.

After calling 911 and DeStefano’s husband, they confirmed that Brie was safe and sound on her skiing excursion.

However, for the entire call, she was convinced that her daughter was in peril. “It was completely her voice,” the Arizonan described. “It was her inflection. It was the way she would have cried.”

As it turned out, her progeny never said any of it, and the voice was devised via an AI simulation like a case of long-distance ventriloquism.

@bethroyce

I feel the need to tell everyone I know about this. Literally the scarriest moment of my entire life #scam #scammers #scammed #hostagesituation #trauma #phonescam #fyp #foryoupage #fypシ #viral

♬ original sound – Beth

The identity of the cybernetic catfish is unknown at this time, but computer science experts say that voice-cloning tech has evolved to the point that someone’s tone and manner of speaking can be re-created from the briefest of soundbites.

“In the beginning, it would require a larger amount of samples,” explained Subbarao Kambhampati, a computer science professor and AI authority at Arizona State University. “Now there are ways in which you can do this with just three seconds of your voice. Three seconds. And with the three seconds, it can come close to how exactly you sound.”

With a large enough sample size, the AI can mimic one’s “inflection” as well as their “emotion,” per the professor.

Brie DeStefano’s kidnapper originally asked for $1 million in ransom. Instagram/briedestefano

Think how Robert Patrick’s sinister T-1000 robot from the sci-fi classic “Terminator 2: Judgment Day” parrots the voice of John Connor’s mom to try to lure him home.

DeStefano found the voice simulation particularly unsettling given that “Brie does NOT have any public social media accounts that has her voice and barely has any,” per a post on the mom’s Facebook account.

“She has a few public interviews for sports/school that have a large sampling of her voice,” described Brie’s mom. “However, this is something to be extra concerned with kids who do have public accounts.”

Indeed, FBI experts warn that fraudsters often find their targets on social media.

Jennifer DeStefano was particularly disturbed by the fact that her daughter doesn’t even have a big social media presence. Instagram/briedestefano

“If you have it [your info] public, you’re allowing yourself to be scammed by people like this,” said Dan Mayo, the assistant special agent in charge of the FBI’s Phoenix office. “They’re going to be looking for public profiles that have as much information as possible on you, and when they get ahold of that, they’re going to dig into you.”

In order to prevent being hornswoggled, he advises asking the scammer a bunch of questions about the “abductee” that the scammer wouldn’t know.

Mayo also suggested looking out for red flags, such as if they’re calling from an unfamiliar area code or using an international number.

Meanwhile, DeStefano warned people on Facebook to alert authorities if the scam she described happened to them or anyone they knew.

Jennifer said that the AI was even able to mimic Brie’s inflection. Instagram/briedestefano

“The only way to stop this is with public awareness!” she said. “Also, have a family emergency word or question that only you know so you can validate you are not being scammed with AI! Stay safe!”

Her public service announcement is particularly timely given the recent spate of kidnapper schemes.

Last month, TikToker Beth Royce allegedly received a call from a mysterious man who demanded that she pay him $1,000 or he’d kill her sister. All the while, a woman could be heard sobbing in the background.

Meanwhile, in December, social media user Chelsie Gates received a similar call from a man threatening to kill her mom — whom she also heard weeping in the background — if she didn’t shell out the same amount.

In both instances, the victims forked over the ransom, terrified that the caller would harm their family members.