‘Everything was so real’: Virtual kidnapping scams made more realistic with AI

Jennifer DeStefano (right) believes she nearly fell victim to a scammer who cloned her daughter's voice, possibly using AI software. PHOTO: JENNIFER DESTEFANO/FACEBOOK

In recent months, increasingly controversial use of artificial intelligence has sparked debates worldwide, with voice deepfakes creeping into the music industry.

AI-powered voice filters have successfully mimicked musicians like Drake, The Weeknd and Jay-Z, but more alarmingly some scammers are believed to have used the voice-cloning technology to stage even more realistic kidnapping scams.

Ms Jennifer DeStefano told American media that she nearly fell victim to a virtual kidnapping scam even though she thought “a mother knows her child”.

On Jan 20, the Arizona resident was about to pick up her daughter Aubrey from a dance studio when she received a call from an unknown number.

She thought of rejecting the call, but as her older daughter Briana was training for a ski race more than 170km away at a resort in northern Arizona with her father, she thought the call could be about a medical emergency.

When she picked up the call, she heard yelling and sobbing and “the voice sounded just like Brie’s (Briana’s), the inflection, everything,” Ms DeStefano told CNN.

“Then, all of a sudden, I heard a man say, ‘Lay down, put your head back.’ I’m thinking she’s being gurnied off the mountain, which is common in skiing, so I started to panic.”

Next, she heard a man telling her that he had her daughter. He warned: “You call the police, you call anybody, I’m gonna pop her something so full of drugs. I’m gonna have my way with her, then drop her off in Mexico, and you’re never going to see her again.”

The man also asked for a US$1 million (S$1.34 million) ransom.

Ms DeStefano ran into the dance studio screaming for help. A woman helped her to call the authorities.

Thankfully, Ms DeStefano managed to reach her husband, who confirmed that Briana was with him and unharmed.

She also spoke to Briana, who said she was in bed and had no idea what was going on.

The man on the call still insisted he had her daughter and continued demanding money, but by then Ms DeStefano knew that she should just hang up and call the police.

She said she fell for the scam because the girl’s voice is just the same as her daughter’s.

“A mother knows her child,” she said. “You can hear your child cry across the building, and you know it’s yours.”

According to CNN, law enforcement officers have not verified whether AI was used in Ms DeStefano’s case, but she believes scammers cloned her daughter’s voice.

Voice cloning from social media posts

United States authorities have warned that scammers can get audio clips from potential victims’ social media posts.

Ms DeStefano’s daughter Briana has a private TikTok account and a public Instagram account with photos and videos from her ski racing events, though her followers are mostly close friends and family members. .

US officials said it is easy for scammers to upload audio clips of someone’s voice to an online programme to replicate it, and such technology is becoming cheaper and easier to use.

“The threat is not hypothetical – we are seeing scammers weaponise these tools,” said Dr Hany Farid, a computer sciences professor at the University of California, Berkeley.

Dr Farid, also a member of the Berkeley Artificial Intelligence Lab, told CNN that a “reasonably good” clone can be created with under a minute of audio, with some claiming that a few seconds may be sufficient.

“The trend over the past few years has been that less and less data is needed to make a compelling fake,” he said, adding that voice cloning can be done for as little US$5 a month with the help of AI software.

As far as he knows, AI software cannot currently clone voices to display an emotional range, but he also cannot entirely rule out its ability to generate screaming or sobbing voices.

In the US, families lose an average of US$11,000 in each fake kidnapping scam, special agent and FBI spokesman Siobhan Johnson told CNN.

Data from the Federal Trade Commission shows that Americans lost US$2.6 billion in 2022 in impostor scams.

“We don’t want people to panic. We want people to be prepared,” said Mr Johnson. “This is an easy crime to thwart if you know what to do ahead of time.”

Join ST's Telegram channel and get the latest breaking news delivered to you.