Chilling AI scam cloned ‘kidnapped’ girl’s voice in call to mum to demand ransom

Scammers are using a terrifying new weapon to steal money from loving mums and dads.

Sophisticated AI software enables criminals to “clone” a person’s voice and fake a kidnapping – potentially tricking a parent into parting with a hefty ransom.

Jennifer DeStefano was one intended victim. She told how she received a voice message from her 15-year-old daughter Brie apparently telling her that she had been kidnapped and was being taken to Mexico to be “pumped full of drugs” and raped.

READ MORE: Deepfakes will create 'every face we see' online in AI 'Infocalypse'

Brie doesn’t have any social media accounts, but a few snippets of her voice from sports reports on her school website were enough to let the crooks fake up a convincing "deepfake" replica of her speaking voice.

While Brie was away from home on a school trip, her mum received a spine-chilling call from an unrecognised number.

“I pick up the phone and I hear my daughter’s voice, and it says, ‘Mom!’ and she’s sobbing,” Jennifer told local news station WKYT. “I said, ‘What happened?’ And she said, ‘Mom, I messed up,’ and she’s sobbing and crying.”

Jennifer then heard a man’s voice ordering Brie to lie down.

  • AI expert warns 180,000 deepfake porn videos of innocent people will be online in 2021

“This man gets on the phone and he’s like, ‘Listen here. I’ve got your daughter. This is how it’s going to go down. You call the police, you call anybody, I’m going to pop her so full of drugs. I’m going to have my way with her and I’m going to drop her off in Mexico,’

“And at that moment,” Jennifer recalled, “I just started shaking.

“In the background she’s going, ‘Help me, Mom. Please help me. Help me,’ and bawling…”

  • Cyber enemies using social media 'to tear society apart with conspiracies', says general

The “kidnapper” demanded a million-dollar ransom, halving the price after Jennifer said she couldn’t afford the original sum. Then the panic-stricken mum kept him on the line as she asked friends that were with her to contact the police.

Within minutes, a relieved Jennifer was told that her daughter was still safely on her ski trip and had never been in any danger.

Experts have warned for years that AI would be able to duplicate a person’s voice – creating the risk of anything from simple fraud to provoking a world war.

“It was completely her voice,” Jennifer said. “It was her inflection. It was the way she would have cried,” she said. “I never doubted for one second it was her. That’s the freaky part that really got me to my core.”

For more incredible stories from the Daily Star, make sure you sign up to one of our newsletters here

READ NEXT

  • Ukraine's AI drones can hunt and kill Russian troops without direct human control
  • Top AI experts warn groundbreaking new tech could spark 'nuclear-scale' catastrophe
  • Astronomer Royal says humanity will be replaced by AI robots – and aliens already have been
  • AI battleships and killer drones: 'Terminator future' of war feared by military experts

Source: Read Full Article