Posted on 10/11/2024 6:09:33 AM PDT by DFG
Phone scams have been around for a while, but recent advancements in artificial intelligence technology is making it easier for bad actors to convince people they have kidnapped their loved ones.
Scammers are using AI to replicate the voices of people’s family members in fake kidnapping schemes that convince victims to send money to the scammers in exchange for their loved ones’ safety.
The scheme recently targeted two victims in a Washington state school district.
Highline Public Schools in Burien, Washington, issued a Sept. 25 notice alerting community members that the two individuals were targeted by “scammers falsely claiming they kidnapped a family member.”
“The scammers played [AI-generated] audio recording of the family member, then demanded a ransom,” the school district wrote. “The Federal Bureau of Investigation (FBI) has noted a nationwide increase in these scams, with a particular focus on families who speak a language other than English.”
In June, Arizona mother Jennifer DeStefano testified before Congress about how scammers used AI to make her believe her daughter had been kidnapped in a $1 million extortion plot. She began by explaining her decision to answer a call from an unknown number on a Friday afternoon.
“I answered the phone ‘Hello.’ On the other end was our daughter Briana sobbing and crying saying, ‘Mom,’” DeStefano wrote in her congressional testimony. “At first, I thought nothing of it. … I casually asked her what happened as I had her on speaker walking through the parking lot to meet her sister. Briana continued with, ‘Mom, I messed up,’ with more crying and sobbing.”
(Excerpt) Read more at nypost.com ...
My mom: “How much do you want to keep him?”
“Scammers are using AI to replicate the voices of people’s family members in fake kidnapping schemes that convince victims to send money to the scammers in exchange for their loved ones’ safety.”
Hell, someone try to pull this on us before AI was even around having a kid posing as our grandson saying he was in a car wreck and needed money and was in police custody. “Police officer” came on and said they would callback with instructions as to where to send it We made a quick call to his folks and found out he was ok. When they called back we told them we didn’t like him anyway so just keep him in jail.
Would it be a good idea to instead of considering this as fraud, to put it and similar actions under the existing kidnapping statutes?
Kidnapping is a federal crime if it involves using facilities of interstate or foreign commerce. This often includes telecommunications.
OK, so the trick is to ask a question that the AI won’t know the answer to “e.g. what kind of cake do you have on your birthday?” Or you could send up a fake question, and when an answer is given, you know its is fake e.g. “Did you talk to cousin Susan before your mistake”? Or, you could set a “safe” word like “trapeze” or “avocado”.
OK, so the trick is to ask a question that the AI won’t know the answer to “e.g. what kind of cake do you have on your birthday?” Or you could send up a fake question, and when an answer is given, you know its is fake e.g. “Did you talk to cousin Susan before your mistake”? Or, you could set a “safe” word like “trapeze” or “avocado”.
oh no kidding!!!!
Them- We kidnapped your oldest daughter
Me- Well good luck with that.
indeed but she was not ever that bad, just does not put up with anybody’s crap, much like her mother.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.