Posted on 08/20/2025 4:04:02 PM PDT by nickcarraway
AI-related fraud attempts surged by nearly 200 per cent in 2024, according to a cybersecurity firm. CNA takes a closer look at deepfake voice phishing, an increasingly common method used by scammers.
“Hi, how have you been? So I heard about this restaurant that just recently opened up. Want to go check it out the next time we meet?” These were the innocuous sentences used in a simple experiment to find out if people in my social circle could distinguish my real voice from an AI-cloned version.
The result was some confusion - but more importantly, the ease with which the imitation was generated suggests that more attention should be paid to the phenomenon of deepfake voice phishing, or vishing. Millions of dollars have been lost to scammers using cheap yet increasingly sophisticated artificial intelligence tools to impersonate the voices of real people.
In Asia-Pacific, the trend of AI and deepfake-enabled fraud is accelerating even faster than the rest of the world, according to cybersecurity firm Group-IB. AI-related fraud attempts surged by 194 per cent in 2024 compared to 2023, with deepfake vishing emerging as one of the most commonly used methods, said the company's senior fraud analyst for the region Yuan Huang.
(Excerpt) Read more at channelnewsasia.com ...
![]() |
Click here: to donate by Credit Card Or here: to donate by PayPal Or by mail to: Free Republic, LLC - PO Box 9771 - Fresno, CA 93794 Thank you very much and God bless you. |
“Hi, how have you been? So I heard about this restaurant that just recently opened up. Want to go check it out the next time we meet?”
Not identifying him/her/itself? Not even going to wait for me to answer the first question? Just launch into yapping about some generic eatery? No name or description? Why would I want to check out anything without more? None of my known acquaintances talks like this.
Just ask “Who IS this, Uncle Leo?” The AI won’t know how to reply.
Maybe someone should start a company that takes recordings of long lost ones and generate conversations in their voices? You could call it loved ones forever, or something like that.
“At [financial institution] my voice is my password.”
They already have computer generated actors because the real ones are dead.
I like the AI’s on YouTube who call the years: “Year one-thousand nine hundred and twenty-eight.”
The YouTube voices are cool and even but a long way from being convincing when they pronounce the same word/name three different ways. They will get there whether we want them to or not, porn as always will lead the way into totally believable people.
Joe Biden was the prototype.
Should have used his age to their advantage. They made Sean Connery his dad. They should have made a son for Indiana Jones who picked up the role.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.