‘I’ve got your daughter’: Scottsdale mom warns of close call with AI voice cloning scam
Artificial intelligence, or AI, technology has advanced to the point that it cannot only imitate the human voice but the voices of specific people, including the inflections and nuances of their speech patterns and other characteristics. The technology is being used to scam people, including a recent incident involving an Arizona woman who was convinced her teenage daughter had been kidnapped. Subbarao Kambhampati, a professor of computer science in the School of Computing and Augmented Intelligence, part of the Fulton Schools, says AI-enabled voice cloning can now replicate the sounds of a particular people speaking by using no more than several seconds of individuals’ actual voices.
The news report was also broadcast or posted by these news programs and outlets throughout the U.S.: WMTV (NBC15.com) Madison, Wisconsin, Fox 5 news KVVU, Las Vegas, WEVV 44 News, Indiana, smithmountainlake news, Virginia, the Daily Mail, England, and WRDW 12 CBS news in Augusta, Georgia, and KGUN 9 news, Tucson
See also: A mom thought her daughter had been kidnapped — it was just AI mimicking her voice, Popular Science, April 14
Criminals using AI to clone voice in a bid for ransom, Fox News, April 18
Kamphampati was also recently interviewed about artificial intelligence technology and its potential impacts on society. (The interview is conducted in the Telugu language.)