While the ongoing AI boom is bringing with it accelerated advancements across practically every industry, it also has a dark side. Rapid advances in technology means more tools in the hands of law-abiding citizens, but also the hands of those with nefarious intent. The Washington Post has reported that thousands of people have already been targeted by a scam that uses generative AI tech to mimic the voices of loved ones, leaving victims under the impression that their family members or friends are in a dire situation and need money immediately.
New technologies developed by several different companies have the same goal: to replicate a voice, not only in its tonality, but also its breadth of emotions and small details that make each person’s speaking voice unique. While still new, the results have already been incredibly impressive, so much so that it has already duped unsuspecting victims into thinking they were being contacted by family, and unfortunately, have lost them tons of cash.
See also: Samsung’s Bixby will soon be able to answer calls with your voice
The Federal Trade Commission has already reported that in 2022 alone, over 5,000 people in the United States fell victim to “imposter scams,” losing more than $11 million through phone trickery. These numbers don’t even include those who were targeted but managed to avoid any financial losses. Assistant Director of the FTC’s marketing practices told The Washington Post that the current best defense against these scams is raising awareness about them, and to try to authenticate the caller’s identity outside of a simple voice call.
Since generative AI voice mimicking requires being trained on existing clips of the user speaking, it may be prudent to remember that those who have recordings of their voice readily available and accessible online are the most susceptible to being imitated.