From Scams to Music, AI Voice Cloning Is on the Rise

4 minute read

An Arizona family was terrified a few months ago when what they thought was a kidnapping and ransom call turned out to be a total scam created by artificial intelligence. As reports grow of scam calls that sound identical to loved ones, many fear that AI could be weaponized to threaten people with technology that’s easy to access and only requires a small fee, several minutes and a stable internet connection.

Jennifer DeStefano received an anonymous call one January afternoon while her 15-year-old daughter was out of town for a ski race. DeStefano heard her daughter answer the phone, panicking and screaming, quickly followed by a man’s voice threatening to drug and kidnap DeStefano’s daughter unless he was sent $1 million, CNN reported.

DeStefano was able to reach her daughter a few minutes later, who was fine and puzzled about what had happened, because she hadn’t been abducted and wasn’t involved in the ransom call. Emergency responders helped the family identify the call as a hoax that uses AI.

“It was obviously the sound of her voice,” DeStefano told CNN, “the inflection, everything.”

Although data on how prevalent AI-powered scam calls are is limited, stories of similar incidents have continually popped up on TikTok and across other social platforms this year, prompting fear and risk for AI’s potential for harm.


More from TIME


AI voice cloning

AI scam calls are set up through voice cloning. Once a scammer finds an audio clip of someone’s voice online, they can easily upload it to an online program that replicates the voice. Such applications emerged a few years ago, but under the generative-AI boom, the apps have improved, become more accessible and are relatively cheap to use.

Murf, Resemble and Speechify are a few popular companies for these services. Most providers offer free trial periods, and range in monthly subscription fees from under $15 for basic plans to over $100 for premium options.

The Federal Trade Commission recommends that if you get a concerning call from a loved one in trouble, call the person who supposedly contacted you back at their regular number and verify the story. If the caller asks for money through questionable channels that are hard to trace, such as wiring, cryptocurrency or gift cards, that could be a sign of a scam. Security experts recommend establishing a safeword with loved ones that can be used in the event of a real emergency and to distinguish a scam.

AI voice cloning in the music industry

AI voice cloning has also spread to the music realm, where people use the technology to create songs with vocals that sound identical to popular artists. A song with Drake and the Weeknd’s likeness went viral online this month, even though neither artist had any involvement in creating it. The management company who represents both artists was able to get the song removed from streaming services, solely because of an illegally sampled audio, not for the AI voices. Drake commented, “this is the final straw AI,” after an AI-generated track of him rapping Ice Spice’s Munch also went viral this month.

Other artists like the Canadian musician Grimes are looking to the future where such technology could continue to grow and change the way the music industry operates. “I’ll split 50% royalties on any successful AI generated song that uses my voice,” Grimes tweeted last week. “Feel free to use my voice without penalty.”

People can write songs themselves, but record them with famous singers’ voices to attract attention. So far, there’s no legal penalties for music deepfakes, but the New York Times reports that they pose the risks of infringing on artists’ reputations, depriving the vocalists from profit and culturally appropriating BIPOC artists.

More Must-Reads from TIME

Contact us at letters@time.com