In the last week, highly anticipated songs by Drake and Taylor Swift appeared to leak online, sparking enormous reactions. Massive Reddit threads spawned, dissecting musical choices. Meme videos were created simulating other rappers’ reactions to being dissed by Drake. The rapper Rick Ross even responded to the song’s bars about him with a diss track of his own.
But there was one big problem: neither Swift nor Drake confirmed that the songs were real. In fact, loud contingents on social media claimed that the songs were AI-generated hoaxes, and begged fellow fans not to listen to them. Fervent fans soon became engulfed in rabid hunts for clues and debates aimed at decoding the songs’ levels of authenticity.
These types of arguments have recently intensified and will only continue ballooning, as AI vocal clones keep improving and becoming increasingly accessible to everyday people. These days, even an artist’s biggest fans have trouble telling the difference between their heroes and AI-creations. They will continue to be stymied in the coming months, as the music industry and lawmakers slowly work to determine how best to protect human creators from artificial imposters.
The Advent of AI Deepfakes
AI first shook the pop music world last year, when a song that seemed to be by Drake and the Weekend called “Heart on My Sleeve” went viral, with millions of plays across TikTok, Spotify, and YouTube. But the song was soon revealed to have been created by an anonymous musician named ghostwriter977, who used an AI-powered filter to turn their voice into those of both pop stars.
Many fans of both artists loved the song anyway, and it was later submitted for Grammys consideration. And some artists embraced new deepfake technology, including Grimes, who has long experimented with technological advancements and who developed a clone of her voice and then encouraged musicians to create songs using it.
But soundalikes soon began roiling the fanbases of other artists. Many top stars, like Frank Ocean and Beyoncé, have turned to intense policies of secrecy around their output (Ocean has carried around physical hard drives of his music to prevent leaking), resulting in desperate fans going to extreme lengths to try to obtain new songs. This has opened the door for scammers: Last year, a scammer sold AI-created songs to Frank Ocean superfans for thousands of dollars. A few months later, snippets that purported to be taken from new Harry Styles and One Direction songs surfaced across the web, with fans also paying for those. But many fans argued vociferously that they were hoaxes. Not even AI-analysis companies could determine whether they were real, 404 Media reported.
Read More: AI’s Influence on Music Is Raising Some Difficult Questions
Drake and Taylor… Or Not?
This week, AI shook up the fanbases of two of the biggest pop stars in the world: Taylor Swift and Drake. First came a snippet of Drake’s “Push Ups,” a track that seemingly responded to Kendrick Lamar’s taunts of him in the song “Like That.” (“Pipsqueak, pipe down,” went one line from “Push Ups.”) The track, which also took aim at Rick Ross, The Weeknd, and Metro Boomin, quickly went viral, and Ross fired back a diss track of his own.
But the internet was divided as to whether or not the clip was actually made by Drake. The original leak was low quality; Drake’s vocals sound grainy and monotone. Even the rapper Joe Budden, who hosts the prominent hip-hop podcast The Joe Budden Podcast, said that he was “on the fence” for a while about whether or not it was AI.
A higher quality version of the song was subsequently released, leading many news outlets and social media posters to treat “Push Ups” as a genuine Drake song. Strangely enough, Drake has toyed with this ambiguity: He has yet to claim the song as his own, but posted an Instagram story containing people dancing to parts of it. Whether or not he made it, the song has become an unmistakable entry in a sprawling rap beef that has taken the hip-hop world by storm.
“Push Ups” has a reference to Taylor Swift: It accuses Lamar of being so controlled by his label that they commanded him to record a “verse for the Swifties,” on the 2015 remix of her song “Bad Blood.” On Wednesday, Swifties went into a frenzy when a leaked version of her highly anticipated new album, The Tortured Poets Department, began making the rounds online two days before its release date. Purported leaks have been popping up for months, including some that were eventually debunked as AI-generated. Given all of the false trails across the web, many Swift fans dismissed these new leaks as fake as well. But the songs are also being treated as real by many fans on Reddit, who are already announcing their favorite tracks and moments a day before the album’s official release.
Read More: Everything We Know About Taylor Swift’s New Album The Tortured Poets Department
Can the music industry fight back?
Some of these vocal deepfakes are not much more than a nuisance to major artists, because they are low-quality and easy to detect. AI tools often will get the timbre of a distinctive voice slightly wrong, and can glitch when artists use melisma—sliding up and down on a single syllable—or suddenly jump registers. Some pronunciations of lyrics also come out garbled, or with a slightly wrong accent.
But AI tools are constantly improving and getting closer to the real thing. OpenAI recently shared a preview of Voice Engine, their latest tool that generates natural-sounding speech mimicking certain speakers. Researchers and AI companies are racing to create voice clone detection software, but their success rates have been uneven.
So some musicians and music labels are fighting back with the avenues they have available to them. Three major music publishers—Universal Music Publishing Group, Concord Music Group and ABKCO—sued the AI company Anthropic, alleging that the company infringed on copyrighted song lyrics. More than 200 musicians, including Billie Eilish, Stevie Wonder, and Nicki Minaj, recently signed a letter decrying the “predatory use of AI to steal professional artists' voices and likenesses." And BPI, a UK music industry group, threatened legal action against the vocal cloning service Jammable.
The music industry has growing support from lawmakers. Last month, Tennessee governor Bill Lee signed into law the ELVIS Act, which prohibits people from using AI to mimic an artist's voice without their permission. And U.S. senators announced a similar bill called the NO FAKES Act. “We must put in place rules of the road to protect people from having their voice and likeness replicated through AI without their permission,” Minnesota Senator Amy Klobuchar wrote in a statement.
It will likely take a long time for this bill or other similar ones to wind their way through the halls of Congress. Even if one of them passes, it will be exceedingly hard to enforce, given the anonymity of many of these online posters and the penchant for deleted songs to pop back up in the form of unlicensed copies. So it’s all but assured that deepfaked songs will continue to excite, confuse, and anger music fans in the months and years to come.
More Must-Reads from TIME
- Why Trump’s Message Worked on Latino Men
- What Trump’s Win Could Mean for Housing
- The 100 Must-Read Books of 2024
- Sleep Doctors Share the 1 Tip That’s Changed Their Lives
- Column: Let’s Bring Back Romance
- What It’s Like to Have Long COVID As a Kid
- FX’s Say Nothing Is the Must-Watch Political Thriller of 2024
- Merle Bombardieri Is Helping People Make the Baby Decision
Contact us at letters@time.com