Two years ago, “BBL Drizzy” was the AI music shot heard around the world: a song that sounded like Drake burst out of nowhere and sparked a battle over artistry, equality, and, of course, copyright. Three major labels – Universal Music Group (UMG), Sony Music Entertainment and Warner Records – filed a “collective” lawsuit against AI companies Udio and Suno for copyright infringement; He had a public dispute with TikTok over issues including AI content on the platform; And they started developing AI detection tools to keep an eye on how their music was being spun.
Now the music industry and AI startups appear to be headed down the (monetizable) path to scale – and it seems the system artists are already caught up in.
The press release states, “KLAY is not a prompt-based meme generation engine designed to replace human artists. Rather, it is an entirely new subscription product that will uplift great artists and celebrate their art.” “Within KLAY’s system, fans can tailor their music journey in new ways, while ensuring that participating artists and songwriters are appropriately recognized and rewarded.”
according to a financial Times In the October report, labels were advocating for a compensation framework similar to the way traditional music streaming works: micro-payments based on plays. Everyone from independent artists to Taylor Swift have complained that the streaming-era payment system actually squeezes out the people making music, with profits flowing to labels instead. The specifics of Clay’s deals were not immediately clear, but one can imagine that determining the value of earnings for an AI-generated remix could be much more complex than streaming the original song: For example, when a user asks for a shoegaze-style remix of a Sabrina Carpenter song, who gets paid? And let’s say a user-created Shoegaze Sabrina Carpenter track goes viral on TikTok, garnering millions of views – then what?
The ecosystem for AI-generated music is a mess. Spotify said in September that it had pulled 75 million “spammy” tracks in the past 12 months alone. One track removed by the streamer in recent weeks is “I Run” by unknown artist Heaven. Which was propelled to virality through TikTok. Some users mistakenly attributed the vocals to R&B artist Jorja Smith, and the track had 13 million streams before Spotify removed it. In September, Spotify added a new policy against artist voice impersonation. (Songs that are original compositions but sound like an actual artist opens a whole new can of worms around a person’s right to publicity.)
The creators of the track told board He wrote and produced the song, but processed the vocals using Listen, which allows users to compose lyrics based on text cues. Finally, Heaven. The track was re-uploaded, this time using human vocals instead of the Suno-processed Smiths soundalike. Some listeners clearly preferred the AI version.
All of this makes the future of music listening potentially very strange. AI-generated tracks will continue to be misattributed to human artists without any licensing agreements, and labels will continue to pursue them. But if Clay and the three big labels really launch a remix platform, officially licensed AI tracks will mix with black market AI tracks on the Internet. Songs will be uploaded, pulled, reuploaded, and modified, creating a tangle of questions about ownership and compensation. With these deals, music labels are trying to walk a line that could get even worse: AI music based on our artists is fine as long as we get paid.