DAVE LEE: If Spotify keeps pushing deceptive AI music on its subscribers, they should walk away

I don’t mean to be overly rude about country music — I’ve acquired a strong taste for it since living in the US — but it probably surprises few that the genre often derided for its formulaic nature was the first to have an AI “artist” top a Billboard chart.
Walk My Walk, by a computer-generated artist called Breaking Rust, holds that dubious honour: It’s currently No. 1 in the Country Digital Song Sales ranking.
Everybody knows the cliched components of a passable country song, and so apparently does AI. “I keep moving forward, never looking back/With a worn-out hat and a six-string strap.” Inspired stuff, cowboy.
Sign up to The Nightly's newsletters.
Get the first look at the digital newspaper, curated daily stories and breaking headlines delivered to your inbox.
By continuing you agree to our Terms and Privacy Policy.What the achievement underlines is the unfortunate fact that AI-generated music is encroaching into the places where we expect to find human talent.
Some people won’t mind this, but I am not one of them, and I’d bet that most people aren’t. In fact, I think if Spotify were to add a button in its app that filtered out all 100 per cent AI-generated music, a huge portion of its 713 million monthly users would press it without hesitation.
Which is why, as the biggest global music streaming platform, that’s something it should offer.
Spotify’s policy on AI does not allow tracks that impersonate another artist. But music created using AI is permitted, be it an entirely AI-generated song, as appears to be the case with Walk My Walk, or with tracks that might use AI-enhanced production.
In these instances, Spotify has said it is working on the creation of a shared industry standard that would “clearly indicate where and how AI played a role in the creation of a track”.
But that pledge was made in September, and there’s little sign anything has changed on the platform, leaving listeners to determine for themselves whether the music on Spotify is made by humans or computers.
This is getting much harder, and if I sound a little sore, it’s because I am. I was (briefly) duped by an artist named Sterlin Knox.
A friend had added the song to a shared playlist without realizing it was created by AI. I listened to it a few times without much thought.
It was only when it was loud in my headphones that I picked up on the clues. The vocal seemed low quality, like an MP3 file that had gone through too many rounds of compression.
To be sure, I sent a copy to the people at Reality Defender, a specialist in AI detection, and they said they were highly confident it was artificially generated in its entirety.
There is no indication in the Spotify app that Sterlin Knox is the product of AI. The artist page features a bright blue “verified” tick.
The most popular track has been listened to 300,000 times. Each song comes with an accompanying video promo — almost certainly generated by AI also — featuring a young man walking through various scenes.

A producer credit is listed for “Maurizio S,” and there’s even a label, FreshCross. I could find no record of any either anywhere, other than in listings of the same AI-generated songs placed on other platforms such as Apple Music.
On YouTube, I saw that Sterlin Knox tracks had been distributed using DistroKid, a service that makes it easier for independent artists to place work across multiple streaming platforms.
When I contacted DistroKid, it reached out on my behalf to the account holder for FreshCross. The person, whoever they are, declined to speak with me.
When I asked DistroKid if it had any issue with its service being used for this kind of deceptive practice, the spokesman stopped returning my emails. (I couldn’t find any clearly stated position shared by the company online.)
Clearly, Sterlin Knox and Breaking Rust are the tip of the iceberg when it comes to the spread of AI music.
Alternative streaming service Deezer said it had detected 50,000 AI-generated tracks being uploaded to its service every day — about 34 per cent of all songs submitted.
Its survey of 9000 music fans found 97 per cent of them failed to realise a song was made with AI — half of them felt “disappointed” to learn that was the case.
Eight in 10 respondents wanted clear labelling for AI; 40 per cent said they would use it to filter AI out of their listening.
On Spotify, deceptive AI music isn’t just being allowed, the app actively pushes it: Breaking Rust tracks are featured in several of Spotify’s Viral 50 playlists, reaching millions of users — although a spokesman argued its virality had likely been driven by the media coverage of the song.
Either way, country music journalist Aaron Ryan told NPR, the moment is bringing matters to a head: “I think it’s a big deal because it’s going to force country music and kind of the music world as a whole to decide what’s acceptable”.
But music fans don’t need to wait for any industry reckoning. My carefully crafted policy on whether I want to hear AI-generated music is this: I don’t.
In Spotify’s settings today, I can choose to filter out any song with “explicit” lyrics. Those using the ad-supported subscription tier can choose to filter certain promotions for weight-loss products or alcohol.
When R. Kelly was accused of sex crimes for which he was later convicted, Spotify introduced a policy that kept his music on the platform but omitted it from Spotify-curated playlists or any recommendations generated by algorithm.
In other words, Spotify knows when it should offer users control and has tools at its disposal to amplify or restrict certain narrow content.
Of course, AI is a more nuanced question than offensive material. Is a song “made with AI” if an indie musician uses it to fill in a drum track?
How about if a Beatle uses it to resurrect a lost song? (Instagram certainly ran into trouble trying to define it for photography.)
These are judgments that should be put in the hands of the individual music fan.
Once Spotify has helped create the industry standard for honestly disclosing AI use in music, it should use its influence and reach to strictly enforce compliance with repercussions for bad actors.
Users can then make choices in accordance with their principles.
Not everyone will share the same principles, of course. It is the right of all of us to possess awful taste. Breaking Rust has 2.4 million monthly listeners on Spotify, though it’s not clear how many of those know “he” is not real: The Times of London reported last week that some Breaking Rust fans had enquired about tour dates. If they like what they’re hearing, so be it.
I’ve been a Spotify subscriber since before the company even had a mobile app. In the 16 years since, not once have I questioned that expense, which has long felt more like a utility bill than a discretionary subscription.
But discovering you’ve been listening to an AI artist is a deeply violating experience that a good streaming platform should help me, a valued customer, avoid. If it refuses, a streaming app that does will soon get my business - and that of many other music fans, too, I suspect.
Dave Lee is Bloomberg Opinion’s US technology columnist. He was previously a correspondent for the Financial Times and BBC News.
