Putting the “Ban” in Bandcamp: The Implications of the AI-Ban

Rising R&B artist Sienna Rose has racked up over 3 million monthly listeners on Spotify in less than 6 months. Headlining countless viral playlists, Rose is also suspiciously prolific, having released four full length albums since October of 2025. Her latest record, Date Night, features risqué track titles, like “Deep Intensive Kissing” and “Loud Moaning.” There seems to be a pattern to Rose’s cover art as well: 50s-style dimly lit portraits modeled after the likes of Sarah Vaughan and Billie Holiday. The singer caresses a retro microphone or gazes out the window, flashing a self-conscious smirk at the camera. The music itself is formulaic modern R&B; Rose’s vocal performance is reminiscent both of early-aughts artists like Alicia Keys and Norah Jones, and neo-soul singers like Cleo Sol and Jorja Smith. Keen readers might get a sense where we’re headed. All signs point to the likelihood that Sienna Rose is, gasp, not a humble starlet who won the streaming lottery at a time when prospects for musical careers are exceedingly grim, but in fact an imposter: an AI-generated artist whose digital footprint is less than one year old. 

On January 13, Bandcamp made headlines by introducing an unprecedented blanket ban on music made using generative AI (GenAI), under the banner: Keeping Bandcamp Human. Their policy reads:

  • Music and audio that is generated wholly or in substantial part by AI is not permitted on Bandcamp. 
  • Any use of AI tools to impersonate other artists or styles is strictly prohibited in accordance with our existing policies prohibiting impersonation and intellectual property infringement.

The language around the enforcement of these guidelines is murky at best. Bandcamp mentions “reporting tools” that listeners will be encouraged to use to flag suspected AI-music. This begs the question: who will be reviewing these claims? Much AI-detecting (or “spam-filtering”) software is itself reliant on AI tools. The potential irony is glaring.

Bandcamp’s policy suggests that mere suspicion of GenAI use may be enough of a decisive factor for the platform to remove a song, an album, or even an artist’s entire catalog from the site. The question then becomes, how do we train ourselves as listeners to identify AI-generated music? 

Sienna Rose is not the only AI-artist to gain success on streaming platforms. You might remember The Velvet Sundown, a supposed psychedelic band whose monthly listeners numbered over a million before the group was exposed as “synthetic.” There was “Walk My Walk,” a single from Road Boyyz, an AI country group, which reached No. 1 on Billboard’s Country Digital Song Sales chart, having won the hearts (or maybe just ears) of millions with lyrics like, “You can kick rocks if you don’t like how I talk.” There’s the case of Xania Monet, an uncanny hybrid of Christina Aguilera and Beyonce, who signed a multimillion dollar record deal with Hallwood Media after a heated bidding war. 31-year-old poet Telisha Nikki Jones designed Monet with the help of Suno, one of the leading GenAI-based music production platforms. Monet’s artificiality was not a problem for the labels vying to represent her; in fact, an AI-generated artist might be the ideal act for predatory music companies: a musician unburdened by the baggage of human rights and fair working standards. That is, of course, if the contract is under Monet’s name rather than the name of her “creator,” Jones, who appears to be a real human being.

AI-generated image of AI-generated band Velvet Sundown playing AI-generated music. Courtesy of Velvet Sundown.
AI-generated image of AI-generated band Velvet Sundown playing AI-generated music. Courtesy of Velvet Sundown.

The hope when one creates an AI-artist, from a business perspective, is that the artist might be added to an editorial playlist on Spotify or Apple Music, thereby generating revenue for the human actors behind the slop. Bandcamp’s business model and ethos differ greatly from those of its competitors, in that it’s always boasted an “artist-first” model, where artists receive relatively high payouts and the ability to directly engage with fans. Bandcamp does offer streaming, but you can only stream an album so many times before being prompted to “open thy heart/wallet” and purchase the music. The platform recently introduced a playlist function, but fans can only add purchased music to playlists, and unlike Spotify and Apple Music, there are no Bandcamp editorial playlists for various genres and moods. 

Considering Bandcamp’s ostensibly values-based business model, one wonders how prevalent GenAI really is on the service in comparison with its streaming competitors. Undoubtedly there is music on the platform that is entirely or substantially generated by AI, given the ubiquity of the technology, but there is far less of a chance for the success of an AI musician on an “artist-first” platform like Bandcamp as opposed to Spotify or Apple Music, where an AI-artist can reach relative fame, or at least virality overnight by being added to editorial playlists and algorithmically-curated radio stations. More likely, Bandcamp’s issue is spam: mass uploads, duplicates, SEO hacks, etc. This news of the ban having just become public, it remains unclear what tactics the service will take to identify these issues and rectify them. 

There is an obvious PR aspect to this move from Bandcamp, which has been subject to criticism since the company was sold to Songtradr, a B2B music licensing service, by its short-term parent company, Epic Games, back in 2023. Soon after the acquisition, Songtradr attracted further ire by announcing that 50% of Bandcamp employees had “accepted offers to join” the company, a roundabout way of admitting to sizable layoffs.

The previous acquisition of Bandcamp by Epic Games in 2022 was a worrisome sign for users of the platform, who wondered why the software developer behind Fortnite would be interested in buying a hitherto independent music retailer like Bandcamp. For many, the Songtradr deal only amplified alarm bells that had been sounding since the Epic Games takeover.

It remains to be seen whether the negative reactions to Bandcamp’s recent ownership changes and business practices will be tempered by the platform’s strong stance on AI. The policy has already attracted a number of supporters and detractors, numbering among them musicians who use AI technologies and fear the blanket ban will cause their music to be removed from the platform. 

Holly Herndon, an experimental artist and composer (with a Ph.D from Stanford University’s Center for Computer Research in Music and Acoustics), expressed skepticism over the potential impact of the ban. Herndon’s work has long incorporated AI-tools and Web 3.0 technologies such as blockchain. Regarding the Bandcamp ban, she posted to X: “I understand why Bandcamp is taking this measure but it’s a tourniquet,” going on to say, “We live with infinite media now… I encourage platforms to be more curated, but enforcing a hard human / AI binary is not the right way to address this long term.” 

Lee Gamble, an electronic musician signed to Kode9’s illustrious Hyperdub label, used a neural network to create the choir of artificial voices that haunt his 2023 project, Models. Gamble isn’t trying to fool anyone; the artificially generated voices on Models are used in a similar way that samples have historically been manipulated in electronic music to create dense, borderline-hallucinatory sound environments. In an interview with Passion of the Weiss, Gamble spoke candidly about his relationship to AI, stating that, “I’m aware that it’s problematic…but it’s also such a tantalizing technology for an artist to not use. I’m still in a quandary with it.” For Gamble, the artificial voices are interesting precisely because they are inhuman and strange: 

“It’s about the idea that these aren’t real people, but it’s not suggesting the replacement of people. It’s about, “How much mood can I get out of this? How human can I make the technology?” How human does a singing voice feel when it’s disembodied? How emotional can a digital simulation feel? This is without ignoring what they are: simulations of people.”

Both Herndon and Gamble use Bandcamp to share their music, and likely make more money from the platform than all other streaming services combined. But, as artists on reputable labels who have spoken extensively about their creative and nuanced relationships with AI-tools, they are unlikely to be targeted by Bandcamp’s ban. 

It’s worth noting that Spotify removed roughly 75 million songs in a major crackdown on what it euphemistically called “spam” last September. Using a music spam filter, likely powered by AI, Spotify claimed to identify uploaders of spam, tag them, and prevent the tracks from being prioritized by its algorithm. The language was especially strong regarding “voice clones” which it classifies as impersonations. However, artists that pass the spam filter are not required to label their music as entirely or partially AI-generated. All of the aforementioned AI-artists that gained fame on the platform are still hosted there, as well as on Apple Music, which has all but embraced AI, albeit primarily as a personalized playlist-creation tool.

One of the more startling aspects of the proliferation and rapid advancement in quality of AI-generated music (not to mention images) is the way it makes us, as listeners, feel. For many a principled music lover, i.e. Bandcamp’s target demographic, listening to an AI-generated voice masquerading as a human one is unsettling. More disturbing is the possibility that this artificial music could produce an emotional response in the listener. At the same time it’s true that many listeners have no problem with AI-generated music, many even finding it novel and exciting, a reminder that AI-skepticism is still relatively fringe despite many of us living in left-leaning, conscientious bubbles that suggest otherwise. 

A recent study on the “perceived humanness” of AI music found that, when presented with pairs of songs and asked which of them had been generated by AI, specifically via Suno, participants chose correctly only 53% of the time. The accuracy rose to 66% when the pairs of human vs. AI songs were stylistically similar. But because AI-generation models update frequently, by the time the study was released, there was already a more advanced Suno model available. Recall now Bandcamp’s proposal to review all music flagged as potentially AI-generated by its users. With such a high margin of error, there are bound to be false reports, as well as instances of AI-generated music that go undetected. It’s not just the uncanny valley that troubles critics of AI; many point to its vast, energy-guzzling infrastructure, with some large data centers consuming over a million gallons of water daily. Most troubling for artists is the idea of AI-generated acts superseding flesh and blood musicians in the industry rat race, a novel iteration of the age-old fear that humans will lose their jobs to automatons. It will be instructive to see the material consequences of this ban in the coming months, and whether other streaming platforms follow suit in issuing harsher AI policies.