Folk musician Murphy Campbell found AI-cloned versions of her songs uploaded to Spotify under her name, exposing gaping holes in music distribution platform verification.
Folk artist Murphy Campbell discovered in January that someone had used AI to clone her voice from YouTube recordings and uploaded fake versions of her songs to Spotify under her name. AI detection tools confirmed the tracks were likely AI-generated. A copyright troll named Vydia then sent DMCA takedown notices against Campbell's own authentic recordings, weaponizing copyright law against the real artist. Despite months of effort, at least one fake track remains on Spotify under a duplicate artist profile, and Campbell now competes with multiple impersonator accounts bearing her name.
The attack vector here is trivially simple: rip audio from YouTube, run it through a voice cloning model, upload via a distributor with zero verification. Spotify's proposed manual approval system is reactive and manual — the real fix is cryptographic artist identity verification at the distribution API layer. Any developer building in the music or creator economy space should understand that current distributor APIs (TuneCore, DistroKid, Vydia) have no artist authentication beyond an email address.
If you're building a music or creator tool, test how easy it is to upload audio to a distributor under a fake identity this week — document the failure points and use that to spec an identity verification feature before a competitor does.
Go to claude.ai and open a new conversation
Tags
Signals by role
Also today
Tools mentioned