Who Made This Song? Platforms Start to Label AI Music

0 comments

AI-Generated Music and the Push for Transparency

The rise of artificial intelligence (AI) in music creation is challenging established norms around authorship, authenticity, and even the very definition of a musical artist. Last fall, an AI-generated country song quietly reached No. 1 on Billboard’s Country Digital Song Sales chart, raising questions about the future of music production and distribution. Now, industry players are grappling with how to address the influx of synthetic content and ensure transparency for listeners.

The Breaking Rust Phenomenon

The song, “Walk My Walk,” was released under the name Breaking Rust, a fictional artist with an AI-generated persona. The track gained traction on streaming platforms, attracting over 2 million monthly listeners on Spotify despite lacking a traditional artist biography. Several of its songs were played over a million times, and one single exceeded 4.5 million streams. The creator of Breaking Rust remained anonymous.

The success of “Walk My Walk” highlighted a key characteristic of the Country Digital Song Sales chart: a relatively small number of purchases can propel a song to the top, as fewer people purchase digital songs compared to streaming . This dynamic raised concerns that the chart could be manipulated.

The Scale of AI-Generated Music

The proliferation of AI-generated music is rapidly increasing. Deezer reported receiving over 60,000 fully AI-generated tracks daily in January, a significant jump from the 10,000 it received when it first deployed its detection tool in early 2025 . According to Music Business Worldwide, synthetic content now accounts for roughly 39% of all music delivered to the platform daily.

Deezer found that up to 85% of streams on AI-generated music were fraudulent in 2025, suggesting that many streams are artificially inflated to generate royalty payouts . This underscores the connection between the transparency problem and the issue of fraudulent streaming activity.

Platforms Respond with Disclosure Frameworks

In response to these challenges, platforms are beginning to explore disclosure frameworks. Apple announced “Transparency Tags” this month, a metadata framework covering track, composition, artwork, and music video content categories . Labels and distributors can apply these tags immediately, although they are currently optional, and there is no independent verification or enforcement mechanism.

Spotify’s approach is similarly focused on voluntary disclosure. Co-CEO Gustav Söderström stated that Spotify should not dictate the tools artists use, but acknowledged listener demand for clarity regarding the creation process. The platform is working with the industry to allow creators and labels to include metadata indicating how music was created, which Spotify intends to surface to users .

Broader Implications for Content Creation

The music industry’s debate over disclosure is part of a larger challenge facing platforms across various content types. While music benefits from streaming data that can quantify the impact of synthetic content, detecting AI-generated video and images is more hard, and the stakes for trust are arguably higher.

Meta, for example, launched Vibes in September, a dedicated feed for short-form AI-generated video, effectively segregating synthetic content from the main content environment. This contrasts with Apple’s labeling approach, which aims to integrate disclosure within a shared content space.

Looking Ahead

Both disclosure and segregation strategies are experimental, and their effectiveness at scale remains to be seen. As AI continues to reshape the creative landscape, the industry will need to navigate the complex interplay between innovation, transparency, and the protection of artists and listeners.

Related Posts

Leave a Comment