Stability AI Partners with Warner Music Group to Build Responsible Generative-AI Tools for Musicians
The partnership aims to develop professional-grade audio tools that enable artists, songwriters, and producers to experiment and compose using ethically trained AI models.
Warner Music Group (WMG) and generative-AI leader Stability AI today announced a strategic collaboration to create the next generation of responsible AI tools for music creation.
Last month, Stability AI entered a similar agreement with Universal Music Group (UMG) to co-develop next-generation professional AI tools for music creation.
“This collaboration represents an important step toward developing responsible, artist-friendly AI tools that expand creative possibilities while safeguarding the rights and integrity of music creators,” said Carletta Higginson, EVP and Chief Digital Officer at WMG.
The partnership aims to develop professional-grade audio tools that enable artists, songwriters, and producers to experiment and compose using ethically trained AI models.
The two companies will work directly with artists to understand how emerging technologies fit into their creative workflows. This feedback will help shape AI tools that expand creative possibilities, preserve artistic control, and protect creators’ rights.
“WMG is a leader in the global music and entertainment landscape, and we’re proud to partner with a company that shares our Artist First ethos. At Stability AI we put artists at the center and build tools that support their creative process. This partnership deepens that focus and will open new creative possibilities for artists through generative AI,” Prem Akkaraju, CEO of Stability AI, added.
Stability AI brings to the table its Stable Audio family of models, built specifically for professionals and trained exclusively on licensed data to ensure commercially safe, high-quality audio generation.
In September, the AI startup launched Stable Audio 2.5, the first audio generation model purpose-built for enterprise sound production at scale.
Comments ()