top of page
  • Capital Culture

AI Vocal Cloning App Voicify Facing Legal Action From UK Music Industry Due To Artist Voice Deepfake Catalog

Updated: May 6




As reported by Music Business Worldwide, Jammable, formerly known as Voicify.ai, has been a popular vocal cloning service for some time. The company allows users to clone the voices of famous artists without permission, making it possible to create their own musical deepfakes. Jammable’s website currently has over 3,000 unlicensed AI-generated voice models available on its service, including those of Adele, Justin Bieber, Phil Collins, Eminem, Ariana Grande, Michael Jackson, Bruno Mars, George Michael, Elvis Presley, Prince, Tupac Shakur, Ed Sheeran, Taylor Swift, and Amy Winehouse.



The voice models are available for creating “covers” by stripping out the vocal line of a piece of recorded music and replacing it with the AI-generated vocals of another artist. Jammable has become significant enough to be the only voice-cloning tech named in the US Trade Representative’s annual “notorious markets” report by the Recording Industry Association of America (RIAA). “




The service stream-rips the YouTube video selected by the user, copies the acapella from the track, modifies the acapella using the AI vocal model, and then provides to the user unauthorized copies of the modified acapella stem, the underlying instrumental bed, and the modified remixed recording,” the RIAA said in a submission last fall to the US Trade Representative’s annual “notorious markets” report.


On its website, Jammable describes itself as the “#1 platform for making high quality AI covers in seconds.”


In February 2023, the UK’s recorded music industry group, BPI, sent a letter through its solicitors to Voicify, threatening legal action unless the vocal cloning site stopped its copyright-infringing activity. In response, Voicify changed its name to Jammable and altered some of its functionalities, but continues to offer users access to cloned voice models.


Kiaron Whitehead


“Music is precious to us all, and the human artistry that creates it must be valued, protected and rewarded. But increasingly it is being threatened by deepfake AI companies who are taking copyright works without permission, building big businesses that enrich their founders and shareholders, while ripping off artists’ talent and hard work,” said Kiaron Whitehead, a general counsel at BPI.





“The music industry has long embraced new technology to innovate and grow, but Voicify (now known as Jammable), and a growing number of others like them, are misusing AI technology by taking other people’s creativity without permission and making fake content. In so doing, they are endangering the future success of British musicians and their music.


“We, like all true music fans, believe that human artists must be supported, and we reserve our right to take action against any operation that infringes artists’ rights and damages their creative talent and prospects in this way.”


Jammable charges its users a subscription fee ranging from $1.99 a month for a “starter” account to $89.99 a month for a “power user” account. The BPI has garnered support from various music industry groups, including the UK Musicians’ Union, the Ivors Academy, the Music Publishers Association, UK Music, PPL, and the Association of Independent Music (AIM) in confronting Jammable/Voicify. Experts in the music industry have expressed concerns that platforms like Jammable are misusing AI technology and endangering the future success of British musicians.


Nick Eziefula, a copyright and AI lawyer with law firm Simkins LLP, stated in an email to media, “Today’s news proves that the ‘Wild West’ era of unlicensed AI music generation may not last much longer if music rights-holders have any say in the matter.” 




“It is impossible to see how an AI platform that flagrantly and deliberately mimics artists’ voices could be built without using recordings of those voices as training data. Using copyright recordings in this way is unlawful in England unless permission has been granted or any of the relevant exceptions or defenses to copyright infringement apply.”




Eziefula added that, while an artist’s voice may not itself be a copyrighted work, “it may amount to personal data, in which case unauthorized use may be a breach of data privacy laws. There may also be a misappropriation of the artist’s brand, so false endorsement principles could well apply, and even claims for reputational damage if AI-generated works appear to be attributed falsely to the original artists.”


He went on further to state, “These are risks day-to-day consumers of the service are unlikely to be aware of – and Voicify seems to hide behind its small print here, expressly asserting that it is not responsible for its customers’ use of generated content.”


The music industry is increasingly asserting its rights in the face of a massive proliferation of generative AI tools that have made it easier than ever to clone vocals, music or other forms of intellectual property. Rightsholders in the US have filed numerous lawsuits against AI developers, asserting that they infringed on their rights by using copyrighted materials without permission in the training of AI models. As the alarm over deepfakes grows louder, lawmakers are beginning to react. The No AI FRAUD Act, currently before the US House of Representatives, would enshrine a right of publicity into federal law for the first time, giving individuals or rights holders the ability to sue when a person’s voice or likeness has been used in AI-generated content without permission.


The European Union’s parliament recently passed the AI Act, the Western world’s first comprehensive set of laws regulating the development and use of AI. Among its regulations is a requirement that developers of “general-purpose AI models” obtain authorization to use copyrighted materials in developing their AI models. However, some doubts remain as to whether the law is sufficient to capture all the potential acts of infringement being seen today.


2 views0 comments

DISCLAIMER

Capital Culture seeks to provide accurate and up-to-date content.

With that in mind, we still advise readers to verify the facts of our articles and to consult a professional before making any decision based upon information obtained.

© 2023 Capital Culture

bottom of page