This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.

A Fresh Take

Insights on US legal developments

| 3 minute read
Reposted from Freshfields Technology Quotient

Permission to Deep Fake – Who Owns the Rights to AI-Generated Content?

“Feel free to use my voice without penalty,” tweeted Canadian musician, Grimes, after the song “Heart on My Sleeve” went viral for its use of AI-generated vocals made to sound like Drake and The Weeknd. Although Grimes’ tweet seemingly constituted an offer to “split 50% royalties on any successful AI generated song that uses [her] voice,” which is what she normally offers artists for collaborations, is the use of generative AI in the music industry truly that simple? In an era of uncertainty regarding intellectual property rights in AI-generated content, how might the growing use of AI across industries shape the strategies that creators and rights-holders employ to protect and enforce their rights?

In the US at least, in order to benefit from copyright protection, a work requires a human author. In March 2023, the US Copyright Office (USCO) provided guidance addressing AI authorship, clarifying that authors may only claim copyright protection for human contributions. But the USCO guidance does not create a bright-line rule on what constitutes an author’s contributions with respect to a work that combines both human- and AI-created elements. Individuals who create works with generative AI tools should keep in mind that while the human contributions to the works may qualify for copyright protection, the portions created by AI would not.

Generative AI, from its development to its deployment for content creation, presents additional legal hurdles beyond the question of authorship. The successful copyright takedown of the AI-generated song “Heart on My Sleeve” rested on the inclusion of copyrighted material (producer Metro Boomin’s actual audio tag). Creators should be aware that any work created using AI tools may contain copyrighted elements, which could expose creators to claims of copyright infringement.

But even without the copyrighted tag, could the copyright owners have relied on other legal mechanisms to effect the takedown? To get to a point where generative AI can create output sounding like real or even AI-generated artists, the algorithm needs to be trained with existing works, further raising the risk that infringement claims may be asserted. Although courts have yet to opine on whether training AI using copyrighted works of any kind violates the copyright holder’s rights, any infringement claims that rest on the use of copyrighted materials for training would ostensibly involve a fact-specific exercise.

In addition to IP infringement concerns, use of an artist’s voice for AI-generated works may raise other legal considerations. Artists that are bound by exclusivity deals with their record labels may need consent from such labels even if the artists approve of AI-generated sound-alikes. While Grimes tweeted she has “no label and no legal bindings,” that may not be the case for many artists. Further, unauthorized use of a person’s name and voice for an AI-generated song may also violate such individual’s right of publicity to their own name, image, and likeness. For example, Bette Midler and Tom Waits have successfully used right of publicity claims to challenge human sound-alikes. Paul McCartney recently announced he will use AI to replicate John Lennon’s voice in a song scheduled for release later this year, and we are likely to continue to see a rise in AI look- and sound-alikes as AI technology improves. Because use of a person’s image and voice can be particularly sensitive and emotional, absent express permission, we may see a corresponding increase in right of publicity claims as a means for objecting to AI use.

Of course, these potential pitfalls are not confined to the entertainment industry and are similarly applicable to AI use in industries like technology or law, where companies are always seeking ways to improve efficiency in coding, debugging, conducting research, drafting and other time-consuming tasks. Certain strategies can help players across industries strike a balance in harvesting potential efficiency gains while protecting users of generative AI tools against challenges in this evolving legal landscape. Keeping records of human involvement in the creation of a work and how much original content was used to train the generative AI tool could help overcome objections to copyrightability. Users should also consider tracking and tagging code and other content that is developed using these tools, allowing such code or content to be isolated if a legal challenge arises. With appropriate checks and balances in place, companies and individuals feeling pressure to adopt generative AI tools to keep up with industry trends should be able to mitigate some of these risks.

*With special thanks to Olivia Luongo on co-authoring this article.