To print this article, all you need is to be registered or login on Mondaq.com.
Although the Screen Actors Guild – American Federation of Television and Radio Artists continues to strike, the Writers’ Guild of America (WGA) and the Alliance of Motion Picture and Television Producers (AMPTP), the trade association responsible for bargaining on behalf of Hollywood production companies, finally signed a deal after a historic 148-day-long strike.
As discussed here, using artificial intelligence (AI) was a key issue throughout the negotiations. As the details of the agreement have become public, it appears that the WGA was successful in their quest to regulate AI (at least to some degree). This agreement not only marks the end of the second longest WGA strike in Hollywood history. It also provides insight into how the industry plans to approach the use of AI.
Although the law has not been able to keep up with the speed at which AI has been developing, the film and television industry appears to be making strides.
The parties agreed that generative AI (GAI) will not be considered to be a “writer.”
Further, any materials that it produces will not constitute literary materials – defined as stories, adaptations and screenplays, among other types of works – for use in the production of TV and film projects. Any material that GAI produces will have to be edited by a human author before it will be considered literary material. This approach will reassure writers that they will not be competing with GAI for credits or Oscar nominations (for now).
By requiring human authors to revise and adapt any GAI-generated work, the AMPTP may be strategically finding a way to use GAI in the script writing process, while maintaining copyright in the work. Whether copyright in GAI-generated works subsists has been a very hot topic – but it has not reached the courts and the answer may depend on the circumstances and differ between countries.
In most countries, including Canada, a human author is required to benefit from copyright protection. If GAI contributed to the work, whether or not copyright is available is dependent on the extent of the human involvement in getting the GAI to “write” the work.
The WGA was also successful in preventing studios from using its materials to train generative AI models without its permission. According to the Summary of the 2023 WGA MBA published by the WGA:
Interestingly, it remains unclear whether the “law” prevents the unauthorized use of copyrighted materials to train GAI. Whether this type of use amounts to copyright infringement is a question that is currently before the courts in multiple jurisdictions.
These lawsuits may succeed but the outcome cannot be predicted (at least not yet). Even if the United States courts reach a position on infringement, the outcome could be different elsewhere. For example, in the UK and Canada, the “fair dealing” defence – a statutory defence that could potentially be used to allow AI to be trained on copyright works without permission – is much narrower than the US fair use defence.
Depending on how these cases shake out, studios may be permitted by law to use copyrighted works to train GAI for the purposes of script writing, but prohibited by the contract (provided that the copyright work was written by a member of the WGA).
However, until the courts weigh in on this issue, using copyrighted works to train GAI for the purposes of script writing will present risks. For example, producers may be exposed to liability in their agreements with third parties, such as distributors and broadcasters, to whom the producers would have represented and warranted that the film and its underlying elements, including the script, are original and do not infringe third party rights.
Further, the producers’ errors and omissions insurance policies could become vulnerable, having been based on representations from the producer that the chain of title in the production is sound.
Although the contract provides insight into the industry’s approach to using GAI in the script writing process, other uncertainties and areas of concern exist. Aside from copyright issues, the use of GAI raises concerns surrounding privacy, defamation and personality rights, and so further clarity is still required.
AI has the potential to radically alter working life, the economy, cultural norms, our security and even what it means to be human. As a result, we expect to see AI regulation take form in the near future.
However, it has been difficult to predict what this regulation will look like. As the technology moves so quickly, its impacts are deeply uncertain. Outside of some very specialized areas (such as autonomous vehicles), regulation of “AI” in general will probably take the form of legal obligations to be cautious and retain documents.
The WGA-AMPTP deal helps to inform best practices in the entertainment industry, but we continue to look forward to new regulations and court decisions. Until then, anyone who is considering putting AI to work on in the film and television industry should consider legal advice before proceeding.
Read the original article on GowlingWLG.com