Legal Issues With Generative AI

Nancy Wolff and Scott Sholder, co-chairs of the litigation group at Cowan DeBaets Abrahams & Sheppard LLP, discuss the intellectual property issues raised by generative AI.

Published on 15 May 2023
Nancy Wolff, Cowan DeBaets Abrahams & Sheppard LLP, Chambers Expert Focus contributor
Nancy Wolff

Ranked in Intellectual Property: Trademark, Copyright & Trade Secrets in Chambers USA

View profile
Scott Sholder, Cowan DeBaets Abrahams & Sheppard LLP, Chambers Expert Focus contributor
Scott Sholder

What is AI and What Copyright Issues Does It Raise?

Generative AI is an umbrella term used to describe any form of artificial intelligence that creates content.

“The US Copyright Office has taken a pretty clear stance; they believe there has to be a significant amount of human authorship in order for you to have ownership over creative work.”

Three key intellectual property issues are raised by AI:

  • the creation of training sets and whether these need to be licensed or whether the inclusion of copyrighted material is fair use;
  • if there is found to be infringement, who is going to be held responsible – the person who made the prompt or the company that created the AI platform; and
  • the question of authorship.

AI Training and How The Process Affects Artists and Authors

All creators, no matter their field, are likely to be interested in AI and, at the same time as they are concerned by it, look at AI as being a tool that can help them.

“There is a dark side to all this, AI could be used to create false news and propaganda and to put a lot of people out of work.”

AI will only be as good as the data on which it is trained, but if creative industry jobs are lost then the new works necessary to supplement these datasets will be lost with those jobs.

Recent and Pending Litigation

One of the first lawsuits in this area involves Getty Images, which has one of the world’s largest image and video libraries. Getty brought a claim against Stable Diffusion in the UK and Delaware involving not just copyright, based on the scraping of 12 million images, but also the data that goes with those images. Getty also brought claims for trademark infringement because the stock photo industry uses watermarks, which were often visible and distorted in content created by Stable Diffusion.

Another case involves a class action brought by three artists, and purporting to represent the interests of thousands of creators, against AI art generators.

Recommended Best Practices for both AI Companies and Users

Guardrails are needed. Platforms should offer a “safe mode” based on datasets that are licensed, allow users to avoid creating something that is in the style of a living artist or author, and prevent the digital recreation of real places so that it is harder to generate “fake news.”

Tools now exist allowing creators to detect whether their content has been used on some platforms.

For AI companies, given likely future regulation, licensing is the way to go. Where an AI has scraped the internet indiscriminately – acquiring news, photos and content involving real people who have never given permission for their likeness to be used for commercial purposes  – there is a clear risk of liability.

Cowan DeBaets Abrahams & Sheppard LLP

Cowan DeBaets Abrahams & Sheppard LLP, Chambers Expert Focus contributor
1 ranked department and 1 ranked lawyer

Learn more about the firm’s ranking in Chambers USA

View firm profile

Chambers Global Practice Guide Copyright 2023

Discover more about global developments in copyright and IP.