A trio of artists have filed what will undoubtedly be a closely watched lawsuit over the use of their artwork to train artificial intelligence (“AI”) generators to create “new” art in response to user prompts. According to the complaint that they filed in a California federal court on January 13, Sarah Andersen, Kelly McKernan, and Karla Ortiz claim that by way of their respective AI-image generators (all of which are built on a software library called Stable Diffusion), DeviantArt, Midjourney, and Stability AI (collectively, the “defendants”) are engaging in “blatant and enormous infringement of their rights” by using their artworks – without authorization – to create what are being characterized as “new” images that amount to infringing derivative works.
In the newly filed proposed class action lawsuit, Andersen, McKernan, and Ortiz (collectively, the “plaintiffs”) assert that to create the AI-centric Stable Diffusion library, Stability “scraped, and thereby copied over five billion images from websites,” and did so without seeking consent from the creators of the artworks at issue or the websites that hosted them. These “training images” are then used to produce “seemingly new images through a mathematical software process” based on prompts from the AI-image generators’ users. “These ‘new’ images are based entirely on the training images,” the plaintiffs assert, and thereby, amount to “derivative works of the particular images Stable Diffusion draws from when assembling a given output.”
(The exclusive right to create a derivative work – i.e., a work based on or derived from one or more already existing works – is held by the copyright holder of the underlying work.)
With the foregoing in mind, the defendants, “by and through the use of their AI image [generators], benefit commercially and profit richly from the use of copyrighted images,” the plaintiffs allege. Pointing to 2-year-old Midjourney, for instance, the plaintiffs assert that the company “skipped the expensive part of complying with copyright and compensating artists, instead helping themselves to millions of copyrighted works for free.” And the “harm to artists is not hypothetical,” they contend, as “works generated by AI image [generators] ‘in the style’ of a particular artist are already sold on the internet, siphoning commissions from the artists themselves.”
The plaintiffs set out claims of direct and vicarious copyright infringement, violation of the Digital Millennium Copyright Act, violation of their statutory and common law rights of publicity, breach of contract, and unfair competition. Among other things, they argue that Stability AI and co. “are using copies of the training images interconnected with their AI image [generators] to generate digital images and other output that are derived exclusively from the training images, and that add nothing new.” As such, they are infringing the plaintiffs’ copyrights by “reproduc[ing] one or more of their works” and/or “prepar[ing] derivative works based upon one or more of the works” to create “digital images and other output that act as market substitutes for the underlying training images, thereby competing with the plaintiffs and members of the class.”
On the right of publicity front, the plaintiffs allege that defendants “used [their] names and advertised the AI’s ability to copy or generate work in the artistic style that [they] popularized in order to sell [the AI] products and services.” The defendants’ use of their names was “not incidental,” Andersen, McKernan, and Ortiz argue, claiming that the defendants “specifically and knowingly used [their] names because these names were uniquely related to specific artistic styles, and the defendants generated valuable business from their ability to sell artworks ‘in the style’ that the plaintiffs popularized.”
Because the defendants “advertise the ability of their systems to generate artwork ‘in the style’ of their work – and explicitly used [their] work to train the AI algorithms,” Andersen, McKernan, and Ortiz maintain that the art generated by the defendants’ AI products is “not transformative,” and the defendants’ “misappropriation merely capitalizes on [their] theft of the plaintiffs’ artistic work and the associated value of the plaintiffs’ names.” (In the same vein, the plaintiffs assert that among the common questions shared by the potential class action members is, “Whether Defendants violated Plaintiffs’ and the Class’s rights of publicity when they designed their AI Image Products to respond to prompts requesting output images “in the style” of specific individuals, namely Plaintiffs and the Class.”)
As a result of the defendants’ “scheme,” Andersen, McKernan, and Ortiz are seeking injunctive relief, an award of statutory and other damages, and certification of their proposed class action.
The case is Sarah Anderson, et al., v. Stability AI LTD., et al., 3:23-cv-00201 (N.D. Cal.)