Shortly after its January release, the AI copyright infringement tool Nightshade exceeded the expectations of its developers at the University of Chicago's computer science department, with 250,000 downloads. With Nightshade, artists can avert AI models from using their artwork for training purposes without acquiring permission.
The Bureau of Labour Statistics reports that more than 2.67 million artists work in the United States, but social media response indicates that downloads have taken place across the globe. According to one of the coders, cloud mirror links were established in order to prevent overloading the University of Chicago's web servers.
The project's leader, Ben Zhao, a computer science professor at the University of Chicago, told VentureBeat that "the response is simply beyond anything we imagined.”
"Nightshade seeks to 'poison' generative AI image models by altering artworks posted to the web, or 'shading' them on a pixel level, so that they appear to a machine learning algorithm to contain entirely different content — a purse instead of a cow," the researchers explained. After training on multiple "shaded" photos taken from the web, the goal is for AI models to generate erroneous images based on human input.
Zhao, along with colleagues Shawn Shan, Wenxin Ding, Josephine Passananti, and Heather Zheng, "developed and released the tool to 'increase the cost of training on unlicensed data, such that licencing images from their creators becomes a viable alternative,'" VentureBeat reports, citing the Nightshade project page.
Opt-out requests, which purport to stop unauthorised scraping, are reportedly made by the AI companies themselves; however, TechCrunch notes that "those motivated by profit over privacy can easily disregard such measures."
Zhao and his colleagues do not intend to dismantle Big AI, but they do want to make sure that tech giants pay for licenced work—a requirement that applies to any business operating in the open—or else they risk legal repercussions. According to Zhao, the fact that AI businesses have web-crawling spiders that algorithmically collect data in an often undetectable manner has basically turned into a permit to steal.
Nightshade shows that these models are vulnerable and there are ways to attack, Zhao said. He went on to say that what it implies is that there are methods for content creators to provide harder returns than writing Congress or complaining via email or social media.
Glaze, one of the team's apps that guards against AI infringement, has reportedly been downloaded 2.2 million times since its April 2023 release, according to VentureBeat. By changing pixels, glaze makes it more difficult for AI to "learn" from an artist's distinctive style.