AI & RoboticsNews

AI poisoning tool Nightshade received 250,000 downloads in 5 days: ‘beyond anything we imagined’

Nightshade, a new, free downloadable tool created by computer science researchers at the University of Chicago designed to be used by artists to disrupt AI models scraping and training on their artworks without consent, has received 250,000 downloads in the first five days of its release.

“Nightshade hit 250K downloads in 5 days since release,” wrote the leader of the project, Ben Zhao, a professor of computer science, in an email to VentureBeat, later adding, “I expected it to be extremely high enthusiasm. But I still underestimated it…The response is simply beyond anything we imagined.”

It’s a strong start for the free tool, and shows a robust appetite among some artists to protect their work from being used to train AI without consent. According to the Bureau of Labor Statistics, there are over 2.67 million artists in the U.S. alone, but Zhao told VentureBeat that the users of Nightshade are likely even broader.

“We have not done geolocation lookups for these downloads,” Zhao wrote. “Based on reactions on social media, the downloads come from all over the globe.”

Nightshade seeks to “poison” generative AI image models by altering artworks posted to the web, or “shading” them on a pixel level, so that they appear to a machine learning algorithm to contain entirely different content — a purse instead of a cow, let’s say. Trained on a few “shaded” images scraped from the web, an AI algorithm can begin to generate erroneous imagery from what a user prompts or asks.

On the Nightshade project page, Zhao and his colleagues — Shawn Shan, Wenxin Ding, Josephine Passananti, and Heather Zheng — stated they developed and released the tool to “increase the cost of training on unlicensed data, such that licensing images from their creators becomes a viable alternative.”

Shortly after Nightshade’s release on January 18, 2023, the demand for concurrent downloads of it was so overwhelming to the University of Chicago’s web servers, that the creators had to add mirror links where people could download copies of it from another location on the cloud.

The demand for nightshade has been off the charts. Global Requests have actually saturated our campus network link. (No dos attack, just legit downloads). We are adding two fast mirror links for nightshade binaries.

Win: https://t.co/rodLkW0ivK
Mac: https://t.co/mEzciopGAE

Meanwhile, the team’s earlier tool — Glaze, which works to prevent AI models from learning an artist’s signature “style” by subtly altering pixels so they appear to be something else to machine learning algorithms — has received 2.2 million downloads since it was released in April 2023, according to Zhao.

Operating under their umbrella name, The Glaze Project, Zhao and his fellow researchers had already previously stated their intention to release a tool combining both Glaze (defensive) and Nightshade (offensive).

As for when this is coming, it will be at least a month at the soonest.

“We simply have a lot of to dos on our list right now,” Zhao wrote. “The combined version must be carefully tested otherwise ensure we don’t have surprises later. So I imagine at least a month, maybe more, for us to get comprehensive tests done.”

At the same time, The Glaze Project researchers have advocated that artists use Glaze first, then Nightshade, to both protect their style while disrupting AI model training, and have been heartened to see artists doing just that, even though it is a little more cumbersome to use two separate programs.

“We warned people that we have not done full tests to understand how it works together with Glaze and that folks should wait before releasing any images with only Nightshade,” Zhao explained. “The artist community’s response was to say, ‘we will just Nightshade and Glaze in two steps, even though it takes more time and has more visible impact on the art.’”

An open-source version of Nightshade may be in the cards as well. “We will likely do an open source version at some point,” Zhao stated. “Just more time required to put out different versions.”

The project leader noted that he and his colleagues had not, nor did they expect to hear directly from, the model makers behind AI image generating technology, such as OpenAI (DALL-E 3), Midjourney, Stability AI (Stable Diffusion) and others. VentureBeat uses some of these tools and others to create article imagery and other content.

Nightshade, a new, free downloadable tool created by computer science researchers at the University of Chicago designed to be used by artists to disrupt AI models scraping and training on their artworks without consent, has received 250,000 downloads in the first five days of its release.

“Nightshade hit 250K downloads in 5 days since release,” wrote the leader of the project, Ben Zhao, a professor of computer science, in an email to VentureBeat, later adding, “I expected it to be extremely high enthusiasm. But I still underestimated it…The response is simply beyond anything we imagined.”

It’s a strong start for the free tool, and shows a robust appetite among some artists to protect their work from being used to train AI without consent. According to the Bureau of Labor Statistics, there are over 2.67 million artists in the U.S. alone, but Zhao told VentureBeat that the users of Nightshade are likely even broader.

“We have not done geolocation lookups for these downloads,” Zhao wrote. “Based on reactions on social media, the downloads come from all over the globe.”

Nightshade seeks to “poison” generative AI image models by altering artworks posted to the web, or “shading” them on a pixel level, so that they appear to a machine learning algorithm to contain entirely different content — a purse instead of a cow, let’s say. Trained on a few “shaded” images scraped from the web, an AI algorithm can begin to generate erroneous imagery from what a user prompts or asks.

On the Nightshade project page, Zhao and his colleagues — Shawn Shan, Wenxin Ding, Josephine Passananti, and Heather Zheng — stated they developed and released the tool to “increase the cost of training on unlicensed data, such that licensing images from their creators becomes a viable alternative.”

Shortly after Nightshade’s release on January 18, 2023, the demand for concurrent downloads of it was so overwhelming to the University of Chicago’s web servers, that the creators had to add mirror links where people could download copies of it from another location on the cloud.

Meanwhile, the team’s earlier tool — Glaze, which works to prevent AI models from learning an artist’s signature “style” by subtly altering pixels so they appear to be something else to machine learning algorithms — has received 2.2 million downloads since it was released in April 2023, according to Zhao.

What’s next for the Glaze/Nightshade team?

Operating under their umbrella name, The Glaze Project, Zhao and his fellow researchers had already previously stated their intention to release a tool combining both Glaze (defensive) and Nightshade (offensive).

As for when this is coming, it will be at least a month at the soonest.

“We simply have a lot of to dos on our list right now,” Zhao wrote. “The combined version must be carefully tested otherwise ensure we don’t have surprises later. So I imagine at least a month, maybe more, for us to get comprehensive tests done.”

At the same time, The Glaze Project researchers have advocated that artists use Glaze first, then Nightshade, to both protect their style while disrupting AI model training, and have been heartened to see artists doing just that, even though it is a little more cumbersome to use two separate programs.

“We warned people that we have not done full tests to understand how it works together with Glaze and that folks should wait before releasing any images with only Nightshade,” Zhao explained. “The artist community’s response was to say, ‘we will just Nightshade and Glaze in two steps, even though it takes more time and has more visible impact on the art.’”

An open-source version of Nightshade may be in the cards as well. “We will likely do an open source version at some point,” Zhao stated. “Just more time required to put out different versions.”

The project leader noted that he and his colleagues had not, nor did they expect to hear directly from, the model makers behind AI image generating technology, such as OpenAI (DALL-E 3), Midjourney, Stability AI (Stable Diffusion) and others. VentureBeat uses some of these tools and others to create article imagery and other content.

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.


Author: Carl Franzen
Source: Venturebeat
Reviewed By: Editorial Team

Related posts
AI & RoboticsNews

H2O.ai improves AI agent accuracy with predictive models

AI & RoboticsNews

Microsoft’s AI agents: 4 insights that could reshape the enterprise landscape

AI & RoboticsNews

Nvidia accelerates Google quantum AI design with quantum physics simulation

DefenseNews

Marine Corps F-35C notches first overseas combat strike

Sign up for our Newsletter and
stay informed!