We use cookies to ensure that we give you the best experience on our website. Read privacy policies.
Nightshade is a free tool created by computer scientists at the University of Chicago that poisons digital artwork against scraping by AI models. It makes image data useless for AI models that use such data for training by making invisible changes to pixels in an image that makes reproduction by AI unpredictable. The tool is gaining traction as a copyright protection tool and will likely impact generative AI products such as Midjourney, DALL-E, and Stable Diffusion.
Nightshade has gained over 250,000 downloads in its first week after release and is being adopted by artists to protect their digital artwork from AI scraping operations. The development is a key move in the growing conflict by content creators against artificial intelligence companies that use web scraping to use art data for AI models without permission.
Artists Take up Nightshade Data Poisoning Tool
Thank you for subscribing!