We use cookies to ensure that we give you the best experience on our website. Read privacy policies.
Nightshade, a new software developed by the Glaze Project at the University of Chicago, enables artists to "poison" AI models that train on their artwork. It subtly distorts images to mislead AI, making it "see" something different than what's depicted. This tool is a proactive measure against AI models copying an artist's style without consent. Nightshade is designed to be used alongside Glaze, an earlier tool with a defensive approach. Available for Mac and Windows, Nightshade aims to make unauthorized data training more expensive, pushing AI developers towards licensing agreements with creators. Some see it as a cyberattack on AI, but its creators argue it's for protecting artists' rights and deterring data scraping.
Nightshade, the free tool that ‘poisons’ AI models, is now available for artists to use
Thank you for subscribing!