Artists have long sought ways to protect their works from unauthorized use, particularly by artificial intelligence models that train on vast swaths of web data, often without permission.
Enter Nightshade v1.0: a cutting-edge tool released by computer scientists at the University of Chicago that gives artists with a digital shield to protect their works from unwanted artificial intelligence consumption, VentureBeat reported.
Nightshade is the “offensive” counterpart to its predecessor, Glaze, a tool designed to obfuscate an artist’s style from artificial intelligence.
Glaze changes in works of art are “like UV light” – invisible to the naked eye. “The models have mathematical functions that allow them to perceive images that are very, very different from what the human eye looks like,” said Shawn Shan, a graduate student researcher at the University of Chicago IT infusion.
Similarly, Nightshade embeds pixel-level changes into graphics that are imperceptible to the human eye, but its tweaks effectively act as hallucinogenic “poison” for the AI, causing it to completely misinterpret the content, according to VentureBeat. Photos of pastoral scenes can suddenly be recognized by artificial intelligence as fashion accessories – for example, a cow becomes a leather handbag.
The tool is intended for users of Mac or PC computers equipped with Apple M1, M2 or M3 chips and running Windows 10 or 11.
Many artists, including Kelly McKernan – a plaintiff in a highly publicized copyright infringement class motion lawsuit against artificial intelligence firms including Midjourney and DeviantArt – have welcomed Nightshade with open arms, according to the outlet. However, critics are condemning the tool as a veiled attack on AI models and firms, with one going so far as to call it an “illegal” hack.
Ahahah, this fashion of dealing is crazy.
Dude rightly objects to glazing your photos because it’s “illegal” in his eyes.
He compared it to having his computer hacked because it “disrupts its operation.”I’m delighted pic.twitter.com/BhMP73BkUb
— Jade (@jadel4w) January 19, 2024
The development team behind Nightshade stands by their work, arguing that their intention is not to wreak havoc on AI models, but to tip the economic scales by making it less financially profitable to ignore artists’ copyrights – and more attractive to enter into legal licensing agreements.