I don’t know how I feel about Nightshade. Nightshade is a newly released tool that allows artists to “poison” datasets that use images of their artwork without permission. Before sharing it online, an artist can boobytrap their image so that if it is integrated into an AI training dataset without their consent, it contaminates the data with misinformation, for example flawed data may cause the AI to read a dog as a cat (to use the common example) or to read a radioactive cheeto as a religious aura (to use mine).
Obviously the intention here is to empower artists and send a message about the importance of respecting artists’ copyrights and intellectual property, which I get.
But I can’t help but think there’s something gnarly about this approach. Normally, artists come up with something way better when they are raging against the machine.
Just one example of this higher-quality raging is a project that I absolutely love, the Algorithmic Resistance Research Group, the endeavour of a group of artists including Eryk Salvaggio, Caroline Sinders, and Steph Maj Swanson aka Supercomposite.
The group engages in “the creative misuse of Generative AI, Machine Learning, and other automated data analysis systems”, developing strategies for creative resistance against algorithmic systems by understanding how to identify and subvert technologies that are harmful to human rights or society.
Here, artists are intervening in systems toward outcomes that bring visibility to the exploitation that is built into some of the algorithmic systems that we encounter every day. Salvaggio brings together some great examples, such as Simon Weckhert who tricked Google Maps into thinking there was a traffic jam when he walked over a bridge with a wagon full of iphones or James Bridle who trapped an autonomous car by painting a circle around it, and connects this work to the legacy of the Dadaists and Situationists, where artists hijacked expectations of audiences and society by creating “non-sensical or deliberately counter-productive” data.
Another example: Discovering that artists were being ripped off by a company that was automatically creating and selling a T-shirt of any image on Twitter that someone had tagged with something like “That would be a great T-shirt”, artists starting writing “I would buy that shirt” on bizarre or transgressive images, even images designed to cause a fight between the shirt thieves and corporate copyright lawyers. In the words of AV Club’s coverage of the phenomenon, “Where there’s an algorithm, there’s a way to fuck with that algorithm for our own collective amusement.”
Anyway maybe it’s the idea of poisoning the data that rubs me the wrong way but in contrast to the long history of artists whose raison d’etre was in finding ways to dissect, defy, dispute, and dissent: Nightshade seems a little like dropping a turd in the punch bowl or going home with the football when the game is going badly. Although it could be in the examples: if they go ahead with an open-source version of Nightshade that allows artists to customize their data interventions maybe my point is moot and we’re back in the zone of Dada where we’re hijacking systems to teach AI that “hats are cakes” and “handbags are toasters” and I’d be kind of into that.
Gosh, thanks! :)