I don’t know how I feel about Nightshade. Nightshade is a newly released tool that allows artists to “poison” datasets that use images of their artwork without permission. Before sharing it online, an artist can boobytrap their image so that if it is integrated into an AI training dataset without their consent, it contaminates the data with misinformation, for example flawed data may cause the AI to read a dog as a cat (to use the common example) or to read a radioactive cheeto as a religious aura (to use mine).
Gosh, thanks! :)
i love your work so much!