Nightshade tool can “poison” images to thwart AI training and help protect artists [TechSpot]

View Article on TechSpot

MIT Technology Review highlights the new tool, called Nightshade, created by researchers at the University of Chicago. It works by making very small changes to the images’ pixels, which can’t be seen by the naked eye, before they are uploaded. This poisons the training data used by the likes of…

Read Entire Article