Salamendacious@lemmy.world to News@lemmy.world · 1 year agoMeet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training dataventurebeat.comexternal-linkmessage-square110fedilinkarrow-up1542arrow-down136 cross-posted to: technology@hexbear.netart@hexbear.nettechnews@radiation.party
arrow-up1506arrow-down1external-linkMeet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training dataventurebeat.comSalamendacious@lemmy.world to News@lemmy.world · 1 year agomessage-square110fedilink cross-posted to: technology@hexbear.netart@hexbear.nettechnews@radiation.party
minus-squareAsifall@lemmy.worldlinkfedilinkarrow-up11·1 year agoI don’t think the idea is to protect specific images, it’s to create enough of these poisoned images that training your model on random free images you pull off the internet becomes risky.
I don’t think the idea is to protect specific images, it’s to create enough of these poisoned images that training your model on random free images you pull off the internet becomes risky.