Salamendacious@lemmy.world to News@lemmy.world · 1 year agoMeet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training dataventurebeat.comexternal-linkmessage-square110fedilinkarrow-up1542arrow-down136 cross-posted to: technology@hexbear.netart@hexbear.nettechnews@radiation.party
arrow-up1506arrow-down1external-linkMeet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training dataventurebeat.comSalamendacious@lemmy.world to News@lemmy.world · 1 year agomessage-square110fedilink cross-posted to: technology@hexbear.netart@hexbear.nettechnews@radiation.party
minus-squareStuka@lemmy.mllinkfedilinkarrow-up12arrow-down1·1 year agoSo artists can’t make certain art because some company’s AI might get confused. Right then.
minus-squareSCB@lemmy.worldlinkfedilinkarrow-up1arrow-down9·1 year ago… If an artist doesn’t want their art used, we already have a system in place for that. If that system needs expanding or change, then that is the discussion that should be had. Laws are better than random acts of destruction.
So artists can’t make certain art because some company’s AI might get confused. Right then.
… If an artist doesn’t want their art used, we already have a system in place for that. If that system needs expanding or change, then that is the discussion that should be had.
Laws are better than random acts of destruction.