Salamendacious@lemmy.world to News@lemmy.world · 1 year agoMeet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training dataventurebeat.comexternal-linkmessage-square110fedilinkarrow-up1542arrow-down136 cross-posted to: technology@hexbear.netart@hexbear.nettechnews@radiation.party
arrow-up1506arrow-down1external-linkMeet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training dataventurebeat.comSalamendacious@lemmy.world to News@lemmy.world · 1 year agomessage-square110fedilink cross-posted to: technology@hexbear.netart@hexbear.nettechnews@radiation.party
minus-squarealiteral@lemmy.worldlinkfedilinkarrow-up3·1 year agoI understand where you are coming, but most AI models are trained without the consent of those who’s work is being used. Same with Github Copilot, it’s training violated the licensing terms of various software licenses.
I understand where you are coming, but most AI models are trained without the consent of those who’s work is being used. Same with Github Copilot, it’s training violated the licensing terms of various software licenses.