Is it fairly easy? Seems useful for a public site like Lemmy and the fediverse
https://nightshade.cs.uchicago.edu/whatis.html
https://decrypt.co/203153/ai-prompt-data-poisoning-nightshared
Is it fairly easy? Seems useful for a public site like Lemmy and the fediverse
https://nightshade.cs.uchicago.edu/whatis.html
https://decrypt.co/203153/ai-prompt-data-poisoning-nightshared
This doesn’t have anything to do with tracking. This is supposed to sabotage free and open image generators (ie stable diffusion). It’s unlikely to do anything, though.
Hard to say what the makers want to achieve with this. Even if it did work, it would help artists just as much, as better DRM would help programmers. On its face, this is just about enforcing some ultra-capitalist ideology that wants information to be owned.
I see it as trying to combat the dystopia where not only is our data scraped but now every single thing we write, draw or film is fed into an AI that will ultimately be used to create huge amounts of wealth for very few, essentially monetizing our very existence online in a way thats entierly unavoidable and without consent.
In addition its entierly one way, google and others can grab as much of our data as they want while most of us would have an extremely hard time even getting granted a freedom of information request about ourselves, let alone grabbing a similar amount of data about those same corporations.
But… That is what these poisoning attacks are fighting for. They are attacking open image generators that can be used by anyone. You can use them for fun or for business, without having to pay rent to some owner who is not lifting a finger. What do you think will happen if you knock that out?