- cross-posted to:
- technology@lemmy.zip
- cross-posted to:
- technology@lemmy.zip
Nucleo’s investigation identified accounts with thousands of followers with illegal behavior that Meta’s security systems were unable to identify; after contact, the company acknowledged the problem and removed the accounts
A rebuttal to this that I’ve read is that the easy access may encourage people to dig into it and eventually want “the real thing”… but regardless, with it being FOSS, there’s no easy way to stop it anyway… It’s just a Pandora’s box that we can never close.
And I could rebute to that, that if someone is interested enough to check it with AI then they were likely to try and check it anyway without AI, maybe it would take longer, it would be harder to find… But they’d be the intended audience that now are redirected elsewhere.
To quote myself:
We could rebute again and again and again, and get nowhere because either option is hard to discuss as it is simply impossible to give proper data to prove anything. And worse, when defending the use of AI for it can lead to being told you are allowing it in the first place and that’s not even telling how many people still believe that AI needs real sample images to produce those (whether the algorithm is trained or not on CP is irrelevant on this particular point, as it is not needed to be created)