A.I. Is Making the Sexual Exploitation of Girls Even Worse::Parents, schools and our laws need to catch up to technology, fast.

    • ominouslemon@lemm.ee
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      5
      ·
      10 months ago

      You could do everything before, that’s true, but you needed knowledge/time/effort, so the phenomenon was very limited. Now that it’s easy, the number of victims (if we can call them that) is huge. And that changes things. It’s always been wrong. Now it’s also a problem

      • BringMeTheDiscoKing@lemmy.ca
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        3
        ·
        10 months ago

        This is right. To do it before you had to be a bit smart and motivated. That’s a smaller cross section of people. Now any nasty fuck with an app on their phone can bully and harass their classmates.

        • ominouslemon@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          10 months ago

          The time / effort here is very similar, both methods have their own quirks that make them better or worse than the other, both methods however are very fast and very easy to do.

          You’re lying to yourself and you must know that, or you’re just making false assumptions. But let’s go through this step by step.

          Now with a “nudify” app:

          • install a free app
          • snap a picture
          • click a button
          • you have a fake nude

          Before:

          • snap a picture
          • go to a PC
          • buy Photoshop for $ 30.- / month (sure) or search for a pirated version, dowload a crack, install it and pray that it works
          • find a picture that fits with the person you’ve photographed
          • read a guide online
          • try to do it
          • you have (maybe) a bad fake nude

          That’s my fist point. Second:

          the result should just be ignored as far as personal feelings go

          Tell it to the girl who killed herself because everyone thought that her leaked “nudes” were actual nudes. People do not work how you think they do.

          You don’t need special laws to file for harassment or even possible blackmail. This whole thing is just overblown fake hysteria and media panic because “AI” is such a hot topic at the moment

          True, you probably don’t need new laws. But the emergence of generative AI warrants a public discussion about its consequences. There IS a lot of hype around AI, but generative AI is here and is having/will have a tangible impact. You can be an AI skeptic but also recognise that some things are actually happening.

          In a few years this will all go away again because no one really cares that much and real leaked nudes will possibly even declared a deepfake to confuse people.

          For this to happen, things will have to get WAY worse before they get better. And that means people will suffer and possibly kill themselves, like it’s already happened. Are we ready to let that happen?

          Also we’re talking only about fake nudes, but if you think about the fact that GenAI is going to spread throughout every aspect of our world, your point becomes even more absurd