‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

    • cosmicrookie@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      4
      ·
      1 year ago

      But its not. That is not legal.

      I dont know if it is where you live, but here (Scandinavian Country) and many other places around the World, it is illigal to create fske nudes of people without their permission

      • Daxtron2@startrek.website
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        Ah didn’t know that, AFAIK it’s protected artistic speech in the US. Not to say that it’s right but that’s probably why it’s still a thing.

        • barsoap@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          In principle that’s the case in Germany, too, but only if the person is of public interest (otherwise you’re not supposed to publish any pictures of them where they are the focus of the image) and, secondly, it has to serve actually discernible satire, commentary, etc. Merely saying “I’m an artist and that’s art” doesn’t fly, hire a model. Similar to how you can dish out a hell a lot of insults when you’re doing a pointed critique, but if the critique is missing and it’s only abuse that doesn’t fly.

          Ha. Idea: An AfD politician as a garden gnome peeing into the Bundestag.

      • TotallynotJessica@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        1 year ago

        Appreciate how good you have it. In America, child sex abuse material is only illegal when children were abused in making it, or if it’s considered obscene by a community. If someone edits adult actors to look like children as they perform sex acts, it’s not illegal under federal law. If someone generates child nudity using ai models trained on nude adults and only clothed kids, it’s not illegal at the national level.

        Fake porn of real people could be banned for being obscene, usually at a local level, but almost any porn could be banned by lawmakers this way. Harmless stuff like gay or trans porn could be banned by bigoted lawmakers, because obscenity is a fairly subjective mechanism. However, because of our near absolute freedom of speech, obscenity is basically all we have to regulate malicious porn.

        • cosmicrookie@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 year ago

          The way I believe it is here, is that it is illigal to distribute porn or nudes without consent, be it real or fake. I don’t know how it is with AI generated material of purely imaginary people. I don’t think that that is iligal. but if it is made to look like someone particular, then you can get sued.

        • CaptainEffort@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 months ago

          child sex abuse material is only illegal when children were abused in making it

          This is literally why it’s illegal though. Because children are abused, permanently traumatized, or even killed in its making. Not because it disgusts us.

          There are loads of things that make me want to be sick, but unless they actively hurt someone they shouldn’t be illegal.