‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

  • jivandabeast@lemmy.browntown.dev
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    4
    ·
    1 year ago

    No I disagree because before you could tell a fake from a mile away, but deepfakes bring it to a whole new level of creepy because they can be EXTREMELY convincing

      • Delta_V
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Or maybe an accessibility improvement. You don’t need to practice creating your own works of art over many years anymore, or have enough money to commission a master artist. The AI artists are good enough and work for cheap.

      • jivandabeast@lemmy.browntown.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I’m not saying that it’s a shift in nature? All I’ve been saying is:

        A) tools to create realistic nudes have been publicly available ever since deepfakes became a thing

        B) deepfakes are worse than traditional photoshopped nudes because (as you put it, a quality improvement) they’re more convincing and therefore can have more detrimental effects

    • lolcatnip@reddthat.com
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      There was a brief period between now and the invention of photography when that was true. For thousands of years before that it was possible to create a visual representation of anything you imagine without any hint that it wasn’t something real. Makes me wonder if there were similar controversies about drawings or paintings.