I’m so glad I’m not growing up in this age of smartphones, social media, and bullshit generators. Life was hell enough in the 90s without all that noise.

  • BurningnnTree@lemmy.one
    link
    fedilink
    English
    arrow-up
    119
    arrow-down
    1
    ·
    3 months ago

    This is a misleading headline. The survey question asked if you know any friends OR CLASSMATES who have generated AI nudes. If just one kid in a large school generates AI nudes and a lot of other kids find out about it, then those kids will answer yes on this survey. So this statistic doesn’t do a good job of indicating exactly how widespread this is.

    • Artyom@lemm.ee
      link
      fedilink
      arrow-up
      25
      ·
      3 months ago

      The whole premise of the survey is absurd. Almost no one would actually admit to using it, so we’re multiplying a super small ratio of admittance by a super large ratio of acquaintance and assuming the two cancel which is almost certainly not true.

      It’s all speculation, the only way to know how many people are using these things is by website visit analytics which these companies would never give away (for free).

    • ocassionallyaduck@lemmy.world
      link
      fedilink
      arrow-up
      8
      arrow-down
      2
      ·
      3 months ago

      Still, even assuming Rumor mill inflation drives this down by a factor of 10, and we slash it down, there are a handful of kids probably doing this, which isn’t surprising, bit is extremely upsetting.

  • Flying Squid@lemmy.world
    link
    fedilink
    arrow-up
    95
    ·
    3 months ago

    I’m so glad I’m not growing up in this age of smartphones, social media, and bullshit generators. Life was hell enough in the 90s without all that noise.

    Turned 18 in 1995 here. I can’t say with absolute certainty that I wouldn’t have done the same thing as a horny teenager. Although I doubt I would have told anyone about it.

  • brucethemoose@lemmy.world
    link
    fedilink
    arrow-up
    31
    ·
    3 months ago

    TBH kids need a new culture/attititude towards digital media, where they basically assume anything they consume on their phones is likely bogus. Like a whole new layer of critical thinking.

    I think it’s already shifting this direction with the popularity of private chats at the expense of “public” social media.

    • Pennomi@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      ·
      3 months ago

      Yep, at this point your real nudes could leak and you can just plausibly claim they’re faked.

      Something will change, but I’m not sure where society will decide to land on this topic.

          • leftzero@lemmynsfw.com
            link
            fedilink
            arrow-up
            1
            ·
            3 months ago

            Birthmarks don’t seem like the kind of thing AI would generate (unless asked), though…

            (And, as model collapse sets in and generated images become more and more generic and average, things like birthmarks will become more and more unlikely…)

            • xmunk@sh.itjust.works
              link
              fedilink
              arrow-up
              1
              ·
              3 months ago

              AI is quite unpredictable… it’s sort of only useful because of how random it is. But my point is that either the knowledge is public or private - there’s no situation where you can’t either deny or attribute it to public knowledge.

    • kibiz0r
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      3 months ago

      I kinda doubt anyone is getting “fooled” by these at this point, though that is a whole nother layer of horrible hell in store for us…

      Right now, we’re dealing with the most basic questions:

      • Is it immoral (and/or should it be illegal) for people to be trading pornographic approximations of you?
      • Is it immoral (and/or should it be illegal) for people to privately make pornographic approximations of you?
      • Is it immoral (and/or should it be illegal) to distribute software which allows people to make pornographic approximations of others?
      • yeather@lemmy.ca
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        3 months ago
        1. Illegal
        2. Immoral
        3. If built specifially for? Illegal. If it’s a tool with proper blockages that ends up having a way to break away and make them anyway? No.
  • 21Cabbage@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    29
    arrow-down
    1
    ·
    3 months ago

    Having myself been a horny little perv at that age that surprises me little. Hell, I bet those are low numbers.

    • cybermass@lemmy.ca
      link
      fedilink
      arrow-up
      27
      ·
      3 months ago

      Definitely not. I know my 14 year old 4chan dwelling ass would have known about this before 99.99% of the population.

      Don’t overestimate teenage innocence, this problem is very much in the public sphere I doubt many teens don’t know about this technology already.

      • 21Cabbage@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 months ago

        See but us preteen 4channers make up a minority of the population. My little clan of nerds in a midwestern high school was easily less than 10% of the students.

        • yeather@lemmy.ca
          link
          fedilink
          arrow-up
          6
          ·
          3 months ago

          It’s how info flows though. Your small group of nerds knows, but one of those nerds is good buddys with a football player and tells them. Now they know, they create ai porn of a chearleader, and now everyone knows.

    • fine_sandy_bottom@lemmy.federate.cc
      link
      fedilink
      arrow-up
      6
      ·
      3 months ago

      It’s true that there may be some kids who hadn’t thought of it, but I don’t think that’s a good reason not to ask.

      I think 14 is a great age to be talking to kids about these issues. At the very least, they need to understand that nudity can be a big deal where minors are involved. There are charities that give talks about it in schools here. “Don’t share nudes of your friends because it’s a serious crime”.

      Kids absolutely need to know about this as soon as they’re old enough to have a few minutes alone with any electronic device.

      • Corvidae@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        3 months ago

        Puberty begins as early as 10 in girls and 11 in boys, so 14 may be a little late, at least for some kids.

        • fine_sandy_bottom@lemmy.federate.cc
          link
          fedilink
          arrow-up
          4
          ·
          3 months ago

          Yeah, I just meant 14 is not too young rather than 14 is the correct age.

          I’m sure it will be very difficult but my intention with my kids is just to be as open and honest about these sorts of issues as reasonably possible, at any age. I don’t really see why sex and nudity and privacy and respect have to be awkward subjects.

          There was recently a “book burning” type rally in my city where some residents were concerned that there was a book at the library that had sex-related topics available to children. I just don’t get it.

  • Cocodapuf@lemmy.world
    link
    fedilink
    arrow-up
    17
    arrow-down
    1
    ·
    3 months ago

    Who here has kids?

    Because the statement: “1 in 10 minors say their friends X” is true for any value of X.

    Besides, I had this friend at school who’s stepsister is a model, she lives in Canada. Anyway, you’ll never believe what she did with my friend!

    • Doxin@pawb.social
      link
      fedilink
      arrow-up
      2
      ·
      3 months ago

      But also this is entirely something kids would do if given the chance. I’d not be surprised if it was a lot more than 1 in 10 either.

      Nonsense survey in any case.

    • Cocodapuf@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      3 months ago

      Right, so some kids are generating fakes, but the real things are already floating around.

      It seems to me that the discussion should really be with the kids about what kind of material or messages are appropriate.

      And no, I don’t expect that to be 100% successful, but I think it’s still the right approach.

    • MIDItheKID@lemmy.world
      link
      fedilink
      arrow-up
      6
      arrow-down
      2
      ·
      3 months ago

      Pornography was doing a fine job of that. Naked deep fakes of actual people who are actual minors should not be desensitized.

  • HarriPotero@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    3 months ago

    When I was little we’d play this game called auts.

    The levels were basically just bitmaps with one colour signed as alpha.

    I scanned a picture of my math teacher just so we could fly around it and blow it up. Good times.

  • 11111one11111@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    3 months ago

    “I’m so glad I’m not growing up in this age of smartphones, social media, and bullshit generators.”

    Lol I’m so glad I’m not the journalist or or journalist’s intern who had to poll this question to kids lol. I know I’m being stupid but all I cam picture is that Steve Buscemi meme with the skate board being like, “Hey duuuudes, any of us kids making your Commodore 64’s draw boobies for you?!”

    Back in 2005 when I was in school if we saw any teacher talking to students about anything not school related there were rumors that teacher was fucking every student down that hall. In our defense tho 3 out of the 9 science department teachers were relocated or fired for fucking girls from our class. Not in our defense tho, noone knew while we were still in school. We were just assholes.

  • schizo@forum.uncomfortable.business
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    7
    ·
    3 months ago

    Yeah, I too am glad that the worst I had to deal with were crudely drawn stick figure pictures implying things.

    Too bad AI is turning 10% (or more, I guess but the article wasn’t entirely clear) of our kids into sex-offending felons.