The ESRB has added:

“To be perfectly clear: Any images and data used for this process are never stored, used for AI training, used for marketing, or shared with anyone; the only piece of information that is communicated to the company requesting VPC is a “Yes” or “No” determination as to whether the person is over the age of 25.”

Sure, ok…

I don’t know what else to say about this, this will obviously turn into something else.

      • SatanicNotMessianic@lemmy.ml
        link
        fedilink
        arrow-up
        16
        ·
        1 year ago

        From the description, it sounds like you upload a picture, then show a face to a video camera. It’s not like they’re going through FaceID that has anti-spoofing hardware and software. If they’re supporting normal web cams, they can’t check for things like 3d markers

        Based on applications that have rolled out for use cases like police identifying suspects, I would hazard a guess that

        1. It’s not going to work as well as they imply
        2. It’s going to perform comically badly in a multi-ethnic real world scenario with unfortunate headlines following
        3. It will be spoofable.

        I’m betting this will turn out to be a massive waste of resources, but that never stopped something from being adopted. Even the cops had to be banned by several municipalities because they liked being able to identify and catch suspects, even if it was likely to be the wrong person. In one scenario I read about, researchers had to demonstrate that the software the PD was using identified several prominent local politicians as robbery and murder suspects.

      • Doug [he/him]
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        1 year ago

        Except all the times it has.

        When there’s only one camera that’s often been the way it has worked

          • Doug [he/him]
            link
            fedilink
            arrow-up
            5
            arrow-down
            1
            ·
            1 year ago

            So it does, in at least some cases, work like that?

            It’s ok to admit being wrong

            • Shikadi@lemmy.sdf.org
              link
              fedilink
              arrow-up
              3
              ·
              1 year ago

              Yes, it does work like that in some cases. My comment is technically wrong, the best kind of wrong.

              As another commenter pointed out, the way they intend to do it sounds absolutely like they are going to do it the old way, which surprises me because the hold up a photo thing has been a solved problem for a while.

      • Umbrias@beehaw.org
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        Plaster sculpt, then add fake skin to, and add a small linear actuator for breathing stimulation, small twitch motors under the skin, and run it under some alternating leds to stimulate blood flow coloration. Should defeat almost all facial recognition software. Might need some eye fakes.

        Or just wear makeup to an insane degree. Or return to the forests and live a much happier life.

      • Radium@sh.itjust.works
        link
        fedilink
        arrow-up
        3
        arrow-down
        4
        ·
        edit-2
        1 year ago

        TI would be every dollar I’ve ever made that you know absolutely nothing about how it works. You seem like someone who is barely technically proficient and likes to pretend like that means they know how things work.

        I’m a software engineer and can confirm that you are absolutely fucking wrong on this one.

        • Shikadi@lemmy.sdf.org
          link
          fedilink
          arrow-up
          4
          arrow-down
          1
          ·
          1 year ago

          I’m a software engineer and I work in machine vision hardware. I may have been lazy with my response, but I do know what I’m talking about. On some level I’m probably in a bubble because I work close enough to the cutting edge of things that I wouldn’t expect any modern company to be employing such basic algorithms to a solved problem.

          I’m a software engineer and I can confirm that you are absolutely fucking rude on this one.

  • Jeena@jemmy.jeena.net
    link
    fedilink
    arrow-up
    55
    ·
    edit-2
    1 year ago

    I don’t think the day before your 18th/25th birthday and the day on your 18th birthday your face looks so much different.

  • mindbleach@lemmy.world
    link
    fedilink
    arrow-up
    45
    ·
    1 year ago

    Drink verification can.

    Any images and data used for this process are never stored

    Anyone who believes this deserves it.

    • Jamie@jamie.moe
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      Since it determines if you’re over the age of 25, maybe instead they could get a more accurate measure by having you drink a verification beer.

  • PlatypusXray@feddit.de
    link
    fedilink
    arrow-up
    38
    ·
    1 year ago

    Can anybody actually remember voting for totalitarian control freaks who seem to be scared of people who are not under constant surveillance?

      • exohuman@kbin.social
        link
        fedilink
        arrow-up
        6
        ·
        1 year ago

        While true, unfortunately the latest government spy bill is bipartisan. It will make end to end encryption for texts and chat illegal, using drug enforcement as the excuse.

    • reversebananimals@lemmy.world
      link
      fedilink
      arrow-up
      10
      ·
      1 year ago

      Nearly a quarter of Americans say that a strong leader who doesn’t have to bother with Congress or elections would be “fairly” or “very good” and 18 percent say that “army rule” would be “fairly” or “very good.” More than a quarter of respondents show at least some support for either a “strong leader” or “army rule.”

      https://www.voterstudygroup.org/publication/follow-the-leader

      A disturbing minority of human beings unironically prefer being under a boot.

  • RagingNerdoholic@lemmy.ca
    link
    fedilink
    arrow-up
    28
    ·
    edit-2
    1 year ago

    we won’t ever ever keep your pictures and stuff for the juiciest possible marketing fodder, we super duper pinky swear

    image

  • Mozami@kbin.social
    link
    fedilink
    arrow-up
    22
    ·
    1 year ago

    To be perfectly clear: Any images and data used for this process are never stored, used for AI training, used for marketing, or shared with anyone; the only piece of information that is communicated to the company requesting VPC is a “Yes” or “No” determination as to whether the person is over the age of 25.

    I’d have a hard time coming up with a better lie than this.

  • rickrolled767@ttrpg.network
    link
    fedilink
    arrow-up
    21
    ·
    1 year ago

    Can people who stop trying to throw tech at things where it clearly doesn’t belong? Seems like every time I turn around people are trying to use AI for things with the expectation that it’s some flawless innovation that can do no wrong.

    And that’s not even getting into the privacy nightmare that comes with things like this

  • liara@lemm.ee
    link
    fedilink
    arrow-up
    15
    arrow-down
    2
    ·
    1 year ago

    Because this strategy worked so well for determining individuals’ assigned sex at birth. What could possibly go wrong?

    • Dojan@lemmy.world
      link
      fedilink
      arrow-up
      17
      ·
      1 year ago

      It already has gone wrong.

      There’s a story about a gay couple here in Sweden. One of the men lived with his mother.

      One morning, around 3-4AM I think, a group of masked men went into his apartment and woke him up violently. They physically abused him, before they took him away.

      Eventually he was taken to an interrogation room where he was questioned about a child he had supposedly sexually assaulted.

      At some point they showed him pictures of him and this purported child, only said child was his very much adult, twink-ass boyfriend.

      He and his boyfriend had shared the images with one another over a chat service, like Kik or something, which some American organisation had gotten their hands on, and then forwarded to Swedish police.

      Swedish police then swatted him, and when they stood there with egg on their face the investigation was dropped. No repercussions for the police. None of the people who brutally assaulted the man got any sort of punishment, because he wasn’t able to identify any of them, since they were masked and he shockingly didn’t have X-ray vision, and the police had magically lost all records of who they sent out to bring him in.

      Thinking back on this still fills me with rage. I’ve always thought our police were fairly chill and approachable, nothing like the gun toting cowards in the US, but no. It seems like ACAB holds true everywhere.

    • TiredSpider@slrpnk.net
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Saw an app try exactly this. It was run by terfs and they wanted to lock out anyone who wasn’t a cis woman. Instead it labelled almost every black woman a man and many trans women got through the filter anyway.

  • SokathHisEyesOpen@lemmy.ml
    link
    fedilink
    arrow-up
    13
    ·
    1 year ago

    To be perfectly clear: Any images and data used for this process are never stored, used for AI training, used for marketing, or shared with anyone

    Does anyone have some bridges for sale? I suddenly feel an urge to buy a bridge.

  • coffeeguy@lemmy.world
    link
    fedilink
    arrow-up
    9
    ·
    1 year ago

    Pay: ESRB facial recognition + Denuvo system monitor + custom launcher with system privileges + game

    Pirate: game

    This type of stuff only punishes paying customers.