- cross-posted to:
- gaming@beehaw.org
- ps5@lemmy.world
- cross-posted to:
- gaming@beehaw.org
- ps5@lemmy.world
I see no possible way this could be abused.
Upload pictures of your kids. We pinky promise to delete them.
They now can verify everyone who’s slept with my mom
Doesn’t that mean your mom is a pedophile?
Can’t be. She’s dead.
She’s a dracula!?
I suggest everyone file an official comment with the FTC. This is beyond ridiculous. If anything is going to be changed the ESRB needs to actually prop market the ratings because all the problems come from parent that just buy games without any research and then complain when they don’t read what the M rating on the game means
I feel like this could be bypassed by justing photographing a parent and submitting their image, not to mention how fucking horrifying the whole “database of minor’s faces” thing is, even with the claim that they will delete all the images.
We saw how well deleting pictures worked with the TSA and their body scanners…
deleted by creator
How about parents parent their kids instead.
That would mean reading the M on the game
M stands for … My baby boy
That is becoming less and less popular by the year. Parenting the kids is someone else’s responsibility, but by hod if they have a different opinion than the parents.
This is profoundly stupid
30 year old Asians will get fucked over by this, I’ll bet.
30-year-old Asian here.
Yeah…
Drink verification can to continue.
Oh Lordy, can you imagine if they made a combo system of having your console do a facial scan whilst drinking a verification can?
Apple had to go through a tremendous amount of engineering effort, custom hardware development, and testing to get FaceID to work reliably. This sounds like a software-based system that uses a user’s camera. I’m also not exactly certain what these steps mean without making a lot of assumptions.
The user takes a photo of themselves
Okay? MS had developed an age estimator a decade or so ago. They put up a guess-my-age website that you could upload photographs to. It was relatively accurate (at least in my testing)in that it scored consistently and was usually within a few years of the right answer. I’m sure they’re better now.
The system then checks if there’s a live human face in the frame
So it’s looking at a live video? What’s the picture for, then? Does it confirm the picture is the same person as in the video, in which case why would someone have to upload a picture?
The image is then uploaded to Yoti’s backend server for estimation
Again, fine. Privacy concerns aside, that’s what we’d expect.
But out of gaming devices, how many have a video camera? Obviously phones and tablets do, as do most laptops, but neither my switch nor my steam deck have one, nor does my recently retired gaming pc. Am I su even if I’m playing on another device? Am I going to have to periodically re-authenticate?
I’m not even talking about spoofing here, which given the ubiquity of filters for phone cameras would be trivial.
This strikes me as someone’s project that was sold to management as a good idea and which now needs to find an application.
I’m also going to make the very safe assumption that despite their claims, their real world performance across ethnicities is not going to match up with their confident statements in their application. That’s been a pretty constant issue with this sort of application. They make the same claim about police facial recognition databases despite being repeatedly proven wrong.
The proposal also said, “To the extent that there is any risk, it is easily outweighed by the benefits to consumers and businesses of using this [Facial Age Estimation] method.”
I really, really want to know what the actual harm being prevented is supposed to be such that it outweighs any other concerns. I don’t mean that some ten year old might play Cyberpunk. I mean actual research showing a quantified harm associated with it, along with harm reduction realized by implementing a parental-permission based age verification system.
Time to put these cameras in movie theaters. Can’t be watching R rated films if under 17. Time for cameras on E readers, can’t be reading inappropriate material without regulatory consent. Time to take a photo of yourself, the new Travis Scott album just dropped, but headphones only there buddy. Can’t have anyone else listening to it unless they authenticate.
The “harm”? Litigation $$$ paid to parents of 10 year olds playing Cyberpunk, all it is, really. ESRB covering their butts because they know the rating system is, and always has been, as useless as the parental advisory stickers on CD cases. Parents don’t know it exists, retailers don’t care as long as they get their money, and devs/publishers only care about it as something to point to in order to avoid censorship while they blatantly market games like Cyberpunk to 10 year olds knowing that parents will buy it for them if the kid nags them enough or the kids themselves will buy it from some teenager working at GameStop who’s getting paid too little to care, or they’ll just lie about their age on Steam and use their parents’ credit card to buy it. This is just about the worst way to cover their butts, though.
Because we need more ways to exclude young people…
No.
I can’t speak for this specifically, but these face recognition systems have had a history of bias for white people.
Do you want masks to stay around forever?? Cuz that’s how you get masks to stay around forever