- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
Retailers increasingly are using facial recognition software to patrol their stores for shoplifters and other unwanted customers. But the technology’s accuracy is highly dependent on technical factors — the cameras’ video quality, a store’s lighting, the size of its face database — and a mismatch can lead to dangerous results.
Wild that you can base a whole case on what a photo AI thinks it is seeing. These programs at the very least should work like DNA or fingerprint matching and provide a percentage of its accuracy, not just that it finds some kinda close image in its database and everyone rolls with it. And it should need some other piece of evidence as well to back it up, it should never be the “best” part of a prosecutors case.
Removed by mod
To add onto this, here’s a story about how someone who had their car stolen (and they could prove it) lost their initial objection to the charges from a red light camera.
The charges only dissappeared once the news got involved. https://abc7chicago.com/chicago-red-light-ticket-camera-illinois-car-stolen-theft/11677595/
With most digital forensic tools thats exactly what they do. There’s a specific threshold that gives a match probability. It’s designed as a way to point someone in a direction, not to confirm identity.
I can totally see cops using this as probable cause but it would get totally laughed out of a courtroom.
Should, not would. Get a backwards ass judge and it’ll fly. Your life is already fucked by the time you appeal it.