The error was in the store submitting her picture to a watchlist by mistake. It wasn’t because the software misidentified her.
Automatic_Goal_5491 on
Having seen the detections provided by these systems it doesn’t surprise me. We would get data saying the same person was hundreds of miles apart within minutes of each other. Either a lot of people have the same powers of flight as superman or the detection from bad angles in poor light is not as good as these systems claim.
[deleted] on
[removed]
Fluffy-Sand-9470 on
you know it might be controversial but I’m actually starting to think this whole the uk government will lead the way in implementing ai across all our systems idea, might not be the golden goose we’ve been told it is in fact I’m starting to think the whole thing might be getting closer to the emperors new clothes
TheHess on
Any personal data that a store has on you that is incorrect and isn’t corrected when prompted to, is a breach of GDPR and can (and should) lead to substantial fines. I hope the ICO comes down hard in this situation.
plenihan on
Isn’t this a black mirror episode? It’s just a social credit system with a different name
ohnondinmypants on
Wait till they find out how many times security guards stop people by mistake or the Police stop someone thinking they are someone else who is wanted or how many times CCTV operators confidently say they have a wanted person on camera but it turns out they aren’t. Being asked to account for something isn’t the same as being dragged to the cells. Police, public and it turns out facial recognition isn’t perfect. Most people just deal with it, the few will kick off and cause problems just because how dare they accuse me.
I tried to fill my car up with petrol and it came over the forecourt speaker to speak with staff. My car reg had pinged for a making off without payment, they showed me my car on the footage but couldn’t tell me what amount I had stolen. I obviously thought I was guilty and had accidentally driven off. I provided all my details and it turns out it was a mistake. I didn’t kick off or go crying to the press.
Emotional-Ebb8321 on
The only time a person’s face should be added to this kind of screening is either if they are actually convicted, or if they sign a statement agreeing to be added to the list (possibly in exchange for charges being dropped). Anything else is creating a presumption of guilt without a fair trial.
JackDaniels0049 on
I wish people would stop posting misleading headlines.
Fluffy-Sand-9470 on
has anyone seen the ai Intellilink south park episode where mr mckay just has to buy the bronze package to make it work properly and when it still has bugs the silver package upgrade will apparently solve it, then the gold package then the platinum package etc… etc … eventually having to concede his pride hagving lost the school millions
Rare_Walk_4845 on
If I were in charge I’d tear all this shit down, as if anyone feels any safer with this shit on or off.
Flipmode45 on
Without being disparaging, I suspect that relying on shop staff that are being paid minimum wage to decide if someone in the store was shoplifting and being responsible for reporting their image to FaceWatch is a recipe for disaster. Seems ripe for abuse.
12 Comments
The error was in the store submitting her picture to a watchlist by mistake. It wasn’t because the software misidentified her.
Having seen the detections provided by these systems it doesn’t surprise me. We would get data saying the same person was hundreds of miles apart within minutes of each other. Either a lot of people have the same powers of flight as superman or the detection from bad angles in poor light is not as good as these systems claim.
[removed]
you know it might be controversial but I’m actually starting to think this whole the uk government will lead the way in implementing ai across all our systems idea, might not be the golden goose we’ve been told it is in fact I’m starting to think the whole thing might be getting closer to the emperors new clothes
Any personal data that a store has on you that is incorrect and isn’t corrected when prompted to, is a breach of GDPR and can (and should) lead to substantial fines. I hope the ICO comes down hard in this situation.
Isn’t this a black mirror episode? It’s just a social credit system with a different name
Wait till they find out how many times security guards stop people by mistake or the Police stop someone thinking they are someone else who is wanted or how many times CCTV operators confidently say they have a wanted person on camera but it turns out they aren’t. Being asked to account for something isn’t the same as being dragged to the cells. Police, public and it turns out facial recognition isn’t perfect. Most people just deal with it, the few will kick off and cause problems just because how dare they accuse me.
I tried to fill my car up with petrol and it came over the forecourt speaker to speak with staff. My car reg had pinged for a making off without payment, they showed me my car on the footage but couldn’t tell me what amount I had stolen. I obviously thought I was guilty and had accidentally driven off. I provided all my details and it turns out it was a mistake. I didn’t kick off or go crying to the press.
The only time a person’s face should be added to this kind of screening is either if they are actually convicted, or if they sign a statement agreeing to be added to the list (possibly in exchange for charges being dropped). Anything else is creating a presumption of guilt without a fair trial.
I wish people would stop posting misleading headlines.
has anyone seen the ai Intellilink south park episode where mr mckay just has to buy the bronze package to make it work properly and when it still has bugs the silver package upgrade will apparently solve it, then the gold package then the platinum package etc… etc … eventually having to concede his pride hagving lost the school millions
If I were in charge I’d tear all this shit down, as if anyone feels any safer with this shit on or off.
Without being disparaging, I suspect that relying on shop staff that are being paid minimum wage to decide if someone in the store was shoplifting and being responsible for reporting their image to FaceWatch is a recipe for disaster. Seems ripe for abuse.