- The University of Waterloo is expected to remove smart vending machines from its campus.
- A student discovered an error code that suggested the machines used facial-recognition technology.
- Vending Services said the technology didn’t take or store customers’ photos.
You gotta read through the carefully worded line toeing bullshit though; fingerprint readers on timeclocks no longer store fingerprint images but they can sure as fuck identify your finger. If operating similarly, it likely stores your face as a unique value, and can track unique purchases. Importantly, this can be sent to other machines to identify customers anywhere, and then of course, if that same technology is used in other machines, well shoot, any purchase you make at a vending machine is tracked to your face.
The statement says individual purchases cannot be tracked to the individual but could be wordsmithing this with implication of anonymity but in actuality you’re tracked by a value. Also, if this were true why was this done so surreptitiously? They will claim I’m court an individual can’t be identified but of course once the technology is commonplace it would be only too easy for corps to join a data broker orgy and find out who’s who.
I don’t know what is in the GDPR related to the above, but Cameras and storing pictures are old hat. Could be that carefully worded shit was because it’s not technically a picture of you and also because they’ll delete it upon request.
So facial recognition in this case means that it can recognize that a face exists? No particular details but just a face? That’s a lot less egregious than I assumed from the headline. With all the AI stuff going on these days, I assumed it was some kind of data mining operation
But then, why? Is there a problem with birds and dogs making purchases at these machines that they need to identify a face?
It does also estimate age and gender, so there’s some potential for data mining. But not much.
Yeah but I wonder how far they would push their next generation versions…