• Uriel238 [all pronouns]
    link
    fedilink
    English
    151 year ago

    Ever since we let law enforcement use facial recognition technology, they’ve been arresting people for false positives, sometimes for long periods of time.

    it’s not just camera problems and being poorly trained regarding non-whites, but that people actually look too much alike, especially when using the tech on blurry low-res security footage,

    • Echo Dot
      link
      fedilink
      English
      6
      edit-2
      1 year ago

      I used to work in security camera monitoring and I used to think I don’t understand why insurers will touch some of these companies with an electrified cattle prod.

      They will be pretty high value asset companies with valuable stuff on premises that could be stolen, construction equipment, medical equipment, guns, cars, steel copper lead etc. and their security cameras would max out at 720p have a giant spider web on them without fail and would invariably be on some wobbly pole somewhere that was blowing around in the wind causing 300 false positives a minute. We literally used to switch those cameras off.

      Why don’t they insist on equipment that didn’t cost the company $4.50 from Walmart?

      The only cameras we used to work with that were actually any good were the number plate recognition cameras, but they were specialist and were absolutely useless for anything else other than number plate recognition. But boy did they get you that number plate.