I don’t think you understand how it works.
So the process is simple. a crime happens. the cops are called, they collect whatever surveillance video they can get, or maybe it was a cell phone camera, or something. they load it up to the AI software, which then runs it against government databases (probably driver’s license photos) and maybe images on social media.
it was the initial identification that was off- and should have been entirely avoidable here: the guy lived in Georgia, and the warrant issued in Louisiana. or, you know. Three states away. It disproportionately impacts black people because the training data used was mostly with white people.
checking to see that they match the provided identification doesn’t work because they never had a reliable ID to begin with. Chances are, any video of photos uploaded to get an id is going to be dogshit. Security camera DVR’s usually prioritize holding for about 30 days, which either requires a shit load of storage or a whole lot of compression. And this assumes the cameras are even capable of 4k. or 1080p or 720p. and then you got problems like dust and dirt (especially if it’s outside,) obscuring the camera, or bad lighting, or whatever it is happening too far away.
most times, you’re really only going to be able to get a general description off security cameras “White male with brown hair and scruffy beard and about five-six”. and that’s on a good system. shitty systems set up in the 80’s? that’s going to be more like… “it was a person.”
I don’t understand why an AI match can’t at least be vetted by an actual person more or less on the spot!
Like
Hey AI matched you as this criminal individual, are you this person
gives ID
Oh you’re not that person, sorry our bad.
It’s still not great but much better than this.
Have you seen cops? They’ll use any excuse to lock someone up, they’re not going to ‘take the risk of letting a suspect go’.
Yeah I am assuming a world where cops don’t arrest black people because they feel like it lol
I don’t think you understand how it works. So the process is simple. a crime happens. the cops are called, they collect whatever surveillance video they can get, or maybe it was a cell phone camera, or something. they load it up to the AI software, which then runs it against government databases (probably driver’s license photos) and maybe images on social media.
it was the initial identification that was off- and should have been entirely avoidable here: the guy lived in Georgia, and the warrant issued in Louisiana. or, you know. Three states away. It disproportionately impacts black people because the training data used was mostly with white people.
checking to see that they match the provided identification doesn’t work because they never had a reliable ID to begin with. Chances are, any video of photos uploaded to get an id is going to be dogshit. Security camera DVR’s usually prioritize holding for about 30 days, which either requires a shit load of storage or a whole lot of compression. And this assumes the cameras are even capable of 4k. or 1080p or 720p. and then you got problems like dust and dirt (especially if it’s outside,) obscuring the camera, or bad lighting, or whatever it is happening too far away.
most times, you’re really only going to be able to get a general description off security cameras “White male with brown hair and scruffy beard and about five-six”. and that’s on a good system. shitty systems set up in the 80’s? that’s going to be more like… “it was a person.”
Because the entire point of AI is to remove human labor from the equation.