Detroit woman sues city after being falsely arrested while pregnant due to facial recognition technology::A Detroit woman is suing the city and a police detective after she was falsely arrested because of facial recognition technology while she was eight months pregnant, according to court documents.

  • @SatanicNotMessianic@lemmy.ml
    link
    fedilink
    English
    1211 year ago

    According to a recent review, 100% of the people falsely arrested via facial recognition findings have been black.

    The technology needs to be legally banned from law enforcement applications, because law enforcement is not making a good faith effort to use the technology.

    • @rockSlayer@lemmy.world
      link
      fedilink
      English
      441 year ago

      We should ban patrol automation software too. They utilize historical arrest data to help automatically create patrol routes. Guess which neighborhoods have a history of disproportionate policing.

      • @SatanicNotMessianic@lemmy.ml
        link
        fedilink
        English
        171 year ago

        The problems with the approaches that tend to get used should be the cause of absolute outrage. They’re ones that should get anyone laughed off of any college campus.

        The problem is that they lend a semblance of scientific justification to confirm the biases of both police departments and many voters. Politicians look to statisticians and scientists to tell them why they’re right, not why they’re wrong.

        That’s why it’s so important for these kinds of issues to make the front pages.

        • @brygphilomena@lemmy.world
          link
          fedilink
          English
          31 year ago

          It’s great how statistics can be used to basically support anything the author wants them to. Identifying initial biases in the data is super important just as verifying the statistics independently.

        • phillaholic
          link
          fedilink
          English
          251 year ago

          This is a Systemic Bias; in this case Systemic racism.

          The outcome a product or service disproportionately targets Black people. It wasn’t designed to do it, so it’s not overt racism, it just worked out that way.

          Camera systems inherently have a harder time with dark skin. That’s a fact. However it’s been found time and time again that these systems are predominantly created by and tested on light skin individuals. So the bias is built into the flawed creation. You can see this in Hollywood where lighting has only recently been set up to highlight dark skin with majority black casts and show runners in shows like Atlanta and Insecure.

                • phillaholic
                  link
                  fedilink
                  English
                  41 year ago

                  The outcome of the bad technology and policing is disproportionately effecting dark skinned people. That’s where it becomes systemic racism. No one decided to design a system to arrest more blacks people. The outcome of various factors ended that way however. Sometimes it’s just a consequence of nature, but most of the time there are clear reasons like lack of representation in design and testing that would have found the problems earlier.

    • @fabian_drinks_milk@lemmy.fmhy.net
      link
      fedilink
      English
      51 year ago

      A similar thing has happened here in the Netherlands. Algorithms have been used to detect fraud, but had a discriminatory bias and accused thousands of parents of child benefits fraud. Those parents came in huge financial problems as they had to back back the allowances, many even got their children taken away and to this day haven’t gotten them back.

      The Third Rutte Cabinet did resign over this scandal, but many of those politicians came back at another position, including prime minister Rutte, because that’s somehow allowed.

      Wikipedia (English): https://en.m.wikipedia.org/wiki/Dutch_childcare_benefits_scandal

    • @frankblack@lemmy.world
      link
      fedilink
      English
      -91 year ago

      Well it does have its place. DoD and DHS has a human verify after the system certified the match. After the human “touch”, is when mistakes like this do not occur. What needs to happen is follow what DoD and CBP have created to verify so called matches to reduce the impact against blacks.

      Source: I’m the former Identity Operations Manager for a major agency.

  • AutoTL;DRB
    link
    English
    241 year ago

    This is the best summary I could come up with:


    A Detroit woman is suing the city and a police detective after she was falsely arrested because of facial recognition technology while she was eight months pregnant, according to court documents.

    Porcha Woodruff, 32, was getting her two children ready for school on the morning of Feb. 16 when six police officers showed up at her doorstep and presented her with an arrest warrant alleging robbery and carjacking.

    “Ms. Woodruff later discovered that she was implicated as a suspect through a photo lineup shown to the victim of the robbery and carjacking, following an unreliable facial recognition match,” court documents say.

    When Oliver learned that a woman had returned the victim’s phone to the gas station, she ran facial technology on the video, which identified her as Woodruff, the lawsuit alleges.

    On the day Woodruff was arrested, she and her fiancé urged officers to check the warrant to confirm whether the woman who committed the crime was pregnant, which they refused to do, the lawsuit alleges.

    The office confirmed that facial recognition prompted police to include the plaintiff’s photo in a six-pack, or array of images of potential suspects in the warrant package.


    I’m a bot and I’m open source!

  • @Alexstarfire@lemmy.world
    link
    fedilink
    English
    221 year ago

    I’m going to buck the trend here and say this is less about the facial recognition software. The police used an 8 year old photo even though they had something more recent available. Then the victim identifies the woman. The only thing the software did was put her in the lineup.

    I’m very much against facial recognition, even if it’s 100% accurate. It’s because it will get abused. Just like any other tech that reduces privacy.

    • pitninja
      link
      fedilink
      English
      141 year ago

      Eyewitnesses are notoriously unreliable at picking people out of a lineup as well. But I can kind of understand how if two unreliable systems point to the same person, that could be seen as enough for an arrest. It shouldn’t have taken nearly as long for her to be cleared of any charges, however.

    • phillaholic
      link
      fedilink
      English
      81 year ago

      It’s sort of the “guns don’t kill people, people kill people” argument. It just gives a shitty cop cover to keep being shitty. The tools should be improved to eliminate that cover unless it’s far more accurate.