• JoBo@feddit.uk
    link
    fedilink
    arrow-up
    75
    arrow-down
    1
    ·
    1 year ago

    Should the Cruise car have not started moving if there was a person still on the crosswalk? This whole sad affair raises many questions.

    There are some questions but “should cars start moving while a person is still on the crosswalk?” is surely not one of them.

    • medgremlin@lemmy.sdf.org
      link
      fedilink
      arrow-up
      10
      ·
      1 year ago

      A different question I have is whether or not the cars have transponders or other communication devices to automatically call emergency services in case of accidents. I’m assuming not because they would probably have a lot of junk calls and I doubt the company would have spent the time to create an algorithm for when to call 911 if they didn’t create an algorithm for what to do if there’s a pedestrian in a crosswalk.

      That’s one of the big downsides of these driverless cars: if a human accidentally ran over the victim, they have the capability to get out of the car to assess the situation, call 911, and offer aid to the victim. An empty car can only ever just sit there with its hazard lights on and maybe call for emergency services.

    • halcyoncmdr@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      11
      ·
      1 year ago

      A person laying on the ground in a crosswalk was likely never considered by the team to include in their training data. Those outlier situations are exactly what real world data is needed for. And the only way to properly train for most of these situations is to drive in the real world. The real world isn’t perfect situations and nice lines on fresh asphalt so while base training in perfect situations is useful, it will still miss the exact same situation in a real world environment with crappy infrastructure.

      Not sure what or how Cruise uses the data collected in real-time, but I can see camera visuals categorizing a person laying in the crosswalk as something like damage to painted lines, and small debris that can be ignored. Other sensors like radar and lidar might have categorized returns as something like echoes or false results that could be ignored, again because a person laying in the crosswalk is extremely unlikely. False data returns happen all the time with things like radar and lidar, millions of data points are ignored as outliers or info that can be safely ignored, and sometimes that categorization is incorrect.

        • Duranie@lemmy.film
          link
          fedilink
          English
          arrow-up
          8
          ·
          1 year ago

          Not only that, but no matter whether it can identify a person as a person, cars shouldn’t be driving over objects that are child sized or larger.

      • JoBo@feddit.uk
        link
        fedilink
        arrow-up
        14
        ·
        1 year ago

        A person laying on the ground in a crosswalk was likely never considered by the team to include in their training data

        I didn’t bother reading any further than this. The person was on the crosswalk when both cars started moving. Neither car should have been moving while anyone was still on the crosswalk.

        • MagicShel@programming.dev
          link
          fedilink
          arrow-up
          5
          ·
          1 year ago

          That was the exact moment I called bullshit as well. You’d damn well better plan for people tripping and falling. It happens all the time, but generally is pretty minor if not exacerbated by being run over. This is like saying they didn’t train it on people holding canes or in wheelchairs.

          • JoBo@feddit.uk
            link
            fedilink
            arrow-up
            5
            ·
            1 year ago

            It’s not about the ability to recognise someone lying in the road (although they obviously do need to be able to recognise something like that).

            She was still walking, upright, on the crosswalk when both cars started moving. No car, driverless or otherwise, should be moving forward just because the lights changed.

      • Nurchu@programming.dev
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        I actually work at one of these AV companies. We definitely have training data on adults and children laying down. I’d be very very very surprised if Cruise doesn’t due to all the people laying down on the sidewalks in SF. In addition, the clarity of the lidar/camera data on objects on the road is very clear. You can see the dips and potholes in the road as well as specifically see the raises of the painted lines. There’s no way they weren’t tracking the person.

        I could see predictions on the pedestrian saying the coast is clear. Once the initial crash happens, there likely isn’t enough room to stop in time even with a max break.

  • SatanicNotMessianic@lemmy.ml
    link
    fedilink
    arrow-up
    41
    arrow-down
    5
    ·
    1 year ago

    Oh, this company is not doing itself any favors.

    There needs to be full transparency on these fleets rather than having governments bend over backwards in the name of trade secrets. We’ve gone absolutely too far in that direction with everything from vehicles on our streets to fracking chemicals in our groundwater.

  • just_change_it@lemmy.world
    link
    fedilink
    arrow-up
    26
    ·
    edit-2
    1 year ago

    We desperately need footage of this to make any conclusions.

    The human driver hit her first, and knocked her into the neighboring lane…

    So she was hit and flung into another lane…

    …directly in front of a Cruise autonomous vehicle (AV) that was driving around by itself with no-one on board. The self-driving car then ran her over and came to a stop on top of her body, turning on its hazard lights. Her leg was pinned down by the back tire.

    So it stopped Like others have mentioned, driving over someone under a car can cause more injury than not moving. Was she screaming to move forward/move back? Was she flung in a way that a human driver could have stopped in time?

    If this was just a hit and run and there was no footage like what was provided by the ai car it could be that the victim or her family would be 100% on the hook for the medical bill. It will be interesting to see if the perpetrator is found and the footage surfaces with details so we can get some answers.

    edit: From another article

    The initial impact was severe and launched the pedestrian directly in front of the AV.

    • Salamendacious@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      I’m hoping that the car has multiple cameras recording so maybe the hit-and-run driver will be caught and prosecuted. That said, unless I misunderstood the article, I think the driverless car (DC) didn’t do a horrible job here. It sounded like the victim was struck and flung in front of the DC and it stopped (unfortunately on top of her). I don’t know if I could have reacted better. The article wasn’t clear but it read like the car contacted the police and the police instructed it to remain where it was, which is what I would have done if I were driving a car. The DC was then lifted off the woman by emergency personnel. We can’t expect DCs to be magically perfect. Just like we don’t expect people to be perfect. A DC is only as good as it’s programming. Hopefully this incident will be studied and if a better solution is found that can be integrated into the DCs operations. I really feel bad for the woman here. I don’t know but even if she shouldn’t have been walking no one deserves that. Let’s hope the hit-and-run driver is caught.

    • FuglyDuck@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      1
      ·
      1 year ago

      “I didn’t cause those injuries, it was the driverless cars” might actually work here…

      maybe. eh. the car company probably has better lawyers.

      • cmbabul@lemmy.world
        link
        fedilink
        arrow-up
        8
        ·
        1 year ago

        Yeah but the focus will be on the driverless car, I’m not saying they are blameless, just that the general heat and attention will probably not be on them now

        • FuglyDuck@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          1 year ago

          Im not sure that is a problem.

          Autonomous vehicles are still mostly half baked, and the question of liability of who gets the blame hasn’t even preheated the oven.

          The reality is company’s like waymo are using their cars in SF precisely to harvest training data because they can’t finish it without real world data- the physical driving a car is easy; interacting with humans is not.

    • nocturne213@lemm.ee
      link
      fedilink
      English
      arrow-up
      12
      ·
      1 year ago

      One of my son’s coworkers was just killed in a similar incident. Woman hits a pedestrian, she freaks out and calls her boyfriend instead of emergency services, boyfriend arrives and runs over the injured pedestrian ensuring he was dead.

      They are unsure which vehicle actually killed him.

    • JoBo@feddit.uk
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      1 year ago

      Why? Hit and run is a serious offence and the driverless car has it all on camera.

      • FireTower@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Perhaps he’s implying that due to the second car it may obfuscate if the first driver’s actions would have been lethal or just left the person injured. He’d probably rather be tried for a hit and run resulting in injury than a hit and run resulting in a death.

  • Infernal@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    ·
    1 year ago

    ”The driver of the other vehicle fled the scene, and at the request of the police, the AV was kept in place.”

    Hopefully not while still on top of her!

    • Baggins [he/him]@lemmy.ca
      link
      fedilink
      arrow-up
      10
      ·
      1 year ago

      If you get stuck under a car the fire dept is going to come lift it off of you. They aren’t going to try and drive it off that would almost surely cause further injuries.

  • RalphFurley@lemmy.world
    link
    fedilink
    arrow-up
    7
    arrow-down
    3
    ·
    1 year ago

    Reminds me of someone I knew that would keep a pint of Jack in his trunk. He drove drunk constantly and it was there for when if he ever got into an accident he was prepared to run out of the car, pop the trunk, and pound the bottle in front of all the witnesses.

    Can’t prove he was drunk at the time of the accident.

    • WarmSoda@lemm.ee
      link
      fedilink
      arrow-up
      6
      ·
      edit-2
      1 year ago

      This kind of thing has been repeated amd handed down for like a century. But I’ve never ever heard of anyone actually doing it, much less having it work.

  • gibmiser@lemmy.world
    link
    fedilink
    arrow-up
    1
    arrow-down
    5
    ·
    1 year ago

    It was just protecting her like a dog would.

    Whose a good car!? That’s right, you’re a good car!