A judge has found “reasonable evidence” that Elon Musk and other executives at Tesla knew that the company’s self-driving technology was defective but still allowed the cars to be driven in an unsafe manner anyway, according to a recent ruling issued in Florida.

Palm Beach county circuit court judge Reid Scott said he had found evidence that Tesla “engaged in a marketing strategy that painted the products as autonomous” and that Musk’s public statements about the technology “had a significant effect on the belief about the capabilities of the products”.

The ruling, reported by Reuters on Wednesday, clears the way for a lawsuit over a fatal crash in 2019 north of Miami involving a Tesla Model 3. The vehicle crashed into an 18-wheeler truck that had turned on to the road into the path of driver Stephen Banner, shearing off the Tesla’s roof and killing Banner.

  • bedrooms
    link
    fedilink
    40
    edit-2
    1 year ago

    The concept of autonomous cars might be game over.

    As always, advocates forgot about corporate greed. Do you trust your manufacturer to not lie to you? So much you risk killing yourself, your family and people on the road?

    • Yeah, the scary part of this is that as much as I absolutely would never go near this shit with a ten foot pole when it’s clearly still woefully inadequate and over hyped… They very frequently drive withing ten feet of me because for some reason it’s legal to put this shit on roads with unwilling participants.

      • @RedditRefugee69@lemmy.world
        link
        fedilink
        -34
        edit-2
        1 year ago

        I get it there’s inevitable interference of interest here but we can’t really tell other people to not do things we don’t like in a free country

        Edit: this is clearly being misinterpreted. I am NOT talking about the Tesla. I’m saying a hypothetical, well-regulated self-driving car can be fielded without the permission of every other motorist that thinks they’re icky.

        • @jopepa@lemmy.world
          link
          fedilink
          24
          edit-2
          1 year ago

          Yes, we can tell people they can’t do things. Welcome to society we’ve all been talking and decided on a bunch of things people can’t do in a free country. It’s public roads, it’s entirely reasonable to have restrictions on self driving cars, just like you can’t ride a tandem bicycle in the HOV lane.

        • @mateomaui@reddthat.com
          link
          fedilink
          English
          201 year ago

          People get fined for having unsafe vehicles on public roads all the time. All that’s needed here is a regulatory body to decide self-driving cars are unsafe enough to revoke approval.

          • @RedditRefugee69@lemmy.world
            link
            fedilink
            -151 year ago

            Oh hell yeah if it’s unsafe. I’m making the finer point that saying “you don’t have the right to drive that car next to me cuz it makes me feel weird” is overstepping

            • @mateomaui@reddthat.com
              link
              fedilink
              English
              131 year ago

              I’m pretty sure the actual concern has less to do with “feeling weird” and more “because it and/or its inattentive driver may suddenly kill me” because of a dysfunctional self-driving system whose capabilities has been fraudulently marketed and has, in reality, repeatedly, killed people.

          • @RedditRefugee69@lemmy.world
            link
            fedilink
            -41 year ago

            I am referring to America which prides itself on freedom (and not enough on equality and collectivism) . I’m just saying it makes legal sense that you don’t need the consent of every other motorist to operate a self-driving car (if it passes safety regulations and assuming no problems of regulatory capture). Both of those assumptions are not applicable here

        • @Fedizen@lemmy.world
          link
          fedilink
          6
          edit-2
          1 year ago

          truth in advertising laws exist for a reason

          also the people who frequently talk about a “free country” are often the same ones that want more police so they can do taliban style gender policing so it (the expression) seems deeply inauthentic at this point.

          • @RedditRefugee69@lemmy.world
            link
            fedilink
            -11 year ago

            True about “free country” being used to justify a society controlled by extreme wealth. And I’m talking about another persons right to “drive” a self-driving car next to me. Not about these guys objectively being criminally ass-hole-y

    • @Fedizen@lemmy.world
      link
      fedilink
      8
      edit-2
      1 year ago

      Autonomous vehicles ten years ago: Human drivers are slow and prone to lapses in judgement

      Autonomous vehicles today: Elon musk, a guy who famously destroyed a rare vehicle like a dumbass, will be training the AI that drives you around. It won’t know how to respond to an event not encountered in the training data and it will occasionally run into an ambulance

      • @jopepa@lemmy.world
        link
        fedilink
        51 year ago

        And his employees hate him so much I wouldn’t be surprised if there’s a patch released that makes one sustained fart noise when airbags deploy.

    • @mateomaui@reddthat.com
      link
      fedilink
      English
      41 year ago

      Maybe at least until there’s a better comprehensive infrastructure of external sensors on the road, at intersections, etc etc etc, to control and limit vehicle movement, but that probably will be a long while before getting those improvements considering normal routine road and bridge maintenance is far behind as it is.

    • @stolid_agnostic@lemmy.ml
      link
      fedilink
      21 year ago

      I think you need protected ways where no people or non autonomous vehicles may enter. Shy of that, I think you’re right.

    • @CmdrShepard@lemmy.one
      link
      fedilink
      English
      21 year ago

      Hell yeah, let these drivers behind the wheel plow into more semi trucks. They deserve it after all. /s

    • @RushingSquirrel@lemm.ee
      link
      fedilink
      11 year ago

      To me, autonomous vehicles are like AI (it actually is AI in the case of Tesla): the public perception is that it’s way better than it really is because it’s really good in 80% of cases. But to get to 90-95% will take many many years still. That doesn’t mean we shouldn’t use them, neither abandon them. To progress, we have to keep using them with caution. Learn the limits and work within it. Don’t start firing people to be replaced with AI because in a few months and years you’ll realize that the 20% left to improve will be hurting more than your thought. The same way you shouldn’t remove drivers just yet.

      • @IphtashuFitz@lemmy.world
        link
        fedilink
        English
        3
        edit-2
        1 year ago

        But it’s not true AI. In my decades of experience driving cars I’ve encountered numerous edge cases that I never explicitly learned about during my drivers ed days. One recent case in point - I pulled up to a red light at a fairly busy intersection and stopped. While the light was still red a police officer on the corner at a construction site walked out and tried to wave me through the intersection. I was watching the red light so I didn’t even see him until he yelled at me.

        How would an autonomous AI car handle that situation if it’s not explicitly trained to recognize it? It would need to recognize the police officer as an authority that legitimately overrides the red light.

        Same intersection a few years earlier I saw a car engulfed in flames right in the middle of it. I saw & heard the fire trucks rapidly approaching as I got to the intersection. I, and others, realized we needed to get out of the way quickly. Would a Tesla AI(or any other) recognize the car is on fire and safely move away, or would it just recognize the shape of the car and patiently wait for it to move out of the intersection before proceeding?

        The point is that it’s virtually impossible to predict for, and program an AI to handle, every single situation it might ever encounter. A true AI would be trained on a lot of these sorts of scenarios but would need to be capable of recognizing edge cases it hasn’t encountered before as well. It would then need to react as safely as possible to those edge cases in a manner similar to how a human would.

        Edit: Downvotes must be from Tesla fanbois who can’t face reality. If the had legitimate arguments they would have replied…

        • @RushingSquirrel@lemm.ee
          link
          fedilink
          11 year ago

          This is why AI is a solution, not coding everything. How does one learn how to react in these situations? Either you’ve learned from watching your parents, by taking lessons, reading the code or by simply following the others. The goal of an AI is to be able to do just that. Coding every single use case is way too complex.

          I know Tesla has worked on improving emergency vehicles situations, but I don’t know how and what’s the current state.

          Why are you being downvoted?

    • @Death_Equity@lemmy.world
      link
      fedilink
      11 year ago

      The Wright brothers first flight was less than the wingspan of a Boeing 747, an aircraft with a rang of over 8,000 miles. The Internet was once called a fad.

      Autonomous cars will be the future and people will die before they become the defacto method of personal transport. The unwilling sacrifices of a public alpha test of the technology are worth the losses we must endure to achieve the unparalleled safety of ubiquitous autonomous vehicles that mitigate traffic congestion, pedestrian deaths, unwieldy public transit, and the shortcomings of urban sprawl.

      The deaths caused by early adoption benefit the greater good and we should be willing to accept their loss as a necessary evil for a greater good.

      Not that I would ever trust a computer to drive my car. I will drive my own car until it kills me, financially or literally, but I can see what good an imperfect system struggling with growing pains will create.

  • @CanadianCorhen@lemmy.ca
    link
    fedilink
    271 year ago

    Its definitly insane how hearly Tesla started selling their “self driving” cars. the fact that there are cars that paid for self driving, and then never got more than a Level 3 system is insane.

    • @polygon6121@lemmy.world
      link
      fedilink
      10
      edit-2
      1 year ago

      Even the beta is considered a level 2 system still. Level 3 would require the system to conditionally take over in certain situations, you will quickly win a Darwin award if you consistently trusted the fsd for any given situation.

  • bedrooms
    link
    fedilink
    191 year ago

    Maybe they should start a new motor racing series where autonomous cars race 24 hours in 80km/h with random people walking on the circuit. Then we can trust autonomous cars.

  • @Nobody@lemmy.world
    link
    fedilink
    English
    121 year ago

    “We decided to bring the issue to Mr. Musk after the 5000th child died in the simulations. He asked if the children were going to be white.”

  • AI employee: “we can’t release the cars, senior muskrat! we just don’t have enough test data to guarantee the algorithm works!”

    Senior Muskrat: “test data, you say…”

    galaxy brain intensifies

  • @dan1101@lemm.ee
    link
    fedilink
    71 year ago

    “Full Self Driving” was such bullshit, call it Tesla Driving Assistant or something.

  • LittleHermiT
    link
    fedilink
    English
    21 year ago

    The real elephant in the room with AI is that when it works, the network has been over-fitted to the output. And when something completely novel is fed into it, it spits out nonsense that runs over your dog because it looked like a shadow on the asphalt. Poor fido.