• Rikudou_SageA
    link
    201 year ago

    Fine by me, as long as the companies making the cars take all responsibility for accidents. Which, you know, the human drivers do.

    But the car companies want to sell you their shitty autonomous driving software and make you be responsible.

    If they don’t trust it enough, why should I?

    • @Thorny_Thicket@sopuli.xyz
      link
      fedilink
      21 year ago

      Well you shouldn’t trust it and the car company tells you this. It’s not foolproof and something to be blindly relied on. It’s a system that assists driving but doesn’t replace the driver. Not in it’s current form atleast though they may be getting close.

      • Rikudou_SageA
        link
        111 year ago

        Then what’s the discussion even about? I don’t want autonomous cars on the street because even their creators don’t trust them to make it.

        • @Thorny_Thicket@sopuli.xyz
          link
          fedilink
          21 year ago

          Most people consider cruise control to be quite useful feature thought it still requires you to pay attention that you stay on your lane and don’t run into a slower vehicle in front of you. You can then keep adding features such as radar for adaptive cruise control and lane assist and this further decreases the stuff you need to pay attention to but you still need to sit there behind the wheel watching the road. These self-driving systems at their current form are no different. They’re just further along the spectrum towards self driving. Some day we will reach the point that you sitting on the driver’s seat just introduces noise to the system so better you go take a nap on the back seat. We’re not there yet however. This is still just super sophisticated cruise control.

          It’s kind of like with chess engines. First humans are better at it than computers. Then computer + human is better than just the computer and then at some point the human is no longer needed and computer will from there on always be better.

          • Rikudou_SageA
            link
            English
            71 year ago

            I don’t feel like this is what we were talking about - at least I was talking about cars that drive alone.

            • @Thorny_Thicket@sopuli.xyz
              link
              fedilink
              21 year ago

              Well Cruise is offering a full self driving taxi service where they don’t mandate you as a passenger to pay attention to the traffic and take control if needed so it’s not fair to say that they don’t trust it so why should you.

              With Tesla however this is the case but despite their rather aggresive marketing they still make it very clear that this is not finished yet and you are allowed to use it but you’re still the driver and the safe use of it is on your responsibility. That’s the case with the beta version of any software; you get it early which is what early adopters like but you’re expected to encounter bugs and this is the trade-off you have to accept.

              • Count Zero
                link
                fedilink
                3
                edit-2
                1 year ago

                Is the company legally liable for the actions of the self driving car? If no, then they don’t trust the vehicles.

                What charges would apply against a human that delayed an emergency vehicle and caused someone to die?

                  • Count Zero
                    link
                    fedilink
                    11 year ago

                    That’s a moved goalpost, and you know it.

                    If liability is forced on them, that is a huge difference from them voluntarily accepting responsibility. That is what would indicate that they trusted the service they provided.

      • @Kornblumenratte@feddit.de
        link
        fedilink
        41 year ago

        The discussed incident does not involve driving assist systems, driverless autonomous taxis are already on the streets:

        A number of Cruise driverless vehicles were stopped in the middle of the streets of the Sunset District after Outside Lands in Golden Gate Park on Aug. 11, 2023.

    • TheHalc
      link
      fedilink
      21 year ago

      take responsibility [… like] human drivers do.

      But do they really? If so, why’s there the saying “if you want to murder someone, do it in a car”?

      I do think self-driving cars should be held to a higher standard than humans, but I believe the fundamental disagreement is in precisely how much higher.

      While zero incidents is naturally what they should be aiming for, it’s more of a goal for continuous improvement, like it is for air travel.

      What liability can/should we place on companies that provide autonomous drivers that will ultimately lead to safer travel for everyone?

      • Rikudou_SageA
        link
        English
        31 year ago

        Well, the laws for sure aren’t perfect, but people are responsible for the accidents they cause. Obviously there are plenty of exceptions, like rich people, but if we’re talking about the ideal real-life scenario, there are consequences for causing an accident. Whether those consequences are appropriate or not is for another discussion.

      • @abhibeckert@beehaw.org
        link
        fedilink
        2
        edit-2
        1 year ago

        While zero incidents is naturally what they should be aiming for, it’s more of a goal for continuous improvement, like it is for air travel.

        As far as I know, proper self driving (not “autopilot”) AVs are pretty close to zero incidents if you only count crashes where they are at fault.

        When another car runs a red light and smashes into the side of an autonomous vehicle at 40mph… it wasn’t the AV’s fault. Those crashes should not be counted and as far as I know they currently are in most stats.

        What liability can/should we place on companies that provide autonomous drivers that will ultimately lead to safer travel for everyone?

        I’m fine with exactly the same liability as human drivers have. Unlike humans, who are motivated to drive dangerously for fun or get home when they’re high on drugs or continue driving through the night without sleep to avoid paying for a hotel, autonomous vehicles have zero motivation to take risks.

        In the absence of that motivation, the simple fact that insurance against accidents is expensive is more than enough to encourage these companies to continue to invest in making their cars safer. Because the safer the cars, the lower their insurance premiums will be.

        Globally insurance against car accidents is approaching half a trillion dollars per year and increasing over time. With money like that on the line, why not spend a lazy hundred billion dollars or so on better safety? It won’t actually cost anything - it will save money.

        • @jarfil@beehaw.org
          link
          fedilink
          11 year ago

          the safer the cars, the lower their insurance premiums will be.

          Globally insurance against car accidents is approaching half a trillion dollars per year

          That… almost makes it sound like the main opposition to autonomous cars, would be insurance companies: can’t earn more by raising the premiums, if there are no accidents and a competing insurance company can offer a much cheaper insurance.