Air Canada must pay damages after chatbot lies to grieving passenger about discount | Airline tried arguing virtual assistant was solely responsible for its own actions::Airline tried arguing virtual assistant was solely responsible for its own actions

    • @tiramichu@lemm.ee
      link
      fedilink
      English
      28
      edit-2
      9 months ago

      Hundreds in this case, but millions in the long term.

      I can see why Air Canada wanted to fight it, because if they accept liability it sets a precedent that they should also accept liability for similar cases in future.

      And they SHOULD accept liability, so I’m glad Air Canada lost and were forced to!

      • @brsrklf@jlai.lu
        link
        fedilink
        English
        119 months ago

        The solution would be easy, just stop having an LLM chatbot.

        But I suspect they don’t want to because someone sold them on how good and cheap and human-resource-free it was, and now they think they’re too invested.

          • @rottingleaf@lemmy.zip
            link
            fedilink
            English
            29 months ago

            Plus just the general sentiment that you’re not businessing right if you don’t something something AI.

            Feel my blood boiling at the very thought of people choosing to use something buzzwordy like blockchain or “AI”, despite likely no competent person advising them to employ that, AND then trying to clean themselves of the responsibility when it misfires.

            That’s as if drunk driving leading to car crash was blamed on the air, because “having fun is not a crime”.

            Only with computing these people unironically think that nobody should be responsible, because everybody they respect is as clueless as themselves, so “nobody knows how it works, it’s a frontier, see”.