• veee
    link
    fedilink
    English
    449 months ago

    According to Air Canada, Moffatt never should have trusted the chatbot and the airline should not be liable for the chatbot’s misleading information because Air Canada essentially argued that “the chatbot is a separate legal entity that is responsible for its own actions,” a court order said.

    “Air Canada argues it cannot be held liable for information provided by one of its agents, servants, or representatives—including a chatbot,” Rivers wrote.

    • @RegalPotoo@lemmy.world
      link
      fedilink
      English
      579 months ago

      The thing is, none of that is even slightly true; even if the chatbot were it’s own legal entity, it would still be an employee and air Canada are liable for bad advice given by their representatives

      • JohnEdwa
        link
        fedilink
        English
        299 months ago

        And Air Canada is free to sue the legal entity chat bot for damages after firing them all they like, after paying the customer their refund.
        Though they might find out that AI chatbots don’t have a lot of money, seeing as they aren’t actually employees and they don’t pay them anything.

        • @RegalPotoo@lemmy.world
          link
          fedilink
          English
          129 months ago

          You’d be really hard pressed to make a case for civil liability against an employee, even in cases where they did not perform their duties in accordance with their training - unless they have actively broken the law, the most recourse you have is to fire them

      • veee
        link
        fedilink
        English
        189 months ago

        Totally agree. With that statement they’re treating both employees and bots like scapegoats.

      • Lemminary
        link
        fedilink
        English
        39 months ago

        I wonder if advertising laws apply? With the whole “misleading their customers” being a thing.