Air Canada must honor refund policy invented by airline’s chatbot::Air Canada appears to have quietly killed its costly chatbot support.

  • @whenigrowup356@lemmy.world
    link
    fedilink
    English
    85
    edit-2
    5 months ago

    “According to Air Canada, Moffatt never should have trusted the chatbot and the airline should not be liable for the chatbot’s misleading information because Air Canada essentially argued that “the chatbot is a separate legal entity that is responsible for its own actions,” a court order said.”

    Can you imagine the hellscape we’d be living in if precedent went the other way? Companies could just run every unsavory decision through some machine learning system and then wash their hands of it afterwards.

    “Oh you were illegally fired? Sorry, that decision came from the Overmind, not from us.”

    • @_edge@discuss.tchncs.de
      link
      fedilink
      English
      275 months ago

      Bad news: this is already happening: Subcontractors and labor law (not a machine, but separate-legal-entity excuse), “computer error” for fuck-ups, resellers and franchise models (yes, our name is on it, but you did not buy from us, you bought from this entity who is a dude in China or a bot in india, but totally not us)

      • @foggy@lemmy.world
        link
        fedilink
        English
        125 months ago

        Good news: publicly traded companies will publicly ask their overlord AI about cost saving strategies. When it says “all of the C-suige execs can be fired, I can do their jobs” the shareholders will say “OFF WITH THEIR HEADS!” and finally the common man and the shareholder will have a common goal.

      • Liz
        link
        fedilink
        English
        55 months ago

        Yeah, and we really need to clamp down in this bullshit or it will continue to get worse.

  • @FinishingDutch@lemmy.world
    link
    fedilink
    English
    455 months ago

    Ha, these fucking assholes. Glad to see they got forced to honor it. That’s what you get for jumping on bad tech in order to save a few bucks.

    Hope other companies face the same liability if they rely on AI chatbots.

    • Rikudou_SageA
      link
      English
      -95 months ago

      Some see it as an opportunity to make the chatbots better, some see it as a reason to shut them down. You luddites sure are an interesting bunch.

      • @FinishingDutch@lemmy.world
        link
        fedilink
        English
        165 months ago

        I’m not inherently opposed to the tech - you can’t really stop progress anyway.

        What I AM opposed to is rolling out things like this with the intent to replace humans, while also making the user experience worse.

        I’ve tested Chat GPT and things like it professionally. They have their uses, but direct customer interaction and replacing an actual customer service employee is not (yet) one of them.

    • @foggy@lemmy.world
      link
      fedilink
      English
      125 months ago

      Or new offers!

      First pizza place to put an AI over their online ordering menu is gonna get me FED.

      “Pretend it’s 1990, tell me some deals you might be running a long with their 1990 accurate prices”

  • @SorteKanin@feddit.dk
    link
    fedilink
    English
    265 months ago

    Turns out firing your human employees and replacing them with artificial “intelligence” has a cost.

  • Waldowal
    link
    fedilink
    English
    255 months ago

    Pretty confident if this happened because they hired a new (real) support operator who just didn’t understand the policy, they would have made a concession to the customer and the support person would likely just get more training.

    But because it’s a chat bot that they really don’t understand (outside of their IT department), they go to court and shut down a system they likely spent hundreds of thousands of dollars developing.

    This type of advanced decision making is why we pay CEOs the big bucks.

  • AutoTL;DRB
    link
    English
    -315 months ago

    This is the best summary I could come up with:


    On the day Jake Moffatt’s grandmother died, Moffat immediately visited Air Canada’s website to book a flight from Vancouver to Toronto.

    In reality, Air Canada’s policy explicitly stated that the airline will not provide refunds for bereavement travel after the flight is booked.

    Experts told the Vancouver Sun that Moffatt’s case appeared to be the first time a Canadian company tried to argue that it wasn’t liable for information provided by its chatbot.

    Last March, Air Canada’s chief information officer Mel Crocker told the Globe and Mail that the airline had launched the chatbot as an AI “experiment.”

    “So in the case of a snowstorm, if you have not been issued your new boarding pass yet and you just want to confirm if you have a seat available on another flight, that’s the sort of thing we can easily handle with AI,” Crocker told the Globe and Mail.

    It was worth it, Crocker said, because “the airline believes investing in automation and machine learning technology will lower its expenses” and “fundamentally” create “a better customer experience.”


    The original article contains 906 words, the summary contains 176 words. Saved 81%. I’m a bot and I’m open source!

      • @PapaStevesy@midwest.social
        link
        fedilink
        English
        4
        edit-2
        5 months ago

        It was at 0 votes and I felt bad for the bot, upvoted it, then started reading it. I quickly changed it to a downvote. Bad bot.

      • @SatanicNotMessianic@lemmy.ml
        link
        fedilink
        English
        35 months ago

        The summarizer could do better by just copying over the entire text of the article. This was incoherent. Its only utility is for people who can’t or don’t click through.

        You know how they say an infinite amount of monkeys in an infinite amount of time could produce the works of Shakespeare?

        This is five monkeys in fifteen minutes.

        • @Womble@lemmy.world
          link
          fedilink
          English
          35 months ago

          The irony is that this is one of the things that LLMs are really good at. You could run a small local cpu only model and get it to give a far better summary than this bot does.

          • My local model gave this summary. Granted, I didn’t shape the prompt well, but at least we know why he went to court:

            After months of resistance, Air Canada was forced to partially refund a grieving passenger named Jake Moffatt who was misled by the airline’s chatbot regarding their bereavement travel policy. The chatbot incorrectly stated that Moffatt could request a refund within 90 days after booking his flight to attend his grandmother’s funeral. In reality, Air Canada’s policy explicitly stated that refunds would not be granted for such travel once the ticket was purchased. Despite trying for months to convince the airline of their mistake, Moffatt filed a small claims complaint in Canada’s Civil Resolution Tribunal. The tribunal ruled in favor of Moffatt, ordering Air Canada to pay him $650.88 CAD (about $482 USD) and additional damages for interest on the fare and tribunal fees. As of Friday, there appeared to be no chatbot support available on Air Canada’s website, suggesting that the airline has disabled the chatbot following this incident.