Company claimed its chatbot ‘was responsible for its own actions’ when giving wrong information about bereavement fare

Canada’s largest airline has been ordered to pay compensation after its chatbot gave a customer inaccurate information, misleading him into buying a full-price ticket.

Air Canada came under further criticism for later attempting to distance itself from the error by claiming that the bot was “responsible for its own actions”.

Amid a broader push by companies to automate services, the case – the first of its kind in Canada – raises questions about the level of oversight companies have over the chat tools.

  • @Gork@lemm.ee
    link
    fedilink
    English
    1610 months ago

    “separate legal entity”

    That the airline completely controls and has since updated the chatbot’s programming.