- cross-posted to:
- hackernews@lemmy.smeargle.fans
- cross-posted to:
- hackernews@lemmy.smeargle.fans
Company claimed its chatbot ‘was responsible for its own actions’ when giving wrong information about bereavement fare
Canada’s largest airline has been ordered to pay compensation after its chatbot gave a customer inaccurate information, misleading him into buying a full-price ticket.
Air Canada came under further criticism for later attempting to distance itself from the error by claiming that the bot was “responsible for its own actions”.
Amid a broader push by companies to automate services, the case – the first of its kind in Canada – raises questions about the level of oversight companies have over the chat tools.
This chatbot no longer works for the airline according to its LinkedIn profile.