Content
summary Summary

Air Canada must compensate a British Columbia passenger after its chatbot misled him about airfares.

The man relied on the chatbot's information about airfares when booking a flight to his grandmother's funeral, CTV News reports.

Specifically, he wanted to know about special bereavement fares for flights due to a death in the family. The chatbot told the man that he was eligible for a discount within 90 days of the flight.

This information was incorrect, and the airline refused to grant the bereavement discount retroactively.

Ad
Ad

Air Canada argued that it could not be held responsible for the information provided by the chatbot. The chatbot was a separate legal entity and responsible for its actions.

However, the Canadian civil court ruled in favor of the man and awarded him $650.88 in damages for negligent misinformation. The amount represents the difference in price between the flight he paid for and a discounted bereavement fare, plus interest and fees.

A chatbot on a website is part of the website

The court concluded that Air Canada failed to ensure the accuracy of its chatbot. The chatbot is part of the website and, like the website, must provide accurate information.

Air Canada is responsible for all information on its website, whether it comes from a static page or a chatbot.

The court also rejected the argument that the man could have found the correct information on another page of the website, noting that there should be no reason for customers to compare the chatbot's information with information elsewhere.

Recommendation

The incident occurred in November 2022, at which point Air Canada was likely still using a programmed chatbot rather than a language model bot, which may be even less reliable.

For example, an LLM chatbot at a car dealership in the United States was able to negotiate the price of a new car down to one US dollar. Again, a court might conclude that the bot is part of the website and that the negotiated price, however absurd, counts.

Delivery company DPD damaged its image by using a chatbot: The LLM-based bot poked fun at its own company, calling it the worst delivery company in the world.

Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Summary
  • Air Canada has been ordered to refund $650.88 to a customer. The airline's chatbot provided inaccurate information about bereavement fares, which the court found to be a case of negligent misrepresentation.
  • The customer relied on the chatbot's information that bereavement fares could be claimed retroactively. However, the airline refused to reimburse the customer, who filed a lawsuit.
  • The court ruled that Air Canada was responsible for the information on its website, including the information provided by the chatbot, and rejected the airline's argument that the customer should have checked other areas of the website to ensure the information was accurate.
Sources
Online journalist Matthias is the co-founder and publisher of THE DECODER. He believes that artificial intelligence will fundamentally change the relationship between humans and computers.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.