AI in practice

Air Canada held responsible for chatbot's misleading advice on airfares

Matthias Bastian
A chatbot character, depicted with a distinctly digital appearance, riding on the wing of a plane mid-flight. The scene is rendered in a vibrant glitch style, emphasizing a dynamic and abstract interpretation of digital distortion. The plane, sleek and modern in design, cuts through a sky filled with digital artifacts, error patterns, and a spectrum of glitch effects in vivid colors like electric blues, neon greens, and splashes of purple and pink. This digital chaos symbolizes the intersection of technology and adventure, set against a backdrop that blends the real with the surreal. The image is created in a 16:9 aspect ratio, providing a wide cinematic view of the chatbot's exhilarating journey through a cybernetically enhanced sky.

DALL-E 3 prompted by THE DECODER

Air Canada must compensate a British Columbia passenger after its chatbot misled him about airfares.

The man relied on the chatbot's information about airfares when booking a flight to his grandmother's funeral, CTV News reports.

Specifically, he wanted to know about special bereavement fares for flights due to a death in the family. The chatbot told the man that he was eligible for a discount within 90 days of the flight.

This information was incorrect, and the airline refused to grant the bereavement discount retroactively.

Air Canada argued that it could not be held responsible for the information provided by the chatbot. The chatbot was a separate legal entity and responsible for its actions.

However, the Canadian civil court ruled in favor of the man and awarded him $650.88 in damages for negligent misinformation. The amount represents the difference in price between the flight he paid for and a discounted bereavement fare, plus interest and fees.

A chatbot on a website is part of the website

The court concluded that Air Canada failed to ensure the accuracy of its chatbot. The chatbot is part of the website and, like the website, must provide accurate information.

Air Canada is responsible for all information on its website, whether it comes from a static page or a chatbot.

The court also rejected the argument that the man could have found the correct information on another page of the website, noting that there should be no reason for customers to compare the chatbot's information with information elsewhere.

The incident occurred in November 2022, at which point Air Canada was likely still using a programmed chatbot rather than a language model bot, which may be even less reliable.

For example, an LLM chatbot at a car dealership in the United States was able to negotiate the price of a new car down to one US dollar. Again, a court might conclude that the bot is part of the website and that the negotiated price, however absurd, counts.

Delivery company DPD damaged its image by using a chatbot: The LLM-based bot poked fun at its own company, calling it the worst delivery company in the world.

Sources: