Skip to main content

Language Learning Models (commonly called AI chatbots) are having a moment. After the Chat GPT amazed most folks with its human-like texting capabilities, we have seen these programs pop up in all walks of life. We’ve even grown used to laughing off “misunderstandings,” such as a GM dealership chatbot selling a new Tahoe for $1. But in Canada, courts have begun to hold businesses accountable for chatbot mistakes.

Jake Moffat lives in Vancouver. When his grandmother in Toronto died, he logged on to Air Canada and found it offers a bereavement discount. A helpful chatbot on the airline’s website told him he could book his ticket and fly, then would have 90 days to ask for a post-flight partial refund.

The silhouette of a jet airplane flying across a sunset.
Jet airplane | Sangga Rima Roman Selia via Unsplash

After Moffat got home from the funeral, he sent his $1,200 ticket stub to Air Canada and was told that any bereavement discounts need to be approved before you actually travel. And when he took Air Canada to court, the airline argued that the chatbot is a “separate legal entity” from Air Canada.

Air Canada probably thought it was in the clear. In the case of the GM chatbot promising a customer a $1 Tahoe, and other precedents, corporations have not had to honor promises made by their AI employees. But a Canadian Tribune was not having it.

“While a chatbot has an interactive component, it is still just a part of Air Canada’s website. It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot…I find Air Canada did not take reasonable care to ensure its chatbot was accurate…While Air Canada argues Mr. Moffatt could find the correct information on another part of its website, it does not explain why the webpage titled ‘Bereavement travel’ was inherently more trustworthy than its chatbot. It also does not explain why customers should have to double-check information found in one part of its website on another part of its website.”

Christopher Rivers, Member of Canadian Civil Tribunal

The Canadian Tribunal ruled that Air Canada had to pay Moffat the partial refund, which came out to $483 and some nominal fees. Last time I checked, the chatbot has been removed from Air Canada’s website.

Honestly, I think this is a great thing. I’m not cheering for Air Canada to lose money or for chatbots to fail. But as soon as chatbots could reliably imitate humans, corporations got dollar signs in their eyes. Air Canada added a chatbot to the site in an attempt to replace real flesh-and-blood customer service people and save money. But while a human representative can make choices–such as when a bereavement discount is appropriate–a language learning model does just that. It just imitates human language.

If corporations are going to swap chatbots for humans to save money, they should be responsible for any added costs the chatbots run up.

Next, learn how Chat GPT can help you research your next road trip, or see the news coverage of Air Canada’s chatbot mishap in the video below:

Related The World’s Longest Nonstop Domestic Flight Regularly Took Over 16 Hours

The World’s Longest Nonstop Domestic Flight Regularly Took Over 16 Hours