ÎÚÑ»´«Ã½

Skip to content
Join our Newsletter

Air ÎÚÑ»´«Ã½ responsible for errors by chatbot, civil resolution tribunal rules

A ÎÚÑ»´«Ã½ man paid full fare for an Air ÎÚÑ»´«Ã½ flight to Toronto for his grandmother’s funeral after the website’s chatbot incorrectly said he could get a retroactive discount
web1_20230601110636-6478bae0fbd0a6c77e42ca47jpeg
Air ÎÚÑ»´«Ã½ had argued that it couldn’t be held liable for information provided by its chatbot. THE CANADIAN PRESS/Paul Chiasson

VANCOUVER — An Air ÎÚÑ»´«Ã½ passenger from ÎÚÑ»´«Ã½ has won his fight after the airline refused him a retroactive discount, claiming it wasn’t responsible for promising the refund because it was made in error by the airline’s online chatbot.

Artificial-intelligence law experts say it’s a sign of disputes to come if companies don’t ensure accuracy when increasingly relying on artificial intelligence to deal with customers.

Jake Moffatt booked a flight to Toronto with Air ÎÚÑ»´«Ã½ to attend his grandmother’s funeral in 2022 using the website’s chatbot, which advised him he could pay full fare and apply for a bereavement fare later, according to the decision by ÎÚÑ»´«Ã½’s Civil Resolution Tribunal.

But an Air ÎÚÑ»´«Ã½ employee later told him that he couldn’t apply for the discount after the flight.

“Air ÎÚÑ»´«Ã½ says it cannot be held liable for the information provided by the chatbot,” said tribunal member Christopher Rivers in his written reasons for decision posted online.

It “suggests the chatbot is a separate legal entity that is responsible for its own actions,” he said Rivers. “This is a remarkable submission.”

When Moffatt asked Air ÎÚÑ»´«Ã½’s automated response system about reduced fares for those travelling because of a death in the immediate family, the chatbot answered he should submit his claim within 90 days to get a refund.

His total fare for the return trip was $1,640, and he was told the bereavement fare would be about $760 in total, a $880 difference, he told the tribunal.

He later submitted a request for the partial refund and included a screenshot of the chatbot conversation, the tribunal said.

Air ÎÚÑ»´«Ã½ responded by saying “the chatbot had provided ‘misleading words’ ” and refused a refund.

In ruling in Moffat’s favour, Rivers said Moffatt was alleging “negligent misrepresentation” and he found Air ÎÚÑ»´«Ã½ did owe Moffatt a duty to be accurate.

“The applicable standard of care requires a company to take reasonable care to ensure their representations” are not misleading, he wrote.

The airline argued it could not be held liable for information provided by one of its agents, servants or representatives, including a chatbot, Rivers said, adding it didn’t say why it believed that.

He said the chatbot is “still just a part of Air ÎÚÑ»´«Ã½’s website. It should be obvious to Air ÎÚÑ»´«Ã½ that it is responsible for all the information on its website.”

Rivers also said the airline didn’t explain why customers should double check the information found on one part of its website against another, referring to the section called “bereavement travel” that had the correct information.

“There is no reason why Mr. Moffatt should know that one section of Air ÎÚÑ»´«Ã½’s webpage is accurate and another is not,” he said.

Moffatt said he wouldn’t have booked the flight at full fare and Rivers found he was entitled to damages.

Rivers calculated the extra fees and taxes Moffatt would have paid in addition to the base rate to arrive at $650.

Air ÎÚÑ»´«Ã½ said in a statement it will comply and it had no further comment.

The case is a reminder to companies to be cautious when relying on artificial intelligence, said Ira Parghi, a lawyer with expertise in information and AI law.

As AI-powered systems become capable of answering increasingly complex questions, companies have to decide if they are worth the risk.

“If an area is too thorny or complicated, or it’s not rule-based enough, or it relies too much on individual discretion, then maybe bots need to stay away,” said Parghi.

“That’s the first time that I’ve seen that argument” that a company isn’t liable for its own chatbot, said Brent Arnold, a partner at Gowling WLG.

To avoid liability for errors, a company would have to warn customers it didn’t take responsibility for its chatbots, which would make the service of questionable use to consumers, he said.

Companies will need to disclose what is AI-powered as part of new AI laws, and they’ll have to test high-impact systems before rolling them out to the public, he said.

As rules evolve, companies will have to be careful on both civil liability and regulatory liability, said Arnold.

“It will be interesting to see what a Superior Court does with a similar circumstance, where there’s a large amount of money at stake,” he said.

“It’s a cutting edge ruling when it comes to technology,” said Gabor Lukacs, president of the Air Passenger Rights consumer advocacy group. “It’s a great ruling. I’m really pleased.”

— With files from The Canadian Press