Posted on 02/16/2024 8:31:38 PM PST by DoodleBob
A Canadian tribunal has ruled that Air Canada must pay damages to one of its passengers for misleading advice given by its customer service chatbot, which resulted in the passenger paying nearly double for their plane tickets.
The case centered on the experience of Jake Moffatt, who flew round-trip from Vancouver to Toronto after his grandmother died in 2022. At the time, Moffatt visited Air Canada’s website to book a flight using the company’s bereavement rates. According to tribunal documents, Moffatt specifically asked Air Canada’s support chatbot about bereavement rates and received the following reply:
“Air Canada offers reduced bereavement fares if you need to travel because of an imminent death or a death in your immediate family,” the chatbot stated, including an underlined hyperlink to the airline’s policy. “If you need to travel immediately or have already traveled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form.”
Moffatt didn’t visit the link provided by the chatbot, which stated that, contrary to what the chatbot said, customers couldn’t apply for bereavement rates after they completed their travel.
The same day he spoke to the chatbot, Moffatt called Air Canada to get more information on the possible amount of the flight discount. He claims a human customer service representative told him that he would receive a discount of about 440 Canadian dollars ($326 U.S. dollars) per flight but wasn’t told the discount couldn’t be applied retroactively. Based on the information from the chatbot and the human customer service representative, Moffatt booked his flights.
A few days later, Moffatt submitted his application for a partial refund of what he had paid for his flights, which totaled 1,630 Canadian dollars ($1,210 U.S. dollars). After debating with the airline for weeks, Moffatt sent Air Canada a screenshot of the chatbot’s response in February 2023. In response, the human customer service representative told him the chatbot’s advice had been “misleading” and said they would take note of the issue so Air Canada could update the chatbot.
Moffatt’s back-and-forth with Air Canada continued and eventually ended up in the Civil Resolution Tribunal, also known as the CRT, a quasi-judicial tribunal in the British Columbia public justice system that deals with civil law disputes like small claims. Moffatt represented himself in the case, while Air Canada was represented by an employee.
In its defense, Air Canada denied all of Moffatt’s claims and said it couldn’t be held liable for information provided by its servants, agents, representatives, or chatbots—an argument that baffled Tribunal member Christopher C. Rivers. In a decision published this week, Rivers said that Air Canada’s suggestion that its chatbot was a “separate legal entity responsible for its own actions” didn’t make any sense.
“While a chatbot has an interactive component, it is still just a part of Air Canada’s website,” Tribunal member Christopher C. Rivers wrote. “It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot.”
Rivers added that Air Canada didn’t take reasonable care to ensure that its chatbot was accurate. It also didn’t explain why customers should have to double-check information found in one part of its website, the chatbot, with another part of its website. In the end, Rivers ordered Air Canada to pay Moffatt the refund he had spent nearly a year and a half fighting for.
All in all, the tale goes to show big companies that mistakes like “my chatbot did it, not me” won’t fly in court.
Companies spend a fortune on those things and on phone robots. Not one of them has ever given me anything the least bit useful.
Yeah, about a week or so ago I was directed by a company’s website to use their on line customer help. It went almost like that. Asked if I could talk to a human and was told that it could help me, why did I want to talk to a human? Almost like I had hurt it electronic feelings. Ended up calling their help line the next day and, after 97 minutes in the queue, got a human who solved my problem in under a minute.
No punitive damages?
It's difficult enough to get the legal department to sign off on customer-facing websites, adding in risky programming is a big no-no.
Well it’s rare that I agree with Canada much these days but on the face of it I have to agree. If a company uses an AI chatbot for customer service, they need to train it to deliver the correct responses and not blame some other AI bot or AI company.
I dread calling my cable TV company and having to get past the useless answering computer. But it seeems most companies and all medical providers have this useless service.
It’s a very useful service to the companies. If they can get 10% of the complaints to hang up and go away, that could be 10% saved toward the bottom line.
EC
Indian tech support has the same effect.
Really horrible wait music also has that effect.
We have Comcast for internet. It is almost impossible to get a person. Same with out mortgage company. They keep sending us paper even though we’ve selected “gp paperless” 20 times.
So, I'm not required to listen to the stewardess when he/she/it tells me to return to my seat for landing?
They paid interest and legal fees in addition to the refund portion.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.