In February 2024, the British Columbia Civil Resolution Tribunal (similar to the Ontario Small Claims Court) concluded that Air Canada was responsible for a chatbot’s remarks on its website. What did it say? The plaintiff was trying to pay for a flight following the death of his grandmother, and asked the chatbot on the company website about bereavement fares—the chatbot said that he could apply for bereavement fares retroactively. But later, an Air Canada employee told him that Air Canada did not allow retroactive applications. Ultimately, the adjudicator decided that Air Canada negligently misrepresented the situation since it did not take reasonable care to ensure its chatbot was accurate. To that end, Air Canada had to reimburse the plaintiff for the difference (plus pre- and post- judgment interest as well as application fees).
The plaintiff had to take a flight to Ontario from British Columbia following the death of his grandmother. During his research on flights while he was on the Air Canada website, a chatbot assured him that he could apply for bereavement fares retroactively. More specifically, he would have 90 days from the date of the ticket to apply for the reduced rates. As a result, he booked the flight.
Later, when applying for the bereavement fare, the plaintiff learned from an Air Canada employee that the chatbot was not correct. In fact, certain portions on the actual website stated that it was not possible to apply for bereavement consideration after the travel was completed.
Consequently, the plaintiff applied to the Tribunal for damages.
The Tribunal swiftly found that the plaintiff was able to prove the tort of negligent misrepresentation:
1. Air Canada owed him a duty of care: given the commercial relationship as service provider and consumer, there was a duty of care where Air Canada had to take reasonable care to ensure that its representations were accurate and not misleading. The chatbot was not a separate legal entity as argued by Air Canada, and the company was responsible for all information on its website (whether from a static page or a chatbot).
2. Air Canada’s representation was untrue, inaccurate, or misleading: there was no ensuring that the chatbot was accurate by Air Canada, and it was clear that what the chatbot said was inconsistent with what other parts of the website stated.
3. The plaintiff relied on Air Canada’s representations: this was reasonable in the circumstances—there was no reason why the plaintiff should have known about the inconsistencies regarding the Air Canada website. It made sense that the plaintiff would not have flown immediately if he had known that he would have had to pay full price for the ticket.
Therefore, the plaintiff was entitled to damages against Air Canada. More precisely, he was entitled to be put back in the position that he would have been in had the misrepresentation not been made. The amount of damages was the difference between the price and the actual market value at the time of sale. Additionally, the plaintiff was entitled to pre- and post- judgment interest and Tribunal application fees.
What does this mean?
One thing that we can take from this case is that adjudicators are not going to agree with companies who have AI chatbots that provide customers with the wrong information on which they rely. This means that companies will have to ensure that the information provided on their websites is accurate—regardless of whether the information comes from a static page or a chatbot. Going forward, when customers rely on the information provided by incorrect chatbots, it is likely that companies will not be able to escape the consequences of their carelessness.
In addition, there was also an ethical question here that was operating in the background: do companies really wish to financially hurt their customers who are experiencing bereavement? The circumstances of the case suggest that it would be very difficult for companies to do so, especially since the plaintiff checked to make sure that he had 90 days from the date of the ticket to get the reduced rate. Along the same lines, is it fair that the plaintiff had to go all the way to the Tribunal to resolve the matter? In the circumstances, it is recommended that if companies want to use chatbots on their sites, that they make an effort to ensure that the information provided by these bots is correct. And when a customer in the plaintiff’s position needs to resolve his matter retroactively, it might be worth considering how the matter can be resolved without forcing the customer to go to the Tribunal.
- Recent proposal for an American federal privacy law - April 19, 2024
- Bill 149 receives royal assent March 21, 2024 - April 1, 2024
- Reasonable expectation of privacy in Internet Protocol (IP) addresses - March 26, 2024
Leave a Reply