Tuesday, February 20, 2024

Remarkable case of Moffatt v. Air Canada: Company if provides information via a chatbot or another automated system, that company may be liable for any misleading information that it provides

Remarkable case of Moffatt v. Air Canada


Canada court rules that a company if provides information via a chatbot or another automated system, that company may be liable for any misleading information that it provides in remarkable case of Moffatt v. Air Canada

Background

The dispute is about a refund for a bereavement fare. In November 2022, Jake Moffatt booked a flight with Air Canada following the death of their grandmother. While researching flights, Mr. Moffatt used a chatbot on Air Canada’s website, which suggested Mr. Moffatt could apply for bereavement fares retroactively. Mr. Moffatt later learned from Air Canada employees that Air Canada did not permit retroactive applications. Mr. Moffatt asks for a partial refund of the ticket price, as they relied on the chatbot’s advice. They claim $880 for what they say is the difference in price between the regular and alleged bereavement fares. Air Canada says Mr. Moffatt did not follow the proper procedure to request bereavement fares and cannot claim them retroactively.

Plaintiff argument

The plaintiff, Jake Moffatt, argued that Air Canada provided misleading information via the chatbot and that they relied upon this information when booking their flight. Mr. Moffatt claimed that they should have been given a partial refund of the ticket price, as they relied upon the chatbot’s advice and that Air Canada was negligent in providing him with inaccurate information. They claimed $880 for what they say is the difference in price between the regular and alleged bereavement fares.

In support of their claim, Mr. Moffatt also argued that they followed up with Air Canada and provided evidence of the chatbot's misleading information. They also pointed to the fact that they would not have flown last-minute if they knew they would have to pay the full fare.

The CRT ultimately agreed with Mr. Moffatt’s arguments and found that Air Canada provided unreliable information via its chatbot and that Mr. Moffatt relied on it. As a result, the CRT awarded damages to Mr. Moffatt.

Respondent Argument

The respondent, Air Canada, argued that they were not liable for providing misleading information via the chatbot and that Mr. Moffatt did not follow the proper procedure to request bereavement fares. Additionally, Air Canada argued that it could not be held liable for the information provided by the chatbot. They also relied on certain contractual terms from its Domestic Tariff and asked the CRT to dismiss Mr. Moffatt’s claim.

However, the CRT found that Air Canada did not take reasonable care to ensure that its chatbot was accurate and that Mr. Moffatt relied upon the chatbot to provide accurate information. As a result, the CRT allowed Mr. Moffatt’s claim of negligent misrepresentation and awarded damages. The fact that Air Canada did not provide a copy of the relevant portion of the tariff was also taken into consideration, and the defense was dismissed since Air Canada did not provide the relevant portions of the contract.

Consideration Court relied on:

The Civil Resolution Tribunal (CRT) relied on the evidence and submissions before it to decide the case. Both parties made their arguments and presented their evidence. The CRT used the principles of law and fairness to resolve the dispute. Additionally, in resolving disputes, the CRT must apply principles of law and fairness. The CRT took into account the fact that it has jurisdiction over small claims brought under Civil Resolution Tribunal Act (CRTA) section 118 and that the CRT’s mandate is to provide dispute resolution services accessibly, quickly, economically, informally, and flexibly. The CRT also took into account that the Court Order Interest Act applies to the CRT and Mr. Moffatt is entitled to pre-judgment interest on the damages from November 17, 2022, the date of their first email requesting the bereavement fare refund, to the date of the decision.

Takeaway:

The key takeaway from this dispute is that if a company, such as Air Canada, provides information via a chatbot or another automated system, that company may be liable for any misleading information that it provides. In this dispute, the CRT found that Mr. Moffatt made out their claim of negligent misrepresentation against Air Canada for their chatbot's misleading information and awarded damages accordingly. Additionally, it is important to provide all relevant evidence when making a case in a legal process and to follow the proper procedures when claiming certain fares or benefits.

                                                                                                                             (Rishiraj Chandan)

  Why to protect IP in Fintech industry?                                                                                                  ...