Do you use chatbots to provide services to your customers or employees? Or are you thinking about implementing one? If so, you have come to the right place. In this multi-part series, we explore the key legal considerations of setting up and maintaining a chatbot. Up first: our commentary of a recent decision in British Columbia and what it means for businesses in Québec.
***
"Hi! I will be assisting you today. How can I help?" asks the chatbot.
Frustrated customers answer the prompt, vigorously detailing their problems (possibly with expletives) as if the chatbot were a real employee of the business they are dealing with. However, chatbots are not human; they are computer programs that simulate human conversation – often using generative AI.1 Businesses use chatbots to engage with an unlimited number of customers in a personal way and resolve their issues at a fraction of the cost associated with actual human interaction. But what happens when the chatbot provides misleading or false information? Worse, what if the customer relies on this information? Are businesses liable for such misrepresentations?
On February 14, 2024, the British Columbia Civil Resolution Tribunal (analogous to Québec's Small Claims Court) answered these questions in Moffatt v. Air Canada.2 The Tribunal held that a company could be liable for its chatbot’s negligent misrepresentations made on its website. Although the monetary claim was limited to approximately $880, the decision has garnered international media attention3 and is an important development in the law of obligations and consumer protection law as businesses increasingly rely on AI-related solutions to assist their customers.
In Part I, we focus on the impact of the Moffatt decision on Québec businesses who use or are contemplating using AI-related solutions to streamline their operations, particularly their interactions with customers.
Facts
Following the death of their4 grandmother, the plaintiff, Mr. Moffatt, conversed with a chatbot on Air Canada’s website while researching flights. The chatbot suggested that Mr. Moffatt could apply for reduced bereavement fares after the ticket had been purchased if the request for reimbursement was submitted within 90 days of the issuance of the ticket.5 The chatbot also referenced a link to another page on Air Canada’s website explaining the bereavement policy, which contradicted the chatbot, as it stated that the bereavement policy cannot be applied retroactively.6
Relying on the chatbot’s representations, Mr. Moffatt submitted their request for the bereavement fare after travel had been completed but within the 90-day window.7 Over the next months, Mr. Moffatt communicated with Air Canada, attempting to receive a partial refund of their fares. Mr. Moffatt emailed Air Canada a screenshot of their conversation with the chatbot.8 An Air Canada representative admitted that the chatbot had provided “misleading words” to Mr. Moffatt but noted the reference to the page with the accurate information on the bereavement policy.9 As Air Canada and Mr. Moffatt were unable to resolve their dispute, Mr. Moffatt sought compensation before the Tribunal for the delta between the regular and the bereavement fares.
The decision
The Tribunal found that Air Canada is liable under the tort of negligent misrepresentation as (a) Air Canada failed in its duty as a service provider to take reasonable care to ensure that its chatbot’s representations were accurate and not misleading, and (b) Mr. Moffatt reasonably relied on these representations.
The Tribunal rejected Air Canada’s argument that it cannot be held liable for information provided by its chatbot as the chatbot is a “separate legal entity that is responsible for its own actions”.10 According to the Tribunal, as the chatbot is a part of Air Canada’s website, it should have been “obvious” to Air Canada that it is responsible for all the information on its website.11
The Tribunal also rejected Air Canada’s argument that Mr. Moffatt could have found the correct information elsewhere on its website. Air Canada did not explain why the other page is inherently more trustworthy than its chatbot, nor did it provide reasons why customers should double-check information found on its website.12
Air Canada also argued that it should not be liable due to the terms and conditions of its tariff, which was accepted by Mr. Moffatt. For reasons that are not explained in the decision, Air Canada failed to provide the relevant portion of the tariff as part of its submissions.13 The Tribunal concluded that if Air Canada wanted to raise a contractual defence, it needed to provide the relevant portions of the contract.14
Furthermore, the Tribunal noted that “Air Canada did not provide any information about the nature of its chatbot.”15
Analysis and comments
Québec courts have not yet issued a judgment on the liability of companies for their chatbots’ representations. However, if the Moffatt decision and its notoriety are any indication, it is only a matter of time before Québec courts will be called upon to do so as businesses increasingly rely on AI-related solutions.
For starters, it is possible that the scope and extent of the evidence adduced during the trial and the resources allocated to litigate the legal issues were limited proportionally with the modest value of the claim. We highlight that, before the Tribunal, the parties submit their pleadings in writing,16 unlike Québec’s Small Claims Court, which generally requires oral arguments.
Although the (closest) counterpart of the tort of negligent misrepresentation in Québec civil law is the doctrine of fraud (“le dol”) under Article 1401 of the Civil Code of Québec (“CCQ”),17 a successful claim in dol requires the intention to deceive18 unlike its common law counterpart. It is highly unlikely that a customer could establish that they were willingly deceived by a chatbot, or its developers. Therefore, a Québec action based on le dol for the representations of a chatbot seems to have little chance of success as the burden of proof is quite high.
Instead, consumers are more likely to invoke the protections provided by the Québec Consumer Protection Act (“CPA”) against false or misleading representations.19 Section 219 CPA provides that no merchant, manufacturer or advertiser may make false or misleading representations to a consumer by any means whatever.20 To determine whether a representation is false or misleading, Section 218 CPA provides that the “literal meaning” of the words used in the representation and the “general impression” given by the representation must be assessed without considering the personal attributes of the consumer.21 Unlike dol, a finding of false or misleading representations under the CPA does not require the intent to deceive. The application of the following provisions of the Competition Act22 also merit further consideration: Section 52, which prohibits false or misleading representations that are knowingly or recklessly made, and Sections 74.01 and 74.011 (2), which prohibit false or misleading representations, particularly in electronic messages.
As for the argument that a business should not be liable for the misrepresentations of its chatbot because it is a distinct legal entity, that argument is unlikely to succeed in Québec. Using the Moffatt decision as an indicator, Québec courts could hold that a business retains control over the chatbot, like any other part of its website.
Lastly, businesses should consider their obligations under the civil and contractual liability regimes codified by the CCQ with regards to their respective chatbots. Notably, businesses should consider the possible implications of the doctrine of “an act of a thing” in Québec. According to Article 1465 CCQ, the custodian of a thing is liable to make reparations for injury resulting from the autonomous act of the thing. However, in such a case, a business can avoid liability by proving that it took reasonable steps to prevent the injury.23 Further, businesses could be vulnerable to an action under Articles 1457 CCQ or 1458 CCQ, depending on whether the alleged fault occurred following the formation of a contractual relationship between the business and the customer.
In light of the conclusion in Moffatt, we suspect that, in future cases in Québec, businesses will need to enlighten the court on how their chatbots were created, trained, and tested in support of their defence against an action in damages for the misrepresentations of their chatbots. Assuming that a business used a chatbot that relied on generative AI algorithms and machine learning, it would have needed to train the chatbot using existing data to understand a future customer’s motives.24 In such case, the business could point to the absence of an intent to deceive throughout the creation, training and testing processes. The business could also show that it took reasonable care to maximize the likelihood of accurate responses and, in doing so, mitigate the risk of injury among its customers. However, it is possible that a business could be required to adduce further evidence, in addition to its processes of creation, training and testing of its chatbot, to successfully defend against an action in damages.
In any case, the Moffatt decision highlights the necessity for businesses to ensure the accuracy of all information shared with their customers across all interfaces, whether by real employees, automated systems, or on a static website. It also warns about the potential legal liability that can be engaged if false or misleading information is delivered. In Part II, we will explore how businesses can mitigate the risks that can arise from chatbot interactions with customers.
__________
1 Chatbots are software programs that use artificial intelligence to simulate natural language conversation. They are not human agents who can adapt based on the context, emotions, and intentions of the people they talk to. Chatbots rely on predefined rules and data to generate responses. They can make mistakes, misunderstand queries, or provide outdated or inaccurate information.
2 2024 BCCRT 149.
3 See eg Leyland Cecco, "Air Canada ordered to pay customer who was misled by airline’s chatbot”, The Guardian (16 February 2024), online: https://www.theguardian.com/world/2024/feb/16/air-canada-chatbot-lawsuit; Maria Yagoda, “Airline held liable for its chatbot giving passenger bad advice – what this means for travellers”, BBC (23 February 2024), online: https://www.bbc.com/travel/article/20240222-air-canada-chatbot-misinformation-what-travellers-should-know; “Air Canada jugée responsable des mauvais conseils prodigués par son robot conversationnel", Radio-Canada (17 February 2024), online: https://ici.radio-canada.ca/nouvelle/2049941/robot-conversationnel-air-canada-plainte.
4 When referring to Mr. Moffatt, we use the pronouns applied by the Tribunal.
5 Moffatt, supra note 2 at para 15.
6 Moffatt, supra note 2 at paras 16‑17.
7 Moffatt, supra note 2 at para 20.
8 Moffatt, supra note 2 at para 21.
9 Moffatt, supra note 2 at para 22.
10 Moffatt, supra note 2 at para 27.
11 Moffatt, supra note 2 at para 27.
12 Moffatt, supra note 2 at para 28.
13 Moffatt, supra note 2 at para 31.
14 Moffatt, supra note 2 at para 31.
15 Moffatt, supra note 2 at para 14.
16 Civil Resolution Tribunal, "What are arguments?", online: https://civilresolutionbc.ca/help/what-are-arguments/ (accessed on 19 March 2024).
17 c CCQ-1991.
18 9147-7356 Québec inc c 9289-6331 Québec inc, 2022 QCCS 32 at para 68. See also Canada Life Assurance Company c Protection VAG inc, 2021 QCCS 3725 at paras 315 and 316.
19 c P-40.1.
20 Sections 220 to 251 CPA supplement Section 219 by providing further prohibitions for specific types of representations.
21 Richard v Time Inc, 2012 SCC 8 at paras 45‑51.
22 RSC, 1985, c C-34.
23 Promutuel Bois-Francs, société mutuelle d'assurance générale c. Goudreault, 2022 QCCS 1549 at paras 35‑38.
24 We discuss this point in greater detail in Part II.