Advertisement

It wasn’t us, the chatbot did it

In a matter with Seinfeldian undertones, the Civil Resolution Tribunal in British Columbia, Canada, has rejected an argument that a chatbot powered by artificial intelligence is a separate legal entity responsible for its own actions.

In Moffatt v. Air Canada, 2024 BCCRT 149 (CanLII), Mr Moffatt sought a refund of part of the airfare for the flight he took to his grandmother’s funeral. The refund was available under the airline’s bereavement policy, and when booking his ticket on-line Mr Moffatt was told by the airline’s support chatbot that the refund could be claimed retrospectively.

That information was contradictory to the airline’s policy, and when Mr Moffatt sought to claim the refund, he was refused. Attempts to resolve the matter via email were unsuccessful and the matter proceeded to the tribunal.

Mr Moffatt’s case was reasonably simple, in that the information provided by the chatbot was misleading. Air Canada denied that it owed the refund, based on its view that:

  • it cannot be held liable for information provided by one of its agents, servants, or representatives; and
  • the chatbot was a separate legal entity that is responsible for its own actions.

The second of these propositions was – understandably – described by the Tribunal as a ‘remarkable’ submission.

The Tribunal found that Air Canada was responsible for the information on its website, including the information provided by chatbot (and despite the interactive nature of the chatbot). Unsurprisingly, the Tribunal was not persuaded that the chatbot was a separate legal entity.

Advertisement

Ultimately, the Tribunal found that Air Canada was responsible for the information on its website, and that provided by the chatbot, and Mr Moffatt was entitled to his refund. While this is simply a decision from a civil tribunal in Canada, there are lessons for QLS members here:

  • Yet again the fallibility of chatbots is seen, with this one being unable to extract accurate information from its own website. Practitioners and their clients should keep that in mind when using chatbots;
  • Those using chatbots will likely be held responsible for the information the chatbot provides, regardless of its level of autonomy.

While the damages in this case were small, it is easy to see how the same issue could arise with much bigger stakes. Any firms using chatbots, in client onboarding especially, should test them extensively before going live and monitor the performance of the chatbot carefully.

Additionally, care should be taken with their use generally, and it would be prudent to keep a record of any chat that is relied upon.

Share this article

Leave a Reply

Your email address will not be published. Required fields are marked *

Search by keyword