A recent decision of the Federal Circuit and Family Court of Australia – Handa & Mallick [2024] FedCFamC2F 957 – illustrates the consequences legal practitioners may face from overreliance on artificial intelligence products, including disciplinary action and personal costs orders.
As with any technology used in legal service delivery, practitioners need to carefully consider how they are using AI in a way which is consistent with their professional obligations to their clients and to the court.
The practitioner was an agent acting in an enforcement application concerning final property orders between a husband and wife. While the matter was stood down for negotiation, the parties were called upon to provide any authorities that they sought to rely on for Justice Humphreys to read in the interim.
The practitioner tendered a single-page list of authorities. Upon returning to chambers, neither Justice Humphreys nor her Honour’s associates were able to locate the cases identified in the list. The practitioner was asked to provide copies of the authorities referred to, and he did not do so. When the matter returned to court, Justice Humphreys asked the solicitor whether the list of authorities was prepared using AI. The practitioner confirmed the list was generated using artificial intelligence software.
Justice Humphreys acknowledged her concerns of the veracity of information provided in the list of authorities, and concerns of the competency and ethics of the practitioner.1
The practitioner was ordered to make submissions identifying any reasons why he ought not to be referred to the Victorian Legal Services Board and Commissioner. A personal costs order against the practitioner was also foreshadowed.
This decision underscores the importance of integrating proper checks and ensuring that you review all AI-generated information. This applies whether you are using a general purpose Large Language Model such as Chat GPT or a specialized legal tool with AI features. While the latter is likely to have a lower error rate, at the current stage of technological development all content must be verified.
The person undertaking such a check must have sufficient understanding of the relevant law and facts of the case in order to detect errors.
Practitioners are referred to the QLS Guidance Statement No. 37 Artificial Intelligence in Legal Practice and the AI Companion Guide for further information.
Footnotes
1 Handa & Mallick [2024] FedCFamC2F 957, [8].
Share this article