Two barristers and a solicitor have been referred to their respective professional regulators over their conduct around AI use in error-riddled filings in the Federal Circuit and Family Court of Australia.
The matter came to light when amended versions of a Summary of Argument and List of Authorities were filed to correct errors in the original for an appeal1 in the Federal Circuit and Family Court of Australia, due to be heard last month.
The amendments included the removal of several authorities and footnotes, with a letter accompanying the amended version advising that there “were significant errors in the citations” in the original and apologising for the “oversight”.
The specific changes were not outlined in the amended document, however the Court noted it was “obvious the amendments were made to rectify the inclusion in the original of non-existent, inaccurate and misleading authorities.”
Suspecting AI had been used, the Court invited counsel to respond before referring the matter.
In their submissions, both barristers acknowledged that they had settled the original documents that had been prepared by the solicitor. While confirming the material had been prepared using AI, both said they had not used AI themselves.
In her own submission the solicitor also denied using AI herself saying that a paralegal had used it without her knowledge.
While accepting ‘full responsibility’ for the use of AI by her paralegal, the solicitor noted that she had terminated the paralegal’s services.
The solicitor also filed a second set of amended documents, settled by new counsel, replacing the list of footnotes included in the first amendment.
In its decision the Court expressed concern as to whether the further errors also arose from the use of AI but noted the solicitors “submissions filed the same day implied to the contrary”.
While each of the lawyers acknowledged that AI was used in the preparation of the originally filed documents, the judgment said that the “extent and in what way it was used remains opaque, notwithstanding the written submissions of the solicitor responsible.”
The decision reviewed the growing body of case law involving the risks posed by AI, noting particularly the judgement in Dayal2 that identified:
“the significant risks which attend the use of AI tools in legal practice and identified the relevant duties of legal practitioners, including the duty not to mislead the Court, to deliver legal services competently and diligently and not to engage in conduct which is likely to diminish public confidence in the administration of justice or bring the legal profession into disrepute. Those obligations extend, of course, not only to the use of AI but also to addressing difficulties when they are identified (as they were here).”
The Court noted that the practice directions and guidelines being developed in various jurisdictions make it clear that “if AI is used to identify authorities for the purposes of any Court document, then it is incumbent on the author and those accepting responsibility for the document to verify that those authorities are both accurate and relevant to the proceedings.”
“Each of the relevant legal practitioners conceded that they had not done so in respect of the original Summary of Argument and List of Authorities.”
Noting the public interest in ensuring “that those regulating the legal profession are aware of examples of difficulties which have arisen from the use of AI in the preparation of Court documents”, Justices Aldridge, Carew and Behrens concluded it was appropriate to refer the matter to the relevant professional disciplinary bodies for consideration of the conduct of the three.
The solicitor was, by consent, ordered to pay the respondent $10,000 as “costs thrown away correcting the errors generated by AI.”
The Court also awarded costs of $36,955 against the appellant, noting the considerable expense the defendant had been put to preparing for the appeal which was discontinued two days prior to the listing date.
The case underscores the message that long-standing professional obligations do not change when using new technologies.
“Decisions like these are a reminder that lawyers are responsible for the documents they tender, and need to check them carefully,” Queensland Law Society’s ethics special counsel Shane Budden said.
“This is especially important in light of the growing use of AI within the profession – and this is probably just the tip of the iceberg.
“Hallucinations and the like are picked up quickly in litigation matters, but there are likely errors in transactional matters that will take a long time to surface – and mistakes in things like AI-generated wills could take decades to be noticed.
Shane said the lesson is simple.
“Under-pressure junior staff are likely to turn to AI, and AI makes everything up,” he said.
“Law firms will need robust policies to guide the use of these tools, and be especially careful checking work that may have been created with the assistance of AI-based tools and anything created using an AI-based tool needs to be checked as thoroughly as you would check the work of a first – year graduate.”


Share this article