The adjudicator in a body corporate dispute has awarded maximum costs in a matter where AI was suspected of being used in submissions that involved “numerous and persistent misrepresentations of the relevant law and evidence”.
While not making a formal finding about the use of AI, Adjudicator Ingrid Rosemann wrote in her judgement1 that “if the applicant used AI or other sources in preparing the material they submitted, I am not satisfied they checked the accuracy of the information obtained”.
“If the applicant has not relied on AI for the contents of the submitted material, then they do not even have even that excuse for the innumerable errors and misrepresentations,” she stated.
The adjudicator said that the way the applicant “pursued this application, including the continual reliance on non-existent or irrelevant case law and legislative provisions, and incorrect or unsubstantiated claims about the submitted evidence, has impeded the determination of this application”.
“The applicant cited incorrect or irrelevant provisions of the legislation on numerous occasions,” she said.
“The applicant cited cases as precedent for various propositions, but those cases either do not exist or have little or no relevance to this dispute.”
The adjudication through the Office of the Commissioner for Body Corporate and Community Management follows recent cases in other jurisdictions that have sparked concerns about increasing use of generative AI in courts.
A recent US case ended with a ‘terminating sanction’ being issued, after the court shared a defendant’s lawyers concerns over the use of deepfakes and AI-generated evidence.
The Supreme Court of Queensland also recently issued an AI Practice Direction emphasising individual responsibility for oversight.
In her findings, the adjudicator referenced these new guidelines for the use of generative AI by non-lawyers.
“It explains that the information produced by generative AI can be inaccurate, incomplete, or out of date,” she said.
“It says a user is responsible for ensuring all information they rely on or provide to a court or tribunal is accurate. As such, a party must check the accuracy of any information obtained from a generative AI chatbot before using it.”
The adjudicator noted that when she asked the applicant if they used AI, they responded that “portions of the application and reply were prepared using tools for assistance with grammar and document structure only”.
“They said they independently verified all information generated or formatted using any digital assistance. They said miscited or incomplete references to case law were unintentional,” she wrote.
She said that while a misapprehension of the law would not itself warrant the award of costs, the nature of the application went beyond that.
In her decision to dismiss the application and award full costs to the respondent, the adjudicator stated “the scale of the deficiencies in the submitted material amounts to an abuse of process”.
“The resources of this office have also been wasted,” she said.
The Queensland Law Society has resources available and a guidance statement for the use of AI in legal practice.


Share this article