Advertisement
Advertisement

Fake case prompts chatbot warning

The Queensland Civil and Administrative Tribunal (QCAT) has issued a warning about the use of generative AI after a fake case was included in a submission.

The non-existent case was presented in a matter decided last month, where occupational therapist “Ms LJY” sought a stay of conditions imposed on her registration by the Occupational Therapy Board of Australia from December last year, pending determination of her application to review the board’s decision.

Among Ms LJY’s contentions was that the public listing of the conditions had “caused immediate and irreparable reputational, professional and financial harm as they effectively operate as a de facto suspension without legal justification”.

In her final submission, the sole director and sole practising therapist in a Far North Queensland clinic argued “that denying a stay would establish a dangerous precedent where practitioners in high-demand service regions are forced to cease or restrict practice based on contested regulatory decisions before their appeal rights have been fully exercised”.

She referred to a case of Crime and Misconduct Commission v Chapman [2007] QCA 283.

In her decision delivered on 26 March, QCAT Deputy President Dann said that because the tribunal could inform itself in any way it considered appropriate, she decided to check the case using ChatGPT.

Judge Dann said the chatbot told her details including:

  • where the case could be found, but it did not exist in any of the databases of Australian and Queensland cases;
  • that the case was a significant case decided in the Queensland Court of Appeal involving a solicitor’s (Chapman) ability to practise law and related to determining whether to grant a stay pending appeal;
  • that the Commission had decided to suspend Chapman, based on alleged misconduct; and
  • that Chapman had sought a stay citing significant harm to him professionally and personally.

“This information is wrong: the case does not exist,” she said.

Judge Dann pointed to the Queensland Courts’ Guidelines for the Use of Generative Artificial Intelligence (AI) Guidelines for Responsible Use by Non-Lawyers, which applied in QCAT.

She said the guidelines set out, among other things, that “generative AI chatbots can make up fake cases, citations and quotes, or refer to legislation, articles or legal texts that do not exist, can provide incorrect or misleading information about the law or how it may apply in a particular case or get facts wrong”.

She said litigants before the tribunal were responsible for checking the accuracy of all information relied on or provided to the tribunal, including information from a generative AI chatbot.

“It is important that Ms LJY, and other litigants before the tribunal, understand that including non-existent information in submissions or other material filed in the tribunal weakens their arguments,” she said.

“It raises issues about whether their submission can be considered as accurate and reliable.

“It may cause the tribunal to be less trusting of other submissions which they make.

“It wastes the time for tribunal members in checking and addressing these hallucinations.

“It causes a significant waste of public resources.”

Judge Dann said it was recognised generally that it was appropriate to protect information about the health of patients and their treatment under Section 66(2) of the Queensland Civil and Administrative Tribunal Act 2009 (Qld).

“Because the tribunal has made observations about the use of generative AI in submissions before the tribunal which may attract wider attention, to ensure the patient’s identity is protected, in this case it will extend the operation of that non-publication order to the practitioner, at least until further order,” she said.

Ultimately, Judge Dann decided it was appropriate to grant a stay of the conditions until the review application was determined.

Share this article

Leave a Reply

Your email address will not be published. Required fields are marked *

Search by keyword