On September 15, the Supreme Court of Queensland released its guidelines1 on the use of generative AI for practitioners.
As a legal specific AI platform (marytechnology.com), we naturally keep a close eye on these updates from the various jurisdictions we operate in.
From the vendor perspective, Queensland’s response is a timely and measured response to the challenges facing practitioners with the massive influx of AI tools available.
While the guidelines focused on AI chatbots and research tools (the most common AI legal applications), our view is the seven principles can be applied across the broad spectrum of AI use cases.
The seven principles from the guidelines are:
- Understand AI and its limitations.
- Uphold confidentiality, suppression, and privacy.
- Ensure accountability and accuracy.
- Be aware of ethical issues.
- Maintain security.
- Take responsibility.
- Be aware that court users may have used AI tools.
Queensland has taken a more informational approach in comparison to New South Wales and Victoria, which opted to include stricter rules on certain AI risks.
For example, Victoria has encouraged disclosing the use of AI in the preparation of documents to the court and other side, and New South Wales outright banned the use of AI in the drafting of affidavits and witness statements, while permitting AI for grammar and chronologies from source documents.
As a legal specific tech vendor, we support the measures taken by these jurisdictions. They act not just as a guide for litigators but set appropriate boundaries for vendors too while encouraging innovation.
In particular, ensuring accountability and accuracy is a design principle that is lacking in many of the available tools. Lawyers need to be wary of the risks involved with quick answers without the opportunity to inspect the sources, as well as understand what alternative answers were available.
This risk is most apparent when reviewing large sets of evidence in litigation matters. Generalised tools are built to answer questions quickly from clean data sets. In litigation, source documents are varied, messy and unstructured. Skipping the task of reviewing source documents with a practitioner’s own eyes risks violating this crucial principle.
At Mary, we prioritise this principle by organising the documents (splitting, naming, categorising), extracting all available facts (chronologically and tagged) and highlighting the information lawyers care about faster, while maintaining the ability to immediately view originals alongside rationale for any AI analysis taken.
It is not a platform to gloss over review, but to actively encourage it and give lawyers confidence in their own accountability requirements.
As the legal industry is still early in its adoption of AI, it is a certainty that the regulatory requirements will evolve over time. However, Queensland’s informative approach has laid a timeless set of principles that practitioners should become familiar with.
We would encourage all practitioners to consider these principles when evaluating AI tools as well as challenge vendors to ensure they not only comply with these guidelines but actively seek to encourage their responsible use.
Rowan McNamee is a former lawyer and co-founder of Mary Technology – a fact management platform that assists litigators by reducing the time taken to review and analyse evidence by 50-90 per cent.
Book a demo at www.marytechnology.com




Share this article