Artificial intelligence (AI) is increasingly used in accounting, valuation and forensic engagements supporting legal matters. While AI offers efficiencies, its use raises questions regarding confidentiality, independence, transparency and compliance with professional and judicial standards.
This article examines the implications of AI for accountants acting as expert witnesses, drawing on Accounting Professional and Ethical Standards (APES), International Valuation Standards (IVS), international commentary and the NSW Supreme Court’s Practice Note SC GEN 23.
A practical framework for responsible use is proposed.
1. Introduction
AI is now common in accounting workflows, from document review to data processing.
For accountants acting as expert witnesses in legal matters, however, AI use must be considered against the expert’s overriding duty to the court. Expert evidence must be independent, transparent and reliable.
Courts and regulators are responding to the risks associated with AI, and expert accountants must understand where the boundaries lie.
2. The Professional Standards Context: APES as the Foundation
Members of CA ANZ, CPA Australia and the Institute of Public Accountants must comply with APES Board (APESB) pronouncements, including APES 110 Code of Ethics, APES 215 Forensic Accounting Services and APES 225 Valuation Services.
These standards impose strict requirements regarding independence, objectivity, professional competence, due care, transparency and control over work performed under the member’s name.1 2 The service specific standards require disclosure of the methodology and sources of information.
The APESB considers such disclosure would generally include the use of AI tools and AI-generated information.3
AI does not dilute the professional obligations of expert accountants. Instead, it raises new questions about data security, control of work and whether an expert can demonstrate their reasoning.
In 2025, APESB acknowledged these concerns and commenced a review of its pronouncements to reflect the increasing use of AI.
3. Lessons from International Thought Leadership
3.1 Confidentiality Risks Identified by NACVA (US)
The National Association of Certified Valuators and Analysts (NACVA) warns that many public AI tools retain or reuse user inputs, exposing confidential or privileged material if uploaded to uncontrolled platforms.4
3.2 Ethical Use and Professional Judgment: ICAEW (UK)
The Institute of Chartered Accountants in England and Wales (ICAEW) highlights that AI can support work but cannot replace human judgment. Accountants must understand model limitations, maintain scepticism and ensure outputs are accurate and appropriate.5 6
3.3 International Valuation Standards (IVS)
The IVS Council requires valuers to maintain full control over valuation work and apply independent professional judgment. AI cannot produce an IVS-compliant valuation without human oversight.7 8
4. Judicial Developments: Implications of NSW SC GEN 23 for Accountants
The NSW Supreme Court’s Practice Note SC GEN 23 (2025) introduces strict rules for AI use in litigation.
4.1 Key Restrictions
Unless the Court grants leave:
- AI must not draft any part of an expert report.9
- AI must not alter, rephrase or influence an expert’s evidence.10
- Confidential or restricted material must not be input into AI systems unless confidentiality and training safeguards exist.11
4.2 Disclosure Requirements
If leave is granted, experts must disclose:
- which report sections were produced using AI.12
- the prompts, parameters and outputs relevant to those sections.13
- any applicable professional guidelines or policies.14
4.3 Permitted Uses
AI may still be used for administrative tasks such as chronologies, document indices and summaries15, provided confidentiality and accuracy are preserved.
5. A Practical Framework for Accounting Experts
Drawing from APES, IVS, international guidance and SC GEN 23, a practical approach is as follows:
5.1 Appropriate Uses
(Provided the Court grants permission, confidentiality is protected and human review occurs.)
- Summaries, indices, chronologies.
- Preliminary research.
- Data extraction, cleaning and transformation.
- Grammar and formatting checks.
- Administrative workflow automation.
5.2 Inappropriate Uses
(These undermine independence, confidentiality or admissibility.)
- Drafting substantive reasoning or conclusions.
- Using non-explainable models for analytical work.
- Uploading confidential or privileged material to public AI tools.
- Allowing AI to substitute for expert judgment.
- Relying on unverifiable or hallucinated outputs.16
6. Recommendations for Best Practice in AI-Enabled Expert Work
Expert accountants should adopt:
- Clear disclosure of AI use in engagement letters.
- Strict confidentiality controls regarding litigation material.
- Full retention of prompts, outputs and settings in work papers.
- Validation and testing of any AI-assisted analytical tools.
- Documentation of human judgment applied when reviewing AI material.
- Auditability consistent with APES and IVS.
Conclusion
AI is becoming integral to accounting practice, but its use in expert engagements requires caution. APES, IVS and SC GEN 23 all require transparency, control and independent judgment. AI may assist, but it cannot replace the expert. Responsibility for accuracy, reliability and integrity ultimately rests with the human practitioner.
- APESB, APESB at a Glance (2024) ↩︎
- CA ANZ, Conflict of Interest Guide (2021), p. 9 ↩︎
- APESB, Technical Alert (Oct 2025) ↩︎
- NACVA, AI and Machine Learning Advisory Brief (2024), p. 2 ↩︎
- ICAEW, Artificial Intelligence and the Future of Accountancy (2018), pp. 1–3 ↩︎
- ICAEW, ibid, pp. 3–9 ↩︎
- IVSC, Perspectives Paper (2025), p. 5 ↩︎
- IVSC (2025), pp. 4 and 10 ↩︎
- NSW Supreme Court, Practice Note SC GEN 23 (2025), [20] ↩︎
- Ibid, [10–14] ↩︎
- Ibid, [9A] ↩︎
- Ibid, [22(a)] ↩︎
- Ibid, [22(b)] ↩︎
- Ibid, [22(c)] ↩︎
- Ibid, [9B] ↩︎
- Matthew Lee, AI Hallucination Case Tracker, ↩︎


Share this article