Advertisement

‘Advo-cat’ for the GPT Age: US lawyer faces sanction for filing a submission by ChatGPT

This content has been archived. It may no longer be relevant

Who can forget ‘Cat Lawyer’, a middle-aged practitioner struggling with a Zoom filter, pleading with the court to believe that he was not a cat while his opponent watched with admirably deadpan satisfaction?1 

Technology in law practices can make fools of us all unless we learn how to use it. AI is no different. In fact, more so due to the ‘black box’ nature of the machine’s reasoning. Used as a magic wand, there is no guarantee which end will be turning someone into a frog. 

An unfortunate New York lawyer has entered the Early Adopter’s Hall of Fame as the first lawyer2 (although probably not the last) facing sanction for mistaking Chatty’s authoritative air as actually knowing the law [Mata v Avianca Ltd 22-cv-1461].  

He used GPT-3 to draft a submission, complete with case references and quotes from supporting authority. Sadly, six of the cases cited so confidently don’t exist, and the judges quoted therein would be surprised at the opinions they were credited with expressing.  

Even if he used the time saved for a pleasant lunch, the aftermath will no doubt lead to digestive upset – the practitioner filing an affidavit expressing that [he] “greatly regrets having utilized generative artificial intelligence to supplement the legal research performed herein and will never do so in the future without absolute verification of its authenticity”. 

Some lessons for us all. As every student quickly learns, don’t cite cases you haven’t read. Secondly – don’t expect a generalist tool to reliably perform a specialist job. As a Large Language Model, GPT-4 is far better at polishing an argument than researching one, as at early 2023 in any event.   

Does this mean that lawyers should avoid using generative AI? Absolutely not. In fact, those who can’t or won’t will likely find themselves at a serious disadvantage in the near future. Even in their infancy AI tools are useful for limited tasks. The real power of GPT and its competitors will not be unlocked until the queries are passed on to specialist tools designed to undertake legal drafting and analysis. These are not far away, and once the wrinkles are ironed out will be a significant productivity enhancer. 

But even then, we can only delegate some of the work and none of the responsibility. Our job is to thoroughly scrutinise all output, and exercise sufficient professionalism to ensure that if we don’t know the area of law we find someone who does.  

As a practical tip, GPT-4 (paid version) is far more reliable than GPT 3.5, so invest the $USD 20 a month to get access to the latest tool. Also, if you specify in your prompt that you DO NOT want the response to include fictional citations it is less likely to include them. This illustrates the tragedy of AI – you can’t assume it to know something a human would take for granted. 

To avoid professional consequences, every firm should consider how and when staff may use AI and make it clear that independent professional judgement can never be delegated. For a draft AI usage policy, see here: QLS Innovation Insights: template AI policy.

Thanks to the eagle eyed Glen Carpenter at Spire Law for the tip on this unfolding legal story. We extend the invitation to all Proctor readers to forward your observations and any interesting developments you see. 

David Bowles is a Queensland Law Society ethics solicitor.

Footnotes
1 youtube.com/watch?v=lGOofzZOyl8.
 2 Does anyone know of an earlier case? If so, please let me know. The lawyer who filed the motion did not actually draft it. However, as the signatory and advocate of record he is in the firing line, at least for now.

Share this article

One Response

  1. Thanks foг finally talking about > 'Advo-cat' for the GPT
    Age: US lawyеr faces sanction for filing a submіssion by
    CһatGPΤ – Proctor bacchus

Leave a Reply

Your email address will not be published. Required fields are marked *

Search by keyword