Advertisement

Do you speak robot?

A few years ago I had the privilege of going back to my alma mater, Ipswich Grammar School, to speak to the legal studies classes there.

It was a very enjoyable experience, something I would happily do again; I recommend that if anyone gets the chance to do the same with their old school, they take it.

The students were highly engaged and interested in the law, and some had even taken time during their holidays to go into Brisbane to watch the courts in action. The only slightly unnerving thing about it was that the entire class had laptops, and quickly googled every case I spoke about to check if I was getting it right.

Back then, of course, I only had to stay ahead of how quickly they could type, and simply be sure I knew the cases I spoke about well enough to cope with what the students could glean from a quick read of a casenote. None of them were going to be able to read a long case in such a short space of time.

If I do it again, things will be a little hairier, as I will be competing with AI, probably the (in?)famous ChatGPT system. The students would be able to have the app listening in real time, and the answers the AI would pop out would be far better-researched than they could do on their own. They could even ask the questions of the app before class, and come along armed with a well-researched, footnoted and professional-sounding answer in an effort to trip me up.

In my case, of course, the only consequence of being caught out by AI would be being snickered at by a bunch of teenage boys, most of whom probably won’t go on to do law anyway, and a slightly bruised ego. I am pretty sure I would survive it.

Advertisement

For practising solicitors, however, there is a much greater downside. As anyone who has ever had a family law client being advised on the side by an experienced divorcee friend can tell you, having your advice second-guessed by a half-smart party is debilitating.

The client often gives far too much weight to the advice of their ‘learned’ friend, develops unrealistic expectations and engages in costly, time-wasting argument over the simplest things.

However, a client armed with ChatGPT and running their questions (and your answers) through it will be an entirely different kettle of fish, likely one which is far more troublesome. This is because this AI is capable of producing fairly sophisticated advices – at least to a layperson’s eyes – and presents them in a very confident and credible way, even when it is lying.

Yes, this sort of AI happily lies when it cannot get the answer it needs, and dislodging its conclusions from a client’s mind – say, for example, it inflates a client’s potential damages by a factor of five by looking up average payouts in a different jurisdiction – will likely prove very difficult. This will be even more problematic if you do miss something that the AI doesn’t, because it will interfere with the relationship of trust and confidence that is essential to the solicitor-client relationship.

So, what is to be done? First, get your head around this technology, its advantages and limitations, and be ready to discuss both with your client. This article by my colleague, David Bowles, is a good place to start, and many legal tech-focused websites have enough information to give you a general idea of what ChatGPT and its ilk can (and cannot) do. Just as doctors warn patients of the dangers of ‘Dr Google’, we will need to warn clients of the limitations of AI legal advice.

It is also worth experimenting with the product, which can be downloaded for free (although clear it with your firm IT and security processes first) but bear in mind that the subscription version is far better than the free version. That means the free version gives only a taste of what the full-on product can do.

Advertisement

Finally, whatever you do, don’t ignore this technology. Much of the commentary around it so far has been the standard commentary on legal tech, that is, it will simply be a force multiplier for lawyers, not a replacement. That is most definitely NOT the case with ChatGPT.

Yes, it has its faults – it lies well, it makes mistakes and it is only as good as the data on which it is trained – but there are some legal tasks it will learn to do quite well. Solicitors need to be aware that it will impact on our work, and we need to be ready for it.

Shane Budden is a Special Counsel, Ethics, with the Queensland Law Society Ethics and Practice Centre.

Share this article

Leave a Reply

Your email address will not be published. Required fields are marked *

Search by keyword