While the legal fraternity has used AI to make it more efficient, it is also prone to making up fictitious case law.
Picture: iStock
There’s a reason lawyers get paid as much as they do, and why good lawyers get paid even more.
Knowing decades of precedent, the intricacies of each case and creatively patching ones which support your case together — that takes years of graft. As in all industries, even though there are thousands of lawyers out there, there’s that special handful who really have mastered the craft.
Then you get the rest of us who aspire to be at that level. Some work toward it through personal development. Others have discovered artificial intelligence.
As with any tool, it’s important that you understand how it works in order for it to deliver the results you want. How to read a book is pretty understandable. How to search through an index, equally so, but it’s not the grandest of ideas to give a lawyer a child’s picture book titled “How Birds Fly” and expect them to be able to land a 747. You’d be within your rights to have an expectation that they’d have some sort of pilot training first.
Similarly, you don’t just throw tech at a lawyer and say hey, your job, which you get paid above minimum wage to do, just got so easy that you can sit back and relax while Jarvis takes a break from building the next Iron Man to help your client settle their messy divorce.
ALSO READ: Lawyers who steal: R1.4bn trust fund theft ignored
It shouldn’t take a rocket scientist to realise that typing “write me an argument to win my case” and “I have a client who caught their partner cheating with the pool guy and wants to divorce and take the three kids. What are similar cases in my favour?” will yield different results.
It’s not that much of a leap to understand that the quality of your AI “research” is only as good as the information you give it. And if you don’t understand how AI works or, worse, understand that each AI is different and based on differently prioritised datasets, you’re not going to know how to ask it for what you want.
To its credit, the legal fraternity is very welcoming of AI as a tool and globally, it’s been adopted to create much more efficiency, especially in the mundane parts of the job. Is it prone to making up fictitious case law? Absolutely. And we’ve seen this in jurisdictions around the world.
ALSO READ: Will AI replace your psychologist?
You’ve got to admire the ballsiness of a lawyer who would approach a judge with a made-up case, but if we’re honest, it’s unlikely that they even knew what they were doing. Sounds right, looks right, everybody is claiming how awesome this tech is… let’s just submit what it gives me.
And yet, it is not right when it comes to case law. That’s not entirely surprising. It’s one thing to teach a bot to string a couple of emotional sentences together using a global database. It’s quite another to have it formulate a cohesive legal argument using jurisdiction-dependent and often elusive legal sources.
So if your lawyer gets busted for using AI to build your case and gets called out by a judge on it, then of course you should be upset about it. If your lawyer uses AI to get a bunch of cases together before they check them and put them together for argument themselves, that’s just good sense.
Nobody expects a lawyer to build a time machine, go back to every case that they reference and witness the full proceedings first-hand. We have a tool to avoid doing that: the written record. Similarly, it’s not expected that you restudy an LLB every time you do legal research, and if AI is going to get you the result faster, it may even save you some fees.
All you need to demand of your AI-using lawyer is that they compile and check the final product, because it would be pretty upsetting if you’re sitting in court and the judge asks why you pleaded, “click here for more information”.