Fake cases, judges’ headaches and new limits: Australian courts grapple with lawyers using AI

Table of Contents

The legal professional had a back injury and was in a bit of a rush. Cases were due soon for an immigration appeal hearing where they were acting on behalf of someone contesting a decision made by the Immigration Minister. They'd heard about how artificial intelligence could be used in legal work, so they had a look at ChatGPT, putting in some relevant words to get a rundown of some similar cases. They thought the summary sounded good, so they worked it in as is into their submissions without double-checking the specifics.

Then, about 14 days later, the Immigration Minister's outline of claims pointed out 17 cases mentioned in the applicant's paperwork that had not actually occurred.

The lawyer was deeply mortified and apologetic when this received a belting from the federal court last year. But the minister took the strong view, reflected in the subsequent federal circuit and family court decision, that the use of generative AI was a major concern for the public and it was essential to set a precedent and "nip in the bud" any barristers using generative AI without double-checking their work.

Referred to the New South Wales legal services commissioner for examination.

There isn't any text provided to paraphrase . The prompt says there's a chat between a human and an AI assistant, but there's no text to paraphrase.

As artificial intelligence increases its use in some sectors, the full extent of its use in the legal field remains uncharted.

Thomson Reuters, a media company that provides its own AI software for lawyers, conducted a survey of 869 private practice professionals in Australia last year. The poll found that 40% of them work at law firms that are trialling AI but proceeding with caution.

  • Sign up for the latest news from the Guardian Australia - get emails as soon as news breaks.

The findings showed nearly one in ten lawyers were regularly utilising AI technology as part of their daily work. About a third of them expressed an interest in employing a generative AI legal assistant.

‘Correct and ethical use’

A Melbourne lawyer was referred to the Victorian legal complaints body after indicating he had utilised an artificial intelligence software in a Family Court case that produced incorrect court citations, resulting in a hearing being put off.

The solicitor, representing one party in a domestic dispute involving a married couple, lodged with the court a list of previous court cases that had been requested by the presiding judge.

Neither the judge nor his colleagues were able to pick out the individual cases from the list. When the matter came back to court later, the lawyer confirmed the list had been put together using the legal software Leap, a program that incorporates an artificial intelligence feature.

Leap specifically allows legal professionals to have lawyers review the work output by the AI, but in this case, it didn't happen when the technology "hallucinated" citations.

Christian Beck, Leap’s chief executive, reckons lawyers around the world are starting to take up AI technology.

“At Leap, we promote the right and fair use of our integrated AI products, and have introduced a range of measures to protect against misuse along with education and professional development,” Beck says.

US Democrats are pushing for a more confrontational approach against Trump and Musk: 'We're going to be the opposition'.

“LawY offers users a complimentary lawyer verification process, based on the expertise of local lawyers with heaps of experience. We're pretty tight on confidentiality, too – we use tech that's not connected to training big language models, so there's no risk of getting sensitive information out in the open.”

It's not just lawyers copping criticism for using AI. Those blokes presenting affidavits to the courts, or individual representing themselves, have judges worried.

In one case last year, an offender provided a personal reference from his brother, which the Supreme Court of the Australian Capital Territory stated "strongly suggested" was written by a large-language model like ChatGPT.

There was a statement in the brother's affidavit that said he had known his brother "both personally and professionally for an extended period", which the court suggested cast doubt on whether the brother had sorted out the affidavit himself.

And in a decision by the Vic Supreme Court appeal regarding the sacking of a student by a uni, it was noted that some docs lodged by the applicant "contained some case citations to cases that don't exist".

"I've left out those citations so they don't become a part of the problem of LLM AI creating unapproved case references," Justice Kristen Walker said.

The bloke who was asked by the Minnesota attorney general to have a crack at the topic of misinformation copped a serve after it was found out he'd been using dodgy article citations riddled up by AI to support the state's case.

‘Still early days’

Up to this point, there've only been a few dozen complaints made to the various state bodies that regulate lawyers in Australia.

Professor Jeannie Paterson, University of Melbourne

“It's still too early days and we reckon we might get more complaints as generative AI starts bein' used more widely," a spoksperson for the NSW Legal Services Commissioner said.

The Queensland legal services commissioner, Megan Mahon, points out that technology is having a significant impact on how the public interacts with lawyers.

“Self-help might be one thing, but making sure AI doesn't let people who are not qualified and not supposed to be giving legal advice just keep doing it is a big worry,” Mahon says.

“It's really important that anyone chatting with a lawyer makes sure they're talking to someone who's qualified and registered to give legal advice.”

On Monday, a practice note issued by the NSW Supreme Court limited the use of generative AI by lawyers, stating it cannot be used to generate affidavits, witness statements, character references or other matters presented in evidence or cross-examination.

Janine Paterson, a professor of law and the director of the Centre for AI and Digital Ethics at the University of Melbourne, reckons that errors may be seen from lawyers with fewer resources or experience, and that it stresses the need for lawyers to be trained on the use of AI.

“I reckon this is an AI awareness issue as much as a lazy lawyer issue,” she says.

“Train people in where it's useful, because once you start using it in a conscious way, you realise it's actually not good at those sorts of things.”

“To say our legal system's headed for a crash if we lack that level of literacy.”

The Victorian legal services board has flagged lawyers' misuse of AI as a major concern.

“It's crucial for lawyers to remember that it's their responsibility to deliver accurate legal info, not the job of an AI system,” a spokesperson said.

“One thing that sets apart a law qualified professional from AI is that it can't make higher-quality life decisions, or provide services that keep confidentiality and ethics in mind.”

Posting Komentar