Back to Articles
ACT Chief Justice Lucy McCallum's Thoughts on AI in the Legal System

The Canberra Times

SKIPPED

Details

Date Published
9 Feb 2025
Priority Score
3
Australian
Yes
Created
8 Mar 2025, 01:04 pm

Authors (1)

Description

The ACT is set to adopt guidelines on AI use issued by the NSW Supreme Court after consultation with local judges and practitioners.

Summary

Chief Justice Lucy McCallum of the Australian Capital Territory has expressed strong concerns about the integration of artificial intelligence (AI) in the legal system. Her remarks highlight the potential pitfalls of using generative AI, including the risk of misinformation and the inadequacies of AI in performing human-centric legal tasks. The ACT is preparing to implement guidelines on AI similar to those of the NSW Supreme Court, which advise caution due to the limitations of AI, such as 'hallucinations' and bias. This initiative underscores the importance of maintaining human judgment and empathy in legal proceedings, while allowing AI to handle only minor, supportive roles. The ongoing dialogue within the Australian legal framework emphasizes the critical balance between embracing technology and safeguarding justice.

Body

The territory's top judicial officer has offered a warning about the pitfalls of generative artificial intelligence and its lack of humanity in dealing with complexities in the legal system. "To me, the prospect of receiving legal submissions generated by AI, let alone court itself becoming a virtual space, is chilling," Chief Justice Lucy McCallum said. "I cannot imagine a chatbot taking the place of advocates in this city, who are redoubtable, irrepressible and irreplaceable." Much of the judge's January 28 speech to mark the commencement of the territory legal year revolved around AI and its looming shadow. "To resist the siren call of generative AI as the future of the legal profession, we must tie ourselves to the mast and stay a steady course, working harder and smarter to ensure that finite resources are stretched to make justice as accessible as it can be," she said. Many believe AI-powered programs have already proven useful and efficient by, for example, handling menial tasks in law firms, summarising information, writing draft documents, and deciphering contract jargon. But it appears some kinks are still being ironed out inside courtrooms. In 2024, a Melbourne lawyer was referred to the Victorian legal complaints body for submitting false case citations generated by AI software. More recently, another lawyer was referred to a comparable body in NSW after he used the popular AI platform ChatGPT to similarly create false case citations. Both apologetic practitioners admitted not double-checking the citations, which had been unknowingly hallucinated. In the ACT, Justice David Mossop called out what he believed was the "clearly inappropriate" use of AI to write a character reference in support of a man being sentenced for fraud. It was the role of counsel to inform the court if any tendered document had been written or re-written with the assistance of a large language model, the territory judge said in early 2024. At the commencement of the latest law term, Chief Justice McCallum told a room filled with judicial officers, practitioners and politicians the territory would adopt guidelines for AI use recently published by the NSW Supreme Court. Those guidelines warn of the "limits, risks and shortcomings" of AI programs, including the scope for hallucinations, misinformation, incomplete data, bias and inadequate safeguards around privacy. They bar practitioners from using AI to generate affidavits, witness statements or character references. Lawyers also must verify that any AI-generated references and citations in written submissions exist, are accurate and are relevant to the proceedings. NSW Chief Justice Andrew Bell recently shared his own concerns about AI and its potential to be manipulated by companies producing software. "The task of judging in our society is a human one," he told the ABC. A guiding practice direction for the ACT is set to be published after appropriate consultation with local judges and practitioners. But Chief Justice McCallum made clear her views on the limitations of AI and its inability to fulfill essential human roles in and outside the courtroom. "Generative AI will not sit with the parents of an offender who has been sent to jail, or explain to a litigant or victim of a crime why an appeal was allowed, or tactfully praise a cross-examination, or laugh at a counsel's jokes," she said. AI, she said, would not stay late in the court registry to make sure an accused person could sign their bail undertaking to be released from custody. "Or, like our wonderful sheriffs, somehow manage to make all who enter the court feel both protected and nurtured at the same time." At the same ceremonial sitting, ACT Law Society president Rob Reis said: "AI is here to stay and we will all embrace this new world." "We need not fear or be in dread of the advancements," he said. In December, the ACT Law Society told The Canberra Times it was working with other Australian jurisdictions to develop guidance around the use of AI ahead of an education conference in March. MORE COURT AND CRIME NEWS: "The ACT Law Society would support the introduction of guidance on the use of generative AI technology in ACT court proceedings," it said. In a recent mock trial held at SXSW Sydney, a lawyer was ruled victorious in a battle against AI to determine a traffic offence matter. Based on that experiment, it appears the role of human counsel may be safe for now. But the future is a guessing game, and the limits of AI use inside Australian courtrooms are unknown.