Australian Lawyer Penalized for Using AI in Law Case

Updated:September 4, 2025

Reading Time: 3 minutes
A legal gavel

AI is transforming nearly every industry. However, a recent case shows that when it comes to the law, misusing AI can have real consequences. 

For the first time in Australia, a lawyer has been officially sanctioned for presenting AI-generated court documents filled with fake legal citations.

The Victorian Case

Family law court Australia
Source: The Guardian

In July 2024, during a routine hearing in Victoria, a solicitor represented a husband in a family dispute. 

As part of court proceedings, he submitted a list of legal cases requested by Justice Amanda Humphreys. On the surface, the list looked legitimate. 

But upon review, they discovered a major problem: none of the cases could be found in the official legal records.

When pressed, the lawyer admitted that the list was prepared using legal research software powered by AI. 

He had not bothered to verify the accuracy of the citations before handing them over to the court. This oversight turned into a serious professional misstep.

The solicitor quickly apologized, telling the court he had not fully understood how the AI tool worked. 

He acknowledged that he should have double-checked the results and promised to “take the lessons learned to heart.” 

To cover the unnecessary costs caused by the error, he paid the opposing party’s legal fees for the wasted hearing.

Disciplinary Action

Despite the apology, Justice Humphreys referred the matter to the Victorian Legal Services Board (VLSB). 

The case will be further investigated to emphasize public confidence in the legal system. And after a months-long review, the VLSB issued its decision on August 19, 2025. 

The solicitor’s practicing certificate was varied. This stripped him of his right to act as a principal lawyer. 

This means he can no longer run his own practice, manage trust accounts, or operate independently. 

Instead, he must now work as an employee solicitor under supervision for at least two years.

Both he and his supervisor will have to file quarterly reports with the board to ensure compliance.

A spokesperson for the board explained: “Our regulatory action in this matter demonstrates our commitment to ensuring legal practitioners who choose to use AI in their legal practice do so responsibly and consistently with their obligations.”

A Trend 

Since that hearing in 2024, more than 20 similar incidents have been reported in Australian courts.

They all take the same familiar pattern: AI-generated documents contain fake case law or inaccurate information.

For example, lawyers in Western Australia and New South Wales have also been referred to their state regulators after similar errors came to light. 

In one unusual case, a litigant claimed that their documents were created using ChatGPT. However, the court discovered the document was dated before the tool was even released. 

High Stakes 

Legal proceedings depend heavily on accuracy and precedent. A fake citation can mislead judges, waste valuable court time, and potentially harm clients’ interests. 

Unlike in other industries, errors in court are a matter of public record and can damage reputations permanently.

Imagine hiring a lawyer to represent you, only to learn that your case was weakened because of fabricated research. 

For many people, that could mean losing custody, property, or financial security. That’s why regulators and judges are treating these issues so seriously.

AI Responsibility

The Law Council of Australia acknowledges that AI will play a role in the future of law. Although leaders stress that it must be used with extreme care.

 Juliana Warner, the Council’s president, explained that while fake citations are a “serious concern,” an outright ban on AI would be impractical.

“Generative AI is now widely used,” she said. “Banning it outright risks hindering innovation and limiting access to justice. 

But lawyers must keep front of mind their professional and ethical obligations to the court and to their clients.”

In other words, AI may assist with research, drafting, or streamlining tasks, but it cannot replace the professional judgment of a lawyer. Tools are only as reliable as the people who use them.

Lolade

Contributor & AI Expert