Federal Court GPN-AI: What It Means for Australian Legal Practice

The Federal Court of Australia published its Generative AI Practice Note (GPN-AI) on 16 April 2026.

The Federal Court of Australia published its Generative AI Practice Note (GPN-AI) on 16 April 2026. Signed by Chief Justice Mortimer, the document is the first binding framework governing AI use in Australian federal proceedings, and its practical implications extend well beyond practitioners who appear before the Federal Court.

What the Practice Note requires

The GPN-AI builds its framework around three core obligations.

Anyone using AI in connection with proceedings must have a basic understanding of its capabilities and limitations, including that AI systems can produce outputs that are "not accurate, entirely fictitious or plainly wrong."

AI use must not adversely affect the administration of justice. That obligation isn't new: it flows from existing professional duties. What the practice note adds is specificity about where AI creates heightened risk.

Practitioners must be able to disclose AI use when the Court requires it — what tool was used, how, and for what purpose.

The verification obligations

The practice note's most operationally significant provisions concern verification. Where AI has been used to prepare documents filed with the Court, the responsible lawyer must personally confirm that:

  • cited legal authorities exist and support the stated proposition
  • evidence referenced in submissions is in the materials before the Court and reasonably likely to be admissible
  • facts stated in pleadings are based on what the party reasonably considers can be proved
  • chronologies are accurate

The signature on the document carries that weight regardless of how the draft was produced. Filing an AI-generated hallucinated citation in the Federal Court is now a breach of a specific, binding requirement.

The evidence provisions

For affidavits, witness statements, and expert reports, the requirements are more demanding. AI can assist with structure and drafting, but it cannot substitute for the deponent's own recollection, knowledge, or experience. Expert reports must contain the expert's own opinion and process of reasoning.

Where AI was used to summarise or analyse the information underlying a statement of fact or opinion, disclosure is required in the body of the document, at the start of the relevant section, describing specifically where and how AI was used.

Confidentiality: the underappreciated risk

The provision that deserves particular attention concerns confidential, suppressed, or privileged information. Entering such information into a standard AI tool - even inadvertently - may breach the implied undertaking that governs documents produced in litigation. The consequences apply even where sharing was not intended.

The practice note distinguishes between open AI tools, where information may become accessible to third parties, and closed or ringfenced systems. The risk is lower for genuinely closed environments. But the Court expects parties using closed AI tools to preserve the confidentiality of compulsory process documents with rigour, and to be conscious that using outputs from a closed tool for different purposes may itself breach the implied undertaking.

What this means in practice

The GPN-AI is part of a consistent pattern across Australian jurisdictions. NSW, Queensland, Victoria, South Australia, and the Fair Work Commission have all issued AI guidance in the past 18 months, landing in broadly the same place: verify output before it reaches the court, accept professional responsibility for everything filed, and be able to account for AI involvement if asked.

The Federal Court's practice note makes those obligations specific, binding, and enforceable in federal proceedings.

For law firms, the practical question is whether their AI workflows are auditable at the matter level. The Court assumes you can tell it what tool you used, how, and for what purpose. Practices that have built verification steps into their workflow as a default are positioned to answer that question. Those that haven't are now exposed in a way that didn't exist before 16 April.

A symposium is planned for later in the year, at which the standards are likely to be refined further as courts gather data on how AI is actually being used.

Habeas is a platform built from the ground up for Aussie lawyers involved in deep research. The platform grounds every output in verified Australian legal sources, with citations traceable to the source document. Book a demo.

Other blog posts

see all

Experience the Future of Law