The Certification Trap: When Courts Mistake Technology Anxiety for Professional Reform

There is a quiet but unmistakable trend emerging across jurisdictions: courts are increasingly requiring attorneys to certify whether artificial intelligence was used in drafting a brief or filing. These rules are typically well-intentioned. They arise from legitimate concerns about fabricated citations and the lawyer’s non-delegable duty of candor. No serious practitioner disputes that obligation.

But the certification requirement itself reflects a deeper problem — one that the profession should confront with care. It risks elevating form over substance, codifying what is already governed by existing ethical rules, and signaling a misunderstanding of how modern lawyers actually work.

This is not an argument against responsibility. It is an argument against redundancy masquerading as reform.

The Duty Already Exists — And Always Has

Every lawyer admitted to practice understands that a signature on a pleading is not ceremonial. It is a professional representation to the court.

Rules modeled on Rule 11, along with state analogues, already require that factual and legal contentions have evidentiary support and are warranted by existing law or a good-faith argument for its modification. Professional conduct rules independently impose duties of competence, diligence, and candor toward the tribunal.

If a lawyer files a brief containing nonexistent cases, the problem is not technological — it is professional. The misconduct occurs regardless of whether the source was a junior associate, a treatise, a misread case, or a software tool.

Courts have never required attorneys to certify that they personally verified every Westlaw headnote, Shepardized every citation without assistance, or double-checked every quotation produced by a research platform. The obligation has always been outcome-focused: the filing must be accurate.

Artificial intelligence does not alter that equation.

Technology Has Always Changed Legal Practice

The legal profession has navigated transformative technological shifts before. Electronic research displaced library stacks. Email replaced courier deliveries. E-filing eliminated paper dockets. Few would argue today that those changes diluted professional responsibility.

On the contrary, they enhanced efficiency while leaving the lawyer’s core duties untouched.

Artificial intelligence belongs in that same lineage. At its most practical level, it functions as an advanced research assistant — a tool capable of accelerating synthesis, improving clarity, and reducing administrative friction. Used properly, it allows lawyers to devote more time to judgment, strategy, and client counseling — the aspects of practice that cannot be automated.

The signature still means what it has always meant: the lawyer stands behind the work.

Requiring a certification that AI was used implicitly suggests that the technology itself is suspect. That framing may unintentionally discourage transparency and thoughtful adoption, while doing little to prevent the very misconduct the rule seeks to address.

Bad lawyering is not a software problem.

The Expanding Culture of Procedural Formalism

The AI certification requirement also fits within a broader procedural trend familiar to most litigators. We certify service in an era when electronic filing generates instantaneous confirmation. We certify conferral before filing many motions, often attaching detailed statements describing efforts that rarely affect the court’s analysis. We navigate increasingly exacting formatting directives — font specifications, editable order requirements, and technical submission rules that can result in rejection despite full substantive compliance.

None of these requirements is indefensible in isolation. Administrative order matters. Courts must manage extraordinary caseloads with limited resources.

Yet collectively, they raise a legitimate question: at what point does procedural layering begin to obscure the judiciary’s central function — the fair and efficient resolution of disputes?

When a proposed order is returned because it was not submitted in precisely the preferred format, even though it is editable and accurate, the message received by practitioners is unmistakable. Process has begun to compete with substance.

The AI certification risks becoming another entry on that growing list.

Regulation Should Follow Risk — Not Anxiety

Courts are right to be attentive to emerging technologies. The legal system depends on trust, and any tool capable of producing convincing but incorrect information deserves scrutiny.

But durable regulation typically responds to demonstrated systemic risk, not isolated headline-grabbing incidents.

It is worth remembering that citation errors predate artificial intelligence by decades. Reporters are filled with cases correcting miscited authority, misquoted language, or misunderstood holdings. The profession addressed those failures through enforcement of existing rules — not by requiring lawyers to disclose whether they used a particular book or database.

If a filing is accurate, the method of research is largely irrelevant.
If it is inaccurate, existing sanctions are already available.

The focus should remain where it has always belonged: on the reliability of the work product.

The Generational Dynamic — and the Need for Dialogue

There is also a quieter dimension to this conversation, one that should be approached with respect rather than caricature. Many judges and rule-makers built their careers in an era defined by physical libraries and manual research. Their caution toward rapidly evolving tools is understandable.

But technological skepticism and technological fluency need not be adversaries.

The profession benefits when experience and innovation inform each other. Lawyers who came of age in digital environments should welcome reasonable guardrails; institutional leaders should remain open to the possibility that new tools can enhance — rather than threaten — professional standards.

The goal should not be resistance or blind adoption, but informed integration.

Competence, after all, is not static. It evolves with the tools necessary to practice effectively.

What Would a More Productive Approach Look Like?

If the concern is that some attorneys may rely on AI without verification, the answer is straightforward: emphasize verification.

Judicial education programs, bar guidance, and continuing legal education already provide mechanisms to reinforce best practices. Clarifying that lawyers must independently confirm citations and authorities — regardless of how they are generated — would strengthen competence without imposing another procedural hurdle.

Courts might also consider whether rules should remain technology-neutral whenever possible. Regulations tied too closely to a specific tool risk obsolescence as innovation continues.

The profession has long regulated conduct rather than instruments. That principle has served it well.

A Call for Measured Restraint

None of this suggests that courts should ignore technological change. Prudence is a judicial virtue.

But so is restraint.

The legal system functions best when it regulates what matters most: honesty, preparation, and professional judgment. When rules multiply faster than the risks they address, the danger is not merely inconvenience. It is the gradual normalization of compliance for compliance’s sake.

Lawyers do not improve their work because they check an additional certification box. They improve it because their name — and their reputation — appear on the signature line.

Artificial intelligence will continue to evolve. So will the practice of law. The challenge for the judiciary and the bar is not to halt that evolution, but to guide it without losing sight of first principles.

The lawyer is responsible. The signature is the certification. It always has been.

Before adding new layers of procedural obligation, the profession would do well to ask a simple question: does this requirement enhance the quality of advocacy, or merely document our discomfort with change?

The answer should shape the rules that follow.