A Dallas lawyer has been sanctioned for secretly using generative AI to help write a sworn court filing — a move that caught the attention of Judge Ed Kinkeade of the U.S. District Court for the Northern District of Texas.

According to Bloomberg Law, the attorney represented Dr. Joy Wilson in a case against KIPP Texas Inc. and failed to disclose that ChatGPT had been used to draft part of her declaration.

The judge ruled that the lawyer violated both court disclosure rules and the duty of honesty under federal law.

The punishment wasn’t light — the attorney was ordered to pay KIPP Texas’s legal fees related to the incident and to complete two hours of continuing education on the ethical use of AI.

The court also noted that AI-generated text had included fabricated quotes, calling it a “serious breach of trust.” You can see how the court’s opinion unfolded here.

What’s fascinating is how fast this kind of thing is becoming a pattern. Only a year ago, two New York lawyers were fined for citing fake cases created by ChatGPT.

Now, judges are preemptively issuing local rules demanding lawyers disclose any AI assistance in filings. The problem isn’t the use of AI — it’s the secrecy. If you’re going to use it, be transparent.

The American Bar Association has already stepped in, urging that lawyers who rely on generative AI must ensure accuracy, confidentiality, and informed client consent.

Their recent ethics guidance emphasizes that attorneys can’t simply trust what the machine produces.

Across the pond, the same concern is growing; a UK judge recently warned that citing AI-generated, non-existent precedents could lead to contempt of court.

The irony is that most lawyers using AI aren’t trying to cheat — they’re just trying to save time.

But as The Verge reported, the pressure to work faster sometimes outweighs the duty to verify. It’s the digital version of cutting corners, except now the corner cuts back.

In my opinion, this Dallas ruling isn’t anti-AI at all. It’s pro-truth. The message is clear: AI can help, but hiding it can hurt your career.

Courts aren’t banning the use of technology — they’re demanding transparency. And that seems fair. After all, if justice depends on facts, we can’t let hallucinations write them.

Leave a Reply

Your email address will not be published. Required fields are marked *