The weekly constitutional

The use of artificial intelligence in courts: a warning

A judge deployed an onerous sanction when lawyers seemed to use fake, AI-generated case law

May 08, 2025
Photo by Alex Segre / Alamy Stock Photo
Photo by Alex Segre / Alamy Stock Photo

Welcome to this week’s Weekly Constitutional, where a judgment or other formal document is used as a basis of a discussion about law and policy. This week’s legal texts are section 51(6) of the Senior Courts Act 1981, which provides civil courts with a power to make wasted costs orders personally against lawyers in a case, and the High Court case of R (Ayinde) v London Borough of Haringey

Civil litigation—for example, where a person sues another person or challenges a public body—is largely a game of costs. Cases settle or parties capitulate not directly because of the merits of the case, but because of the costs risks for the parties of continuing. 

In England and Wales the general rule is that the winning party in a case obtains their legal costs from the losing party. This is called “costs following the event”. These costs are usually awarded on the so-called standard basis, where the winner gets about 70 per cent of their costs.

So when a different costs award is made, something significant has happened in the case. A judge may award no or reduced costs, or they may award indemnity (near 100 per cent) costs depending on the unwelcome behaviour of the parties in the litigation. The case reports when such alternative awards are given are often worth a look to see what went wrong. 

Then there are the cases where something very wrong has happened and the judge uses their “wasted costs jurisdiction”. This is when lawyers—not the parties—have done something so unwelcome that an award of costs is made against them personally. Such awards of wasted costs are rare, and the cases where they happen are exceptional.

And here we come to the unhappy case of R (Ayinde) v London Borough of Haringey, which would otherwise be a not unusual judicial review by a homeless claimant against an unhelpful local authority. 

Here the defendant council was hopeless, at least in the litigation. Indeed, Haringey was so useless—not even putting in its grounds of defence—the council was formally disbarred from defending the claim. The judge described the “wholesale breach of court orders” by the council.

Fortunately, as the case progressed the homeless claimant was given accommodation and so was no longer, as the judge described it, “street homeless”. For him, at least, the case had a satisfactory ending. And the merits of the claim appear to have been on his side, for as the judge said, “The submission was a good one. The medical evidence was strong. The ground was potentially good.” Justice was done. 

But what takes this matter from the common run of such cases to exceptional status as a wasted costs case is what the claimant’s lawyers—the solicitors and the barrister—did with their detailed legal submissions. For what they did was remarkable. 

In essence, the lawyers rested their legal submissions, in part, on five fake legal cases—including a Court of Appeal case. These cases had mundane names and proper-looking citations and looked very much like cases that would exist. But they did not exist. There are no law reports. The citations were false. The cases were invented.

And what makes this all the more extraordinary is that it seems there was no need to fabricate these cases. It would appear from the judgment that the legal points being made were so straightforward that other—existing—cases or other authorities could have been used. As the judge said in respect of one of the examples, “[t]he problem with that paragraph was not the submission that was made, which seems to me to be wholly logical, reasonable and fair in law, it was that the case of Ibrahim does not exist, it was a fake”.

Why invent fictional cases when real-life cases existed which would support the same point? This was not an example of a desperate advocate inventing a leading case on which an otherwise losing case would turn. This was instead an example where the invented cases served the same purpose as a genuine case.

We do not know finally why this was done. The judge did not need to find the motives, only to see the outputs before making a wasted costs order. 

The local council, which mischievously tried to use the wasted costs order to offset all the costs awarded against it for its own litigation failings, suggested that it may be that the culpable lawyers relied on artificial intelligence. In other words, that the fake cases were not contrived by the lawyers but taken from search queries conducted by ChatGPT or some other large language model (LLM).  

The judge said he did not need to decide whether this was what happened, but it is the most plausible explanation given the facts, especially how the fake cases were used in the submissions. Indeed, it may also be the most charitable explanation.

Of course, AI should not be used for such serious legal research—that is, the legal research on which others will rely. The judge in this case says it would be negligent for a lawyer to use AI and not double-check the results, but it is better for AI not to be used at all.  

This is because the legal research for any case or advice—work of identifying rules and their exceptions, of applying the relevant laws, of relying on or distinguishing a precedent, and so on—is the very stuffing of law. It is not some task to be delegated by someone doing the job of lawyer: it is the job of the lawyer. Such a mental exercise is how one comes to properly understand and work with the law, and to know how it applies or does not apply in concrete instances.

But proper legal research is time-consuming and often expensive. Online legal information services are exorbitant in their fees or, if free, incomplete in their coverage. Proper law libraries are a luxury. Law firms and sole practitioners cannot afford to spend days in a law library. Clients are unwilling to pay for legal research because they understandably, if unreasonably, expect the lawyers to know all of the law anyway. And using AI LLMs seems like using a legitimate search engine.

One can therefore appreciate the temptation for lawyers of using AI to do legal research, even if that temptation should be resisted absolutely. And now the English courts—in a case where no injustice was caused on the facts and the invented cases made no real difference to the outcome—have used one of the most powerful weapons in their judicial armoury as a sanction for lawyers against such use. Using AI in legal research is not only a waste of time, it can also now lead to wasted personal costs for the lawyers.