Skip to main content
Case update 17 February 2026

Fake law: Barrister suing Dan Neidle relied on cases from ChatGPT

alexsl / Getty

Demand for £8m supported by fake legal cases generated with AI tools – but that’s not the worst part of this shocking lawsuit

Last week, we were at the High Court with Dan Neidle, pushing back against a barrister who tried to silence him. And we also revealed that this barrister has been relying on fake legal cases.

Setu Kamal sued Dan for £8m in August 2025, after Dan said a tax avoidance scheme with Kamal named as “legal partner” was “nonsense” that would put anyone who used it in trouble with HMRC.

Kamal claimed that he had never advised on the scheme, and suggested that Dan’s defence was “untenable” (PDF) because of three legal cases. But all three of those cases were wrong.

Good Law Project is powered by people across the UKDonate now

“Aaronson v Channel 4”, sounds real, but it’s completely made up, Koutsogiannis v Random House is a real case, but Kamal’s summary was totally wrong. And the citation number – the number used by the courts to identify a case – Kamal used to refer to another real case Riley v Murray was a reference to an entirely different case: GC v AS, a case about a mother seeking to be reunited with her children in Libya.

When Dan spotted these problems, he wondered if Kamal had relied on ChatGPT to support his case. And when Dan asked ChatGPT about those cases, it made exactly the same mistakes.

In his skeleton argument, Kamal admitted that he “accepts responsibility” for “errors in that letter”, and insisted that “the letter was not intended to be used in court proceedings”.

According to the barrister Matthew Lee, there have been at least 43 other cases where lawyers have relied on bad law made up by AI.

But lawyers relying on machine learning to support their arguments are putting justice at risk. Large language models are designed to generate text that sounds plausible, like “Aaronson v Channel 4”. What if Dan hadn’t spotted Kamal’s fake cases? What if the judge had relied on them to make a ruling?

And the courts have taken a dim view of lawyers who put their faith in ChatGPT. They have already asked the Bar Standards Board to investigate two barristers who used fake cases in court.

Kamal went on to suggest that our concerns over his fake cases “do not go toward the core issues arising” in his lawsuit. But his use of fake law is just one example of the oppressive tactics Kamal has used which we argue are intended to silence vital reporting.

Last September, HMRC named Kamal as the promoter of other tax avoidance schemes that they say “do not work” – schemes that put people at risk of “tax bills, interest and potential penalties” – and identified him as a risk to the public.

Dan’s article reflects his honest opinions, which are protected by law. We’re standing by him, and by every journalist who finds themselves threatened by an unreasonable lawsuit, because it’s essential that our courts defend our right to speak truth to power.