Local Government Lawyer

London Borough of Tower Hamlets Vacancies


The Upper Tribunal (Immigration and Asylum Chamber) has referred a barrister to the Bar Standards Board for presenting a bogus court decision generated using the ChatGPT artificial intelligence tool.

It did not though find sufficient grounds for contempt proceedings or police involvement.

In their decision, Mr Justice Dove and Tribunal Judge Fiona Lindsley found barrister Muhammad Mujeebur Rahman “directly attempted to mislead the tribunal through reliance on Y (China)” this being a non-existent case.

The judges said the proceedings resulted from a false citation placed by Mr Rahman in the grounds argued in an immigration appeal.

This contained the following: “The Tribunal or decision maker placed undue weight on delay in isolation, contrary to Y (China) [2010] EWCA Civ 116, which requires consideration of personal circumstances, mental health, and overall context.”

Judges noted Y (China) does not exist but that permission was granted to appeal on limited grounds including that undue weight had been placed on delay.

An error of law hearing then took place before the Upper Tribunal when a judge asked Mr Rahman to take him to the relevant paragraph of Y (China) as it was noted that the citation was actually for YH (Iraq), a judgment of the Court of Appeal that does not concern delay.

Mr Rahman then said he instead meant Beatson J’s judgment in R (WJ) v SSHD [2010] EWHC 776 (Admin), “although he was again unable to take the panel to anything in that case which bore on credibility assessments or section 8 of the 2004 Act”.

He then said he should have cited Bensaid v UK [2001] ECHR 82, “although he accepted that the ECtHR was unlikely to have said anything about a 2004 UK statutory provision in a decision which was made in 2001”, the tribunal noted.

After a lunch break Mr Rahman said he had undertaken ChatGPT research and the citation for Y (China) was correct.

The tribunal directed Mr Rahman to provide a copy of Y (China) but he provided nine stapled pages “which were not a judgment of the Court of Appeal but an internet print out with misleading statements including references to the fictitious Y (China) case with the citation for YH (Iraq)”, the tribunal said. 

Judges said they had “concerns that Mr Rahman is conducting litigation in this matter when he has no licence from the BSB to do so [and] that he is holding himself out to be from a set of chambers, namely Lexminders Chambers when in fact he is simply a self-employed barrister”.

They noted Mr Rahman had been referred to the Bar Standards Board in January 2025 over concerns that he was conducting litigation without being authorised in another appeal and that he lacked basic professional competence.

Mr Rahman admitted that when he used ChatGPT he did not check with any reputable source of legal information that Y (China) was genuine.

When the tribunal queried the authenticity of Y China, Mr Rahman “did not immediately admit to his unprofessional use of ChatGPT but astonishingly maintained that the case was genuine because of it having been evidenced by this AI large language model”.

Judges added: “Mr Rahman inconsistently and dishonestly pretended that he had intended to rely upon the genuine case of YH (Iraq) which he now accepts was not the case, and indeed that the passage he identified in this previous letter was to do with anxious scrutiny and not delay.”

They concluded Mr Rahman did not know that AI large language models, and ChatGPT in particular, were capable of producing false authorities and so neither a police investigation nor contempt proceedings were appropriate.

But they said: “We do however conclude that this is a case where referral to a regulator, in this instance the BSB, is most definitely appropriate”, as Mr Rahman “did not ensure the accuracy of what was placed before the Upper Tribunal”.

Mark Smulian