Ayinde brought together two cases where lawyers used generative AI (genAI) to produce written legal arguments or witness statements which were not checked and false information ended up before the court.
Clearly, lawyers need to keep in mind their existing duties, whether barristers or solicitors. The Solicitors Regulation Authority’s (SRA) Rules of Conduct mean that solicitors are under a duty not to mislead the court or others including by omission (Rule 1.4). They are under a duty only to make assertions or put forward statements, representations, or submissions to the court or others which are properly arguable (Rule 2.4). Further relevant rules include: the duty not to waste the court’s time (Rule 2.6), and the duty to draw the court’s attention to relevant cases … which are likely to have a material effect on the outcome (Rule 2.7). Most importantly, a solicitor remains accountable for the work (Rule 3.5).
The court has a range of sanctions if a lawyer breaches the rules: public admonition of the lawyer, the imposition of a costs order, the imposition of a wasted costs order, striking out a case, referral to a regulator, the initiation of contempt proceedings, and referral to the police if the court thinks that is warranted.
In the case of Ayinde, it was submitted that the threshold for contempt proceedings was not met, because counsel did not know that the citations were false.
Background of Ayinde
The case originated with a judicial review claim by Mr Ayinde represented by Haringey Law Centre. Mr Victor Amadigwe, a solicitor, was the Chief Executive of Haringey Law Centre, Ms Sunnelah Hussain was a paralegal working under his supervision, and Ms Sarah Forey was the barrister instructed. Ms Forey used AI to settle and sign the grounds for judicial review, with the legal submissions mis-stating the statutory provisions of the Housing Act 1996 and citing five fictitious cases. The defendant’s legal team requested copies of the cases they could not find. In a wasted costs hearing, Mr Justice Ritchie said:
“I do not consider that it was fair or reasonable to say that the erroneous citations could easily be explained and then to refuse to explain them.”
Ritchie J then found that the behaviour of Ms Forey and Haringey Law Centre had been improper, unreasonable, and negligent. Before the Administrative Court, Ms Forey denied using AI tools to assist her with legal research and submitted that she was aware that AI is not a reliable source. She accepted that she acted negligently and apologised to the court.
Ms Hussain and Mr Amadigwe also apologised to the court. Mr Amadigwe explained that it was not their practice to check what counsel produced.
Administrative court findings
The Court said of Ms Foley’s explanations:
“Ms Forey could have checked the cases she cited by searching the National Archives’ caselaw website or by going to the law library of her Inn of Court. We regret to say that she has not provided to the court a coherent explanation for what happened.”
While the Court found the threshold for contempt was met, it determined that counsel’s junior nature and having already being publicly admonished and reported to the Bar Standards Board was sufficient sanction. Mr Amadigwe was referred to the SRA, and Ms Hussain as a paralegal under supervision faced no punishment.
The lessons of Ayinde apply in the trade marks registry
The risks of relying on genAI for legal research were demonstrated in a trade mark opposition appeal to the Appointed Person against a decision of the Registrar in the Intellectual Property Office (IPO). The grounds of appeal and skeleton argument of the appellant – for whom Dr Soufian appeared as a litigant in person – and the skeleton argument of the respondent, represented by Mr Caddy, a trade mark attorney, raised questions on the use of AI.
The Appointed Person noted that the grounds of appeal referred to a number of authorities and included ‘quotes’ from each one; the cases were genuine but the quotes cited in the grounds of appeal did not exist in those decisions. Dr Soufian’s skeleton argument similarly listed cases relied upon, two of which had ‘complex (but incorrect) references’. Those were accompanied with short summaries of the propositions for which each case stood; for three of these, the summary was held to have been a substantial misrepresentation of the case.
Upon questioning, Dr Soufian confirmed that ChatGPT had been used to assist with the grounds of appeal and the skeleton argument, and an unreserved apology was given for the noted inaccuracies. The Appointed Person observed that the arguments ChatGPT generated were ‘largely not relevant to the issues’ and were consequently unhelpful to the appellant’s position.
Turning to the respondent’s skeleton argument, three cases relied upon were genuine and were correctly cited. However, it was unclear that those cases stood for the propositions for which they were cited. The Appointed Person probed this during the hearing, and Mr Caddy was not able to point to the parts of the cited judgments despite being given additional time to do so.
In considering how to address the conduct on both sides which fell clearly below expectations for litigants, the Appointed Person reviewed the findings in Ayinde, and the decision in Olsen v Finansiel Stabilitet A/S [2025] EWHC 42 (KB) which considered the duties owed to the court by litigants in person, concluding:
‘[I]t is clear that litigants-in-person (however inexperienced) have a duty not to mislead the registrar or the Appointed Person by providing fabricated authorities.’
Litigants-in-person are given much greater latitude in the conduct of their case than those with professional legal representation. Honest mistakes and misunderstandings as to the authority for which a case may stand, ought not to give rise to punishment. Fabricating citations, in contrast, may occasion sanctions, and ‘it does not matter whether fabrication was arrived at with or without the aid of generative artificial intelligence.’
Sanctions available for misconduct
The Appointed Person concluded that misconduct before the Appointed Person or the registrar of trade marks is unlikely to fall within the law of contempt.
The Appointed Person then considered sanction by way of a costs order. Neither the Appointed Person nor the registrar has the power to make a wasted costs order, nor to order costs against a representative of a party. Both the registrar (per rule 67 of the Trade Marks Rules 2008) and the Appointed Person (per rule 73(4)) may, however, ‘award any party such costs … and direct how and by what parties they are to be paid’.
Whilst the usual rule on costs before the registrar is that they are awarded on the relevant scale at the time, ‘off scale’ costs can be awarded where a party acts unreasonably. The Appointed Person observed that ‘[i]t is difficult to see a situation where the conduct of a party who has tried to rely on fabricated citations could be seen as anything but unreasonable.’ Accordingly, off scale costs should be the ‘starting point’ in such instances. In the appeal at hand, the Appointed Person awarded no costs to the respondent, despite the appeal being dismissed, by reason of Mr Caddy’s conduct.
Referral of a professional representative to a regulator, or admonition either publicly or in a decision were also considered. Referring to Ayinde, the Appointed Person noted that similar duties exist for trade mark attorneys as those which apply to lawyers. Considering the central principles of the Core Regulatory Framework adopted by IPReg in July 2023, he noted that ‘one or more of these duties will clearly be breached by a trade mark attorney who puts fabricated case citations before the registrar or the Appointed Person’.
The Appointed Person noted that the registrar has the inherent jurisdiction to strike out or stay all or part of a case, concluding however that ‘the nature of proceedings before the registry and before the Appointed Person means that it is usually not cost-effective for a party to apply for a strike out in advance of the final hearing. Where a Hearing Officer or the Appointed Person is aware material is fabricated, it will be disregarded in any event whether or not it is formally struck out.
The Appointed Person considered that the registrar ought to adopt a practice of including a clear warning on the risks of reliance on genAI, and that ‘a very clear warning needs to be given to make even the most nervous litigant aware of the risks they are taking’.
The future is clear: AI will be part of the administration of justice. What is also clear is that there is proper concern about its use. There likely needs to be procedural requirements for the disclosure of its use and generally users must own the outcomes as their responsibility. Clearly AI in our justice system can only be safely used with human oversight and responsibility.
If you have questions or concerns about the use of AI in legal research, please contact James Tumbridge, Robert Peake and Ryan Abbott.