Government Legal Department Vacancies

Government Legal Department Vacancies

Senior judges fire warning over misuse of AI before courts and tell those in profession with leadership responsibilities to take practical measures to prevent it happening

Practical and effective measures must be taken by those within the legal profession with individual leadership responsibilities and by those with the responsibility for regulating the provision of legal services to ensure that artificial intelligence is not misused in litigation, a Divisional Court has warned.

The comments of the President of the King’s Bench Division, Dame Victoria Sharp, and Mr Justice Johnson came after the referral of two cases – Ayinde v London Borough of Haringey and Al-Haroun v (1) Qatar National Bank QPSC and (2) QNB Capital LLC – "out of the actual or suspected use by lawyers of generative artificial intelligence tools to produce written legal arguments or witness statements which are not then checked, so that false information (typically a fake citation or quotation) is put before the court".

The Ayinde case involved a judicial review case in which a man had applied to Haringey Council as homeless. The council had applied for a wasted costs order because of the inclusion of five fake cases and the law centre's failure to produce copies of these when asked.

The Divisional Court said: “The facts of these cases raise concerns about the competence and conduct of the individual lawyers who have been referred to this court. They raise broader areas of concern however as to the adequacy of the training, supervision and regulation of those who practice before the courts, and as to the practical steps taken by those with responsibilities in those areas to ensure that lawyers who conduct litigation understand and comply with their professional and ethical responsibilities and their duties to the court.”

The court added: “Artificial intelligence is a powerful technology. It can be a useful tool in litigation, both civil and criminal. It is used for example to assist in the management of large disclosure exercises in the Business and Property Courts. A recent report into disclosure in cases of fraud before the criminal courts has recommended the creation of a cross-agency protocol covering the ethical and appropriate use of artificial intelligence in the analysis and disclosure of investigative material. Artificial intelligence is likely to have a continuing and important role in the conduct of litigation in the future.”

This comes with an important proviso however, the judges said.

“Artificial intelligence is a tool that carries with it risks as well as opportunities. Its use must take place therefore with an appropriate degree of oversight, and within a regulatory framework that ensures compliance with well-established professional and ethical standards if public confidence in the administration of justice is to be maintained. As Dias J said when referring the case of Al-Haroun to this court, the administration of justice depends upon the court being able to rely without question on the integrity of those who appear before it and on their professionalism in only making submissions which can properly be supported.”

The Divisional Court suggested that in the context of legal research, the risks of using artificial intelligence are now well known.

“Freely available generative artificial intelligence tools, trained on a large language model such as ChatGPT are not capable of conducting reliable legal research. Such tools can produce apparently coherent and plausible responses to prompts, but those coherent and plausible responses may turn out to be entirely incorrect. The responses may make confident assertions that are simply untrue. They may cite sources that do not exist. They may purport to quote passages from a genuine source that do not appear in that source,” it said.

“Those who use artificial intelligence to conduct legal research notwithstanding these risks have a professional duty therefore to check the accuracy of such research by reference to authoritative sources, before using it in the course of their professional work (to advise clients or before a court, for example). Authoritative sources include the Government’s database of legislation, the National Archives database of court judgments, the official Law Reports published by the Incorporated Council of Law Reporting for England and Wales and the databases of reputable legal publishers.”

This duty rests on lawyers who use AI to conduct research themselves or rely on the work of others who have done so, the court said.

This was no different from the responsibility of a lawyer who relies on the work of a trainee solicitor or a pupil barrister for example, or on information obtained from an internet search.

“We would go further however,” the judges said. “There are serious implications for the administration of justice and public confidence in the justice system if artificial intelligence is misused. In those circumstances, practical and effective measures must now be taken by those within the legal profession with individual leadership responsibilities (such as heads of chambers and managing partners) and by those with the responsibility for regulating the provision of legal services.

“Those measures must ensure that every individual currently providing legal services within this jurisdiction (whenever and wherever they were qualified to do so) understands and complies with their professional and ethical obligations and their duties to the court if using artificial intelligence. For the future, in Hamid hearings such as these, the profession can expect the court to inquire whether those leadership responsibilities have been fulfilled.”

The Divisional Court said the two cases showed that promulgating guidance on its own was insufficient to address the misuse of artificial intelligence.

“More needs to be done to ensure that the guidance is followed and lawyers comply with their duties to the court,” it suggested, and said a copy of the judgment would be sent to the Bar Council and the Law Society, and to the Council of the Inns of Court.

“We invite them to consider as a matter of urgency what further steps they should now take in the light of this judgment.”