Nath Solicitors offers a breakdown of the updated 2025 Judicial Guidance on AI from the Judiciary of England and Wales and the potential hazards of using AI.

Law firms are now using AI to assist with routine tasks, but its use can pose significant risks. Recent cases such as Ayinde v London Borough of Haringey and Al-Haroun v Qatar National Bank, coupled with the updated 2025 Judicial Guidance on AI from the Judiciary of England and Wales, underscore the risks associated with relying on AI without proper oversight.

The Updated Judicial Guidance

The updated judicial guidance, which can be found here, warns that any use of AI must protect “the integrity of the administration of justice.” The guidance highlights that AI output can contain fabricated cases, inaccurate quotations, or non-existent legislation.

Judges, and by extension solicitors, are reminded to verify accuracy and avoid entering confidential information into public AI chatbots. The guidance also warns that bias may also make AI unsuitable for legal research or drafting.

Case Studies

Ayinde and Al-Haroun demonstrate the impact of ignoring these warnings. In Ayinde, a barrister prepared grounds for judicial review using five invented cases, allegedly generated by AI. Further, the solicitor failed to review the cases used by the barrister.

The High Court imposed a wasted costs order and referred both the barrister and solicitor to the Bar Standards Board and the Solicitors Regulation Authority respectively.  Likewise, in Al-Haroun, fabricated and mis-cited cases in a witness statement triggered judicial criticism and a referral to the Solicitors Regulation Authority.

Impact on Law Firms

Using AI in a law firm during proceedings can be risky and may lead to serious consequences if not handled correctly. In instances where an AI output presents inaccurate or biased information, a solicitor risks complicity in deceiving the court, which could result in sanctions such as a referral to the regulator or a costs order. Sharing client information with AI tools could also lead to a potential compromise of client data.

Conclusion

The refreshed 2025 Judicial Guidance reminds us to use AI with extreme care, even though it can be a valuable tool for law firms. The case studies above clearly illustrate the serious professional repercussions of relying on AI-generated material without sufficient oversight.

Law firms need to check all outputs for accuracy, protect confidential data from public AI systems, and keep human oversight central to the legal process. Ultimately, responsible and transparent use of AI should not have any adverse consequences for a law firm.

Why Choose Nath Solicitors

At Nath Solicitors, we have significant experience in a wide range of legal contexts. If you need advice or assistance, please contact us on 0203 983 8278 or get in touch with the firm online.

Contact Us

Get in touch with us using the form and one of our team will respond to you promptly. You can also contact us by email or telephone if you prefer.

enquiries@nathsolicitors.co.uk

020 3983 8278

Opening Hours

Mon – Fri 9am-5pm

    Personal Information

    More Information

    Please include the background to your situation and any further details that may help us answer your query.

    This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

    Enquire Now