AI governance, ethics, and compliance solutions for businesses, governments, and innovators.

☞ Bias & Transparency Issues – Ensure fair, responsible, and accountable AI development.
☞ Future-Proof Your AI – Build AI systems that are ethical, compliant, and scalable.
Three expert-designed courses to help you master AI ethics, legal challenges, and policy innovations at your own pace.




AI Ethics & Governance Consulting

AI Risk Assessment & Bias Audits


Regulatory Compliance & AI Law Advisory

AI Governance Strategy Development


Government & Public Policy Advisory
Whether you're facing a legal challenge or planning ahead, our experts are here to help. Get personalized advice tailored to your needs no pressure, just clarity.
Book a Free Consultation Today!
Let’s turn your questions into confident decisions.
Live Regulation Updates
EU AI Act Updates
US AI Bill of Rights
China’s AI Regulation
Develop expertise in AI governance, compliance, and responsible AI practices. Our certification programs are designed for professionals, compliance officers, and businesses looking to lead in ethical AI.

As artificial intelligence continues to reshape the legal profession, one of the most important questions facing law firms today is where to draw the line between AI assistance and the formal “conduct of litigation.”
The 2025 High Court decision in Mazur & Stuart v Charles Russell Speechlys LLP (EWHC (KB), Sheldon J, 16 September 2025) serves as a timely reminder that even in an AI-augmented workplace, legal authorisation cannot be automated.
In this case, the High Court clarified a fundamental principle under the Legal Services Act 2007 (LSA):
being an employee of an SRA-authorised firm—or working under the supervision of an authorised solicitor—does not, by itself, entitle a person to conduct litigation.
The facts were straightforward. A debt claim was issued on behalf of the firm Charles Russell Speechlys LLP, but significant steps in the litigation were taken by a non-authorised employee without a practising certificate. The lower court initially accepted the arrangement and ordered costs against the defendants. However, on appeal, Sheldon J reversed this view, holding that:
Section 21(3) of the LSA defines who is regulated, not who is authorised to conduct reserved activities.
Only authorised or exempt persons (under sections 18 and 19/Schedule 3) may conduct litigation.
Supervision is not a substitute for authorisation.
The costs order (£10,653) was quashed, reaffirming the strict boundaries of fixed-costs rules in the Civil Procedure Rules (CPR Part 45).
For law firms embracing automation, this case carries a clear warning:
AI systems are not “authorised persons” under the LSA.
That means any AI process that files, serves, or signs pleadings without an authorised solicitor’s approval may expose the firm to liability under sections 14 and 16 of the LSA, which criminalise unauthorised practice and employer complicity.
Simply having “AI supervised by a solicitor” is not enough. The authorised solicitor must personally approve, sign, and take accountability for every formal litigation step. AI may assist in drafting, research, or workflow management—but cannot replace human legal responsibility.
To remain compliant while benefiting from AI, firms should embed strong governance mechanisms into their operations. A simple but effective governance checklist might include:
Publish a clear internal policy stating that:
AI tools and non-authorised staff may assist but may not conduct litigation.
Define what counts as support (e.g., research, drafting, bundling) and what counts as conduct (e.g., issuing proceedings, signing statements of truth).
Every litigation task should have an Authorised Solicitor marked as Responsible and Accountable.
AI systems and support staff should only hold Supporting roles.
Restrict e-filing and document submission rights to authorised solicitors.
Require digital sign-off prompts confirming authorisation before any formal submission.
Prevent AI from directly triggering submission or signature endpoints.
Maintain detailed logs of:
Drafts, approvals, and edits,
Who signed off and when,
Submission IDs for traceability.
Such evidence may be reviewed by COLP/COFA if questions of compliance arise.
Train all team members on the “green” and “red” boundaries:
Green (Permitted): AI drafting, summarising disclosure, preparing bundles, scheduling, or research.
Red (Prohibited): Filing or serving claims, signing statements of truth, submitting formal applications, or making concessions.
All client-facing AI chat interfaces should clearly state:
“AI Assistant – not a solicitor. Legal advice is provided only by authorised professionals.”
This decision underscores the human-in-the-loop principle of responsible AI adoption.
It demands that firms build trust through accountability, not delegation to algorithms.
When integrated properly, AI can increase efficiency, improve accuracy, and free solicitors from repetitive tasks. But the legal authority to act in court or before tribunals will always rest with qualified human practitioners—not automated systems.
The Mazur & Stuart ruling reaffirms that technology cannot replace legal authorisation.
AI can streamline, support, and enhance workflows—but it cannot conduct litigation, file documents, or make legal decisions without a human lawyer’s signature and oversight.
Law firms that design governance around these principles—balancing innovation with compliance—will be best positioned to harness AI’s power responsibly and lawfully.
Legal Services Act 2007, ss.12–19, 21; Schedule 2 & 3.
R v AUH [2023] EWCA Crim 6; Baxter v Doble [2023] EWHC 486 (KB).
Civil Procedure Rules, Part 45, Sections VII–IX.
Sheldon J, Mazur & Stuart v Charles Russell Speechlys LLP (EWHC (KB), 16 Sept 2025).
Be the first to know about legal trends, expert tips, and our latest services. We deliver real value – no spam, just smart insights and useful updates. Subscribe to our newsletter and stay informed, protected, and empowered.
We respect your inbox. Unsubscribe anytime.

Copyright 2025. Nexterlaw. All Rights Reserved.