Blog Post

Golden scales of justice balanced over a digital interface with the letters “AI,” symbolizing the intersection of artificial intelligence and legal compliance.

AI, Law, and the Conduct of Litigation: Lessons from Mazur & Stuart v Charles Russell Speechlys LLP

October 24, 20254 min read

AI, Law, and the Conduct of Litigation: Lessons from Mazur & Stuart v Charles Russell Speechlys LLP

As artificial intelligence continues to reshape the legal profession, one of the most important questions facing law firms today is where to draw the line between AI assistance and the formal “conduct of litigation.”
The 2025 High Court decision in
Mazur & Stuart v Charles Russell Speechlys LLP (EWHC (KB), Sheldon J, 16 September 2025) serves as a timely reminder that even in an AI-augmented workplace, legal authorisation cannot be automated.


1. The Case in Context

In this case, the High Court clarified a fundamental principle under the Legal Services Act 2007 (LSA):
being an employee of an SRA-authorised firm—or working under the supervision of an authorised solicitor—
does not, by itself, entitle a person to conduct litigation.

The facts were straightforward. A debt claim was issued on behalf of the firm Charles Russell Speechlys LLP, but significant steps in the litigation were taken by a non-authorised employee without a practising certificate. The lower court initially accepted the arrangement and ordered costs against the defendants. However, on appeal, Sheldon J reversed this view, holding that:

  • Section 21(3) of the LSA defines who is regulated, not who is authorised to conduct reserved activities.

  • Only authorised or exempt persons (under sections 18 and 19/Schedule 3) may conduct litigation.

  • Supervision is not a substitute for authorisation.

  • The costs order (£10,653) was quashed, reaffirming the strict boundaries of fixed-costs rules in the Civil Procedure Rules (CPR Part 45).


2. Why This Matters for AI-Driven Law Firms

For law firms embracing automation, this case carries a clear warning:
AI systems are not “authorised persons” under the LSA.

That means any AI process that files, serves, or signs pleadings without an authorised solicitor’s approval may expose the firm to liability under sections 14 and 16 of the LSA, which criminalise unauthorised practice and employer complicity.

Simply having “AI supervised by a solicitor” is not enough. The authorised solicitor must personally approve, sign, and take accountability for every formal litigation step. AI may assist in drafting, research, or workflow management—but cannot replace human legal responsibility.


3. Embedding Governance into AI Workflows

To remain compliant while benefiting from AI, firms should embed strong governance mechanisms into their operations. A simple but effective governance checklist might include:

A. Policy Design

Publish a clear internal policy stating that:

  • AI tools and non-authorised staff may assist but may not conduct litigation.

  • Define what counts as support (e.g., research, drafting, bundling) and what counts as conduct (e.g., issuing proceedings, signing statements of truth).

B. Role Assignment (RACI Model)

Every litigation task should have an Authorised Solicitor marked as Responsible and Accountable.
AI systems and support staff should only hold
Supporting roles.

C. Process Controls

  • Restrict e-filing and document submission rights to authorised solicitors.

  • Require digital sign-off prompts confirming authorisation before any formal submission.

  • Prevent AI from directly triggering submission or signature endpoints.

D. Audit and Evidence

Maintain detailed logs of:

  • Drafts, approvals, and edits,

  • Who signed off and when,

  • Submission IDs for traceability.

Such evidence may be reviewed by COLP/COFA if questions of compliance arise.

E. Training and Communication

Train all team members on the “green” and “red” boundaries:

  • Green (Permitted): AI drafting, summarising disclosure, preparing bundles, scheduling, or research.

  • Red (Prohibited): Filing or serving claims, signing statements of truth, submitting formal applications, or making concessions.

All client-facing AI chat interfaces should clearly state:

“AI Assistant – not a solicitor. Legal advice is provided only by authorised professionals.”


4. The Broader Implications

This decision underscores the human-in-the-loop principle of responsible AI adoption.
It demands that firms
build trust through accountability, not delegation to algorithms.

When integrated properly, AI can increase efficiency, improve accuracy, and free solicitors from repetitive tasks. But the legal authority to act in court or before tribunals will always rest with qualified human practitioners—not automated systems.


Conclusion

The Mazur & Stuart ruling reaffirms that technology cannot replace legal authorisation.
AI can streamline, support, and enhance workflows—but it cannot conduct litigation, file documents, or make legal decisions without a human lawyer’s signature and oversight.

Law firms that design governance around these principles—balancing innovation with compliance—will be best positioned to harness AI’s power responsibly and lawfully.


References

  1. Legal Services Act 2007, ss.12–19, 21; Schedule 2 & 3.

  2. R v AUH [2023] EWCA Crim 6; Baxter v Doble [2023] EWHC 486 (KB).

  3. Civil Procedure Rules, Part 45, Sections VII–IX.

  4. Sheldon J, Mazur & Stuart v Charles Russell Speechlys LLP (EWHC (KB), 16 Sept 2025).

AI in lawLegal Services Act 2007Mazur v Charles Russell Speechlysconduct of litigationAI compliancelegal techSRA regulationlaw firm automationAI governanceauthorised solicitorAI in litigationlegal ethicsUK legal practiceAI policy designresponsible AI adoption
blog author image

Dr. Siamak Goudarzi

Dr. Siamak Goudarzi is a globally recognized lawyer, AI consultant, and visionary leader in technology law. With a career spanning over 30 years, Dr. Goudarzi has continuously redefined the intersections of law, business, and technology. Holding a PhD in International Law from the University of Portsmouth, he has become a driving force in adapting legal frameworks to the rapid advancements in artificial intelligence.

Back to Blog