Shaping the Future of Responsible AI

AI governance, ethics, and compliance solutions for businesses, governments, and innovators.

AI is powerful but without governance, it’s risky

AI is powerful but without governance, it’s risky

As AI integrates into decision-making, businesses and governments face increasing regulatory scrutiny. Ethical AI is no longer optional it’s essential for compliance, trust, and innovation.

AI Regulations Are Here – Stay compliant with global AI laws (EU AI Act, US AI Bill of Rights).

Bias & Transparency Issues – Ensure fair, responsible, and accountable AI development.

Future-Proof Your AI – Build AI systems that are ethical, compliant, and scalable.

Master the Legal Future of AI

Three expert-designed courses to help you master AI ethics, legal challenges, and policy innovations at your own pace.

AI & Law Online Course

$27

Responsible AI Online Course

$27

Who Owns Intelligence Online Course

$27

WHAT WE OFFER

Comprehensive AI Governance Solutions

AI Ethics & Governance Consulting

Provide clients with comprehensive career assessments to identify their strengths, interests, and goals. Offer personalized career planning sessions where you help clients define their career path, set achievable goals, and develop action plans to reach them.

AI Risk Assessment & Bias Audits

Assist clients in crafting professional resumes and optimizing their LinkedIn profiles. Offer services to highlight their skills and achievements, making them stand out to potential employers. Provide resume templates or samples as resources.

Regulatory Compliance & AI Law Advisory

Conduct mock interview sessions to help clients improve their interviewing skills. Offer feedback on their responses, body language, and overall interview performance. Share tips and strategies for handling different types of interviews, including behavioral and technical interviews.

AI Governance Strategy Development

Develop a tailored job search strategy for each client, including guidance on networking, utilizing job search platforms, and targeting specific industries or companies. Provide resources and templates for cover letters, thank-you notes, and follow-up emails.

Government & Public Policy Advisory

Conduct mock interview sessions to help clients improve their interviewing skills. Offer feedback on their responses, body language, and overall interview performance. Share tips and strategies for handling different types of interviews, including behavioral and technical interviews.

Facing Legal Questions?

Let’s Find Clear Answers Together

Whether you're facing a legal challenge or planning ahead, our experts are here to help. Get personalized advice tailored to your needs no pressure, just clarity.

Book a Free Consultation Today!

Let’s turn your questions into confident decisions.

Stay Ahead of AI Regulation Changes

Live Regulation Updates

EU AI Act Updates

New compliance deadlines for AI developers.

March 2025

US AI Bill of Rights

White House releases new AI accountability guidelines.

Feb 2025

China’s AI Regulation

Stricter guidelines for deep fake technology.

Jan 2025

Become a Certified AI Ethics Professional

Develop expertise in AI governance, compliance, and responsible AI practices. Our certification programs are designed for professionals, compliance officers, and businesses looking to lead in ethical AI.

AI Ethics in Action Lessons from Real World Cases

Golden scales of justice balanced over a digital interface with the letters “AI,” symbolizing the intersection of artificial intelligence and legal compliance.

AI, Law, and the Conduct of Litigation: Lessons from Mazur & Stuart v Charles Russell Speechlys LLP

October 24, 20254 min read

AI, Law, and the Conduct of Litigation: Lessons from Mazur & Stuart v Charles Russell Speechlys LLP

As artificial intelligence continues to reshape the legal profession, one of the most important questions facing law firms today is where to draw the line between AI assistance and the formal “conduct of litigation.”
The 2025 High Court decision in
Mazur & Stuart v Charles Russell Speechlys LLP (EWHC (KB), Sheldon J, 16 September 2025) serves as a timely reminder that even in an AI-augmented workplace, legal authorisation cannot be automated.


1. The Case in Context

In this case, the High Court clarified a fundamental principle under the Legal Services Act 2007 (LSA):
being an employee of an SRA-authorised firm—or working under the supervision of an authorised solicitor—
does not, by itself, entitle a person to conduct litigation.

The facts were straightforward. A debt claim was issued on behalf of the firm Charles Russell Speechlys LLP, but significant steps in the litigation were taken by a non-authorised employee without a practising certificate. The lower court initially accepted the arrangement and ordered costs against the defendants. However, on appeal, Sheldon J reversed this view, holding that:

  • Section 21(3) of the LSA defines who is regulated, not who is authorised to conduct reserved activities.

  • Only authorised or exempt persons (under sections 18 and 19/Schedule 3) may conduct litigation.

  • Supervision is not a substitute for authorisation.

  • The costs order (£10,653) was quashed, reaffirming the strict boundaries of fixed-costs rules in the Civil Procedure Rules (CPR Part 45).


2. Why This Matters for AI-Driven Law Firms

For law firms embracing automation, this case carries a clear warning:
AI systems are not “authorised persons” under the LSA.

That means any AI process that files, serves, or signs pleadings without an authorised solicitor’s approval may expose the firm to liability under sections 14 and 16 of the LSA, which criminalise unauthorised practice and employer complicity.

Simply having “AI supervised by a solicitor” is not enough. The authorised solicitor must personally approve, sign, and take accountability for every formal litigation step. AI may assist in drafting, research, or workflow management—but cannot replace human legal responsibility.


3. Embedding Governance into AI Workflows

To remain compliant while benefiting from AI, firms should embed strong governance mechanisms into their operations. A simple but effective governance checklist might include:

A. Policy Design

Publish a clear internal policy stating that:

  • AI tools and non-authorised staff may assist but may not conduct litigation.

  • Define what counts as support (e.g., research, drafting, bundling) and what counts as conduct (e.g., issuing proceedings, signing statements of truth).

B. Role Assignment (RACI Model)

Every litigation task should have an Authorised Solicitor marked as Responsible and Accountable.
AI systems and support staff should only hold
Supporting roles.

C. Process Controls

  • Restrict e-filing and document submission rights to authorised solicitors.

  • Require digital sign-off prompts confirming authorisation before any formal submission.

  • Prevent AI from directly triggering submission or signature endpoints.

D. Audit and Evidence

Maintain detailed logs of:

  • Drafts, approvals, and edits,

  • Who signed off and when,

  • Submission IDs for traceability.

Such evidence may be reviewed by COLP/COFA if questions of compliance arise.

E. Training and Communication

Train all team members on the “green” and “red” boundaries:

  • Green (Permitted): AI drafting, summarising disclosure, preparing bundles, scheduling, or research.

  • Red (Prohibited): Filing or serving claims, signing statements of truth, submitting formal applications, or making concessions.

All client-facing AI chat interfaces should clearly state:

“AI Assistant – not a solicitor. Legal advice is provided only by authorised professionals.”


4. The Broader Implications

This decision underscores the human-in-the-loop principle of responsible AI adoption.
It demands that firms
build trust through accountability, not delegation to algorithms.

When integrated properly, AI can increase efficiency, improve accuracy, and free solicitors from repetitive tasks. But the legal authority to act in court or before tribunals will always rest with qualified human practitioners—not automated systems.


Conclusion

The Mazur & Stuart ruling reaffirms that technology cannot replace legal authorisation.
AI can streamline, support, and enhance workflows—but it cannot conduct litigation, file documents, or make legal decisions without a human lawyer’s signature and oversight.

Law firms that design governance around these principles—balancing innovation with compliance—will be best positioned to harness AI’s power responsibly and lawfully.


References

  1. Legal Services Act 2007, ss.12–19, 21; Schedule 2 & 3.

  2. R v AUH [2023] EWCA Crim 6; Baxter v Doble [2023] EWHC 486 (KB).

  3. Civil Procedure Rules, Part 45, Sections VII–IX.

  4. Sheldon J, Mazur & Stuart v Charles Russell Speechlys LLP (EWHC (KB), 16 Sept 2025).

AI in lawLegal Services Act 2007Mazur v Charles Russell Speechlysconduct of litigationAI compliancelegal techSRA regulationlaw firm automationAI governanceauthorised solicitorAI in litigationlegal ethicsUK legal practiceAI policy designresponsible AI adoption
blog author image

Dr. Siamak Goudarzi

Dr. Siamak Goudarzi is a globally recognized lawyer, AI consultant, and visionary leader in technology law. With a career spanning over 30 years, Dr. Goudarzi has continuously redefined the intersections of law, business, and technology. Holding a PhD in International Law from the University of Portsmouth, he has become a driving force in adapting legal frameworks to the rapid advancements in artificial intelligence.

Back to Blog

Stay Ahead with NexterLaw

Be the first to know about legal trends, expert tips, and our latest services. We deliver real value – no spam, just smart insights and useful updates. Subscribe to our newsletter and stay informed, protected, and empowered.

We respect your inbox. Unsubscribe anytime.

Copyright 2025. Nexterlaw. All Rights Reserved.