emailfacebookinstagrammenutwitterweiboyoutube


AI vs human judgment: why technology still needs a human hand

AI should take care of repetitive, automatable tasks so that lawyers can focus on strategy, empathy and judgment — the things machines can’t replicate, highlights Marcus Dacombe, propositions marketing director at Access Legal

Marcus Dacombe|Propositions marketing director at Access Legal|

When I sat down with Francis Anderson, CEO at Net Law Media, for our recent webinar, I knew we were stepping into one of the most important conversations in the legal sector today: the role of AI. With 96% of UK law firms already using AI in some capacity, the question isn’t whether AI is here to stay — it’s how we use it responsibly.

As propositions marketing director at Access Legal, I’ve seen firsthand the excitement and the anxiety AI brings. Some see it as a silver bullet; others fear it’s the beginning of the end for human judgment. My goal in this discussion was simple: cut through the hype and explore where AI truly adds value, and where human oversight remains irreplaceable.

AI as a partner, not a replacement

One of the first points Francis and I agreed on is that AI isn’t here to replace us. History tells us that every technological leap, from typewriters to the internet, has sparked fear. But what happens? We adapt. AI is no different.

For me, the phrase that sums it up is this: ritualise the mundane, so we can humanise the exceptional. AI should take care of repetitive, automatable tasks so that lawyers can focus on strategy, empathy and judgment (the things machines can’t replicate).

Why human oversight matters more than ever

AI can process thousands of documents in seconds, but speed doesn’t equal accuracy. Bias, hallucinations and misinterpretation are real risks. We’ve all seen the headlines about fabricated case law being presented in court. That’s why I keep coming back to this principle: let the machine read, but let the lawyer reason.

Clients don’t pay for automation; they pay for judgment. If you can’t explain the reasoning behind a document, you’ve lost trust — which is everything in law.

Is using AI cheating?

This is a question I hear a lot. My view? No, if it’s used responsibly. Leveraging AI to review hundreds of pages of conveyancing documents isn’t cheating; it’s smart. But outsourcing your entire intellectual process to a machine? That’s a problem.

Francis made a great point during our conversation: if students write dissertations by simply prompting AI, they’re not learning. The same applies to law —  AI should accelerate thinking, not replace it.

The real barriers to adoption

AI’s potential is huge, but adoption isn’t without challenges:

  • Skills gap: We need to upskill our people, but universities are only just starting to embed AI literacy.
  • Ethical concerns: Transparency is critical. Clients must know when AI is used.
  • Legacy tech: Some firms are still running systems from the 90s. Integrating AI into that environment is a nightmare.

These aren’t reasons to avoid AI — they’re reasons to plan carefully and invest in governance.

Culture change: giving people permission to experiment

One thing I’ve learnt is that technology adoption is as much about culture as it is about code. Some firms encourage curiosity, while others cling to ‘the way we’ve always done it’. My advice? Create safe spaces for experimentation. Set guardrails, provide mentorship and make it fun. Fear thrives in the dark — shine a light on what AI can do.

What good governance looks like

If I had to boil it down to three principles, they’d be:

  1. Never copy-paste verbatim: Always review and contextualise.
  2. Train from the start: Embed ethical frameworks early.
  3. Be transparent: Clients deserve to know when AI is part of the process.

Governance isn’t about slowing innovation; it’s about protecting trust.

The human advantage

Here’s the truth: machines can process, predict and optimise — but they can’t be curious. They can’t challenge assumptions or make the creative leaps that humans do. That’s why clients buy into people, not platforms. AI can streamline delivery, but judgment, empathy and creativity remain uniquely human.

My key takeaways

  • Embrace change, don’t fear it.
  • Own the output — accountability is not negotiable.
  • Use AI as a tool, not a crutch.
  • Invest in skills and governance.
  • Remember: clients trust you, not the system.

As I said during the webinar, technology is there to interact with, not replace, your nuance. Clients trust the person, not the machine.

For more insights, watch the full webinar on demand or explore our AI for Law Firms content hub for practical guidance on using AI responsibly in legal practice.

LPM Conference 2026

LPM Conference 2026

The LPM annual conference is the market-leading event for management leaders in SME law firms

Digital journeys

How are firms juggling high wages, new pricing expectations and tech investments, and developing the right strategies to keep themselves profitable?