What AI actually looks like inside a 10 to 50 person law firm

A senior solicitor at her desk in a working law firm office, looking thoughtfully at her laptop screen with a printed contract on the desk beside her, hand resting on a closed notebook
TL;DR

The 10 to 49 attorney band moved fastest on AI in the past year, up 36 percent year on year (ABA 2024 Legal Technology Survey). The SRA's February 2026 guidance frames AI use under existing solicitor obligations: three use cases (contract review with risk flagging, generative legal research with citation verification, conflict-of-interest screening at scale) are deployable inside that framework today. The one binding red line is uploading confidential client data to public AI tools, which most malpractice policies will not cover.

Key takeaways

- The ABA 2024 Legal Technology Survey shows 30 percent of all firms now using AI tools (up from 11 percent in 2023). Firms with 10 to 49 attorneys increased adoption by 36 percent year on year, the largest absolute growth in any firm-size band tracked. - UK firms are picking generic AI (ChatGPT, Gemini, Claude) over legal-specific platforms by a 2:1 ratio in the smaller-firm market (Clio 2025). The cost difference is small. The risk profile is not. - The SRA's February 2026 guidance is permissive but binding on three things: written governance frameworks, explicit COLP responsibility for regulatory compliance when new technology is introduced, and client confidentiality safeguards on any tool that processes client data. - Three use cases work at 10 to 50 attorney scale today: contract review with risk flagging (LawGeex 94 percent vs 85 percent lawyer accuracy on NDA risk spotting), generative legal research with mandatory citation verification, and conflict-of-interest screening at scale. - The malpractice insurance gap is real. Most policies exclude breaches of confidentiality that aren't the result of negligent acts. Deliberately uploading client data to public ChatGPT may not be covered, because the insurer will argue it sits outside the policy.

A senior solicitor at a 25-person commercial firm has the SRA’s February 2026 compliance guidance on artificial intelligence open in one browser tab. In another, the pricing page for a contract review platform her insurer has flagged as compliant. On her desk, a sample licence agreement she’s been asked to review, the kind of routine task that fills the gaps between bigger matters.

The questions she’s working through are about three things at once. What is now permitted by the regulator. What is now expected of solicitors who choose to deploy these tools. What her firm’s malpractice insurer will and won’t cover if something goes wrong. The middle question is the one most firms in the 10 to 50 attorney band are working out for themselves right now, and the data says she is not alone.

What does the data say about firms your size?

The 10 to 49 attorney band is the part of the legal market that has moved fastest on AI in the past year. The American Bar Association’s 2024 Legal Technology Survey, published March 2025, shows firms in that band increased AI adoption by 36 percent year on year. Overall firm-level use sits at 30 percent, up from 11 percent in 2023.

The 100-plus attorney firms are at 46 percent, but their year-on-year change was smaller because they were already further along. Solo practitioners moved 55.5 percent, but from a much lower base. The 10 to 49 band is where the meaningful absolute growth happened.

The UK pattern looks slightly different. Clio’s 2025 Legal Trends Report shows 79 percent of legal professionals using AI generally, but only 40 percent using legal-specific solutions, down from 58 percent in 2024. Smaller UK firms are picking generic ChatGPT, Gemini, and Claude over specialist platforms like Kira, Luminance, and Thomson Reuters CoCounsel, mostly because the free tier is easy to start with. The numbers say the people in your firm are probably already using AI on at least some work. The question moving through the partnership now is which tools, on which work, with what oversight.

What does the SRA’s February 2026 guidance actually require?

The February 2026 SRA guidance frames AI use under existing solicitor obligations rather than as a new regulatory regime. The single line that matters most is “your client best interests must remain at the centre of your decisions about the use of technology.” Everything else flows from that, mandatory governance frameworks, leadership oversight, risk and impact assessments, training, monitoring.

The Compliance Officer for Legal Practice (the COLP) carries explicit responsibility for regulatory compliance whenever new technology is introduced. That includes AI. So if your firm is using contract review platforms, generative legal research tools, or conflict-of-interest screening software, your COLP needs to have written into the firm’s policy who is using what, on which workflows, with what audit trail.

The guidance is permissive. It names what’s expected, not what’s forbidden. The harder edges sit in two places. Client confidentiality is binding, not soft. Solicitors must not feed sensitive client information into public cloud-based generative AI tools without explicit safeguards. And accountability for AI output stays with the solicitor; if the tool gets it wrong, the solicitor wears it. The accountability principle is what makes specific use cases either safely deployable or quietly exposed.

Three use cases that work at 10 to 50 attorney scale

The use cases that show measurable returns at this firm size are narrow and process-specific. Three of them are deployable inside the SRA framework today, because each operates on structured input data, integrates with existing workflows, and keeps the solicitor’s accountability intact. Contract review with risk flagging, generative legal research with mandatory citation verification, and conflict-of-interest screening at scale.

Contract review is the most-cited example. A LawGeex study referenced in JOLT Richmond found AI reaching 94 percent accuracy on NDA risk spotting compared to 85 percent for experienced lawyers. The category-leading platforms (Kira Systems for clause extraction, Luminance for cross-language documents, Thomson Reuters CoCounsel for chained reasoning on complex review) are mature enough now that mid-sized firms can pilot them on a single workflow without a firm-wide rollout.

Generative legal research saves the non-billable time junior attorneys spend hunting for case law and statutes. Lexis+ AI, CoCounsel, and Relativity aiR retrieve relevant material and synthesise summaries in minutes. The mandatory step is that the lawyer validates every cited authority against primary sources, because generative AI can and does hallucinate citations. That isn’t a soft “best practice”, it’s the line between covered work and uncovered work.

Conflict-of-interest screening at firm scale is the third use case. As a firm’s client base grows, conflict checks become administratively heavy. AI tools that combine name matching, entity disambiguation, and graph databases against the firm’s matter history can flag likely conflicts faster, and reduce intake delays. At a 25-person firm, the time saved on intake also recovers time on billable matters.

Why does using public ChatGPT void your malpractice cover?

This is the specific compliance gate most firms underestimate. Most legal professional liability policies have exclusions for breaches of confidentiality that aren’t the result of negligent acts. Deliberately uploading a client matter document to public ChatGPT may not be covered, because the insurer will argue it wasn’t negligence, it was a deliberate act outside the policy’s scope.

The Utah Bar’s guidance on insurance coverage for lawyers using generative AI lays this out clearly. If a lawyer blindly accepts AI output without validation, an insurer can argue that no professional service was rendered, voiding coverage on that basis. If a lawyer uploads confidential client information to a public tool that retains it for training, the insurer can argue the breach was deliberate, voiding coverage on that basis.

The fix is to use platforms that come with proper data segregation, no training on client data, and a usable audit trail. Most specialist legal platforms have these features by design. Generic tools generally do not, unless the firm has specifically configured an enterprise contract that excludes training. The cost difference between a £500-a-month specialist platform and a free public tool is often less than a single hour of an associate’s billable time. The cost of a malpractice claim that the insurer declines to cover sits in a different category entirely.

What does the maths look like at typical billing rates?

The ROI window at this firm scale is 4 to 8 weeks for contract review at typical billing rates. A mid-level associate billing £300 an hour spending 10 hours a week on routine contract review is generating roughly £3,000 of weekly time that the firm can either bill or redirect. A specialist platform at £500 to £2,000 a month pays back in the first month if usage hits threshold.

Conflict-of-interest screening produces a different shape of return. Faster matter intake means more billable matters per quarter. A firm that opens one or two additional matters per month because its conflict check now runs in minutes instead of days has paid for the platform several times over. The harder-to-quantify return is reputational. A firm that doesn’t miss conflicts because the system is exhaustive is also a firm that doesn’t face professional ethics inquiries.

The caveat sits in two places. First, the time savings only convert to revenue if the firm has elastic demand. A firm at full utilisation on partner time benefits more than a firm with bench capacity. Second, audit-trail rework, the time spent validating AI output and maintaining records of the AI’s decisions, consumes 20 to 30 percent of nominal time savings. Plan for that. The platform vendors’ headline numbers assume the rework is invisible. It is not.

What is the actual next move for your firm?

The actual next move is to pick the highest-volume routine document workflow in the firm, run a 60-day pilot on a specialist platform with proper data segregation, and use that pilot’s audit trail as the basis for the firm’s wider AI policy. Concrete pilot, real numbers, written policy.

For a 25-person commercial firm, that is usually contract review. For a litigation-heavy firm, it might be document discovery or precedent research. The point is to deploy AI on one workflow with one platform, document the time saved against the time spent on rework, and let the COLP write the firm’s policy from real internal evidence rather than from the regulatory guidance alone.

The senior partner reading the SRA guidance with the contract platform pricing in another tab is doing the right work. The 36 percent year-on-year shift in the 10 to 49 attorney band tells her the question is no longer whether to deploy AI, but where, with what oversight, on what audit trail. Most of her peer firms are asking the same three questions she is.

If you would like to walk through this for your firm specifically, book a conversation.

Sources

  • ABA 2024 Legal Technology Survey, conducted October to December 2024, published March 2025: 30 percent of law firms overall use AI tools (up from 11 percent in 2023). Firms with 100+ attorneys at 46 percent. Firms with 10 to 49 attorneys increased AI adoption by 36 percent year on year. https://www.msba.org/site/site/content/News-and-Publications/News/General-News/ABAs_2024_Legal_Technology_Survey_Report_Trends_in_Online_Research.aspx
  • Clio 2025 Legal Trends Report: 79 percent of legal professionals use AI generally, only 40 percent use legal-specific solutions (down from 58 percent in 2024). 71 percent of solo practitioners; 87 percent of large law firms. https://www.2civility.org/2025-clio-legal-trends-report/
  • SRA February 2026 guidance on AI compliance for solicitors: governance frameworks, COLP responsibility, client confidentiality. https://www.sra.org.uk/solicitors/resources/innovate/compliance-tips-for-solicitors/
  • Utah Bar guidance on insurance coverage for lawyers using generative AI: malpractice insurance exclusions for non-negligent confidentiality breaches. https://www.utahbar.org/insurance-coverage-issues-for-lawyers-in-the-era-of-generative-ai/
  • JOLT Richmond on AI in contract drafting (2024): LawGeex study showing AI 94 percent vs experienced lawyer 85 percent on NDA risk spotting; Luminance 80+ language capability; Kira Systems clause extraction. https://jolt.richmond.edu/2024/10/22/ai-in-contract-drafting-transforming-legal-practice/
  • SmartDev on AI in professional services: Thomson Reuters CoCounsel chained reasoning on complex review tasks. https://smartdev.com/ai-use-cases-in-professional-services/

Frequently asked questions

What does the SRA's February 2026 guidance actually require of my firm?

It frames AI use under your existing obligations as a solicitor, not as a new regulatory regime. The mandatory pieces are written governance frameworks (leadership oversight, policies, training, monitoring), explicit COLP responsibility for regulatory compliance when new technology is introduced, and client confidentiality safeguards on any tool that processes client data.

Can I just use ChatGPT or Claude on client work to start?

Not on confidential client information without explicit safeguards. Most malpractice insurance policies have exclusions for breaches of confidentiality that aren't the result of negligent acts, and uploading client data to a public AI tool may sit outside coverage. Specialist legal platforms typically include the data segregation and audit trail you need for covered use.

Which use cases pay back fastest at a 10 to 50 attorney firm?

Contract review with risk flagging tends to pay back in 4 to 8 weeks at typical billing rates. A mid-level associate billing £300 an hour saving 10 hours a week justifies a £500 to £2,000 monthly platform from the first month if usage hits threshold. Conflict-of-interest screening pays back through faster matter intake.

Should we pilot one platform or evaluate the whole market?

Pilot one. Pick the highest-volume routine document workflow in the firm, run a 60-day pilot on a specialist platform with proper data segregation, and use that audit trail as the basis for the firm's wider AI policy. Concrete pilot, real numbers, written policy. Market-wide evaluation comes later.

This post is general information and education only, not legal, regulatory, financial, or other professional advice. Regulations evolve, fee benchmarks shift, and every situation is different, so please take qualified professional advice before acting on anything you read here. See the Terms of Use for the full position.

Ready to talk it through?

Book a free 30 minute conversation. No pitch, no pressure, just a useful chat about where AI fits in your business.

Book a conversation

Related reading

If any of this sounds familiar, let's talk.

The next step is a conversation. No pitch, no pressure. Just an honest discussion about where you are and whether I can help.

Book a conversation