You are a managing partner at a 30-person law firm. The proposal in front of you is for an AI document review pilot. The fee is £18,000. There is no line about SRA guidance, no mention of DPIA work, no reference to data residency or audit trail. You ask the consultant about EU AI Act exposure. There is a pause.
The pause is the point. The proposal looks complete because the consultant has scoped what they want to do, not what your sector requires the engagement to include. In regulated sectors, that gap is real cost waiting to surface, and it surfaces whether the consultant priced for it or not.
Why is regulatory work real cost rather than a checkbox?
Regulatory compliance for AI in regulated sectors is not optional and not free. Each framework requires specific deliverables: a Data Protection Impact Assessment, an AI risk classification, audit trail design, model documentation, human oversight protocols, and ongoing monitoring. Each deliverable takes hours to produce and review, and the hours have to come from somewhere. If the consultant has not priced them, they will either come from your team’s time, your legal counsel’s bill, or a frantic round of remediation work after the regulator visits.
The EU Commission’s own impact assessment for the EU AI Act puts regulatory overhead at 17% of total AI spending. That is a real number, derived from compliance modelling, and it applies to any UK firm serving EU clients.
The five frameworks that touch UK SME AI engagements
Five regulatory frameworks routinely affect UK SME AI engagements. EU AI Act. UK GDPR via the ICO. SRA for legal firms. FCA for financial services. CQC for healthcare. Most regulated SMEs intersect at least two of these in any given AI engagement, and each one carries a cost line that should appear somewhere in the proposal or the pre-engagement scoping conversation.
The EU AI Act is the largest in absolute terms. Any UK firm serving EU clients falls under its extra-territorial reach. High-risk AI systems require a Quality Management System, conformity assessment, technical documentation, and post-market monitoring. The CEPS analysis derived from the EU Commission impact assessment puts QMS setup at €193,000 to €330,000 one-time plus €71,400 annual maintenance for high-risk systems. Most SME AI use cases fall short of the high-risk classification, but transparency obligations, prohibited-use checks, and AI literacy requirements still apply.
UK GDPR via the ICO is the second framework. Any AI processing of personal data triggers Data Protection Impact Assessment requirements, lawful basis review, and bias and fairness audits. ICO penalties under the higher tier reach £20 million or 4% of worldwide turnover. The DPIA work itself runs £2,500 to £10,000 depending on system complexity, and it has to be done before the AI is deployed, not after.
The SRA is the third for legal firms. The February 2026 SRA guidance covers bias in AI-assisted legal advice, conflicts of interest in AI use, disclosure to clients, and supervision of AI-generated work. The cost is in policy work, training, and updated client engagement letters, typically £5,000 to £15,000 across an SME firm.
The FCA is the fourth for financial services. AI in algorithmic trading, automated decision-making in consumer finance, and model risk management all fall under existing FCA guidance, with AI-specific updates issued through 2024 to 2026. Compliance work for an SME financial services firm typically runs £10,000 to £30,000 depending on use case.
The CQC and NHS information governance is the fifth for healthcare. CQC GP Mythbuster 109 covers AI in GP services. NHS information governance guidance covers DPIA and lawful basis specifically for AI. Costs run £5,000 to £20,000 for an SME clinic, and the standards continue to tighten.
What “priced in” actually looks like
A proposal that has priced in regulatory work names the relevant frameworks, scopes specific deliverables, and either includes the work in the fee or flags it as a separate workstream with an estimated cost band. It does not say “we follow industry best practice.” It says specifically what the engagement will produce and what it will not.
The deliverables are concrete. A DPIA document. An AI risk classification with reasoning. Audit trail design for the deployed AI system. Model documentation that meets EU AI Act technical-documentation standards. A staff AI literacy plan. A handover to a named internal compliance owner who will run the ongoing monitoring after the consultant leaves.
A consultant who has done this before will name these deliverables on the second call without prompting. A consultant who has not done this before will either skip the regulatory section entirely or use vague language that signals they do not know what to deliver.
How to surface a missing regulatory line at proposal stage
You can surface a missing regulatory line without making the conversation adversarial. Three questions tell you what you need to know, and they apply across every regulated sector.
Ask whether the proposal includes a DPIA for the AI processing involved. The DPIA is the simplest test, because UK GDPR requires one for AI systems that process personal data, and a consultant who does not name it has not engaged with the regulatory side at all.
Ask which of your sector regulators’ AI guidance documents the engagement is built against. The SRA, FCA, CQC, and NHS each have specific AI guidance, and a consultant working in a regulated sector should be able to name the document and the specific clauses they are designing for. A consultant who cannot name the guidance is either operating from generic frameworks or has not read the sector-specific work.
Ask what the proposal assumes about EU client exposure. Most UK SMEs in professional services have at least some EU clients, which brings the engagement under the EU AI Act’s extra-territorial reach. A consultant who has thought about this will give you a clear answer about whether the engagement is scoped for EU compliance or not. A consultant who has not thought about it will say “we can look at that later” or “it probably doesn’t apply.” Either answer means the cost will surface later, when it is more expensive to fix.
The 17% rule
The EU Commission’s impact assessment puts regulatory overhead at 17% of total AI spending. That is the figure to use in budget planning until UK-specific data emerges, and as of 2026 the UK does not have a published AI consulting compliance cost survey. For a £25,000 AI engagement, 17% is £4,250 of compliance work that should be priced somewhere. For a £60,000 engagement, it is £10,200.
The number serves as a planning anchor rather than a definitive answer. Some sectors run higher (financial services and healthcare typically 20 to 25%), some run lower (lighter-touch professional services 10 to 15%). The point is that the line exists, the line has cost, and the buyer who has not priced it is going to pay it later, usually under more pressure.
If you would like to talk about how regulatory cost should be priced into an AI engagement in your sector, book a conversation.



