What AI governance actually means when you do not have a compliance team

A managing director and an operations lead at a meeting room table with a printed document and a laptop showing a spreadsheet, mid-conversation
TL;DR

AI governance at SME scale is five business functions, not a department. Policy, risk identification, day-to-day controls, monthly review and external accountability. In a 25-person firm those five are usually held by two people, the owner and the operations lead, with an external solicitor reviewing the policy and a fractional DPO if the data work is heavy. The artefacts are a two-page policy, a one-page register, a quarterly review note and an incident log.

Key takeaways

- AI governance is a set of disciplines, not a department. The five disciplines are policy, risk identification, operational controls, monitoring and external accountability. - At 5 to 50 staff, two people usually cover all five. The owner sets policy and signs it. The operations lead runs the register, the controls and the quarterly review. - The artefacts are minimal. A two-page policy, a one-page risk register in a spreadsheet, a quarterly review note and an incident log. Total writing time is a long afternoon a quarter. - External help is bought in narrow slices. A specialist solicitor reviews the policy and flags UK GDPR exposure. A fractional DPO covers data protection if processing is heavy. An ICO-aligned consultant translates regulator guidance. None of them write the policy for you. - Doing this is now a commercial requirement, not only a regulatory one. NHS Digital Technology Assessment Criteria, Microsoft supplier assurance and a growing share of enterprise RFPs in 2026 ask the same governance questions before a contract is signed.

The owner of a 22-person services firm has just come off a call with her largest client. He has asked her to send across the firm’s “AI governance framework” by the end of next week. She has not got one. She has read the term in three contract amendments this year, and her investor mentioned it on the last quarterly call, and her industry body has just put a webinar on it. She closes the laptop. She is not sure whether what is being asked for is a policy document, a process, a software product, or all three.

This is where many owner-led SMEs land in 2026. The published material on AI governance was written for organisations with a Chief Information Security Officer, a Data Protection Officer, an internal audit function and a board risk committee. None of which a 22-person business has. What gets lost in translation is that the disciplines underneath enterprise governance still apply at smaller scale. They simply get compressed into the owner’s actual week with the help of one or two named people on the team.

What does AI governance mean in plain English?

AI governance is the set of disciplines a business uses to make sure its use of AI does not harm customers, staff, the firm, or the regulator’s view of the firm. At enterprise scale it produces committees, officers and policy stacks. At SME scale it produces a short document, a spreadsheet and a quarterly meeting. The disciplines are the same in both cases. The compression is what changes.

The point of governance is to be able to answer five questions in front of a customer or regulator without flinching. What is our stance on AI use. What could go wrong. What do we actually do day-to-day. Is any of it working. Can we prove it. A 22-person firm needs answers to those five just as much as a 22,000-person firm does. It does not need the same paperwork to produce them.

What are the five disciplines every SME using AI needs to cover?

There are five, and any framework worth reading is built on them. Policy and standards, the firm’s stance on what AI is acceptable. Risk identification, what could go wrong with the specific tools in use. Operational controls, the human review and access rules that turn policy into behaviour. Monitoring, the quarterly check that any of it is working. Accountability, the file you hand a customer or regulator that proves the other four exist.

NIST calls these Govern, Map, Measure and Manage. ISO/IEC 42001 wraps them in management system language. The ICO frames them through the lens of accountability under UK GDPR. The labels differ. The underlying functions do not.

What changes at SME scale is not which functions matter. Every one of the five is still needed. What changes is the size of the artefact each function produces and the number of people who hold it. Enterprises spread the five across departments. A 20-person firm holds them with two named people and a shared folder.

Who owns each one in a 20-person firm?

Two people, usually. Take a 25-person digital agency as a worked example. The owner sets and signs the policy, owns standards (what tools are allowed, what is prohibited, what disclosure looks like) and owns the governance file when a customer asks to see it. That is roughly four hours a year of her time once the policy is drafted.

The operations lead runs everything else. She maintains the risk register as a one-tab spreadsheet, keeps the incident log, runs the quarterly review and writes the one-page note that comes out of it. She is the person any staff member talks to when an AI output looks wrong or a tool starts behaving oddly. That is roughly fifteen to twenty hours a year once the rhythm is set.

A third person, almost always an external IT contractor or a technical advisor on retainer, is on call for the questions neither of them can answer. Does this new tool meet our standards, has the vendor changed its data handling, does our integration break anything we care about. The advisor sits outside governance ownership and acts as the firm’s specialist on speed-dial.

What artefacts does the function actually produce?

Four documents, kept in a single folder the owner can open in front of anyone who asks. A two-page AI use policy. A one-page risk register. A quarterly review note. An incident log. Total writing time across the year is a long afternoon a quarter once they exist. The point is having something to hand someone who asks, and something to read in the room when a tool starts behaving oddly.

The two-page policy covers acceptable use, prohibited use, required disclosure to clients, the firm’s UK GDPR posture and what happens when something breaks. The one-page register has a row per AI use case with columns for risk, likelihood, impact, existing control and named owner. The quarterly review note summarises what was reviewed, any incidents, any external changes and decisions taken. The incident log captures date, tool, what happened, what the firm did and what changed.

The four together are what the ICO calls accountability documentation, what an NHS Digital Technology Assessment Criteria review wants to see, and what a Microsoft supplier assurance questionnaire asks the firm to describe. They are also what an enterprise client RFP in 2026 increasingly demands before signing.

When should you bring in external help and what should you ask for?

Three points. First, when you draft the policy. A specialist UK solicitor on data protection and AI law should review it. The brief is not “write our policy”, the brief is “review this policy against UK GDPR and our sector regulator, flag what we need to change, flag what we need to document”. Two to four hours of solicitor time, usually £500 to £2,000.

Second, when the firm processes a significant volume of personal data through AI. A fractional Data Protection Officer on retainer (an ICO-aligned consultant on a few hours a month) is often cheaper than the executive time it would otherwise consume. The ICO publishes guidance on whether the firm legally needs a DPO at all, and the answer for many SMEs is no, but having a person to ring is still useful.

Third, when the firm is in a regulated sector or selling into one. NHS, FCA, SRA and ICAEW supplier expectations have all tightened. An advisor who has seen what those auditors actually look for is the difference between a smooth procurement conversation and three months of back-and-forth on a single contract.

What not to outsource. The policy itself, because it has to reflect the firm. The quarterly review, because it is the meeting where the owner and operations lead notice problems early. The risk register, because it has to be lived in to be useful. External advisors review what the firm has already done. Replacing the firm’s own thinking with a consultant’s deliverable defeats the point of governance.

Sources

- ICO. Guidance on AI and data protection. Sets out the UK accountability framework that any SME using AI for personal data must work within. https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/artificial-intelligence/ - NIST. AI Risk Management Framework 1.0 (2023). Source for the Govern, Map, Measure and Manage functions that the SME version compresses. https://www.nist.gov/itl/ai-risk-management-framework - ISO/IEC 42001:2023 Information technology, Artificial intelligence, Management system. The international management system standard whose principles SMEs can align to without paying for certification. https://www.iso.org/standard/81230.html - National Cyber Security Centre. Guidelines for secure AI system development. Sets out the AI security risks (data poisoning, model extraction, adversarial inputs) an SME should hold on its register. https://www.ncsc.gov.uk/collection/guidelines-secure-ai-system-development - European Commission. EU AI Act consolidated text. Defines provider, deployer and high-risk system categories that UK SMEs serving EU customers must work to. https://artificialintelligenceact.eu/ - UK Government. A pro-innovation approach to AI regulation (2023, updated 2024). The UK's sectoral regulation framework that shapes how ICO, FCA, CMA and NCSC interact on AI. https://www.gov.uk/government/publications/ai-regulation-a-pro-innovation-approach - ICO. The Data Protection Officer, guidance on when an organisation needs one. Source for the fractional DPO route open to many SMEs. https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/accountability-and-governance/data-protection-officers/ - NHS England. Digital Technology Assessment Criteria (DTAC). The supplier framework that bakes AI governance questions into NHS procurement. https://transform.england.nhs.uk/key-tools-and-info/digital-technology-assessment-criteria-dtac/ - Microsoft. Supplier Security and Privacy Assurance Program. The vendor assurance regime now asking SME suppliers to describe their AI governance. https://www.microsoft.com/en-us/procurement/sspa - Solicitors Regulation Authority. Risk Outlook on the use of artificial intelligence in legal practice. Sector-regulator guidance UK legal services SMEs are now held to. https://www.sra.org.uk/risk/risk-resources/use-artificial-intelligence-legal-practice/

Frequently asked questions

Do I need an AI governance framework if I am a 20-person firm?

Yes, but not the version your enterprise client has. UK GDPR applies regardless of size, and so do sector regulators like the FCA, SRA and ICAEW. The shape that fits 5 to 50 staff is a two-page policy, a one-page register, a quarterly review and an incident log, owned by the owner and one operations lead. Total time investment is roughly 20 to 30 hours a year once it is set up.

Can I just buy a governance software platform and be done?

No. A platform is only as good as the discipline of the people using it, and at SME scale a shared spreadsheet with monthly review usually outperforms an unloved subscription. Software has a role once the firm crosses 50 to 100 staff and the register stops fitting on one page. Below that, the spend is rarely justified.

Who in a small firm should own AI governance?

The owner owns policy and accountability and signs the document. The operations lead owns the register, the controls and the quarterly review. A third person, usually a technical contractor or external IT advisor, is on call for tool-specific questions. That is the working unit. No committee, no AI ethics board, no dedicated officer.

This post is general information and education only, not legal, regulatory, financial, or other professional advice. Regulations evolve, fee benchmarks shift, and every situation is different, so please take qualified professional advice before acting on anything you read here. See the Terms of Use for the full position.

Ready to talk it through?

Book a free 30 minute conversation. No pitch, no pressure, just a useful chat about where AI fits in your business.

Book a conversation

Related reading

If any of this sounds familiar, let's talk.

The next step is a conversation. No pitch, no pressure. Just an honest discussion about where you are and whether I can help.

Book a conversation