The MD of a 16-person professional services firm is six months into a rolled-out AI-assisted client management tool that ingests inbound emails, schedules calls, and drafts client follow-ups. The firm’s compliance adviser, brought in for an unrelated audit, asks: “Did you run a DPIA before deploying this?” The MD blinks. He has heard the term. He is uncertain whether he was supposed to do one. The compliance adviser explains: yes, almost certainly, and the firm has been operating without one for six months.
This is one of the most common SME governance gaps on AI. The Data Protection Impact Assessment is a legal requirement under UK GDPR Article 35 in many AI deployments, and most SME owners do not realise their deployment triggered it. The good news is that the SME-scale version of the document fits 4-6 pages and uses a free ICO template. Running it now is much cheaper than discovering the gap during an incident.
What is the legal anchor?
UK GDPR Article 35 requires a DPIA when processing is “likely to result in a high risk to the rights and freedoms of natural persons”. The ICO has published guidance and a list of processing types presumed to be high-risk; systematic AI processing of personal data appears in those categories. The Article 35 obligation has no size threshold; a 12-person firm and a 12,000-person firm face the same requirement when processing meets the criteria.
The ICO position has been stable since the AI guidance updates in 2023-2024. Where AI processing is systematic and involves personal data, expect a DPIA to be needed. Where the processing involves special category data under Article 9, the DPIA threshold is essentially automatic.
What does a DPIA actually contain?
Four sections. A description of the processing (what AI tool, what data, what purpose, who is involved). An assessment of necessity and proportionality (why the AI is the right means). An identification of risks to individuals (what could go wrong, how severe). A list of mitigation measures (what controls reduce each risk, who owns each control). The ICO template covers all four; the SME version fills 4-6 pages.
The MD signs the document. It lives alongside the AI policy and the risk register on the firm’s shared drive. Reviewed annually, or sooner if the deployment changes materially. Filed permanently as part of the firm’s UK GDPR audit trail.
What is the decision rule for whether you need one?
Three questions. Does the AI system process personal data of any kind. Is the processing systematic, meaning routine or repeated rather than one-off. Does the processing pose high risk to individuals, where high risk includes large-scale profiling, special category data, automated decisions with legal effects, or processing that affects vulnerable groups.
Two yeses and a probable third trigger Article 35. When in doubt, run one anyway. A 4-6 page document costs the firm a few hours of work; a regulatory finding that the firm should have done a DPIA and did not is a far more expensive position.
Worked example one: a law firm using AI to process client matter information
The firm has rolled out a contract-review AI that ingests client documents and extracts key terms. The data is personal (clients, third parties named in files) and carries professional confidentiality. Processing is systematic. The risk profile includes confidentiality breach, AI hallucination affecting client advice, and Article 22 if AI output influences case strategy. DPIA required.
The completed DPIA documents the lawful basis, the SRA-aligned mitigations (mandatory human review, partner sign-off on advice), and the data privacy controls (DPA with vendor, training disabled, encryption at rest and in transit). Reviewed annually, with an interim review triggered if the firm extends the tool to new matter types.
Worked example two: a healthcare practice using AI for clinical documentation
The practice uses AI to draft clinical notes from clinician dictation. Patient health data is Article 9 special category. Processing is systematic. The Article 9 element pushes the risk threshold up automatically. DPIA required, and the bar is higher than for ordinary personal data.
The document covers patient consent processes, the lawful basis (typically explicit consent under Article 9), technical safeguards, and clinical-quality controls (clinician review of AI-generated notes before they enter the medical record).
Worked example three: e-commerce personalisation at scale
An e-commerce business uses AI to personalise product recommendations from customer purchase history and browsing behaviour. Personal data, systematic, large-scale profiling. Not Article 9 unless the products are health-related, but the profiling element typically meets the high-risk threshold under ICO guidance. DPIA likely required. The document covers the lawful basis (typically consent or legitimate interest), the privacy notice updates, the customer rights process (opt-out, access, rectification), and the AI vendor’s data protection controls.
Worked example four: manufacturing summarising internal process documentation
A manufacturing firm uses an AI tool to summarise internal process documentation and operating procedures. The documents do not contain personal data. The processing is systematic but the personal-data trigger is missing. DPIA not required. The firm should still maintain a tool inventory entry and consider the OWASP LLM risk categories, but the formal Article 35 obligation is not engaged.
Worked example five: brainstorming with no personal data input
A small business uses free ChatGPT to brainstorm marketing ideas. No personal data is input. The processing is occasional rather than routine. DPIA not required. The business should still have a policy and a data classification rule that keeps personal data out of free public tools, but Article 35 is not triggered for this use.
How does this fit with the rest of the SME governance stack?
The DPIA is one of three documents the SME maintains. The AI policy (2-3 pages) is the firm-wide rule set. The risk register (one page) is the active management tool. The DPIA (4-6 pages per material deployment) is the per-deployment legal requirement under Article 35. Together they cover the firm’s UK GDPR position on AI without requiring enterprise governance overhead.
If the firm you run has rolled out an AI tool that touches customer or client data, and you are uncertain whether a DPIA was required or whether one was completed, book a conversation.



