Public, Internal, Confidential, Restricted: the four-tier rule that drives your AI policy

An operations lead at a standing desk with a printed reference table in front of her, a phone and a laptop nearby, a colleague visible in the background
TL;DR

A four-tier data classification (Public, Internal, Confidential, Restricted) is the operational core of an SME AI policy. It maps types of data to types of tools and gives every employee a quick decision rule. Public goes anywhere. Internal goes to paid commercial tools with DPAs. Confidential needs explicit third-party consent or specialised vendor tooling. Restricted (UK GDPR Article 9 special category) typically does not leave the firm. One single-page table runs the firm's day-to-day AI decisions without escalation.

Key takeaways

- The four tiers: Public (intended for public release), Internal (used inside the firm, not for distribution), Confidential (provided by clients or third parties under contractual confidentiality), Restricted (UK GDPR Article 9 special category personal data). - Tier-by-tier rule: Public goes anywhere including free public LLMs. Internal goes only to paid commercial tools with DPAs. Confidential needs third-party consent and a DPA. Restricted typically does not leave the firm at all. - The Article 9 overlay covers health, biometric, racial/ethnic, religious, political, sexual orientation, criminal records. Stricter lawful-basis rules apply. - The single-page tool-mapping table is the operational deliverable: tool names down the side, four data tiers across the top, yes/no/conditional in each cell. - Worked examples per sector: legal firms with client matter notes, accountancy with payroll data, healthcare with patient records, e-commerce with customer addresses, marketing agencies with client briefs. - The classification is human-applied at SME scale: no metadata tagging, no DLP tooling. The single-page reference is the tool.

The operations lead of a 28-person marketing agency is at her standing desk on a Tuesday morning. A senior strategist messages her. “Quick one, can I use ChatGPT to summarise the 60-page market research deck the new client just sent? It would save me three hours.” The deck is the client’s, marked confidential in the engagement letter, and contains supplier names and pricing that are not public. The operations lead types: “no”. The strategist replies: “OK. Why?” The operations lead realises she can give a one-line answer or a 10-minute answer, and the firm has no shared rule. Today she has to invent one.

The conversation is the canonical SME AI governance moment. An employee with a document open and a tool tab open. A live work problem. A 30-second decision needed. Without a shared rule, the answer becomes a personal judgment call by whoever is asked, which is the slowest and least defensible way to run AI governance. The fix is a four-tier classification scheme that gives the strategist a one-line answer next time, before he has to ask.

What are the four tiers?

Public data has no confidentiality or privacy restrictions: published marketing content, public research, customer-facing descriptions. Internal data is used inside the firm but not for external distribution: internal emails, meeting notes, strategic plans, financial forecasts. Confidential data is provided by clients or third parties under contractual confidentiality: client matter notes, customer personal data, supplier contracts. Restricted data is special category personal data under UK GDPR Article 9.

These four tiers cover the full range of data an SME handles, from public marketing copy at one end to patient health records or criminal-conviction data at the other.

Which tier of tool can process which tier of data?

The mapping is the rule. Public data goes anywhere, including free public LLMs. Internal data goes only to paid commercial tools with Data Processing Agreements and training-disabled options. Confidential data requires the third party’s explicit consent and a DPA, or stays on specialised sector tooling or on-premise systems. Restricted data typically does not leave the firm at all.

A single-page reference table operationalises this. Tool names down the side, the four data tiers across the top, a yes-or-no-or-conditional in each cell. The table sits in the firm’s shared drive alongside the AI policy. An example:

Tool Public Internal Confidential Restricted
Free ChatGPT or Gemini Yes No No No
ChatGPT Plus or Claude Pro (paid) Yes Yes No No
ChatGPT Enterprise or Claude for Work Yes Yes Conditional No
On-premise / locally-run model Yes Yes Conditional Conditional
Sector vendor with DPA + client consent Yes Yes Yes No
Healthcare AI with HIPAA + patient consent Yes Yes Yes Conditional

Conditional means the firm has done the specific work to make the cell a yes for that data class with that tool: client consent obtained, DPA signed, technical safeguards verified, and the use case approved by the MD.

Why does the Article 9 overlay matter?

UK GDPR Article 9 identifies special category personal data: health, biometric, genetic, racial or ethnic origin, political belief, religious belief, sexual orientation, trade union membership, and criminal conviction data. Article 9 requires a specific lawful basis for processing this data, on top of the general lawful basis under Article 6. The threshold is materially higher than for ordinary personal data.

For SME purposes, the rule is simple. Article 9 data does not go into external AI tools unless there is a specific cleared use case with explicit consent and exceptional safeguards. Healthcare practices that handle patient health data are the most common SME context where Article 9 governs every day. HR functions handling employee health records and diversity data are the second most common. Both contexts deserve their own paragraph in the firm’s AI policy.

A solicitor receives a client matter involving a complex contract dispute. The engagement letter permits the firm to use technology tools for efficiency. The matter includes hundreds of pages of emails, contracts, and meeting notes. All of it is Confidential data under the four-tier scheme. The solicitor wants to use AI to extract key contract terms.

The reference table tells the solicitor: free ChatGPT is forbidden. ChatGPT Plus is forbidden. ChatGPT Enterprise might be permitted if the firm has obtained explicit client consent or the engagement letter covers AI use, and the firm has signed a DPA with OpenAI Enterprise. A specialist legal-tech vendor with a DPA and built-in client confidentiality features would be the cleaner choice. The decision takes 30 seconds with the table, where it would have taken a phone call without it.

How does this work for a healthcare practice?

A GP practice is exploring an AI tool that drafts clinical notes from clinician dictation. Patient health data is Restricted under Article 9. The reference table tells the practice manager: external cloud AI tools are forbidden by default for this data class. The exceptions involve specialist healthcare AI products with HIPAA compliance (or UK GDPR equivalent), explicit patient consent, strong technical safeguards, and clinical sign-off on the deployment.

The practice manager pulls together the four-question due-diligence pass for the candidate vendor: medical device classification, clinical safety evidence, patient consent process, contractual terms. The decision takes a week of homework. The reference table is what tells the practice the homework needs doing.

How does this work for a marketing agency or e-commerce firm?

A marketing agency uses AI to generate social media graphics and copy for client campaigns. Client briefs and strategy documents are Confidential. The reference table tells the agency: paid commercial AI tools with DPAs are the floor. AI-generated graphics that go into client campaigns are labelled or disclosed if the campaign context requires it. Client confidential information stays out of free public tools.

An e-commerce business uses AI to personalise product recommendations from customer purchase history. Customer data is Confidential under the scheme but stays outside Article 9. The privacy notice has been updated to disclose AI processing, the lawful basis is documented (consent or legitimate interest), and the AI vendor has a DPA. The reference table is what tells the firm to complete these three steps before deployment rather than discovering the gap afterwards.

If the firm you run is currently answering “can I put this into ChatGPT?” with a 10-minute conversation each time, and you want to talk about getting that to a 30-second reference table, book a conversation.

Sources

  • UK GDPR Article 9 special category data. Source.
  • UK GDPR Article 6 lawful basis for processing. Source.
  • HIPAA Business Associate Agreement provisions. Source.
  • Solicitors Regulation Authority on professional confidentiality and AI. Source.
  • ICAEW guidance on AI in accounting practice. Source.
  • National Institute of Standards and Technology (2023). AI Risk Management Framework (AI RMF 1.0). Establishes measurement rigour and uncertainty quantification as core governance practice. Source.
  • National Association of Corporate Directors (2025). AI Friend and Foe, Director's Handbook on AI Oversight. Foundational governance principles for board-level AI oversight, transparency, risk frameworks and stakeholder communication. Source.
  • Information Commissioner's Office. Guidance on AI and data protection under UK GDPR. The UK regulator's reference for data-protection obligations applied to AI systems. Source.

Frequently asked questions

What are the four data tiers and what data goes in each?

Public: information the firm or a third party has chosen to make public. Internal: information used inside the firm but not intended for external distribution (internal emails, meeting notes, financial forecasts). Confidential: information provided by clients or third parties under contractual confidentiality (client matter notes, customer personal data, supplier contracts). Restricted: special category personal data under UK GDPR Article 9 (health, biometric, racial/ethnic, religious, political, sexual orientation, criminal records).

Which AI tools are allowed for which data tier?

Public data goes anywhere including free public LLMs. Internal data goes only to paid commercial tools with DPAs and training-disabled options. Confidential data needs the third party's explicit consent and a DPA, or stays on specialised sector tooling or on-premise. Restricted data typically does not leave the firm at all; the rare exceptions involve specialised systems with explicit consent and exceptional safeguards.

What does the Article 9 overlay add?

Article 9 of UK GDPR identifies special category data: health, biometric, genetic, racial/ethnic origin, political belief, religious belief, sexual orientation, trade union membership, criminal conviction data. Article 9 requires a lawful basis specifically for processing AI of this data, beyond the general lawful basis under Article 6. The practical rule for SMEs is simple: special category data does not go into external AI tools without explicit consent and exceptional safeguards.

How is the classification operationalised at SME scale?

A single-page reference table. Tool names down the left side, four data tiers across the top, yes or no or conditional in each cell. Lives alongside the AI policy and the risk register on the firm's shared drive. Employees consult it for the day-to-day decision: can I put this into this tool. The classification is human-applied at SME scale, with no metadata tagging or DLP tooling required.

This post is general information and education only, not legal, regulatory, financial, or other professional advice. Regulations evolve, fee benchmarks shift, and every situation is different, so please take qualified professional advice before acting on anything you read here. See the Terms of Use for the full position.

Ready to talk it through?

Book a free 30 minute conversation. No pitch, no pressure, just a useful chat about where AI fits in your business.

Book a conversation

Related reading

If any of this sounds familiar, let's talk.

The next step is a conversation. No pitch, no pressure. Just an honest discussion about where you are and whether I can help.

Book a conversation