The 30-minute monthly AI governance cadence for a 10-person business

An MD at a meeting table with an adviser opposite, a laptop showing a calendar with three highlighted dates, water glasses on the table
TL;DR

Governance at SME scale is a rhythm rather than an event. The MD does not need to read another framework; the MD needs three things in the calendar. A 15-30 minute monthly informal team check-in. A 30-45 minute quarterly formal review with the operations lead. A 1-2 hour annual audit. Total commitment: 12-24 hours per year for the operations lead, 4-8 hours for the MD. £800-2,000 in staff time. This is the whole shape of governance at 5-50 staff.

Key takeaways

- The three-tier cadence: monthly informal (15-30 minutes), quarterly formal (30-45 minutes), annual audit (1-2 hours). - The monthly check-in surfaces shadow AI, new tool requests, incidents, and questions about policy. Three questions are enough: what tools are people using, any incidents this month, any questions about what's allowed. - Quarterly formal review walks the risk register top to bottom, the tool inventory, the incident log, and any vendor or regulatory changes. MD plus operations lead. 30-45 minutes. - Annual audit: comprehensive policy refresh, full tool inventory review, regulatory landscape scan, training effectiveness, ROI assessment. - Total staff time: 16-32 hours per year. Total cost in staff time: £800-2,000. No committee, no board pack, no specialised governance software. Existing Google Workspace or Microsoft 365 covers the infrastructure. - Three calendar invites the MD can set up this afternoon: monthly check-in slot, quarterly formal review, annual audit window.

The MD of a 13-person consulting firm at her quarterly board-of-advisers meeting. One adviser asks: “What’s your AI governance cadence?” The MD does not have an answer. She has the policy, the risk register, the data classification table, all on the firm’s shared drive. She does not have a calendar invite. The next month, none of the documents has moved. The exchange names a gap most owner-led SMEs reach: the documents exist, the rhythm does not, and governance dies inside 90 days of being set up.

The fix is small enough to do this afternoon. Three calendar invites. A monthly slot inside an existing team meeting. A quarterly formal review with the operations lead. An annual audit window. The whole governance practice for a 10-50 person firm fits inside those three recurring entries.

Why is rhythm the actual problem?

Most SMEs that have set up an AI policy or risk register do not maintain them. The documents get drafted, signed, filed on the shared drive, and quietly stop being referenced. The fade is faster than most owners expect: by month three the operations lead has stopped checking the register, by month six the policy is referencing tools the firm has retired, by month nine the documents describe a firm that no longer exists.

The remedy is the rhythm. Without a recurring touch point, governance documents drift away from operational reality. The monthly check-in is what keeps them current and alive between the formal sessions.

What does the monthly check-in actually involve?

Fifteen to thirty minutes inside an existing team meeting. Three questions are enough. What AI tools have we started using or considered this month. Any AI-related incidents or near-misses (a tool that produced a wrong output, a moment where someone wasn’t sure if a use was allowed, a vendor announcement that affected the firm). Any questions about what is or isn’t allowed under the policy.

The operations lead takes notes during the discussion and surfaces anything that needs follow-up. The discussion happens during an existing team meeting (Monday morning standup, Friday team check-in, all-hands), not as a separate AI committee. Adding a calendar invite for AI specifically signals that governance is something other than the team’s normal operating rhythm, which is the wrong signal at SME scale.

What gets reviewed at the quarterly formal session?

Thirty to forty-five minutes, MD plus operations lead. The risk register is walked top to bottom. Each row is checked: still relevant, mitigation still working, new evidence to consider. The AI tool inventory is reviewed: each tool still in use, still appropriate, still on the right tier. The incident log is read: patterns or recurring issues, mitigations to update.

Two more inputs come into the quarterly review. Regulatory or vendor changes that affect the firm: ICO publications, sector regulator guidance, vendor security announcements, EU AI Act enforcement updates. Policy updates needed in light of any of the above. The session ends with a short note filed alongside the policy, recording what was decided and what changed. The notes themselves form the audit trail of the firm’s governance practice over time.

What does the annual audit involve?

One to two hours, optionally split across two sessions. Five inputs into the audit: the policy itself, the full tool inventory, the regulatory landscape, training effectiveness, and ROI. The session produces a refreshed policy, a refreshed risk register, and a one-page summary the firm can show to a regulator, an insurance broker, or a serious customer asking about AI governance.

Each input gets a short, focused review. The policy: does it still match how the firm uses AI, and where does it need refreshing. The tool inventory: which tools to retire, which to add, which tier reassessments are needed. The regulatory landscape: new obligations from the ICO, FCA, EU AI Act enforcement updates, sector regulator publications. Training: have employees absorbed the policy, where are the gaps, where does the team need a brief. ROI: what value the firm has got from AI adoption, what risks have been avoided or mitigated, the year-on-year picture.

Who attends and what is the role split?

The MD attends all three cadences and makes final decisions. The operations lead attends all three and maintains the documents between sessions. If the firm has a designated compliance or IT lead, they attend the quarterly and annual sessions. For regulated firms (FCA, SRA, GMC), a fractional compliance adviser may attend the quarterly or annual session, particularly when the firm is approaching an inspection or has had a recent regulator engagement.

Governance participation does not require attendance by people outside the core leadership. A committee of 10 people for a 10-person firm is unworkable. The two-person model (MD plus operations lead) is the operational unit; everyone else contributes through the monthly check-in, where their use of AI tools surfaces.

What infrastructure does this require?

Minimal. A shared spreadsheet for the AI tool inventory and risk register. A shared document for the policy and incident log. A communication channel (Slack, Teams, or email distribution) for AI-related questions between the formal cadences. The total infrastructure cost is zero on top of the firm’s existing Google Workspace or Microsoft 365 subscription. No specialised AI governance software is required at this scale.

The simplicity matters. Specialised tools introduce the cost of learning, configuring, and maintaining a separate system, which is friction that pushes against the rhythm. A spreadsheet the MD already knows how to open is the right level of friction.

What does the time and cost actually look like?

Twelve to twenty-four hours per year for the operations lead. Four to eight hours per year for the MD. Combined: 16-32 hours of staff time per year. At typical SME labour rates, £800-2,000 in salary cost. Compare to the cost of a single data-leak incident, ICO investigation, or regulatory enforcement action; compare to the value of catching an AI system error before it reaches customers, or surfacing a vendor security issue early.

The ROI calculus is straightforward. Governance costs are modest. The risks avoided or mitigated are significant. The infrastructure already exists. The barrier is putting the three calendar invites in this afternoon and committing to the rhythm.

If you are running a 10-50 person firm with the documents in place but no recurring rhythm, and you would like to talk through what the three calendar entries should look like and what gets reviewed when, book a conversation.

Sources

  • ICO AI guidance hub. Source.
  • NIST AI Risk Management Framework. Source.
  • ISO/IEC 42001 AI Management Systems. Source.
  • National Institute of Standards and Technology (2023). AI Risk Management Framework (AI RMF 1.0). Establishes measurement rigour and uncertainty quantification as core governance practice. Source.
  • National Association of Corporate Directors (2025). AI Friend and Foe, Director's Handbook on AI Oversight. Foundational governance principles for board-level AI oversight, transparency, risk frameworks and stakeholder communication. Source.
  • Chartered Governance Institute UK (2024). Artificial Intelligence and the Governance Professional. UK governance perspective on lawful, ethical and responsible AI use embedded within risk management frameworks. Source.
  • CEPS (2024). Clarifying the costs for the EU's AI Act. EU policy-research analysis of compliance overhead, with regulatory cost at 17 per cent of total AI spending for affected systems. Source.
  • AICPA and CIMA (2026). Executive Insights on AI Opportunities and Risks. Global survey of 1,735 executives identifying operational readiness, talent infrastructure and regulatory preparedness as the principal AI capability barriers. Source.

Frequently asked questions

What does the monthly check-in actually involve?

Fifteen to thirty minutes inside an existing team meeting. Three questions: what AI tools have we started using or considered this month, any AI-related incidents or near-misses, any questions about what is or isn't allowed under the policy. The operations lead notes anything that needs follow-up. The check-in surfaces shadow AI without driving it underground.

What gets reviewed at the quarterly formal session?

The full risk register, top to bottom. The AI tool inventory: still in use, still appropriate. The incident log. Any regulatory or vendor changes that affect the firm. Decisions on policy updates. MD plus operations lead, 30-45 minutes. Notes filed alongside the policy. The cadence keeps the documents alive between annual reviews.

What goes into the annual audit?

Comprehensive policy review: does the policy still match how the firm actually uses AI. Full tool inventory: any tools to retire, any to add. Regulatory landscape: any new obligations from the ICO, FCA, EU AI Act enforcement updates, sector regulator publications. Training effectiveness: have employees absorbed the policy. ROI: what value the firm has got from AI adoption, what risks have been avoided.

How much does this actually cost in time?

12-24 hours per year for the operations lead, 4-8 hours per year for the MD. At typical SME labour rates, £800-2,000 per year in staff time. Compare to the cost of a single data breach, an ICO investigation, or a regulatory enforcement action. The ROI calculus is favourable: governance costs are modest, the risks avoided are significant.

This post is general information and education only, not legal, regulatory, financial, or other professional advice. Regulations evolve, fee benchmarks shift, and every situation is different, so please take qualified professional advice before acting on anything you read here. See the Terms of Use for the full position.

Ready to talk it through?

Book a free 30 minute conversation. No pitch, no pressure, just a useful chat about where AI fits in your business.

Book a conversation

Related reading

If any of this sounds familiar, let's talk.

The next step is a conversation. No pitch, no pressure. Just an honest discussion about where you are and whether I can help.

Book a conversation