The head of delivery at a 50-person consulting firm has knowledge living in five places. Three thousand documents on Google Drive, half stale. A Confluence wiki nobody updates. Four years of Slack channels. Support history in Intercom. And the heads of three senior delivery leads who are also her bottleneck. New hires take 65 days to reach productive output. Her senior leads spend an estimated 5 to 10 hours a week answering questions they have already answered. The CFO has signed off £15,000 to £25,000 for a knowledge tool, and she is meeting Slite, Notion, and Microsoft Copilot for SharePoint next week.
Her question is the right one. Not whether AI knowledge belongs in the firm, but which two of the six deployable jobs to start with, which two failure modes she will design against, and what a 90-day rollout looks like before next year’s external audit reads the access logs. Knowledge management is the function where the 2026 SME ROI is clearest and the deployment pattern is most failure-prone. The useful framing is six jobs to deploy in order, five failure modes to architect against, and one platform pick driven by your existing stack rather than the brochure.
What jobs does AI do well in knowledge management today?
Six jobs have hit the maturity threshold at this revenue band. Connected internal search across the stack returns 20 to 40 minutes per person per day. Handbook and SOP generation compresses what was a three-to-six-month documentation project into a four-to-six-week facilitated process. FAQ curation cuts repeat questions by 25 to 40 percent. Onboarding acceleration recovers 4 to 10 days of pre-productive ramp per hire.
Skills-gap mapping has moved from aspirational to operational. Meeting transcription and decision indexing makes “what did we decide about Q3?” answerable from actual transcript content. The numbers stack. Atlassian’s 2025 State of AI report found 93 percent of those reporting employee-efficiency gains attributed them to faster information access. Vonage’s Intelligent Workspace cut new-agent ramp from 90 days to 56 days, a 38 percent improvement. Johnson and Johnson’s MIT Sloan-documented skills implementation identified 41 future-ready skills and surfaced regional heat maps. Contact centres deploying AI knowledge bases see up to 25 percent reduction in average handling time. The prior post on where to apply AI first helps you triage which two jobs match your firm’s bottleneck.
Where are UK SMEs actually using these tools?
The platform stack at SME scale has settled into a recognisable shape. Slite at £8 per member per month is the affordable floor, with document-verification, a Knowledge Management Panel, and Super Search across Slack, Notion, Drive, and Confluence. Notion AI and Guru sit mid-range at £15 to £20 per user. Glean and Box AI sit at the premium end at £20 plus per user.
For Microsoft 365 organisations the path is different. Microsoft Copilot for SharePoint anchors knowledge to the M365 ecosystem, uses RAG architecture by default, and inherits permissions from source systems. For Slack-native firms, Slack AI turns the communication tool into an organisational memory layer with faster ROI than a separate platform. Atlassian Rovo does the equivalent across Confluence and Jira.
The decision matrix at this scale is three factors in order: existing stack first, cost sensitivity second, governance depth third. Notion is strong on flexible architecture, Guru on structured curation and audit trails. The brochure will not lead with that order, but it is the order that holds up.
Where does AI still fall short in knowledge management?
Five failure modes still bite, regardless of platform. First, stale-content surfacing. AI ranks ageing documents highly unless ownership and verification windows are set, which surfaces the wrong policy and creates legal liability for regulated firms. Second, hallucinated answers. Frontier-model citation hallucination averages 12.4 percent in 2026, range 4.2 to 19.1 percent. Retrieval grounding cuts that 75 to 90 percent and is the single biggest accuracy lever you can buy on.
The prior post on knowledge bases that go stale at six months covers the hygiene gate before any AI layer goes on top. Third, permission-boundary leakage. AI cannot reliably reason about complex permission logic in matrix organisations or cross-functional projects. Veza’s 2025 research found AI can identify over-permissioned accounts but cannot reliably predict whether a specific user should see a specific document under context-dependent rules. Glean, Box AI, and Microsoft 365 Copilot inherit permissions from source systems and refuse to synthesise across boundaries. Lighter-weight implementations leak. Fourth, tribal knowledge stays tacit. AI can pull from written sources but cannot extract a senior delivery lead’s instinct for a struggling client account. Structured interviews remain the only reliable capture. Fifth, version drift across systems. Connected search unifies the index but does not reconcile conflicting versions, so single-source-of-truth governance is non-negotiable. The prior piece on prompt libraries sitting in Notion names the same pattern in the prompt layer.
What does a 90-day starter rollout look like?
Six phases, compressed from the 12-to-18-month enterprise pattern by running education, technical setup, and pilot in parallel. Weeks 1 to 2 are diagnostic: data audit across Drive, Slack, Confluence, and ticketing; baseline metrics on search time and onboarding duration; a cross-functional steering group across operations, HR, customer success, and delivery. AI literacy training runs in parallel at £800 to £2,000 a day externally.
Helium42’s research finds organisations investing in cultural change see 5.3 times higher pilot success rates than those skipping the education layer. Weeks 3 to 4 are configuration: connect primary data sources, configure permission inheritance, set ingestion schedules. Outsourced configuration runs £3,000 to £8,000 at this scale. Weeks 4 to 6 are pilot and hypercare: 10 to 25 users, two or three specific workflows (“support agent finds policy in 30 seconds”, “new hire searches handbook in 1 minute”), defined success metrics, daily check-ins. Helium42 reports 80 percent plus of well-designed pilots show measurable gains in 4 to 6 weeks. Weeks 7 to 8 plus months 2 to 3 scale into production. Total Year 1 cost for a 50-person firm lands at £12,000 to £30,000, against documented Year 1 benefit of £20,000 to £40,000 and payback in six to twelve months. The operations-rollout sibling covers the back-office overlap if you are running both at once.
What should you ask a knowledge-management vendor before signing?
Five procurement questions separate a serious vendor from a marketing pitch. Is the answer architecture retrieval-grounded or pure-generative? Retrieval grounding cuts hallucination 75 to 90 percent and is the single biggest accuracy lever. If the vendor cannot describe how grounding works in their system, walk away. How does the platform inherit permissions from source systems? Glean, Box AI, and Microsoft 365 Copilot all do this. Lighter-weight implementations leak.
Third, what is the content-verification and ageing-detection workflow? You need owner-assignment, expiry windows, and prominence demotion of stale content. Guru and Slite lead here. Fourth, where is data processed and what is your DPA position on UK GDPR plus the Data Use and Access Act 2025, in force February 2026? You need documented lawful basis, DPIA support, deletion workflows, and audit trails. Fifth, what is your IP-and-content-provenance position? Three 2025 US federal copyright decisions, Bartz v Anthropic, Kadrey v Meta, and Thomson Reuters v Ross Intelligence, established that training on copyrighted work without permission can be infringement. For SMEs ingesting customer or third-party content, default to organisation-owned content only and audit everything else.
If you would like a second pair of eyes on which two of the six jobs to start with, and on which two failure modes to architect against before signing, book a conversation.



