What is MCP (Model Context Protocol)? Why it matters for your business

Two colleagues at a shared desk reviewing a laptop screen with printed notes alongside
TL;DR

The Model Context Protocol is an open standard, introduced by Anthropic in 2024 and now supported by OpenAI, Microsoft, and Google, for connecting AI models to external tools and data. The point is portability: one MCP server for your CRM or knowledge base works with any compatible AI client. For an SME the question is whether your use case is mature enough to engineer for MCP now or wait for the ecosystem to settle.

Key takeaways

- MCP is an open protocol for connecting AI models to tools and data through a shared interface, with Anthropic, OpenAI, Microsoft, and Google all supporting it by mid-2026. - The architectural saving is real. One MCP server for your CRM or document store works across Claude, ChatGPT, and other compatible clients, instead of one bespoke integration per AI tool. - The governance question is also real. An MCP server is now a privileged surface that holds the keys to underlying systems, and needs scope-limited credentials, query logs, and a revocation path. - The ecosystem is uneven. Vendor-built servers (Notion, GitHub, Slack, Stripe, Google Workspace) are well-maintained. Community-built servers vary widely and need a maintenance check before production use. - Engineer for MCP now if you have multiple AI clients in production or expect to within twelve months. Single-tool single-client deployments can wait, but the optionality of MCP is cheap to preserve.

A 28-staff financial advisory firm I worked with last quarter wanted their advisors to draft client portfolio summaries with Claude, pulling live data from the firm’s CRM and document store. Twelve months earlier, the same idea would have meant a custom integration per AI tool. One Claude-to-CRM bridge, one ChatGPT-to-CRM bridge, both built and maintained separately. The operations director had quietly killed the project once before.

This time the technology lead came back with a different shape. One MCP server, exposing safe read access to the CRM and the document store, would connect to whichever AI client the advisor preferred. The architectural saving was real. The architectural concern was also real. The MCP server now held the keys to two systems, and the firm needed governance for what queries it accepted and from whom. The owner saw the trade-off cleanly. MCP makes the right integration cheaper, and makes the governance question more pointed.

What is MCP?

The Model Context Protocol is an open standard for connecting AI models to external tools and data. An MCP server exposes tools the AI can call (search this, fetch that, write here) and resources it can read, like documents or database views. An MCP client, which lives inside Claude, ChatGPT, or another compatible AI tool, connects to the server, learns what is available, and uses those tools during a conversation.

Anthropic introduced the protocol in November 2024 and open-sourced the SDKs. The closest analogy is USB. Before USB, every peripheral had its own cable. After USB, one shape worked across devices. MCP does the same job for AI integrations.

Why does it matter for your business?

The point is portability. Before MCP, every AI integration was vendor-specific. OpenAI’s function calling, Anthropic’s tool use, and Google’s function declarations looked similar but were not compatible. Building a CRM integration meant three separate implementations. With MCP the integration is built once and works across any compatible AI client, which is why OpenAI, Microsoft, and Google have all adopted it despite competing with the vendor that created it.

For an SME, the practical implication is choice. If you have Claude in finance, ChatGPT in sales, and a copilot inside Microsoft 365, MCP lets the same CRM server serve all three. The other implication is governance. An MCP server with broad scope is now a single privileged surface, and the security model needs the same discipline as an API key you would not hand out twice.

Where will you actually meet it?

You will meet MCP in three concrete shapes. The first is vendor-built servers for the SaaS tools you already pay for. Notion, Slack, GitHub, Google Workspace, Microsoft 365, and Stripe all publish official MCP servers in 2026. These are the lowest-engineering, highest-value path for an SME, and the maintenance is somebody else’s problem. The setup is usually configuration, not code.

The second shape is custom MCP servers over your own systems. A bespoke server that exposes safe access to your internal database, your file store, or a line-of-business application is medium engineering and higher value, but it is also where the security model genuinely matters. Read-only first, write operations only after the audit trail is in place.

The third shape is hosted MCP gateways, like Zapier, Make, or specialist providers such as Merge, that bridge multiple SaaS tools through a single MCP endpoint. The lowest engineering of all, with a dependency on the gateway provider’s roadmap and pricing.

The ecosystem in mid-2026 is real but uneven. Vendor-built servers from major platforms are well-maintained. Community-built servers vary widely. Before connecting a community server to anything that touches customer data, check the maintenance cadence, the security disclosures, and whether the underlying platform’s API changes have been kept up with.

When to engineer for it now, when to wait

Engineer for MCP now if you have more than one AI client in production, or expect to within twelve months. The cost of building bespoke vendor-specific integrations does not amortise once a second AI tool enters the picture. Engineer for it now if you are connecting to a SaaS tool that already has a maintained vendor server. There is no good reason to build a custom integration when a working one ships in the box.

Wait if you have a single AI client serving a single workflow with no near-term plan to broaden. The optionality of MCP is cheap to preserve, but it is not urgent. Wait if your key business systems do not yet have a well-maintained server and you do not have the engineering capacity to build one. A poorly built MCP server is worse than none, because it gives you the false comfort that the integration question is solved.

The trap to flag sits at the edges. Some teams build custom MCP servers with excessive scope and treat them as developer convenience, forgetting they are now a privileged surface with no per-user authentication and no query logging. Mature deployments use scope-limited credentials, OAuth where the underlying platform supports it, server-side query logging, and a documented revocation path. Less mature ones discover the gap when an audit asks who ran what query through the AI last Tuesday.

Function calling, sometimes called tool calling, is the underlying mechanism that lets a language model trigger an external action during its reasoning. Without function calling, an AI cannot do anything beyond generate text. MCP is the standard interface that wraps function calling so the same tool definition works across vendors and clients.

An API is the general-purpose contract any two systems use to talk to each other. An MCP server typically sits in front of one or more APIs and exposes them in the shape an AI client expects.

OAuth is the authorisation pattern mature MCP servers commonly use. Instead of holding raw credentials, the server receives a time-limited token that lets it act on behalf of an authenticated user, which is more controllable than a static API key. Asynchronous events from MCP servers and the tools they wrap commonly arrive through webhooks, so the same signing-secret and replay-protection questions apply on that side.

An AI agent is an autonomous system that calls tools to complete multi-step tasks. Many agentic deployments in 2026 use MCP as the integration layer for those tools, which is why the agent conversation and the MCP conversation have started to converge.

Vendor lock-in is the wider question MCP is partly a response to. The protocol does not eliminate lock-in entirely, your prompts and configurations still vary by client, but it removes the integration layer as a switching cost, which is the largest hidden cost in the typical AI deployment.

The honest version of an MCP pitch in 2026 names the security model and shows you the audit trail. The marketing version skips both.

Sources

Anthropic (2024). Introducing the Model Context Protocol. The original release announcement and specification overview. https://www.anthropic.com/news/model-context-protocol Model Context Protocol (2025). Official specification and getting-started guide. The canonical documentation for the protocol, server registry, and architecture. https://modelcontextprotocol.io/docs/getting-started/intro Cloud Wars (2025). OpenAI and Microsoft adopt MCP. Cited for the cross-vendor adoption timeline through 2025. https://cloudwars.com/ai/openai-and-microsoft-support-model-context-protocol-mcp-ushering-in-unprecedented-ai-agent-interoperability/ WorkOS (2026). Everything your team needs to know about MCP in 2026. Source for the 2026 ecosystem snapshot, including server count and quality variance between vendor-built and community servers. https://workos.com/blog/everything-your-team-needs-to-know-about-mcp-in-2026 GitHub (2025). Official GitHub MCP server. Vendor-built reference for repository, issue, and pull-request access. https://github.com/github/github-mcp-server Slack (2025). Guide to the Slack MCP server. Vendor-built reference for message search, channel access, and the Slack permission model. https://slack.com/help/articles/48855576908307-Guide-to-the-Slack-MCP-server Notion (2025). Notion MCP server overview. Vendor-built reference for read-write access to Notion workspaces. https://developers.notion.com/guides/mcp/overview Google (2025). Configure MCP servers for Google Workspace. Vendor-built reference for Gmail, Drive, Calendar, and Chat servers using OAuth and existing account permissions. https://developers.google.com/workspace/guides/configure-mcp-servers Model Context Protocol (2025). Authorisation and OAuth 2.1 specification. Cited for the authentication and token-handling model used by MCP servers. https://modelcontextprotocol.io/docs/tutorials/security/authorization MCP Manager (2025). MCP permissions and audit logging. Source for the granular tool-level, data-level, and quantitative permission constraints in the security section. https://mcpmanager.ai/blog/mcp-permissions/

Frequently asked questions

Is MCP actually a standard or is it still an Anthropic project?

It is a standard in practice. Anthropic released the specification in November 2024 and open-sourced the SDKs. By mid-2026 OpenAI, Microsoft Copilot, and Google Gemini all support MCP clients, and the registry lists close to two thousand public servers. The specification is governed in the open and the major AI vendors have a commercial reason to keep it interoperable. It is the closest the industry has come to a common interface for AI-to-system integration.

Do I need to build my own MCP server, or can I use someone else's?

For many SMEs, use someone else's first. Vendor-built servers exist for Notion, Slack, GitHub, Google Workspace, Microsoft 365, Stripe, and dozens more, and a hosted gateway like Zapier or Merge can bridge several SaaS tools through a single endpoint. Build a custom server only when you are connecting to an internal database or a system that has no maintained server, and start it read-only.

What does an MCP deployment cost a 20-staff services firm?

Modest if you are using vendor-built servers, larger if you are building. A typical pilot connecting two or three SaaS tools to Claude or ChatGPT runs from a few thousand pounds in setup to a few hundred a month in subscriptions. A custom MCP server over an internal system is more variable, often a few weeks of engineering time plus hosting, and the security and audit work is the part many firms underestimate.

This post is general information and education only, not legal, regulatory, financial, or other professional advice. Regulations evolve, fee benchmarks shift, and every situation is different, so please take qualified professional advice before acting on anything you read here. See the Terms of Use for the full position.

Ready to talk it through?

Book a free 30 minute conversation. No pitch, no pressure, just a useful chat about where AI fits in your business.

Book a conversation

Related reading

If any of this sounds familiar, let's talk.

The next step is a conversation. No pitch, no pressure. Just an honest discussion about where you are and whether I can help.

Book a conversation