What Is MCP (Model Context Protocol) and Why Does It Matter?
Veriti Team
28 October 2025 · Last updated: January 2026
Model Context Protocol (MCP) is an open standard developed by Anthropic that provides a universal way for AI models to connect to external tools, data sources, and systems. Think of MCP as the USB standard for AI: before USB, every device needed its own proprietary connector. MCP does the same thing for AI integrations — it creates one standard protocol so any AI model can talk to any tool or data source without custom-built connectors for every combination.
Why Does AI Need a Universal Protocol?
Here's the problem MCP solves. Say you want your AI assistant to check your calendar, search your company's knowledge base, and update your CRM. Without MCP, each of these integrations requires custom code: specific API calls, authentication handling, data formatting, and error management — all built separately for each AI model and each tool.
If you switch AI models (say, from GPT to Claude), you have to rebuild every integration. If the tool updates its API, you have to update every AI integration that uses it. At scale, this becomes an unmaintainable mess.
MCP solves this by creating a standard client-server architecture where:
- AI models (clients) speak one protocol
- Tools and data sources (servers) speak the same protocol
- Any client can work with any server, automatically
The result: you build the integration once, and it works with any MCP-compatible AI model. The tool vendor builds their MCP server once, and it works with any MCP-compatible AI client.
How Does MCP Actually Work?
MCP uses a client-server architecture with three core components:
MCP Hosts (The AI Application)
The host is the AI application the user interacts with — Claude Desktop, an IDE with AI features, or your custom AI-powered business application. The host manages the connection to MCP servers and decides which tools and data sources the AI model can access.
MCP Clients (The Connection Layer)
Each host contains one or more MCP clients that maintain connections to individual MCP servers. The client handles the protocol communication: sending requests, receiving responses, and managing the session lifecycle.
MCP Servers (The Tool/Data Interface)
MCP servers expose specific tools, data sources, or capabilities to AI models. A server might provide access to a database, a file system, a CRM, a code repository, or any other system. Each server defines:
- Tools: Actions the AI can perform (e.g., "search_documents", "create_ticket", "send_email")
- Resources: Data the AI can read (e.g., files, database records, API responses)
- Prompts: Pre-built interaction templates for common workflows
Here's a simplified view of how a request flows:
- User asks the AI: "What meetings do I have tomorrow?"
- The AI model recognises it needs calendar data and requests it via the MCP client
- The MCP client sends a standardised request to the calendar MCP server
- The calendar server authenticates, queries the calendar API, and returns structured data
- The AI model receives the data and formulates a natural language response
The critical feature is tool discovery. When an MCP client connects to a server, the server advertises what it can do — its available tools, resources, and capabilities. The AI model can then dynamically decide which tools to use based on the user's request, without hardcoded logic for each integration.
Why Does MCP Matter for Businesses?
If you're building AI solutions for your business — or evaluating AI tools — MCP changes the equation in several important ways:
1. Reduced Vendor Lock-in
Without MCP, your AI integrations are tightly coupled to a specific model provider. If you've built custom integrations between GPT-4 and your internal systems, switching to Claude or Gemini means rebuilding everything. With MCP, the integration layer is model-agnostic. Your MCP servers work with any MCP-compatible AI client.
This is significant for Australian businesses operating in a market where AI models are evolving rapidly. The best model today might not be the best model in 6 months. MCP means you can switch without starting from scratch.
2. Easier Integration
Building a custom integration between an AI model and a business tool typically costs $5,000–$20,000 in development time. With MCP, if an MCP server already exists for your tool (and the ecosystem is growing fast), the integration cost drops to near zero — it's configuration, not development.
As of early 2026, MCP servers exist for:
- Google Workspace (Drive, Calendar, Gmail)
- Slack, Microsoft Teams
- GitHub, GitLab, Jira, Linear
- PostgreSQL, MySQL, MongoDB databases
- Salesforce, HubSpot CRMs
- File systems, web browsers, and many more
3. Multi-Agent Workflows
This is where MCP gets genuinely exciting. AI agents — autonomous AI systems that can plan and execute multi-step tasks — need to interact with multiple tools and data sources. MCP provides the standard interface for this.
Instead of building custom tool-calling logic for each agent, MCP lets agents discover and use tools dynamically. An agent can connect to a dozen MCP servers and choose the right tools for each step of a complex workflow — without any of those integrations being hardcoded.
For businesses, this means you can build agentic workflows that span multiple systems (CRM, email, databases, documents) through a single protocol layer.
4. Security and Access Control
MCP includes built-in mechanisms for authentication, authorisation, and access control. Each MCP server defines what permissions it requires, and the host application manages user consent. This means businesses can control exactly what data and actions their AI systems can access — critical for regulated industries and data-sensitive environments.
How Does MCP Compare to Alternatives?
| Approach | How It Works | Limitations |
|---|---|---|
| Custom API integrations | Build specific connections between each AI model and each tool | Doesn't scale — N models x M tools = N*M integrations |
| OpenAI Function Calling | Define functions the AI can call, specific to OpenAI's API | Vendor-specific, no standardised tool discovery, limited to OpenAI models |
| LangChain/LlamaIndex tools | Framework-level tool abstractions for building AI applications | Framework-specific, not a universal protocol, requires code changes per framework |
| MCP | Universal open protocol — any client, any server, automatic tool discovery | Still maturing, not all tools have MCP servers yet |
The key difference is that MCP is a protocol, not a product. It's open-source, model-agnostic, and designed to be implemented by anyone. This is what gives it the potential to become a genuine standard rather than another proprietary solution.
Who Supports MCP Today?
Adoption is accelerating. As of early 2026:
- Anthropic (Claude): Native MCP support in Claude Desktop and the Claude API. Anthropic created MCP and actively maintains the specification.
- OpenAI: Added MCP support in early 2025, integrating it into ChatGPT Desktop and the Assistants API.
- Google: MCP support in Gemini via the AI Studio and Vertex AI platforms.
- Development tools: Cursor, Windsurf, Zed, and other AI-powered IDEs support MCP natively.
- Enterprise platforms: Salesforce, Block, and others have released official MCP servers for their platforms.
When OpenAI — Anthropic's primary competitor — adopted MCP, it was a clear signal that the protocol had crossed from interesting experiment to industry standard. It's rare for competitors to adopt each other's specifications; it only happens when the standard is genuinely useful and the alternative (fragmentation) is worse for everyone.
What Should Australian Businesses Do About MCP?
You don't need to do anything with MCP today unless you're actively building AI integrations. But here's why you should care:
- If you're evaluating AI tools: Ask whether they support MCP. Tools that do are future-proofed against model changes and will integrate more easily with your broader AI stack.
- If you're building custom AI solutions: Build on MCP from the start. The upfront effort is minimal, and it gives you model flexibility from day one. Any AI model comparison should include MCP support as a factor.
- If you're planning AI agent workflows: MCP is essentially a prerequisite for sophisticated multi-agent systems. It's the protocol layer that lets agents interact with your business tools dynamically.
- If you're concerned about vendor lock-in: MCP is your insurance policy. Build your integrations as MCP servers, and you can swap AI models without rebuilding your tool connections.
MCP is still early, but it's the first serious attempt at a universal standard for AI-to-tool communication. For businesses investing in AI, building on MCP is a bet on interoperability — and historically, interoperability standards win.
Frequently Asked Questions
What is MCP in simple terms?
MCP (Model Context Protocol) is a universal standard for connecting AI models to tools and data sources. It works like USB for AI — instead of building a custom connection between each AI model and each tool, MCP provides one standard protocol that lets any compatible AI model work with any compatible tool automatically.
Who created MCP and is it open source?
MCP was created by Anthropic (the company behind Claude) and is fully open source. Despite being created by one company, it has been adopted by competitors including OpenAI and Google, which signals it is becoming a genuine industry standard rather than a proprietary solution.
Does MCP work with ChatGPT and other AI models?
Yes. As of early 2026, MCP is supported by Claude (Anthropic), ChatGPT (OpenAI), Gemini (Google), and numerous AI-powered development tools. This broad adoption means MCP integrations you build today will work across multiple AI providers.
How does MCP help with AI vendor lock-in?
Without MCP, switching AI models means rebuilding every integration. With MCP, your tool integrations are model-agnostic — they work with any MCP-compatible AI client. This means you can switch from Claude to GPT to Gemini without rebuilding your integration layer, protecting your investment as models evolve.
Do I need MCP for my business AI projects?
If you are building custom AI solutions or AI agent workflows that interact with multiple business tools, building on MCP from the start is strongly recommended. If you are using off-the-shelf AI tools, look for MCP support as a feature — it indicates future-proofing and better interoperability. For simple use cases, MCP is not required but is good insurance against future needs.
See how document intelligence could work for your business
Take our free 2-minute readiness assessment and discover where the biggest time savings are — no sales pitch, no commitment.
Take the Free Assessment