Model Context Protocol: How AI connects to everything

Model Context Protocol: How AI connects to everything


The Model Context Protocol (MCP) is basically the USB-C for AI apps. It lets AI models talk to external tools, databases, and services without custom code for each connection.

Anthropic released MCP in November 2024. By May 2025, over 5,000 servers were running it. OpenAI, Microsoft, Google, and hundreds of companies now use it.

The problem MCP solves

Before MCP, connecting AI to external tools was a mess:

Old way (M×N integrations):
┌─────────┐    ┌─────────────┐
│   AI    │────│   Tool A    │
│ Model 1 │────│   Tool B    │
│         │────│   Tool C    │
└─────────┘    └─────────────┘
┌─────────┐    ┌─────────────┐
│   AI    │────│   Tool A    │
│ Model 2 │────│   Tool B    │
│         │────│   Tool C    │
└─────────┘    └─────────────┘

Every AI model needed custom code for every tool. That's 6 integrations just for 2 models and 3 tools.

New way with MCP (M+N integrations):
┌─────────┐    ┌─────┐    ┌─────────────┐
│   AI    │────│     │────│   Tool A    │
│ Model 1 │    │ MCP │────│   Tool B    │
└─────────┘    │     │────│   Tool C    │
┌─────────┐    │     │    └─────────────┘
│   AI    │────│     │
│ Model 2 │    │     │
└─────────┘    └─────┘

Now it's just 2 + 3 = 5 integrations. Way simpler.

How MCP works

MCP uses a clean three-part setup:

┌─────────────────┐     ┌─────────────────┐     ┌─────────────────┐
│   MCP Host      │────►│   MCP Client    │────►│   MCP Server    │
│                 │     │                 │     │                 │
│ (Claude Desktop,│     │ (Handles the    │     │ (Exposes tools, │
│  VS Code, etc.) │     │  protocol)      │     │  data sources)  │
└─────────────────┘     └─────────────────┘     └─────────────────┘

MCP Hosts are the apps you use - Claude Desktop, VS Code, Cursor. They coordinate everything and manage the AI conversations.

MCP Clients sit inside hosts. They maintain connections with servers and handle the protocol details.

MCP Servers are lightweight programs that expose specific capabilities. They bridge MCP to APIs, databases, or file systems.

Three types of capabilities

MCP defines three ways AI can interact with external systems:

Resources (Read-only data)
├── Documents
├── Database records  
└── Configuration files

Tools (Executable functions)
├── API calls
├── File operations
└── System commands

Prompts (Templates & instructions)
├── Pre-defined workflows
├── User guidance
└── Best practices

Resources are like GET requests - they provide data without changing anything.

Tools are like POST requests - they do stuff and can have side effects.

Prompts help users and AI work together better by providing templates and guidance.

Building MCP servers

The official SDKs make this pretty straightforward. Here's Python using FastMCP:

from mcp.server.fastmcp import FastMCP

mcp = FastMCP("Weather Server")

@mcp.tool()
async def get_weather(city: str) -> str:
    """Get current weather for a city"""
    async with httpx.AsyncClient() as client:
        response = await client.get(f"https://api.weather.com/{city}")
        return response.text

@mcp.resource("greeting://{name}")
def get_greeting(name: str) -> str:
    """Get a personalized greeting"""
    return f"Hello, {name}!"

if __name__ == "__main__":
    mcp.run()

TypeScript is just as clean:

import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";

const server = new McpServer({
  name: "Database Server",
  version: "1.0.0"
});

server.tool("query-database",
  { sql: z.string() },
  async ({ sql }) => {
    const results = await database.query(sql);
    return {
      content: [{
        type: "text",
        text: JSON.stringify(results, null, 2)
      }]
    };
  }
);

const transport = new StdioServerTransport();
await server.connect(transport);

Real companies using MCP

Block deployed MCP across their entire company, with about 4,000 of Block's 10,000 employees (across 15 different job roles) actively using their MCP-powered AI agent by early 2025.[^1] They call their AI agent "Goose" and it runs on MCP. Engineering, design, security, compliance, customer support, and sales teams all use it. Results? They eliminated boring mechanical work and got better cross-team collaboration. Even non-technical teams like marketing adopted it.

Stripe released an official MCP server for payment processing in February 2025, described as a "fintech bombshell" that enables AI agents to issue invoices, manage subscriptions, and process refunds with a single command.[^2] PayPal followed suit in April 2025 with their PayPal Agent Toolkit, providing public MCP access to their payments and invoicing APIs.[^3]

In manufacturing, companies use MCP for real-time machinery monitoring. One factory implementing MCP for predictive maintenance achieved a 25% reduction in machine downtime.[^4] An automotive manufacturer reported an even larger improvement with a 40% reduction in unexpected equipment downtime after deploying MCP-driven monitoring across production lines.[^5]

Healthcare systems connect AI diagnostics with medical imaging through MCP. A healthcare provider deployed an MCP-integrated AI diagnostics tool that reduced patient waiting times by about 30% by pulling real-time data from medical records and imaging to help doctors make faster decisions.[^6]

An e-commerce company achieved an 18% boost in sales conversions in one quarter using MCP for personalized recommendations, with the improved conversion rate translating to comparable revenue increases.[^7]

MongoDB released a public MCP server in May 2025, allowing developers to manage MongoDB databases through natural language via AI assistants.[^8] The MongoDB MCP Server supports both MongoDB Atlas and Enterprise deployments.

By spring 2025, over 5,000 active MCP servers were listed in public directories like Glama,[^9] with major companies including Block, Apollo, Replit, Codeium, and Sourcegraph adding MCP support to their platforms.[^10]

MCP vs the competition

Most alternatives are either proprietary or solve different problems.

Integration Approaches Comparison:

Traditional APIs:
Custom code for each connection
No AI-specific features
Maintenance nightmare

OpenAI Function Calling:
Locked to OpenAI only
Platform-specific
Limited ecosystem

LangChain/Agent Frameworks:
Different abstraction level
Works WITH MCP now
Developer tools, not protocol

MCP:
Universal protocol
Works with any AI model
Growing ecosystem
Open standard

MCP wins because it's open, comprehensive, and first to market with real traction.

Why choose MCP

You write less code. Instead of custom integrations for every tool, you implement MCP once and connect to everything.

No vendor lock-in. MCP servers work with Claude, ChatGPT, and open-source models. Switch AI providers without rebuilding.

AI gets smarter. Models can discover new tools automatically. Your AI system improves as you add servers.

Better security. Built-in authentication, granular permissions, and clear security boundaries.

It scales. Block's company-wide deployment proves it works at enterprise scale.

You save money. Less custom development, less maintenance, more efficiency.

Current issues to watch

Security needs attention. The original spec had authentication gaps. OAuth 2.1 support came in April 2025, but token management is still tricky.

Prompt injection risks. Malicious tool descriptions can manipulate AI behavior. Vet your servers carefully.

Implementation complexity. Basic servers still need hundreds of lines of code. Debugging tools are limited.

Performance concerns. Multiple servers can bloat context windows. Monitor AI model performance.

No central discovery. You configure servers manually. Quality control for community servers varies.

What's coming next

Better authentication with comprehensive OAuth 2.1 framework.

Improved transport replacing legacy HTTP+SSE with Streamable HTTP.

Performance boosts through JSON-RPC batching.

Marketplaces for finding and distributing servers. Mintlify's mcpt and Smithery are early examples.

Major platform adoption. Microsoft is integrating MCP into Windows 11. Google confirmed Gemini support in April 2025.

Advanced workflows with agent graphs and multi-modal support.

MCP in action: Shinkai's integration

Shinkai shows how MCP works in practice. It's a free, open-source AI app that lets you create and manage AI agents without coding.

Here's how Shinkai uses MCP:

Shinkai Agent Ecosystem:
┌─────────────────┐     ┌─────────────────┐     ┌─────────────────┐
│   Shinkai App   │────►│   MCP Bridge    │────►│ External Tools  │
│                 │     │                 │     │                 │
│ • Agent Builder │     │ • Protocol      │     │ • Web Scraping  │
│ • Chat Interface│     │   Translation   │     │ • YouTube API   │
│ • Memory System │     │ • Tool Registry │     │ • Gmail Access  │
│ • Crypto Wallet │     │                 │     │ • Twitter API   │
└─────────────────┘     └─────────────────┘     └─────────────────┘

Two-way MCP support:

  1. Consumer mode - Shinkai agents can use existing MCP servers. Install web scraping, YouTube transcript downloading, Gmail reading, and other tools through MCP.
  2. Server mode - Expose your Shinkai agents as MCP servers. Other apps like Claude Desktop, Cursor, and VS Code can then use your custom agents.

This means you can:

  • Build an agent in Shinkai's no-code interface
  • Expose it as an MCP server
  • Use it from any MCP-compatible app

Why this matters:

You get the best of both worlds. Shinkai's visual agent builder for creation, plus MCP's universal compatibility for deployment.

Your agents aren't locked into Shinkai. They become part of the broader MCP ecosystem.

Crypto integration:

Shinkai adds crypto capabilities through the x402 payment protocol. Your agents can:

  • Pay for services autonomously
  • Receive payments for their work
  • Maintain their own crypto wallets
  • Work in decentralized environments

This creates new possibilities for AI agents that can operate independently in crypto-native workflows.

Bottom line

MCP is becoming the standard way AI connects to external systems. Early adopters get competitive advantages through reduced development costs and increased AI capabilities.

If you're building AI-powered applications, evaluate MCP now. Focus on security-first deployments and comprehensive governance.

The next phase of AI development will be defined by systems that seamlessly connect digital and physical worlds. MCP provides the foundation for this connected AI future.

Companies that adopt MCP early will be best positioned when AI agents become mainstream. It's not just a technical standard - it's the infrastructure enabling AI's next big leap from chatbots to active participants in real workflows.


References:

[^1]: Anthropic Case Study – "Block uses Claude (with MCP) in Databricks" (2025) [^2]: ANOTHER SPACE fintech blog – "Stripe Launches MCP Server" (Feb. 28, 2025) [^3]: InfoQ News – "PayPal's Agent Toolkit with MCP" (Apr. 28, 2025) [^4]: Arsturn Blog – "MCP for Automation – Case Studies" (2025) [^5]: BytePlus Blog – "MCP for Predictive Maintenance" (2025) [^6]: Arsturn Blog – healthcare case study (2025) [^7]: Arsturn Blog – "E-commerce Enhancements with MCP" (2025) [^8]: MongoDB Blog – "Announcing the MongoDB MCP Server (Public Preview)" (May 1, 2025) [^9]: Wikipedia (MCP) – community adoption and server count as of May 2025 [^10]: TechCrunch – "OpenAI adopts rival Anthropic's standard…" (Mar. 26, 2025)

Nico Arqueros

Nico Arqueros

crypto builder (code, research and product) working on @shinkai_network by @dcspark_io