Concepts

Model Context Protocol (MCP)

Open protocol created by Anthropic that standardizes how AI applications connect with external tools, data, and services through a universal interface.

growing#mcp#protocol#ai-tools#anthropic#json-rpc#open-standard#interoperability

What it is

The Model Context Protocol (MCP) is an open standard launched by Anthropic in November 2024 that defines how AI applications connect with external tools and data sources. It is to the AI ecosystem what the Language Server Protocol (LSP) was for code editors: a universal interface that eliminates the need for point-to-point integrations.

Before MCP, every AI tool needed its own custom integration with each external service. If you had 10 AI tools and 10 services, you needed 100 integrations. With MCP, each service implements an MCP server once, and any MCP client can connect to it.

Architecture

MCP uses JSON-RPC 2.0 messages and defines three roles:

  • Host: the AI application that initiates the connection (e.g., Claude Desktop, Kiro CLI, an IDE)
  • Client: the connector within the host that manages communication with a specific server
  • Server: the service that exposes tools, resources, and context to the model
graph LR subgraph "Host (IDE / CLI)" A[LLM] --> B[MCP Client] end B -->|JSON-RPC| C[MCP Server A<br/>Database] B -->|JSON-RPC| D[MCP Server B<br/>External API] B -->|JSON-RPC| E[MCP Server C<br/>File System]

Core capabilities

Tools

Functions the model can invoke. Each tool has a name, description, and typed input schema. The model decides when and how to use them.

{
  "name": "create_todo",
  "description": "Create a new todo item",
  "inputSchema": {
    "type": "object",
    "properties": {
      "title": { "type": "string" }
    },
    "required": ["title"]
  }
}

Resources

Data the server exposes to the model: files, database records, documentation. Unlike tools, resources are data the model reads but doesn't modify.

Prompts

Instruction templates the server can offer to the host to guide user interaction with the model.

Transports

MCP supports two transport mechanisms:

TransportUseLimitation
stdioLocal development. The host runs the server as a child process.Single client per server.
HTTP + SSEProduction. Remote connection with Server-Sent Events for streaming.Requires authentication (OAuth 2.1 since the 2025 spec).

Specification evolution

  • 2024-11: initial launch. Tools, resources, prompts, stdio transport.
  • 2025-03: remote connections with SSE, tool discovery improvements.
  • 2025-06: OAuth 2.0, resource indicators, security improvements.
  • 2025-11: mandatory OAuth 2.1 with PKCE, async execution, client metadata, enterprise readiness.

Why it matters

MCP solves a fundamental problem: integration fragmentation in the AI ecosystem. Without a standard, every combination of AI tool + external service requires custom code. With MCP:

  • Tool developers implement a server once
  • AI application developers implement a client once
  • Any client can connect to any server
  • Tools are discoverable and self-documented

This is especially relevant for AI agents that need to access multiple tools dynamically.

References

Concepts