MCP Dual Interface Demo
Demonstration of dual-interface architecture where the same business logic serves both a traditional web application and an MCP server for AI tools.
The idea
Most teams build applications for humans and then, when they want to integrate AI, create a separate layer on top: wrappers, adapters, intermediary APIs. This creates coupling, duplication, and cascading failure points.
This experiment explores an alternative: building from day one with two interfaces that share the same business logic. One interface for humans (web app) and another for AI agents (MCP server), both accessing the same data layer directly.
Why it matters
The Model Context Protocol (MCP) is an open standard that allows LLMs to interact with external tools in a structured way. Instead of an AI agent having to navigate a UI or parse HTML, the MCP server exposes operations as typed tools that the model can invoke directly.
The key question this experiment answers is: can you design an application where the human interface and the AI interface are both first-class citizens from day one, without one being a patch on top of the other?
Architecture
Key architectural principles:
- Loose coupling: each service accesses DynamoDB directly, without depending on the other
- Shared code: business logic lives in a
shared/module that both interfaces import - Independent scaling: the REST API and MCP server can scale separately
- No cascading failures: if the REST API goes down, the MCP server keeps working and vice versa
Tech stack
| Component | Technology |
|---|---|
| REST API | FastAPI + Python 3.11 |
| MCP Server | Python MCP SDK (stdio transport) |
| Frontend | React 19 + TypeScript + Vite + Tailwind CSS |
| Database | DynamoDB Local |
| Shared Logic | Python module (shared/) |
| Orchestration | Docker Compose |
How it works
The shared/todo_service.py module contains all business logic: create, list, update, and delete tasks. Both the REST API (backend/main.py) and the MCP server (mcp-server/server.py) instantiate TodoService with the same DynamoDB table.
The REST API exposes conventional HTTP endpoints:
POST /todos → create task
GET /todos → list tasks
GET /todos/{id} → get task
PATCH /todos/{id} → update task
DELETE /todos/{id} → delete task
The MCP server exposes the same operations as typed tools that an LLM can invoke:
create_todo → create task
list_todos → list tasks
get_todo → get task
update_todo → update task
delete_todo → delete task
Both interfaces execute exactly the same logic. No business rule duplication.
AWS mapping
The local architecture maps directly to AWS services for production:
| Local | AWS |
|---|---|
| DynamoDB Local | DynamoDB |
| FastAPI | Lambda + API Gateway or ECS/Fargate |
| MCP Server | Lambda or ECS Task |
| React Frontend | S3 + CloudFront or Amplify |
| Shared module | Lambda Layer or shared package |
Lessons learned
-
stdio transport is simple but limiting: works well for local development with Kiro CLI, but production would need SSE or WebSocket to support multiple concurrent clients.
-
The shared layer is the most valuable pattern: separating business logic into an interface-independent module is what makes the dual architecture possible without duplication.
-
DynamoDB simplifies direct access: by not needing a complex ORM or migrations, both services can access the same table with minimal configuration.
References
- GitHub Repository — Source code for the experiment.
- Model Context Protocol — Specification — Anthropic. Official protocol specification.
- Python MCP SDK — Anthropic. Reference implementation in Python.
- FastAPI — Tiangolo. Web framework used for the REST interface.