Introduction to Model Context Protocol
Your AI assistant can write a perfect database query, but it has no idea what tables exist in your production database. It can generate a pixel-perfect React component, but it has never seen your Figma designs. It can draft a commit message, but it cannot read your git history. This disconnect between raw intelligence and actual context is the single biggest bottleneck in AI-assisted development today.
The Model Context Protocol (MCP) eliminates that bottleneck. It is an open standard, originally created by Anthropic and now adopted across the industry, that provides a universal interface for AI assistants to connect with external tools, data sources, and services. Think of it as USB-C for AI: one standardized connection that works with everything.
What You’ll Walk Away With
Section titled “What You’ll Walk Away With”- A clear mental model of how MCP clients, servers, and transports work together
- Setup instructions for connecting your first MCP server in Cursor, Claude Code, and Codex
- An understanding of the difference between local (STDIO) and remote (Streamable HTTP) servers
- A framework for deciding which MCP servers to add to your workflow first
How MCP Works
Section titled “How MCP Works”The architecture is straightforward. Three components interact in every MCP connection:
MCP Host — Your AI assistant (Cursor, Claude Code, or Codex). It is the application that needs to access external capabilities.
MCP Client — A lightweight connection layer embedded inside the host. Each client maintains a single connection to one MCP server.
MCP Server — An application that exposes tools, prompts, and resources over the protocol. A server might give the AI the ability to query a database, read Figma designs, or search documentation.
When the AI decides it needs external information, the client sends a structured request to the appropriate server. The server performs the action (querying a database, fetching a file, calling an API) and returns a structured response. The AI then uses that response as context for its next action.
Local vs. Remote MCP Servers
Section titled “Local vs. Remote MCP Servers”MCP supports two transport modes:
- Local (STDIO) — The MCP server runs on your machine. The client communicates over standard input/output. This is how most filesystem, git, and database servers work. Fast, no network dependency, but limited to your local environment.
- Remote (Streamable HTTP / SSE) — The MCP server runs in the cloud. The client connects over HTTPS, typically with OAuth 2.1 for authentication. This is how Atlassian, Cloudflare, and other SaaS platforms expose their MCP servers. Works from anywhere, but requires network access and authentication.
Setting Up Your First MCP Server
Section titled “Setting Up Your First MCP Server”Let’s connect the Context7 documentation server — it gives your AI up-to-date docs for thousands of open-source libraries. This single server eliminates the most common failure mode: the AI generating code for deprecated APIs.
Create or edit .cursor/mcp.json in your project root:
{ "mcpServers": { "context7": { "command": "npx", "args": ["-y", "@upstash/context7-mcp@latest"] } }}Open Settings > MCP Servers and verify the green dot appears next to “context7”. If it shows red, restart Cursor.
Add to your project’s .claude/settings.json (or run claude mcp add):
{ "mcpServers": { "context7": { "command": "npx", "args": ["-y", "@upstash/context7-mcp@latest"] } }}Run /mcp in Claude Code to verify the connection. You should see “context7: connected”.
Add to your ~/.codex/config.toml:
[mcp.context7]transport = "stdio"command = "npx"args = ["-y", "@upstash/context7-mcp@latest"]Restart Codex to load the new server. The tools will appear in your available tool list.
The MCP Ecosystem at a Glance
Section titled “The MCP Ecosystem at a Glance”Over 20,000 MCP servers exist today. They fall into predictable categories:
| Category | What It Does | Examples |
|---|---|---|
| Documentation | Up-to-date library docs, API references | Context7, Augments, Ref Tools |
| Version Control | Git operations, PR management, code search | GitHub MCP, GitLab MCP, Git MCP |
| Filesystem | File read/write, directory operations | Official Filesystem, Desktop Commander |
| Browser | Page interaction, screenshots, E2E testing | Playwright MCP, Puppeteer MCP |
| Database | Schema inspection, queries, migrations | Prisma, Supabase, MongoDB, SQLite |
| Project Management | Tickets, sprints, status updates | Atlassian Rovo, Linear, Jira |
| Design | Figma layer data, component libraries | Figma Dev Mode, shadcn/ui MCP |
| Cloud Platform | Resource management, deployments, logs | Cloudflare, Azure MCP, AWS MCP Suite |
| Observability | Error tracking, logs, metrics | Sentry, Datadog, Grafana |
How to Think About Adding MCP Servers
Section titled “How to Think About Adding MCP Servers”Do not install twenty servers on day one. Start with the tools that solve your biggest context gaps:
-
Start with documentation. Context7 gives every prompt access to current library docs. This single server eliminates an entire class of hallucination.
-
Add version control. The GitHub or GitLab MCP server lets your AI read PRs, search code across repos, and understand your project’s history without you pasting links into chat.
-
Connect your database. Whether it is Postgres, MongoDB, or SQLite, letting the AI inspect schemas and run read-only queries transforms how it writes data access code.
-
Layer in your workflow tools. Figma for design-to-code, Jira or Linear for ticket context, Playwright for browser testing. Each server you add closes another context gap.
When This Breaks
Section titled “When This Breaks”Server won’t connect. The most common cause is a missing runtime. MCP servers built with Node.js require npx or node on your PATH. Python-based servers need uv or pip. Check the server’s README for prerequisites.
Tools don’t appear after connection. Some clients cache the tool list. Restart your editor or run the MCP debug command (/mcp in Claude Code, check the MCP panel in Cursor) to force a refresh.
Remote servers hang on OAuth. If the OAuth flow opens a browser but never completes, check that your localhost callback URL isn’t blocked by a VPN or firewall. Atlassian and Cloudflare remote servers require the browser to redirect back to a local port.
AI uses tools incorrectly. MCP servers expose tool descriptions that the AI reads to decide when and how to call them. If the AI misuses a tool, the problem is usually in the tool’s description, not the AI. When building custom servers, invest in clear, specific tool descriptions.