Skip to content

Introduction to Model Context Protocol

Your AI assistant can write a perfect database query, but it has no idea what tables exist in your production database. It can generate a pixel-perfect React component, but it has never seen your Figma designs. It can draft a commit message, but it cannot read your git history. This disconnect between raw intelligence and actual context is the single biggest bottleneck in AI-assisted development today.

The Model Context Protocol (MCP) eliminates that bottleneck. It is an open standard, originally created by Anthropic and now adopted across the industry, that provides a universal interface for AI assistants to connect with external tools, data sources, and services. Think of it as USB-C for AI: one standardized connection that works with everything.

  • A clear mental model of how MCP clients, servers, and transports work together
  • Setup instructions for connecting your first MCP server in Cursor, Claude Code, and Codex
  • An understanding of the difference between local (STDIO) and remote (Streamable HTTP) servers
  • A framework for deciding which MCP servers to add to your workflow first

The architecture is straightforward. Three components interact in every MCP connection:

MCP Host — Your AI assistant (Cursor, Claude Code, or Codex). It is the application that needs to access external capabilities.

MCP Client — A lightweight connection layer embedded inside the host. Each client maintains a single connection to one MCP server.

MCP Server — An application that exposes tools, prompts, and resources over the protocol. A server might give the AI the ability to query a database, read Figma designs, or search documentation.

When the AI decides it needs external information, the client sends a structured request to the appropriate server. The server performs the action (querying a database, fetching a file, calling an API) and returns a structured response. The AI then uses that response as context for its next action.

MCP supports two transport modes:

  • Local (STDIO) — The MCP server runs on your machine. The client communicates over standard input/output. This is how most filesystem, git, and database servers work. Fast, no network dependency, but limited to your local environment.
  • Remote (Streamable HTTP / SSE) — The MCP server runs in the cloud. The client connects over HTTPS, typically with OAuth 2.1 for authentication. This is how Atlassian, Cloudflare, and other SaaS platforms expose their MCP servers. Works from anywhere, but requires network access and authentication.

Let’s connect the Context7 documentation server — it gives your AI up-to-date docs for thousands of open-source libraries. This single server eliminates the most common failure mode: the AI generating code for deprecated APIs.

Create or edit .cursor/mcp.json in your project root:

{
"mcpServers": {
"context7": {
"command": "npx",
"args": ["-y", "@upstash/context7-mcp@latest"]
}
}
}

Open Settings > MCP Servers and verify the green dot appears next to “context7”. If it shows red, restart Cursor.

Over 20,000 MCP servers exist today. They fall into predictable categories:

CategoryWhat It DoesExamples
DocumentationUp-to-date library docs, API referencesContext7, Augments, Ref Tools
Version ControlGit operations, PR management, code searchGitHub MCP, GitLab MCP, Git MCP
FilesystemFile read/write, directory operationsOfficial Filesystem, Desktop Commander
BrowserPage interaction, screenshots, E2E testingPlaywright MCP, Puppeteer MCP
DatabaseSchema inspection, queries, migrationsPrisma, Supabase, MongoDB, SQLite
Project ManagementTickets, sprints, status updatesAtlassian Rovo, Linear, Jira
DesignFigma layer data, component librariesFigma Dev Mode, shadcn/ui MCP
Cloud PlatformResource management, deployments, logsCloudflare, Azure MCP, AWS MCP Suite
ObservabilityError tracking, logs, metricsSentry, Datadog, Grafana

Do not install twenty servers on day one. Start with the tools that solve your biggest context gaps:

  1. Start with documentation. Context7 gives every prompt access to current library docs. This single server eliminates an entire class of hallucination.

  2. Add version control. The GitHub or GitLab MCP server lets your AI read PRs, search code across repos, and understand your project’s history without you pasting links into chat.

  3. Connect your database. Whether it is Postgres, MongoDB, or SQLite, letting the AI inspect schemas and run read-only queries transforms how it writes data access code.

  4. Layer in your workflow tools. Figma for design-to-code, Jira or Linear for ticket context, Playwright for browser testing. Each server you add closes another context gap.

Server won’t connect. The most common cause is a missing runtime. MCP servers built with Node.js require npx or node on your PATH. Python-based servers need uv or pip. Check the server’s README for prerequisites.

Tools don’t appear after connection. Some clients cache the tool list. Restart your editor or run the MCP debug command (/mcp in Claude Code, check the MCP panel in Cursor) to force a refresh.

Remote servers hang on OAuth. If the OAuth flow opens a browser but never completes, check that your localhost callback URL isn’t blocked by a VPN or firewall. Atlassian and Cloudflare remote servers require the browser to redirect back to a local port.

AI uses tools incorrectly. MCP servers expose tool descriptions that the AI reads to decide when and how to call them. If the AI misuses a tool, the problem is usually in the tool’s description, not the AI. When building custom servers, invest in clear, specific tool descriptions.