Build Your First MCP Server
A working MCP server that exposes tools to Claude Desktop, Cursor, or your own agent — built from scratch in under an hour.
- 1
Understand what MCP is (in 60 seconds)
Model Context Protocol is a standard for letting AI clients (Claude Desktop, Cursor, agents) talk to external tools and data without each app reinventing the wheel. An MCP server exposes Tools (functions the AI can call), Resources (data it can read), and Prompts (templates). One server works with every compatible client — write once, plug in anywhere.
- 2
Scaffold the server
`npm install @modelcontextprotocol/sdk` (or `pip install mcp` for Python). Create `server.ts`. Initialize with `new Server({ name: "my-mcp-server", version: "1.0.0" })`. Register a transport — `StdioServerTransport` is the simplest (the server talks JSON-RPC over stdin/stdout, which is how desktop apps mount it).
- 3
Define your first Tool
Tools are typed function calls. Define one with `server.setRequestHandler(ListToolsRequestSchema, ...)` listing what you offer, and `CallToolRequestSchema, ...` to handle invocations. A useful first tool: `read_my_notes(folder: string) → list of files`. The schema is JSON Schema; the AI client uses it to know how to call you.
- 4
Add a Resource
Resources are read-only data the AI can pull. Register a list handler with `ListResourcesRequestSchema` returning resource URIs (e.g. `file:///notes/my-readme.md`), and a read handler with `ReadResourceRequestSchema` returning the content. Now the AI can ask "give me the contents of my-readme.md" without you exposing a tool for it.
- 5
Mount in Claude Desktop
Edit `~/Library/Application Support/Claude/claude_desktop_config.json` (or the Windows equivalent), add `"mcpServers": { "my-server": { "command": "node", "args": ["/path/to/server.js"] } }`. Restart Claude Desktop. You should now see your tools available — Claude can call them inside any conversation.
- 6
Production-grade MCP
Real MCP servers need auth, sandboxing, structured error handling, observability, and remote transport (SSE, HTTP) for cloud deployment. The MCP Servers and Tool Ecosystems course on Local AI Master covers all of it — this 45-minute build is chapter 4.
Continue with the full MCP Servers and Tool Ecosystems course on Local AI Master.
This page is one chapter of a structured course covering everything from foundations to production. Try Pro free for 7 days — full access to all 264 chapters across 10 courses, no charge until day 8, cancel anytime.