MCP Servers Explained: The Protocol Powering AI Agents
Before we dive deeper...
Get your free AI Starter Kit
Join 12,000+ developers. Instant download: Career Roadmap + Fundamentals Cheat Sheets.
MCP Quick Reference
Core Concepts:
Key Insight: MCP is like USB for AIโa universal standard that lets any AI connect to any tool without custom integration code.
What is Model Context Protocol (MCP)?
Model Context Protocol (MCP) is an open standard developed by Anthropic that revolutionizes how AI models interact with external systems. Released in late 2024 and rapidly adopted throughout 2025-2026, MCP provides a universal interface for AI to access files, databases, APIs, and tools.
Before MCP, every AI integration required custom code. Want Claude to read your files? Write custom code. Want it to query a database? More custom code. Want GPT to do the same things? Rewrite everything.
MCP changes this by providing a standardized protocol that any AI can speak. One integration works everywhere.
Why MCP Matters
| Before MCP | With MCP |
|---|---|
| Custom code per AI provider | One integration, all providers |
| Security handled ad-hoc | Built-in security primitives |
| No resource discovery | AI discovers available tools |
| Inconsistent interfaces | Standardized JSON-RPC protocol |
| Vendor lock-in | Open standard, MIT licensed |
MCP Architecture
MCP follows a client-server architecture:
Components
-
MCP Hosts (Clients)
- AI applications like Claude Desktop, Open WebUI
- Connect to multiple MCP servers simultaneously
- Manage user approvals and security
-
MCP Servers
- Expose specific capabilities (files, databases, APIs)
- Run as separate processes for isolation
- Can be local or remote
-
Transport Layer
- stdio: Local servers via standard input/output
- SSE: Remote servers via HTTP Server-Sent Events
Protocol Flow
User Request โ AI Model โ MCP Client โ MCP Server โ External System
โ โ
โโโโโโโโ Response โโโโโโโโโ
Top MCP Servers (2026)
Official Anthropic Servers
| Server | Purpose | Installation |
|---|---|---|
| filesystem | Read/write local files | npx @anthropic-ai/mcp-server-filesystem |
| github | Repos, issues, PRs | npx @anthropic-ai/mcp-server-github |
| postgres | PostgreSQL queries | npx @anthropic-ai/mcp-server-postgres |
| sqlite | SQLite database | npx @anthropic-ai/mcp-server-sqlite |
| memory | Persistent knowledge | npx @anthropic-ai/mcp-server-memory |
| brave-search | Web search | npx @anthropic-ai/mcp-server-brave-search |
| fetch | HTTP requests | npx @anthropic-ai/mcp-server-fetch |
Popular Community Servers
| Server | Purpose | Repository |
|---|---|---|
| slack | Slack messaging | modelcontextprotocol/servers |
| google-drive | Google Drive files | modelcontextprotocol/servers |
| notion | Notion databases | community |
| linear | Linear issues | community |
| aws | AWS services | community |
| docker | Container management | community |
| kubernetes | K8s cluster ops | community |
Setting Up MCP with Claude Desktop
Step 1: Locate Config File
macOS:
~/Library/Application Support/Claude/claude_desktop_config.json
Windows:
%APPDATA%\Claude\claude_desktop_config.json
Step 2: Add MCP Servers
Create or edit the config file:
{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": [
"-y",
"@anthropic-ai/mcp-server-filesystem",
"/Users/yourname/Documents",
"/Users/yourname/Projects"
]
},
"github": {
"command": "npx",
"args": ["-y", "@anthropic-ai/mcp-server-github"],
"env": {
"GITHUB_TOKEN": "ghp_your_token_here"
}
},
"sqlite": {
"command": "npx",
"args": [
"-y",
"@anthropic-ai/mcp-server-sqlite",
"/path/to/your/database.db"
]
}
}
}
Step 3: Restart Claude Desktop
After saving the config, restart Claude Desktop. You'll see available tools in the interface.
Step 4: Test the Connection
Ask Claude:
- "What files are in my Documents folder?"
- "Show me my recent GitHub issues"
- "Query the users table in my database"
Building a Custom MCP Server
Here's a simple MCP server in TypeScript:
Install the SDK
npm init -y
npm install @modelcontextprotocol/sdk
Create the Server
// server.ts
import { Server } from '@modelcontextprotocol/sdk/server/index.js';
import { StdioServerTransport } from '@modelcontextprotocol/sdk/server/stdio.js';
const server = new Server({
name: 'my-custom-server',
version: '1.0.0',
}, {
capabilities: {
tools: {},
resources: {},
}
});
// Define a tool
server.setRequestHandler('tools/list', async () => ({
tools: [{
name: 'get_weather',
description: 'Get current weather for a city',
inputSchema: {
type: 'object',
properties: {
city: { type: 'string', description: 'City name' }
},
required: ['city']
}
}]
}));
// Handle tool calls
server.setRequestHandler('tools/call', async (request) => {
if (request.params.name === 'get_weather') {
const city = request.params.arguments.city;
// Your weather API logic here
return {
content: [{
type: 'text',
text: `Weather in ${city}: 72ยฐF, Sunny`
}]
};
}
});
// Start server
const transport = new StdioServerTransport();
await server.connect(transport);
Register in Claude Desktop
{
"mcpServers": {
"my-weather": {
"command": "npx",
"args": ["tsx", "/path/to/server.ts"]
}
}
}
MCP with Local AI Models
Using MCP with Ollama
While Ollama doesn't natively support MCP, you can use compatible frontends:
Open WebUI with MCP:
# Install Open WebUI with MCP support
docker run -d -p 3000:8080 \
-e MCP_SERVERS='{"filesystem": {"command": "npx", "args": ["-y", "@anthropic-ai/mcp-server-filesystem", "/data"]}}' \
--add-host=host.docker.internal:host-gateway \
-v open-webui:/app/backend/data \
ghcr.io/open-webui/open-webui:main
LangChain MCP Integration:
from langchain_mcp import MCPToolkit
from langchain_community.llms import Ollama
# Connect to MCP servers
toolkit = MCPToolkit(servers={
"filesystem": {
"command": "npx",
"args": ["-y", "@anthropic-ai/mcp-server-filesystem", "/data"]
}
})
# Use with local Ollama model
llm = Ollama(model="llama3.1:70b")
tools = toolkit.get_tools()
Real-World MCP Use Cases
1. Code Assistant with Repository Access
Configure GitHub MCP server to let AI browse repos, read code, and create PRs.
2. Database Query Assistant
SQLite or PostgreSQL MCP servers enable natural language database queries.
3. Document Analysis Pipeline
Filesystem MCP server reads documents while AI processes and summarizes.
4. DevOps Automation
Docker and Kubernetes MCP servers enable AI-driven infrastructure management.
5. Knowledge Management
Memory MCP server persists context across conversations for long-term projects.
Security Best Practices
Principle of Least Privilege
Only grant MCP servers access to necessary directories and resources.
{
"filesystem": {
"command": "npx",
"args": [
"-y",
"@anthropic-ai/mcp-server-filesystem",
"/specific/project/folder" // Not entire home directory
]
}
}
Environment Variable Security
Never commit tokens to config files. Use environment variables:
{
"github": {
"command": "npx",
"args": ["-y", "@anthropic-ai/mcp-server-github"],
"env": {
"GITHUB_TOKEN": "${GITHUB_TOKEN}" // Read from environment
}
}
}
Audit Server Code
Review community MCP servers before use, especially those requiring sensitive access.
MCP vs Alternatives
| Feature | MCP | LangChain Tools | OpenAI Functions |
|---|---|---|---|
| Open Standard | Yes | No | No |
| Multi-Provider | Yes | Partial | No |
| Process Isolation | Yes | No | N/A |
| Resource Discovery | Yes | Manual | Manual |
| Community Servers | 100+ | Many | Limited |
| Local-First | Yes | Yes | No |
The Future of MCP
MCP adoption accelerated in 2025-2026:
- OpenAI announced MCP support for ChatGPT and API
- Google integrated MCP into Gemini ecosystem
- Enterprise adoption for secure AI tool integration
- Multimodal MCP for image and audio resources
- MCP Hub marketplace for discovering servers
The protocol is becoming the de facto standard for AI-tool integration.
Key Takeaways
- MCP standardizes AI-tool integration with a universal protocol
- Official servers cover common use cases (files, GitHub, databases)
- Easy to setup with Claude Desktop config file
- Build custom servers in <100 lines of TypeScript or Python
- Works with local AI through compatible clients
- Security-first design with process isolation and approval flows
Next Steps
- Set up local AI agents using MCP tools
- Configure RAG pipelines with MCP filesystem access
- Explore DeepSeek R1 for complex reasoning with tools
- Browse AI models that work well with MCP integrations
MCP represents the future of AI integrationโopen, standardized, and secure. Whether you're building AI agents, automating workflows, or creating custom tools, MCP provides the foundation for connected AI experiences.
Ready to start your AI career?
Get the complete roadmap
Download the AI Starter Kit: Career path, fundamentals, and cheat sheets used by 12K+ developers.
Want structured AI education?
10 courses, 160+ chapters, from $9. Understand AI, don't just use it.
Continue Your Local AI Journey
Comments (0)
No comments yet. Be the first to share your thoughts!