AI Infrastructure

MCP Servers Explained: The Protocol Powering AI Agents

February 4, 2026
18 min read
Local AI Master Research Team
๐ŸŽ 4 PDFs included
Newsletter

Before we dive deeper...

Get your free AI Starter Kit

Join 12,000+ developers. Instant download: Career Roadmap + Fundamentals Cheat Sheets.

No spam, everUnsubscribe anytime
12,000+ downloads

MCP Quick Reference

Core Concepts:

Servers
Expose tools and data
Clients
AI apps that connect
Resources
Data AI can read
Tools
Actions AI can take

Key Insight: MCP is like USB for AIโ€”a universal standard that lets any AI connect to any tool without custom integration code.

What is Model Context Protocol (MCP)?

Model Context Protocol (MCP) is an open standard developed by Anthropic that revolutionizes how AI models interact with external systems. Released in late 2024 and rapidly adopted throughout 2025-2026, MCP provides a universal interface for AI to access files, databases, APIs, and tools.

Before MCP, every AI integration required custom code. Want Claude to read your files? Write custom code. Want it to query a database? More custom code. Want GPT to do the same things? Rewrite everything.

MCP changes this by providing a standardized protocol that any AI can speak. One integration works everywhere.

Why MCP Matters

Before MCPWith MCP
Custom code per AI providerOne integration, all providers
Security handled ad-hocBuilt-in security primitives
No resource discoveryAI discovers available tools
Inconsistent interfacesStandardized JSON-RPC protocol
Vendor lock-inOpen standard, MIT licensed

MCP Architecture

MCP follows a client-server architecture:

Components

  1. MCP Hosts (Clients)

    • AI applications like Claude Desktop, Open WebUI
    • Connect to multiple MCP servers simultaneously
    • Manage user approvals and security
  2. MCP Servers

    • Expose specific capabilities (files, databases, APIs)
    • Run as separate processes for isolation
    • Can be local or remote
  3. Transport Layer

    • stdio: Local servers via standard input/output
    • SSE: Remote servers via HTTP Server-Sent Events

Protocol Flow

User Request โ†’ AI Model โ†’ MCP Client โ†’ MCP Server โ†’ External System
                   โ†‘                         โ†“
                   โ†โ”€โ”€โ”€โ”€โ”€โ”€โ”€ Response โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ†

Top MCP Servers (2026)

Official Anthropic Servers

ServerPurposeInstallation
filesystemRead/write local filesnpx @anthropic-ai/mcp-server-filesystem
githubRepos, issues, PRsnpx @anthropic-ai/mcp-server-github
postgresPostgreSQL queriesnpx @anthropic-ai/mcp-server-postgres
sqliteSQLite databasenpx @anthropic-ai/mcp-server-sqlite
memoryPersistent knowledgenpx @anthropic-ai/mcp-server-memory
brave-searchWeb searchnpx @anthropic-ai/mcp-server-brave-search
fetchHTTP requestsnpx @anthropic-ai/mcp-server-fetch
ServerPurposeRepository
slackSlack messagingmodelcontextprotocol/servers
google-driveGoogle Drive filesmodelcontextprotocol/servers
notionNotion databasescommunity
linearLinear issuescommunity
awsAWS servicescommunity
dockerContainer managementcommunity
kubernetesK8s cluster opscommunity

Setting Up MCP with Claude Desktop

Step 1: Locate Config File

macOS:

~/Library/Application Support/Claude/claude_desktop_config.json

Windows:

%APPDATA%\Claude\claude_desktop_config.json

Step 2: Add MCP Servers

Create or edit the config file:

{
  "mcpServers": {
    "filesystem": {
      "command": "npx",
      "args": [
        "-y",
        "@anthropic-ai/mcp-server-filesystem",
        "/Users/yourname/Documents",
        "/Users/yourname/Projects"
      ]
    },
    "github": {
      "command": "npx",
      "args": ["-y", "@anthropic-ai/mcp-server-github"],
      "env": {
        "GITHUB_TOKEN": "ghp_your_token_here"
      }
    },
    "sqlite": {
      "command": "npx",
      "args": [
        "-y",
        "@anthropic-ai/mcp-server-sqlite",
        "/path/to/your/database.db"
      ]
    }
  }
}

Step 3: Restart Claude Desktop

After saving the config, restart Claude Desktop. You'll see available tools in the interface.

Step 4: Test the Connection

Ask Claude:

  • "What files are in my Documents folder?"
  • "Show me my recent GitHub issues"
  • "Query the users table in my database"

Building a Custom MCP Server

Here's a simple MCP server in TypeScript:

Install the SDK

npm init -y
npm install @modelcontextprotocol/sdk

Create the Server

// server.ts
import { Server } from '@modelcontextprotocol/sdk/server/index.js';
import { StdioServerTransport } from '@modelcontextprotocol/sdk/server/stdio.js';

const server = new Server({
  name: 'my-custom-server',
  version: '1.0.0',
}, {
  capabilities: {
    tools: {},
    resources: {},
  }
});

// Define a tool
server.setRequestHandler('tools/list', async () => ({
  tools: [{
    name: 'get_weather',
    description: 'Get current weather for a city',
    inputSchema: {
      type: 'object',
      properties: {
        city: { type: 'string', description: 'City name' }
      },
      required: ['city']
    }
  }]
}));

// Handle tool calls
server.setRequestHandler('tools/call', async (request) => {
  if (request.params.name === 'get_weather') {
    const city = request.params.arguments.city;
    // Your weather API logic here
    return {
      content: [{
        type: 'text',
        text: `Weather in ${city}: 72ยฐF, Sunny`
      }]
    };
  }
});

// Start server
const transport = new StdioServerTransport();
await server.connect(transport);

Register in Claude Desktop

{
  "mcpServers": {
    "my-weather": {
      "command": "npx",
      "args": ["tsx", "/path/to/server.ts"]
    }
  }
}

MCP with Local AI Models

Using MCP with Ollama

While Ollama doesn't natively support MCP, you can use compatible frontends:

Open WebUI with MCP:

# Install Open WebUI with MCP support
docker run -d -p 3000:8080 \
  -e MCP_SERVERS='{"filesystem": {"command": "npx", "args": ["-y", "@anthropic-ai/mcp-server-filesystem", "/data"]}}' \
  --add-host=host.docker.internal:host-gateway \
  -v open-webui:/app/backend/data \
  ghcr.io/open-webui/open-webui:main

LangChain MCP Integration:

from langchain_mcp import MCPToolkit
from langchain_community.llms import Ollama

# Connect to MCP servers
toolkit = MCPToolkit(servers={
    "filesystem": {
        "command": "npx",
        "args": ["-y", "@anthropic-ai/mcp-server-filesystem", "/data"]
    }
})

# Use with local Ollama model
llm = Ollama(model="llama3.1:70b")
tools = toolkit.get_tools()

Real-World MCP Use Cases

1. Code Assistant with Repository Access

Configure GitHub MCP server to let AI browse repos, read code, and create PRs.

2. Database Query Assistant

SQLite or PostgreSQL MCP servers enable natural language database queries.

3. Document Analysis Pipeline

Filesystem MCP server reads documents while AI processes and summarizes.

4. DevOps Automation

Docker and Kubernetes MCP servers enable AI-driven infrastructure management.

5. Knowledge Management

Memory MCP server persists context across conversations for long-term projects.

Security Best Practices

Principle of Least Privilege

Only grant MCP servers access to necessary directories and resources.

{
  "filesystem": {
    "command": "npx",
    "args": [
      "-y",
      "@anthropic-ai/mcp-server-filesystem",
      "/specific/project/folder"  // Not entire home directory
    ]
  }
}

Environment Variable Security

Never commit tokens to config files. Use environment variables:

{
  "github": {
    "command": "npx",
    "args": ["-y", "@anthropic-ai/mcp-server-github"],
    "env": {
      "GITHUB_TOKEN": "${GITHUB_TOKEN}"  // Read from environment
    }
  }
}

Audit Server Code

Review community MCP servers before use, especially those requiring sensitive access.

MCP vs Alternatives

FeatureMCPLangChain ToolsOpenAI Functions
Open StandardYesNoNo
Multi-ProviderYesPartialNo
Process IsolationYesNoN/A
Resource DiscoveryYesManualManual
Community Servers100+ManyLimited
Local-FirstYesYesNo

The Future of MCP

MCP adoption accelerated in 2025-2026:

  • OpenAI announced MCP support for ChatGPT and API
  • Google integrated MCP into Gemini ecosystem
  • Enterprise adoption for secure AI tool integration
  • Multimodal MCP for image and audio resources
  • MCP Hub marketplace for discovering servers

The protocol is becoming the de facto standard for AI-tool integration.

Key Takeaways

  1. MCP standardizes AI-tool integration with a universal protocol
  2. Official servers cover common use cases (files, GitHub, databases)
  3. Easy to setup with Claude Desktop config file
  4. Build custom servers in <100 lines of TypeScript or Python
  5. Works with local AI through compatible clients
  6. Security-first design with process isolation and approval flows

Next Steps

  1. Set up local AI agents using MCP tools
  2. Configure RAG pipelines with MCP filesystem access
  3. Explore DeepSeek R1 for complex reasoning with tools
  4. Browse AI models that work well with MCP integrations

MCP represents the future of AI integrationโ€”open, standardized, and secure. Whether you're building AI agents, automating workflows, or creating custom tools, MCP provides the foundation for connected AI experiences.

๐Ÿš€ Join 12K+ developers
Newsletter

Ready to start your AI career?

Get the complete roadmap

Download the AI Starter Kit: Career path, fundamentals, and cheat sheets used by 12K+ developers.

No spam, everUnsubscribe anytime
12,000+ downloads
Reading now
Join the discussion

Local AI Master Research Team

Creator of Local AI Master. I've built datasets with over 77,000 examples and trained AI models from scratch. Now I help people achieve AI independence through local AI mastery.

My 77K Dataset Insights Delivered Weekly

Get exclusive access to real dataset optimization strategies and AI model performance tips.

Want structured AI education?

10 courses, 160+ chapters, from $9. Understand AI, don't just use it.

AI Learning Path

Comments (0)

No comments yet. Be the first to share your thoughts!

๐Ÿ“… Published: February 4, 2026๐Ÿ”„ Last Updated: February 4, 2026โœ“ Manually Reviewed

Stay Updated on AI Infrastructure

Get weekly updates on MCP servers, AI agents, and local AI infrastructure.

Was this helpful?

PR

Written by Pattanaik Ramswarup

AI Engineer & Dataset Architect | Creator of the 77,000 Training Dataset

I've personally trained over 50 AI models from scratch and spent 2,000+ hours optimizing local AI deployments. My 77K dataset project revolutionized how businesses approach AI training. Every guide on this site is based on real hands-on experience, not theory. I test everything on my own hardware before writing about it.

โœ“ 10+ Years in ML/AIโœ“ 77K Dataset Creatorโœ“ Open Source Contributor
Free Tools & Calculators