Model Context Protocol (MCP) — Complete Guide for Backend Engineers

 

Model Context Protocol (MCP) — Complete Guide for Backend Engineers

Build Tools, Resources, and AI-Driven Services Using LangChain

Modern LLM-based applications are no longer just about generating text — they need to interact with real systems:

✅ Databases
✅ File systems
✅ Internal microservices
✅ Web APIs
✅ Analytics engines
✅ Cloud services

To support this, OpenAI introduced MCP — Model Context Protocol, a powerful standard that lets LLMs communicate with tools using a safe, structured API.

This guide gives you:

✅ Clear concepts
✅ Interview-focused explanations
✅ Step-by-step MCP server creation
✅ Examples using LangChain
✅ Text-based architecture diagrams

Perfect for your blog.


What Is MCP?

MCP (Model Context Protocol) is a unified protocol that allows AI models to access tools, resources, and files in a structured manner.

Think of it as an API gateway for LLMs.

Instead of relying only on prompts, LLMs can call tools like:

get_weather search_files query_database run_sql get_customer_orders

MCP provides:

✅ A standard interface
✅ Strong typing
✅ Clear request/response format
✅ Security boundaries
✅ Cross-language interoperability


📐 High-Level Architecture (Text-Based Diagram)

┌────────────────────────┐ │ LLM / Agent │ │ (GPT-4, LangChain, │ │ Anthropic, Groq) │ └────────────▲───────────┘ │ Structured Tool Calls (JSON-RPC) │ ┌────────────┴───────────┐ │ MCP Server │ │ Tools / Resources │ │ Transport: stdio/ws │ └────────────▲───────────┘ │ ┌────────────────────┼────────────────────┐ │ │ │ ┌──────┴──────┐ ┌──────┴──────┐ ┌──────┴──────┐ │ APIs │ │ Databases │ │ Filesystem │ │ REST/GraphQL │ │ SQL/NoSQL │ │ Logs/Docs │ └──────────────┘ └──────────────┘ └──────────────┘

🔌 MCP Transport Protocols

MCP defines how an AI agent connects to your server:

✅ 1. stdio (local execution)

  • Uses stdin/stdout for message passing

  • Zero network overhead

  • Ideal for CLI tools, dev workflows

✅ 2. websocket (remote execution)

  • Perfect for cloud microservices

  • Works with Kubernetes, ECS, GKE, etc.

  • Supports multiple LLM clients

✅ 3. HTTP (proxy adapters)

  • HTTP isn't native in MCP but supported via
    Nginx/Envoy/Gateway adapters.


🛠️ Building a Simple MCP Server (LangChain)

Below is a minimal MCP server using LangChain + FastAPI.


✅ Install dependencies

pip install langchain langchain-core fastapi uvicorn mcp-server-fastapi

✅ Step 1: Create Tools

from langchain.tools import tool import requests, os @tool def get_weather(city: str) -> str: """Return temperature and weather for a given city.""" return requests.get(f"https://wttr.in/{city}?format=3").text @tool def list_files(folder: str) -> list: """List files in a directory.""" return os.listdir(folder)

✅ Step 2: Create the MCP Server

from mcp_server_fastapi import MCPServer from fastapi import FastAPI app = FastAPI() server = MCPServer(app, title="Utility MCP Server") server.add_tool(get_weather) server.add_tool(list_files)

✅ Step 3: Run the MCP Server

uvicorn main:app --host 0.0.0.0 --port 8000

MCP endpoint available at:

ws://localhost:8000/mcp

🧰 Exposing Resources

You can expose static or dynamic resources:

from mcp_server_fastapi import resource @resource("config/app") def config_resource(): return {"version": "1.0.0", "env": "production"}

📂 Exposing File-System Resources (Read-Only)

server.mount_folder("/logs", "/var/log/myapp/")

🤖 How Agents Call MCP Tools (LangChain)

from mcp_client import MCPClient from langchain.agents import create_openai_tools_agent, AgentExecutor from langchain_openai import ChatOpenAI client = MCPClient("ws://localhost:8000/mcp") tools = client.get_tools() llm = ChatOpenAI(model="gpt-4.1") agent = create_openai_tools_agent(llm, tools) executor = AgentExecutor(agent=agent, tools=tools) result = executor.invoke({"input": "What is the weather in Bangalore?"}) print(result["output"])

🎯 Tool Invocation Flow 

User Query → Agent → Selects Tool → MCP Tool Executes → Returns Structured JSON → Agent Summarizes Result

Detailed:

┌───────────────────────────┐ │ User Input: "Weather?" │ └───────────────┬───────────┘ │ Reasoning by Agent │ ┌───────────▼───────────┐ │ Tool Call Chosen │ │ get_weather("BLR") │ └───────────┬───────────┘ │ JSON-RPC ▼ ┌──────────────────────┐ │ MCP Server │ │ Executes API calls │ └───────────┬─────────┘ │ JSON Result │ ▼ ┌──────────────────────────┐ │ Agent Summarizes Output │ └──────────────────────────┘

💼 Where Backend Engineers Use MCP

✅ Integrating LLMs with microservices
✅ Allowing safe access to production data
✅ Creating API-driven agents
✅ Building internal developer tooling
✅ Simplifying multi-agent systems
✅ Enabling plug-and-play AI behavior


🎤 Interview-Ready Explanation

Q: What problem does MCP solve?
✅ Standardizes how AI models interact with external tools
✅ Makes tool usage safe, typed, predictable
✅ Enables multi-tool, multi-resource workflows

Q: How does an agent know which tool to call?
The LLM sees tool schemas + natural language description →
Uses reasoning + training → selects correct tool.

Q: What’s the difference between stdio and websocket?

  • stdio: Local execution

  • websocket: Cloud execution

Q: What can MCP expose?
✅ tools
✅ resources
✅ file systems


📦 Full Project Structure 

mcp-weather-server/ │ ├── main.py # Main MCP server entry ├── tools/ │ ├── weather.py # Weather tool │ ├── filesystem.py # List file tool │ ├── resources/ │ └── config.py # Sample resource │ ├── requirements.txt └── README.md

📊 Summary Table

FeatureDescription
ToolsFunctions agent can execute
ResourcesStatic/dynamic information exposed to LLM
FileSystemSafe, restricted directory access
Protocolsstdio, WebSocket, HTTP (proxy)
Language SupportPython, JS, Java (soon), Go (soon)
Architecture StyleJSON-RPC 2.0

✅ Final Thoughts

MCP is quickly becoming the standard protocol for LLM-to-system integration.
For backend engineers, knowing MCP gives you a huge advantage in:

✅ AI system design
✅ Multi-agent architectures
✅ Tooling integration
✅ LLM-powered microservices

No comments:

Post a Comment

12 classic String-based Java interview questions with simple explanations and code.

  1️⃣ Check if a String is a Palindrome Problem Given a string, check if it reads the same forward and backward. Example: "madam...

Featured Posts