Connecting LLMs to External Tools with Anthropic’s Model Context Protocol (MCP)

This article explains the open‑source Model Context Protocol (MCP) created by Anthropic, describes its client‑server architecture for safely linking LLMs with external data sources and tools, and provides a complete step‑by‑step Python tutorial—including environment setup, server and client code—to demonstrate MCP in action.

Architect
Architect
Architect
Connecting LLMs to External Tools with Anthropic’s Model Context Protocol (MCP)

What Is Model Context Protocol (MCP)?

Model Context Protocol (MCP) is an open‑source protocol developed by Anthropic that focuses on building safe and explainable generative AI systems. It was created to solve the key limitation of LLM applications: their isolation from external data sources and tools.

Why MCP Matters

LLM applications need a reliable way to transfer data to the model for inference, a problem addressed by Retrieval‑Augmented Generation (RAG) and fine‑tuning. MCP standardizes how LLMs connect to various systems, acting like a virtual USB‑C for AI agents, enabling seamless, secure, and scalable data exchange between an AI agent and external resources.

Architecture Overview

MCP follows a client‑server model. The AI application (client) communicates with an MCP server that provides data or tool access. This modular approach lets developers build reusable connectors and leverage community‑driven server implementations.

Step‑by‑Step Guide

Prerequisites

Python 3

Terminal (macOS example)

Create a Virtual Environment

python3 -m venv MCP_Demo
source MCP_Demo/bin/activate

After activation your prompt should show (MCP_Demo).

Install Dependencies

pip install langchain-mcp-adapters

Set your OpenAI API key:

export OPENAI_API_KEY=<your_api_key>

Create and Run the MCP Server

Create math_server.py with the following content:

# math_server.py
from mcp.server.fastmcp import FastMCP

mcp = FastMCP("Math")

@mcp.tool()
def add(a: int, b: int) -> int:
    """Add two numbers"""
    return a + b

@mcp.tool()
def multiply(a: int, b: int) -> int:
    """Multiply two numbers"""
    return a * b

if __name__ == "__main__":
    mcp.run(transport="stdio")

Run the server: python3 math_server.py The server starts silently, awaiting client connections.

Create and Run the MCP Client

Create client.py with the following content:

# client.py
from mcp import ClientSession, StdioServerParameters
from mcp.client.stdio import stdio_client
from langchain_mcp_adapters.tools import load_mcp_tools
from langgraph.prebuilt import create_react_agent
from langchain_openai import ChatOpenAI
import asyncio

model = ChatOpenAI(model="gpt-4o")

server_params = StdioServerParameters(
    command="python",
    # Update the path to your math_server.py if needed
    args=["math_server.py"],
)

async def run_agent():
    async with stdio_client(server_params) as (read, write):
        async with ClientSession(read, write) as session:
            await session.initialize()
            tools = await load_mcp_tools(session)
            agent = create_react_agent(model, tools)
            response = await agent.ainvoke({"messages": "what's (3 + 5) x 12?"})
            return response

if __name__ == "__main__":
    result = asyncio.run(run_agent())
    print(result)

Run the client: python3 client.py The output will be a JSON object showing the tool calls (add and multiply) and the final answer, e.g.:

{
  "messages": [
    {"content": "what's (3 + 5) x 12?", "id": "..."},
    {"tool_calls": [
      {"id": "...", "function": {"name": "add", "arguments": "{\"a\": 3, \"b\": 5}"}},
      {"id": "...", "function": {"name": "multiply", "arguments": "{\"a\": 8, \"b\": 12}"}}
    ]},
    {"content": "The result of ((3 + 5) times 12) is 96."}
  ]
}

Conclusion

MCP provides a powerful and convenient way to integrate AI agents with external information and services, enriching context and memory capabilities while maintaining safety and modularity.

PythonAI agentsMCPLangChainOpenAILLM integrationModel Context Protocol
Architect
Written by

Architect

Professional architect sharing high‑quality architecture insights. Topics include high‑availability, high‑performance, high‑stability architectures, big data, machine learning, Java, system and distributed architecture, AI, and practical large‑scale architecture case studies. Open to ideas‑driven architects who enjoy sharing and learning.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.