How to Enable LLMs to Call MySQL via MCP: A Step‑by‑Step Guide

This tutorial shows how to let large language models autonomously invoke external services—specifically a MySQL database—by using the Model Context Protocol (MCP), covering environment setup, MCP service implementation, agent integration, and real‑world execution results.

Instant Consumer Technology Team
Instant Consumer Technology Team
Instant Consumer Technology Team
How to Enable LLMs to Call MySQL via MCP: A Step‑by‑Step Guide

1. Environment Preparation

Install Python 3.10+ and the required packages:

pip install pymysql fastmcp langchain langchain-openai langgraph langchain-mcp-adapters

2. Implement the MySQL MCP Service (mysql_mcp.py)

import pymysql
from fastmcp import FastMCP
import json

app = FastMCP("MySQL MCP")

DB_CONFIG = {
    "host": "127.0.0.1",
    "user": "root",
    "port": 3308,
    "password": "123456",
    "database": "test_db",
    "charset": "utf8mb4",
    "cursorclass": pymysql.cursors.DictCursor
}

def run_query(sql: str):
    """Execute SQL and return result"""
    conn = pymysql.connect(**DB_CONFIG)
    try:
        with conn.cursor() as cursor:
            cursor.execute(sql)
            if sql.strip().lower().startswith("select"):
                rows = cursor.fetchall()
                return {"rows": json.loads(json.dumps(rows, default=str))}
            else:
                conn.commit()
                return {"status": "success", "rows_affected": cursor.rowcount}
    finally:
        conn.close()

@app.tool()
def query_mysql(sql: str) -> dict:
    """Execute a MySQL query (SELECT/INSERT/UPDATE/DELETE)"""
    try:
        return run_query(sql)
    except Exception as e:
        return {"error": str(e)}

if __name__ == "__main__":
    app.run(transport="stdio")

Ensure a local MySQL instance with a test_db database and a users table (schema shown in the image below).

3. LLM Calls the MySQL MCP Service (test_mcp.py)

import asyncio
from langchain_mcp_adapters.client import MultiServerMCPClient
from langgraph.prebuilt import create_react_agent
from langchain_openai import ChatOpenAI
import os

BASE_URL = "https://api.openai.com/v1/"
API_KEY = "sk-xxx-your-key"
MODEL_NAME = "gpt-4o"

async def main():
    try:
        client = MultiServerMCPClient({
            "MySQL-MCP": {
                "command": "python",
                "args": [os.path.abspath("mysql_mcp.py")],
                "transport": "stdio"
            }
        })
        tools = await client.get_tools()
        if not tools:
            raise ValueError("No tools retrieved")
        llm = ChatOpenAI(base_url=BASE_URL, openai_api_key=API_KEY, model=MODEL_NAME, timeout=60.0, max_retries=2)
        agent = create_react_agent(llm, tools)
        while True:
            user_input = input("
请输入需求(或输入 exit 退出):
> ")
            if user_input.strip().lower() == "exit":
                break
            async for chunk in agent.astream({"messages": user_input}):
                print(chunk)
    except Exception as e:
        print(f"程序初始化失败: {e}")

if __name__ == "__main__":
    asyncio.run(main())

4. Code Walk‑through

MultiServerMCPClient initialization : Configures the MySQL MCP service, specifying the start command (e.g., python mysql_mcp.py) and transport method ( stdio). Multiple MCP services can be added simultaneously.

get_tools : Retrieves a tool list from MCP, allowing the LLM to know which functions it can invoke.

ChatOpenAI setup : Wraps the LLM call with base URL, API key, model name, timeout, and retry settings.

ReAct agent creation : Uses create_react_agent so the LLM follows a Reason + Act cycle—first decide what to do, then call the appropriate MCP tool, finally return the result.

5. Demonstration Results

Simple arithmetic query ("89+2"): the model answers without invoking MySQL.

Querying the users table triggers the MySQL MCP service.

Finding the youngest user also calls the service.

Creating a new book table and inserting data demonstrates write operations.

Generated book table data:

Note: This MySQL MCP service is for learning only; production use must enforce SQL execution limits to avoid irreversible damage.

6. Summary

By combining MCP with a LangGraph ReAct agent, an LLM can not only answer questions but also execute real actions such as SQL queries. The core workflow is:

MCP provides tool capabilities (e.g., execute SQL).

The LLM selects the appropriate tool via the Reason + Act cycle.

The result is returned directly to the user.

With MCP, large models can "do" instead of merely "tell".

7. Knowledge Extension

The MultiServerMCPClient class supports multiple transport modes; besides stdio, streamable_http is recommended, while the older SSE mode is deprecated.

PythonLLMMCPLangChainMySQLAI integration
Instant Consumer Technology Team
Written by

Instant Consumer Technology Team

Instant Consumer Technology Team

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.