Cloud Native 12 min read

How to Connect Grafana to Large Language Models with MCP (Model Context Protocol)

This guide shows how to use the Model Context Protocol (MCP) to build a lightweight server that links Grafana dashboards to large language models, covering MCP concepts, FastMCP setup, Python client implementation, environment preparation, and integration with Cherry Studio for seamless AI-driven data access.

Alibaba Cloud Observability
Alibaba Cloud Observability
Alibaba Cloud Observability
How to Connect Grafana to Large Language Models with MCP (Model Context Protocol)

What is MCP?

MCP (Model Context Protocol) is an open standard that standardizes how large language models (LLMs) access external data sources, providing a secure, standardized way for developers to expose data and functions to LLM applications.

Standardization : defines a unified communication protocol, eliminating the need for custom integration code for each service.

Security : includes encryption, authentication and permission controls to protect data.

Flexibility : supports various data sources such as databases, files, and APIs.

Cross‑platform : works with any system or language (Python, TypeScript, Go, etc.).

How MCP Works

MCP follows a client‑server model. The client is embedded in an AI application (e.g., Cherry Studio) and sends requests to an MCP server, which connects to specific data sources or tools and returns results for the LLM to use.

The server provides three core capabilities:

Tool : expose executable functions that the LLM can call (e.g., query a database or send a message).

Resource : read‑only data such as files, database records, images, or logs.

Prompt : reusable templates that guide LLM responses.

Building a Grafana MCP Server

Installation

curl -LsSf https://astral.sh/uv/install.sh | sh

Project Initialization

# Create project folder
uv init grafana-mcp-example
cd grafana-mcp-example

# Install dependencies
uv add "mcp[cli]" requests

# Create server file
touch server.py

FastMCP Server Definition

from mcp.server.fastmcp import FastMCP

MCP_SERVER_NAME = "grafana-mcp-server"
mcp = FastMCP(MCP_SERVER_NAME)

if __name__ == "__main__":
    mcp.run(transport='stdio')

Adding Tools

@mcp.tool()
def listFolder() -> list[Any] | str:
    """List all Grafana folder names"""
    return grafana.GrafanaClient().listFolder()

@mcp.tool()
def listDashboard(folderName: str) -> list[Any] | str:
    """List all dashboard names in a folder"""
    return grafana.GrafanaClient().listDashboard(folderName)

Grafana Client (Python)

import os, requests

class GrafanaClient:
    grafanaURL = os.getenv("GRAFANA_URL")
    grafanaApiKey = os.getenv("GRAFANA_API_KEY")

    def grafanaHeader(self):
        return {"Authorization": f"Bearer {self.grafanaApiKey}",
                "Content-Type": "application/json"}

    def listFolder(self):
        resp = requests.get(f"{self.grafanaURL}/api/search?type=dash-folder",
                            headers=self.grafanaHeader())
        return [{"uid": i["uid"], "name": i["title"]} for i in resp.json()] if resp.status_code == 200 else []

    def listDashboard(self, folderName: str):
        # Get folder UID
        resp = requests.get(f"{self.grafanaURL}/api/search?type=dash-folder&query={folderName}",
                            headers=self.grafanaHeader())
        if resp.status_code != 200:
            return []
        folderUid = resp.json()[0]["uid"]
        # Get dashboards in folder
        resp = requests.get(f"{self.grafanaURL}/api/search?type=dash-db&folderUIDs={folderUid}",
                            headers=self.grafanaHeader())
        return [{"uid": i["uid"], "name": i["title"]} for i in resp.json()] if resp.status_code == 200 else []

Environment Preparation

Use Cherry Studio (v1.1.2) with Alibaba Cloud Bailei LLM qwen‑plus (supports MCP). Prepare a Grafana API Key (for Grafana 9.x) or Service Account Token (for Grafana 10.x) and set the environment variables GRAFANA_URL and GRAFANA_API_KEY.

Integrating MCP Server in Cherry Studio

Add the LLM qwen‑plus, then configure a new MCP server pointing to the directory of the grafana-mcp-example project and run it with:

uv run --directory /Users/{user}/grafana-mcp-example run server.py

After configuration, queries such as “Grafana folder list” or “dashboards in folder X” return real‑time data from Grafana, demonstrating successful integration.

MCP workflow diagram
MCP workflow diagram

Conclusion

By building a simple Grafana MCP server, the guide illustrates how MCP can extend LLM capabilities, enabling AI to retrieve live service data with minimal effort. The same approach can be applied to many other services, turning AI into a universal “super connector” for everyday tasks.

Original Source

Signed-in readers can open the original source through BestHub's protected redirect.

Sign in to view source
Republication Notice

This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactadmin@besthub.devand we will review it promptly.

cloud nativePythonMCPAI integrationGrafanaFastMCP
Alibaba Cloud Observability
Written by

Alibaba Cloud Observability

Driving continuous progress in observability technology!

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.