How to Build AI Agents in PHP with the Model Context Protocol (MCP)

Learn how to connect PHP-based AI agents to the Model Context Protocol (MCP) using the open‑source Neuron AI framework, covering MCP fundamentals, server setup, tool integration, and example code for creating custom agents that can invoke external APIs, databases, and web content.

Open Source Tech Hub
Open Source Tech Hub
Open Source Tech Hub
How to Build AI Agents in PHP with the Model Context Protocol (MCP)

If you want to build AI agents in PHP, the Model Context Protocol (MCP) provides a standardized way to expose tools and services to large language models (LLMs). MCP acts as a thin layer between an LLM and the external functions it can call, turning the LLM from a pure text generator into an orchestrator of real‑world actions.

Why MCP Matters

Before MCP, developers had to write custom glue code for every external API—email, web search, databases, cloud storage, etc.—making multi‑tool agents cumbersome and error‑prone. Each service has its own API, authentication, and data format, so integrating many tools quickly becomes a maintenance nightmare.

MCP solves this by defining a universal protocol that lets an LLM request a function, pass arguments, and receive structured results, similar to how REST APIs standardize web services.

Core Components of MCP

An MCP deployment consists of three parts:

Host : the machine running your PHP agent.

MCP Server : a lightweight process that exposes tool definitions over a stdio (stdin/stdout) interface.

MCP Client : a library in your PHP code that talks to the server, discovers available tools, and forwards calls.

During development the server runs on the same host as the agent; in production you can install it on any cloud instance.

Installing the Neuron AI Framework

Neuron AI is an open‑source PHP library that bridges LLMs and MCP. Install it via Composer: composer require inspector-apm/neuron-ai Neuron provides a full toolkit: agents, RAG (retrieval‑augmented generation), vector stores, embeddings, and observability features.

Creating a Custom Agent

Extend NeuronAI\Agent and implement three methods: provider() (returns an LLM provider such as Anthropic, OpenAI, etc.), instructions() (system prompt for the LLM), and tools() (list of callable tools).

use NeuronAI\Agent;
use NeuronAI\Providers\AIProviderInterface;
use NeuronAI\Providers\Anthropic\Anthropic;
use NeuronAI\Tools\Tool;
use NeuronAI\Tools\ToolProperty;

class MyAgent extends Agent {
    public function provider(): AIProviderInterface {
        // Return an LLM provider (Anthropic, OpenAI, Mistral, etc.)
        return new Anthropic(
            key: 'ANTHROPIC_API_KEY',
            model: 'ANTHROPIC_MODEL',
        );
    }

    public function instructions() {
        return "LLM system instructions.";
    }

    public function tools(): array {
        return [
            Tool::make(
                "get_article_content",
                "Use the ID of the article to get its content."
            )
            ->addProperty(
                new ToolProperty(
                    name: 'article_id',
                    type: 'integer',
                    description: 'The ID of the article you want to analyze.',
                    required: true
                )
            )
            ->setCallable(function (string $article_id) {
                // Example DB query (replace $pdo with your DB connection)
                $stm = $pdo->prepare("SELECT * FROM articles WHERE id=? LIMIT 1");
                $stm->execute([$article_id]);
                return json_encode($stm->fetch(PDO::FETCH_ASSOC));
            })
        ];
    }
}

The tools() method can return any number of Tool objects. Each tool defines a name, description, typed properties, and a PHP callable that performs the actual work (e.g., a database query, an HTTP request, or a cloud‑API call).

Loading Tools from an MCP Server

Neuron ships a McpConnector component that automatically discovers tools exposed by an MCP server and adds them to your agent:

use NeuronAI\McpConnector;

class MyAgent extends Agent {
    // ... provider() and instructions() as before ...

    public function tools(): array {
        return [
            // Load all tools from a remote MCP server
            ...McpConnector::make([
                'command' => 'npx',
                'args' => ['-y', '@modelcontextprotocol/server-everything'],
            ])->tools(),

            // Your custom tools (e.g., the get_article_content example above)
            // Tool::make(...)
        ];
    }
}

When the agent decides to run a tool, Neuron formats the request, sends it over stdio to the MCP server, receives the JSON result, and feeds it back to the LLM so the conversation can continue seamlessly.

Putting It All Together

1. Install the MCP server of your choice (e.g., the modelcontextprotocol/server-everything Docker image) on the same machine as your PHP code. 2. Start the server so it listens on stdio. 3. Write a PHP agent class as shown above, adding any custom tools you need. 4. Run the agent; the LLM can now request web pages, query databases, call Stripe‑like APIs, or execute any tool the MCP server exposes.

This approach dramatically reduces the amount of boilerplate required to build multi‑tool AI assistants in PHP, turning a handful of lines of code into a powerful, extensible agent platform.

AI agentsLLMMCPTool IntegrationPHPNeuron AI
Open Source Tech Hub
Written by

Open Source Tech Hub

Sharing cutting-edge internet technologies and practical AI resources.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.