Boost Your PHP Development with Neuron: An AI‑Powered Coding Assistant

Neuron Coding Agent is a PHP‑based, command‑line AI assistant that integrates multiple large‑model providers, offers local installation via Composer, supports configurable settings for providers like Anthropic, OpenAI, Gemini, and Ollama, and enables context‑aware code generation, review, and debugging directly from your terminal.

Open Source Tech Hub
Open Source Tech Hub
Open Source Tech Hub
Boost Your PHP Development with Neuron: An AI‑Powered Coding Assistant

Overview

Neuron Coding Agent is a command‑line tool built with PHP and the Neuron AI framework. It runs locally and uses multiple AI providers to assist with coding, debugging, code review, and other software‑engineering tasks.

System Requirements

PHP >= 8.1

Composer

Installation

Global installation (recommended)

composer global require neuron-core/coding-agent

Add Composer’s global bin directory to your PATH, for example:

# ~/.bashrc or ~/.zshrc
export PATH="$HOME/.config/composer/vendor/bin:$PATH"
# View absolute bin path
composer global config bin-dir --absolute

Project‑local installation (recommended for a specific project)

composer require --dev neuron-core/coding-agent

Add a custom script to composer.json for easy invocation:

{
  "scripts": {
    "neuron": "vendor/bin/neuron"
  }
}
Note: On native Windows (non‑WSL/Git Bash) run .\vendor\bin\neuron .

Configuration

Create a settings file in the project root:

mkdir -p .neuron && printf "{
}" > .neuron/settings.json

Provider configuration examples

Anthropic :

{
  "provider": {
    "type": "anthropic",
    "api_key": "sk-ant-your-api-key-here",
    "model": "claude-sonnet-4-20250514",
    "max_tokens": 8192
  }
}

OpenAI :

{
  "provider": {
    "type": "openai",
    "api_key": "sk-your-openai-key-here",
    "model": "gpt-4",
    "max_tokens": 8192
  }
}

Google Gemini :

{
  "provider": {
    "type": "gemini",
    "api_key": "your-gemini-api-key",
    "model": "gemini-pro",
    "max_tokens": 8192
  }
}

Local Ollama :

{
  "provider": {
    "type": "ollama",
    "base_url": "http://localhost:11434",
    "model": "llama2"
  }
}

Other supported providers (Cohere, Mistral, Grok/xAI, Deepseek) follow the same JSON structure, changing type and the corresponding API‑key field.

Optional MCP server configuration

{
  "mcp_servers": {
    "filesystem": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/workspace"]
    },
    "brave-search": {
      "command": "uvx",
      "args": ["mcp-brave-search"]
    },
    "github": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-github"],
      "env": {
        "GITHUB_PERSONAL_ACCESS_TOKEN": "your-github-token"
      }
    }
  }
}
Note: .neuron/settings.json must reside in the current working directory when running the neuron command.

Usage

Interactive chat mode

neuron

Single‑question mode

neuron "How do I fix this PHP error?"

Project‑context mode (recommended)

cd /path/to/your/project
neuron

When executed inside a project, the assistant automatically reads the directory files to provide context‑aware assistance.

Typical dialogue examples:

> What does this project do?
> Please review UserController.php for security issues.
> Auth.php throws "Class not found" error, how to fix?
> Refactor UserService to use dependency injection.

Key Features

Multi‑model provider support : Anthropic Claude, OpenAI, Gemini, Cohere, Mistral, Ollama, Grok, Deepseek.

File‑system integration : read, search, and analyze any project directory.

MCP support : connect to Model Context Protocol servers for extended capabilities.

Native CLI experience : built on Minicli for smooth terminal interaction.

Context awareness : understands overall project structure before suggesting changes.

Security first : code stays on the local machine; only necessary snippets are sent to the AI API.

Programming‑focused prompts : system prompts optimized for software‑engineering tasks.

Architecture

Neuron AI Framework : provides the agent architecture and tool integrations.

Settings module : loads multi‑provider configurations from .neuron/settings.json.

Provider Factory : dynamically creates instances of the selected AI provider based on the configuration.

Minicli : handles the command‑line interface and routing.

The assistant’s file‑system tools include directory listing, file reading, pattern searching, glob‑based discovery, and document parsing (PDF, HTML, etc.).

Security Considerations

Code is processed locally; only required fragments are transmitted to the chosen AI API.

No code is stored on external servers (aside from provider request logs).

File‑system tools are read‑only and never execute code.

API keys remain only in the local .neuron/settings.json file.

software developmentPHPAI integrationAI Coding Assistantcommand-line toolNeuron
Open Source Tech Hub
Written by

Open Source Tech Hub

Sharing cutting-edge internet technologies and practical AI resources.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.