Zero‑Change AI Integration: How LApiGateway Transforms RPC Services with MCP

This article explains how LApiGateway leverages the Model Context Protocol (MCP) to enable AI clients to access existing HTTP RPC services without code changes, detailing the protocol basics, current challenges, the gateway's architecture, implementation steps, and the resulting operational benefits.

Huolala Tech
Huolala Tech
Huolala Tech
Zero‑Change AI Integration: How LApiGateway Transforms RPC Services with MCP

Knowledge Background

MCP Server implements the Model Context Protocol, exposing tools, resources, prompts, and completions to AI clients via JSON‑RPC. The protocol defines three transport modes (stdio, HTTP SSE, Streamable HTTP) and three server types: Stateless Server, Stateless Streamable Server, and Stateful Server.

MCP Tool is a callable capability with fields name , description , and inputSchema . When invoked, the server returns a CallToolResult containing either TextContent or structured data such as JSON.

Current Situation

Industry Trend

AI adoption is accelerating, and more AI clients use MCP as a unified interface to external services, reducing integration complexity and improving observability.

Internal Challenges

Huolala has a large set of HTTP RPC services. Refactoring all services into native MCP servers faces two major obstacles: high migration cost and long, risky timelines.

High conversion cost : extensive refactoring, testing, and resource allocation.

Low schedule risk controllability : months‑long effort across multiple teams, risking compatibility issues.

Core Pain Points

Java version compatibility : existing services run on Java 8, while MCP Server requires Java 17+.

Traffic system adaptation conflict : existing custom traffic management (gray releases, lane isolation) is not supported by native MCP Server.

Protocol version iteration risk : future MCP updates may require re‑engineering of already migrated services.

Solution

LApiGateway provides MCP conversion technology that resolves the “zero‑change AI integration” problem by handling Java version compatibility, traffic system conflicts, and protocol iteration risks.

3.1 LApiGateway Architecture Design

LApiGateway is Huolala’s internal microservice gateway offering traffic forwarding, authentication, rate limiting, parameter modification, and validation to improve developer efficiency.

LApi Control Plane includes service configuration (LApi management platform + Apollo), service discovery via Consul, and monitoring via Trace and HLL Monitor.

LApi Data Plane routes requests through load balancers (KONG, SLB) to LApi nodes, where plugins process the traffic before forwarding to downstream services.

Within the MCP Server proxy layer , MCP Tools are mapped to existing HTTP REST calls, enabling dynamic configuration and rapid AI capability activation, while reusing gateway features such as authentication, rate limiting, retries, circuit breaking, routing, gray releases, and observability. The design is stateless, supporting horizontal scaling.

3.2 Technical Details

The gateway initially used HTTP SSE; after the MCP protocol upgrade to Streamable HTTP, it migrated to a Stateless Server model, solving load‑balancing and high‑availability concerns.

MCP Tool call processing flow :

Client sends a request to /{serviceAppid}/{mcpServerName}/mcp.

The gateway’s stateless MCP service translates the Tool call into a standard downstream HTTP request.

The request is routed to the backend RPC service, passing through configured plugins (rewrite, auth, rate limiting, load balancing).

If the downstream returns 2xx, the response text is wrapped as CallToolResult.TextContent; if 4xx/5xx, an MCP error is raised.

The gateway packages the result or error into a JSON‑RPC response for the AI client.

Key points :

Stateless transport enables easy horizontal scaling.

Tool HTTP generation uses LApi’s expression engine to map request data.

Only JSON‑RPC results are exposed to the client.

AI tools downgrade MCP Java SDK dependencies to be compatible with Java 8.

3.3 Platformization

The LApi management platform provides an MCP Server configuration module and an MCP service marketplace, allowing dynamic addition of Tools, one‑click publishing, and access to curated services with code quality, documentation, and monitoring.

Conclusion

LApiGateway’s MCP conversion practice offers a viable path for enterprises to connect legacy RPC services to AI tool ecosystems without code changes, enhancing observability, governance, and security while accelerating AI integration. The stateless MCP Server architecture and expression engine deliver rapid AI capability rollout, zero‑cost service modernization, and a shared tool platform for future digital transformation.

Original Source

Signed-in readers can open the original source through BestHub's protected redirect.

Sign in to view source
Republication Notice

This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactadmin@besthub.devand we will review it promptly.

microservicesMCPBackend DevelopmentAPI gatewayProtocolAI integration
Huolala Tech
Written by

Huolala Tech

Technology reshapes logistics

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.