Game Development 9 min read

Why Game Servers Shun Microservices: Real‑Time Performance Challenges

Game servers often avoid microservice architectures because real‑time communication, low latency, and stateful processing demand tightly coupled, high‑performance designs, making the added network overhead, stateless constraints, and complexity of microservices unsuitable for fast‑paced multiplayer games.

Java Backend Technology
Java Backend Technology
Java Backend Technology
Why Game Servers Shun Microservices: Real‑Time Performance Challenges

1. Background

During a recent interview with a listed game company, the interviewer asked whether the company planned to adopt a microservice architecture. The interviewee explained the usual benefits of microservices—easier testing, maintenance, upgrades, loose coupling, multi‑language support, and automatic scaling—but the game developer argued that game servers require real‑time performance, and microservices would introduce latency, so a modular monolith approach suffices.

2. High‑voted Answer by hongjic93

He illustrated the point with MOBA games such as "Honor of Kings" or "League of Legends". While ancillary systems (account, rune, hero, skin, friend, messaging) could be split into microservices, the core of a MOBA is fast, multi‑way communication among a small group of players (typically 10). This real‑time requirement tolerates only a few milliseconds of delay.

Microservices increase network overhead by breaking a single process into multiple services, adding extra hops, service meshes, gateways, and sidecars, all of which threaten the low‑latency goal. Moreover, most microservices follow a request/response model and are designed to be stateless, which conflicts with the stateful streaming needed for real‑time game events.

In practice, a single server often handles all communication for a match, enabling local data exchange and maximum performance. This requires sticky routing so that a player’s connection remains bound to the same server; however, the stateless nature of microservices opposes sticky routing.

Game sessions are essentially isolated sandboxes, each maintaining long‑lived state (tower health, player kills, etc.) until the match ends. Although this state is not persisted to a database, it lives in memory for the duration of the game, making it unsuitable for a stateless microservice design. Offloading the state to Redis would still incur remote request latency.

Overall, microservices do not address the core challenges of real‑time, low‑latency, stateful game servers.

3. High‑voted Answer by brice

Brice shared experience from developing board‑game servers. He noted that microservices are intended for complex business logic, but many game services (e.g., lobby, login) are relatively simple and inherently stateful.

Game servers keep state in memory and only occasionally use Redis for transient data; relational databases are used solely for asynchronous persistence. Real‑time push requirements mean that TCP is preferred over HTTP/WebSocket gateways, and typical HTTP‑based RPC frameworks (Ribbon, Feign) are unsuitable due to message ordering issues.

Long‑lived TCP connections (often implemented with Netty) are the norm, and the thread model differs from standard Spring MVC, making frameworks like Spring Cloud less appropriate without significant adaptation. Auto‑scaling in games is handled by dedicated “open‑server” processes rather than generic microservice scaling mechanisms.

While some peripheral services (e.g., payment) could benefit from microservice principles, the core game logic servers remain stateful, low‑latency, and tightly coupled, limiting the applicability of microservices.

4. Conclusion

Microservices are not a silver bullet for game server architecture. They add network overhead, conflict with the need for stateful streaming, and introduce complexity that outweighs their benefits for real‑time multiplayer games. A carefully designed monolithic or modular approach that preserves low latency and stateful processing remains the preferred solution for high‑performance game servers.

Original Source

Signed-in readers can open the original source through BestHub's protected redirect.

Sign in to view source
Republication Notice

This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactadmin@besthub.devand we will review it promptly.

real-timemicroservicesNetwork Latencygame-serversstateful architecture
Java Backend Technology
Written by

Java Backend Technology

Focus on Java-related technologies: SSM, Spring ecosystem, microservices, MySQL, MyCat, clustering, distributed systems, middleware, Linux, networking, multithreading. Occasionally cover DevOps tools like Jenkins, Nexus, Docker, and ELK. Also share technical insights from time to time, committed to Java full-stack development!

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.