How to Seamlessly Migrate and Validate Anti‑Cheat Services Across Environments
This article details the end‑to‑end process of migrating an anti‑cheat service to a new data center, verifying strategy effectiveness, building a real‑sample regression pipeline, and automating integration steps using GoAPI and traffic‑comparison platforms to ensure functional consistency and security.
Background and Objective
The anti‑cheat service was migrated to the Jiande data center by deploying a complete independent cluster, conducting performance tests, fault drills, and functional regression before gradually switching traffic online. QA compared core interface responses across the two environments to guarantee stable data reporting and functional consistency.
Environment Migration
The migration involved an isolated deployment of the anti‑cheat cluster in Jiande, followed by comprehensive testing and phased traffic cut‑over to ensure service continuity.
Strategy Validation
Before launching a new intelligent risk‑control product, existing strategies were validated using real‑device samples collected by QA during functional testing. These samples, enriched with various cheat and environment risk data, were precisely labeled and replayed during regression to confirm that strategies met expected outcomes.
Daily Regression
Routine testing of the anti‑cheat service now leverages real player data—containing device and cheat information—to cover complex scenarios more effectively than scripted data. A continuous sample‑based regression workflow ensures comprehensive coverage of core scenarios.
Anti‑Cheat Integration Process
The integration consists of eight steps:
Create interface/scene information on the GoAPI platform.
Configure interface details on the traffic‑comparison platform.
Establish a dedicated anti‑cheat sample library.
Implement a customized sample collection mechanism.
Set up sample collection configurations (ES extraction, field mapping, encryption).
Maintain anti‑cheat scene information.
Automatically associate samples with scenes.
Create regression tasks on the traffic‑comparison platform.
1. GoAPI Platform – Interface/Scene Definition
Define the interfaces or scenes to be tested, including parameter keys and values that will be replaced by sample data.
2. Traffic‑Comparison Platform – Interface Configuration
Register the tested interfaces, set sub‑services and business types, configure domain names for the two environments, and link the scenes created on GoAPI.
3. Build Anti‑Cheat Sample Library
Create a sample repository on the traffic‑comparison platform, specifying sub‑service, business type, and sample limits for subsequent data ingestion.
4. Custom Sample Collection Mechanism
Samples are sourced from two channels: (a) real player data stored in ES, cleaned, mapped, and encrypted into anti‑cheat samples (10,000 samples collected); (b) QA‑collected real‑device data manually labeled (105 samples collected).
5. Sample Collection Configuration
Configure extraction rules, cleaning, filtering, and field selection for ES data within the traffic‑comparison platform.
6. Maintain Anti‑Cheat Scene Information
Manually add scene definitions and associate them with samples to monitor coverage during execution.
7. Automatic Sample‑Scene Association
After sample execution, key information is extracted from results to automatically create and link scene entries.
8. Create Regression Tasks
Schedule regression tasks on the traffic‑comparison platform, trigger GoAPI interfaces/scenes, poll task status, analyze results, and display reports.
Progress and Results
Anti‑Cheat Migration to Jiande: 10,000 online samples were collected; comparison between online and Jiande environments showed consistent results, confirming a smooth migration.
Intelligent Risk‑Control Strategy Comparison: Single‑sample comparisons using real‑device data matched expectations, enabling successful product launch.
Future Plans
Enhance sample diversity and extraction methods.
Implement multi‑task, multi‑scene coverage statistics.
Expand anti‑cheat scene coverage to include various platforms and integration methods.
Optimize asynchronous result comparison.
Improve platform features and normalize traffic‑comparison for daily testing and online verification.
NetEase Smart Enterprise Tech+
Get cutting-edge insights from NetEase's CTO, access the most valuable tech knowledge, and learn NetEase's latest best practices. NetEase Smart Enterprise Tech+ helps you grow from a thinker into a tech expert.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
