Youku Playback Testing Platform: Unified Automation Framework, Services, and System Design
Youku’s unified playback testing platform consolidates a modular automation framework, a comprehensive service chain, and a layered platform ecosystem to standardize workflows, support multiple device types, and provide transparent, real‑time monitoring, thereby reducing development complexity and paving the way for intelligent case recommendation and dynamic verification.
Since last year, the Youku Technical Quality Team has been optimizing its playback testing solutions to improve efficiency, completeness, and effectiveness. A white‑box automation testing framework was built, along with a laboratory‑based verification system and an external‑network dynamic probing system. While these efforts have yielded early gains, new challenges have emerged:
Playback testing capabilities have become extensive and now require a standardized, process‑driven workflow.
Beyond mobile, testing coverage across multiple device types and scenarios remains low, demanding a multi‑platform approach.
The testing process lacks transparency for partners, leading to inconsistent understanding of standards and checkpoints.
To address the three demands of process‑orientation, multi‑platform support, and transparency, Youku has entered a “unified” phase that includes three unifications: a unified core automation testing framework, a unified core automation testing service, and a unified platform ecosystem.
Core Automation Testing Framework
The unified framework must solve several problems:
How to quickly and efficiently support the development and execution of automated cases for each topic.
How to extend the core testing capability to multiple scenarios such as live, OTT, and Iku.
How to make the framework extensible enough to support process‑driven and platform‑driven testing.
To achieve comprehensive coverage of all topics and basic quality tests, the team analyzed common test scenarios, personalized capabilities, and toolsets for each topic, extracted APIs in layers, and divided the new core framework into four modules: api , core , toolkit , and testcase . The overall structure is illustrated in the diagram below:
Common & Personalized Test Schemes
Two representative topics are described:
PCDN topic : The goal is cost reduction and peak traffic mitigation while ensuring correct fragment download. Tests focus on coverage, share rate, waste rate, and duplication rate across various network conditions and user behaviors. Data are collected via VPM hooks and PCDN APIs.
Stutter topic : Focuses on verifying the effectiveness of stutter‑reduction strategies (smart scheduling, slow‑switch logic, network probing) and their impact on playback experience. Data are gathered from VPM hooks, strategy logs, and ODPS historical points, with combinatorial validation across network scenarios and user actions.
Additionally, the team abstracted three core modules— case_checker , log_processor , and network_controller —and incorporated tools such as an AFrame client for simulating user behavior, VPM hook injection, PCDN API retrieval, ODPS queries, hot‑show queries, and OSS uploads.
Playback Testing Service
The local core automation framework was evolved into a platform service (PKAT) that integrates with both the broadcast testing and dynamic probing systems, forming a complete kernel automation testing suite added to the playback testing platform pipeline.
The overall service chain consists of four parts:
Playback test scheduling service: integrates third‑party platforms, manages devices, distributes tasks, and handles results.
AFrameServer deployed in external networks: acts as a data relay, listening for connections and establishing channels.
CaseClient linked with the task scheduler: entry point for executing test cases.
Core automation testing service: provides verification for playback kernels across multiple platforms.
Platform Consolidation
The testing platform adopts a layered design to enhance horizontal scalability. The foundational service layer includes modules for device resource management, case management, case template management, test plan management, execution management, and report management.
Device Resource Management: synchronizes laboratory and external test device information for case execution.
Case Management: maintains case information and permissions, supporting both code‑based and visual component development.
Case Template Management: reduces development effort by providing templates per topic.
Test Plan Management: groups multiple cases to cover diverse testing scenarios.
Execution Management: creates a unique run instance for each case execution, enabling lifecycle monitoring.
Report Management: delivers detailed execution data and result analysis.
Execution Flow
Users trigger a case from the case management module. The request goes to AFrameService, which communicates with the task scheduler, forwards the case to AFrameServer, and finally drives test execution.
AFrameServer streams real‑time execution status back to the report module; upon completion, detailed results are also returned.
Users can create and run test plans; the plan execution follows the same flow as individual case execution.
The report module allows querying of execution data and result analysis.
Conclusion and Outlook
The Youku Playback Testing Platform has standardized the testing workflow, reduced development complexity, improved efficiency, and ensured transparent, real‑time monitoring and visualization of test results. Future work will focus on strategy‑quality insight analysis based on playback experience, dynamic verification capabilities, intelligent case recommendation, and case expansion.
Youku Technology
Discover top-tier entertainment technology here.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.