Mobile H5 Performance Testing: Challenges, Solutions, and Tool Comparisons
This article examines the difficulties of mobile H5 performance testing—including root‑required tcpdump, JavaScript injection, HTTPS pcap parsing, and white‑screen timing—provides background on mobile browsers, outlines performance metrics, compares four testing approaches, and describes a WebView monitoring workflow with data conversion and visualization.
In the previous article “Mobile H5 Advertising Performance Automated Testing (Part 1)” several difficult issues were raised, such as the need for root privileges for tcpdump, JavaScript injection methods, HTTPS pcap parsing, and the definition of white‑screen time.
This article presents the authors' solutions, starting with background knowledge of mobile browsers and their rendering engines, describing the core components of browsers (UI, engine, renderer, network layer, etc.) and the evolution of browser kernels (Trident, Gecko, WebKit, Blink).
It then introduces performance metrics (Start Render, DOM Ready, Page Load) obtainable via the W3C Navigation Timing API, and discusses the controversy around first‑paint timing.
Four testing approaches are compared: Fiddler/Charles capture, PhantomJS HAR generation, Chrome remote debugging, and Tcpdump + mitmproxy, each with advantages and drawbacks.
Finally, the article outlines a workflow for monitoring WebView performance on Android devices, including injecting monitoring JavaScript, collecting data via OpenSTF, converting pcap to HAR (using pcap2har or a custom Node.js tool), and visualizing results with a timeline waterfall chart.
The authors conclude with references and an invitation for readers to share their own white‑screen calculation methods.
360 Tech Engineering
Official tech channel of 360, building the most professional technology aggregation platform for the brand.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.