How to Evaluate Your Website’s Performance with WebPageTest and HTTPArchive Metrics
This article explains how to use WebPageTest and HTTPArchive data to measure key website performance indicators, interprets their meanings and relationships, and provides baseline values for assessing a site’s speed, backend efficiency, and third‑party impact.
How to Know Your Website’s Performance?
First, use a widely recognized testing tool. We recommend WebPageTest, an open‑source project from Google’s “make the web faster” initiative.
WebPageTest It originated as an internal AOL tool and was open‑sourced in 2008 under a BSD license. Website: http://www.webpagetest.org/
In this example we test a major site (e.g., Sohu) and capture the main metrics.
After a few minutes the results appear.
We extract the key indicators.
These raise three questions:
What do the metrics mean?
How are the metrics related?
How can I assess my site’s level?
This article analyzes each metric, explains their relationships, and provides a baseline derived from the HTTPArchive database for a more scientific evaluation.
HTTPArchive Database An open‑source repository that records performance data and trends for many websites worldwide, useful for horizontal performance comparisons. Website: http://httparchive.org/
Data used are from the 2015‑03‑15 snapshot.
1. Load Time
Load Time (or onLoad Time) is the interval from navigation start to the window’s load event.
Analysis
This event is widely measured by third‑party tools and is strongly correlated with Visual Complete, Speed Index, and the total number of requests. It also correlates positively with request count: more requests generally mean slower sites.
During the transition from HTTP/1.1 to HTTP/2, measuring Load Time is important because HTTP/2 aims to reuse TCP connections and reduce handshake overhead.
Purpose
Load Time remains a common benchmark for comparing performance across tools such as RUM and WebPageTest, but it is an older metric that does not fully reflect user‑perceived performance. Over time it should be complemented with newer, more relevant indicators.
Value Distribution
All values are in milliseconds.
HTTPArchive data
2. First Byte (TTFB)
Time To First Byte (TTFB) measures the interval from navigation start to the receipt of the first byte of the base page (after redirects).
Analysis
TTFB shows little correlation with other metrics, affecting mainly the start‑render time. It can reveal backend or CDN performance when measuring static resources.
Purpose
TTFB is valuable for assessing CDN efficiency or backend latency and should be part of performance planning.
Value Distribution
All values are in milliseconds.
HTTPArchive data
3. Start Render
Start Render time is the interval from navigation start to the appearance of the first non‑blank content on the screen.
Details are omitted in this article.
4. Visual Complete
Visual Complete attempts to measure the time required to render the above‑the‑fold (ATF) content.
Analysis
Visual Complete is closely related to Fully Loaded, Load Time, and Speed Index. Unless a page contains many lazy‑loaded resources, Visual Complete aligns with Fully Loaded.
Purpose
Because its value correlates with Speed Index and onLoad, measuring Visual Complete alone adds little value, but it is useful when comparing performance before and after lazy loading.
Value Distribution
All values are in milliseconds.
HTTPArchive data
5. Speed Index
Speed Index is a calculated metric that measures how quickly the visible portion of a page is rendered (lower is better).
Analysis
Speed Index is tightly linked to Visual Complete, Start Render, and Load Time, but has low correlation with TTFB and PageSpeed scores.
Purpose
Speed Index reflects content rendering speed, especially for above‑the‑fold elements, and provides a numeric value that is easy to compare.
Value Distribution
For content‑heavy sites, an ideal target is around 1000.
Values are expressed in units, not time.
HTTPArchive data
6. Total Number of Requests
This metric counts how many requests a page makes to the server before loading completes.
Analysis
Request count correlates with third‑party domains but also shows strong relationships with Fully Loaded, Visual Complete, and onLoad.
Purpose
If a site relies heavily on third‑party tags, a high request count can indicate performance degradation, making this metric useful for monitoring third‑party impact.
Value Distribution
Values are expressed in request units.
HTTPArchive data
7. PageSpeed Insights
Google PageSpeed measures network‑independent performance factors such as server configuration, HTML structure, and the use of images, JavaScript, and CSS.
It provides separate scores for mobile and desktop, ranging from 0 to 100; scores above 85 indicate good performance.
Analysis
PageSpeed has very low correlation with other metrics, highlighting its independence. It is negatively correlated with time‑based metrics: lower times yield higher scores.
Purpose
PageSpeed identifies structural issues (e.g., render‑blocking JavaScript or CSS) that other metrics may miss, making it an essential part of a performance‑planning toolkit.
Value Distribution
Values range from 0 to 100.
HTTPArchive data
8. Total Bytes
Total Bytes represent the sum of all downloaded objects from the first byte received until the page finishes loading.
Analysis
Total Bytes have a strong correlation with Fully Loaded, Visual Complete, and Load Time, though the relationship is non‑linear.
Purpose
This metric helps detect sudden size growth caused by large images or new JavaScript libraries, providing a useful overall health indicator.
Value Distribution
Values are expressed in bytes.
HTTPArchive data
9. Number of Domains
This metric counts the distinct domains from which a page loads resources.
Analysis
More domains usually indicate a busier site and slightly higher Fully Loaded times, though the correlation is modest.
Purpose
Tracking domain count helps monitor third‑party fragmentation; limiting domains and deferring third‑party loads can reduce their impact on perceived performance.
Value Distribution
Values are expressed as a count.
HTTPArchive data
Performance Metrics Summary
HTTPArchive’s extensive desktop‑site dataset shows that Speed Index, Load Time, and PageSpeed scores have clear relationships with perceived performance. Metrics such as domain count, request count, and DOM element count also correlate with Speed Index.
When measurement resources are limited, prioritize Speed Index, Load Time, and PageSpeed scores. If third‑party pressure is high, also track domain count and total request count to provide data‑driven guidance to business owners.
Testing Metric Correlations
We used the HTTPArchive database, extracting non‑null values and calculating Pearson and Spearman correlation coefficients. Coefficients above ±0.7 were considered significant; those below ±0.4 were deemed weak.
Metrics that stood out for their richness and complementary perspectives include Speed Index, Load Time, Google PageSpeed, TTFB, and total domain count.
Efficient Ops
This public account is maintained by Xiaotianguo and friends, regularly publishing widely-read original technical articles. We focus on operations transformation and accompany you throughout your operations career, growing together happily.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.