Frontend Development 7 min read

Live Streaming Latency Measurement: FLV and WebRTC Solutions

To accurately benchmark and reduce end-to-end latency on large live-streaming platforms, this article details practical measurement frameworks for both HTTP-FLV—using custom SEI timestamps and cloud-synced clocks—and WebRTC—leveraging RTCPeerConnection stats to dissect uplink, server, jitter, decoding, and render delays.

Tencent Cloud Developer
Tencent Cloud Developer
Tencent Cloud Developer
Live Streaming Latency Measurement: FLV and WebRTC Solutions

Latency is an essential factor in live streaming services. Implementing a latency‑statistics scheme provides a standard for measuring platform‑wide data, laying the foundation for subsequent optimization and benefit assessment. However, commonly used industry methods are insufficient for large‑scale platforms.

Live streaming latency originates from many stages: content capture → processing → encoding → packetization → push streaming → transmission → transcoding → distribution → decoding → playback. Each stage contributes to overall delay.

The industry often uses a simple “read‑second” approach to roughly estimate end‑to‑end latency by capturing the local push‑stream timestamp and comparing it with the playback timestamp. This method relies on a unified timer and synchronized source/playback time snapshots.

Although straightforward, this approach is highly incidental and unsuitable for aggregating latency across large platforms.

1. FLV‑Based Latency Statistics

HTTP‑FLV remains a primary live‑streaming solution. The H.264 codec allows embedding custom SEI (Supplemental Enhancement Information) data, which can carry timestamps.

The front‑end playback flow for FLV is illustrated below:

During streaming, the encoder writes the current timestamp into the H.264 stream via custom SEI. The front‑end demuxes the FLV, extracts SEI data, and computes the offset between push‑stream time and playback time. Clock synchronization between the push side and client is ensured by a cloud‑function time‑sync service.

Note that the computed offset does not represent the final end‑to‑end latency because browsers introduce additional playback buffering; the sum of both gives the true latency.

Practical tests show this scheme is simple, feasible, and can comprehensively measure the entire playback chain latency.

2. WebRTC‑Based Latency Statistics

Tencent’s Penguin E‑Sports has long explored using WebRTC to reduce live‑stream latency. While WebRTC offers lower latency than HTTP‑FLV, measuring the actual improvement is non‑trivial because browsers do not expose low‑level stream data.

The proposed WebRTC latency‑measurement framework divides the end‑to‑end delay into five stages:

Uplink latency: time from push to arrival at the WebRTC server (measured via SEI timestamps).

Server‑to‑browser latency: measured by round‑trip time (RTT).

Jitter‑buffer delay: calculated as jitterBufferDelay / jitterBufferEmittedCount .

Decoding buffer delay: calculated as (framesReceived - framesDecoded - framesDropped) / framesPerSecond .

Render buffer delay: negligible when using Media Stream with a video buffer length of 0.

Summing these five stages yields the total link latency, with detailed metrics obtained via the RTCPeerConnection.getStats API.

Conclusion

Both broadcasters and viewers demand ever‑lower latency. Implementing robust latency‑statistics solutions provides a benchmark for platform‑wide performance, enabling targeted optimizations and measurable ROI.

This article presented practical latency‑measurement schemes for FLV‑based and WebRTC‑based live streaming, hoping to inspire further discussion and development in the field.

Front-endPerformancelive streamingWebRTCFLVlatency measurement
Tencent Cloud Developer
Written by

Tencent Cloud Developer

Official Tencent Cloud community account that brings together developers, shares practical tech insights, and fosters an influential tech exchange community.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.