Fundamentals 12 min read

How Kuaishou Built Its Immersive Panorama Video Engine: From Sphere Modeling to Projection Rendering

This article explains how Kuaishou's audio‑video team implemented panorama video technology in 2020, covering its historical background, user impact, data analysis of a viral video, and the technical pipeline that creates a spherical model, projects it, and renders immersive 360° playback on mobile devices.

Kuaishou Audio & Video Technology
Kuaishou Audio & Video Technology
Kuaishou Audio & Video Technology
How Kuaishou Built Its Immersive Panorama Video Engine: From Sphere Modeling to Projection Rendering

01 Panorama Video Overview

Kuaishou's audio‑video team launched panorama video and live streaming in 2020, offering users immersive, 360° viewing experiences that inspire creative content across extreme sports and travel.

02 Panorama Video’s Rise on Kuaishou

A viral wing‑suit flight video exploded from 5k to 112 w views within hours, eventually reaching 1.7 w cumulative views and boosting the creator’s followers from 20k to 500k, demonstrating higher interaction and QoE compared to regular videos.

After the hit, the account’s overall QoE and interaction metrics rose sharply.

Videos over 1 w views increased from 9.5% to 17.9%.

Panorama videos outperform ordinary videos in engagement and topic reach.

03 Technical Foundations

Panorama (from Greek meaning “all‑see”) is a virtual 360° view created by stitching images captured by omnidirectional cameras. Users can rotate their phones or use gestures to change the viewing angle, giving them a sense of presence and interactivity.

Although Kuaishou implemented the technology in 2020, similar features appeared on Facebook and YouTube in 2015 and were adopted by major news outlets in 2016.

04 Building the Spherical Model

The first step is to construct a sphere centered on the user, using a spherical coordinate system where any point P on the sphere is defined by (r, θ, φ). The conversion formulas are:

x = r sinθ cosφ; y = r sinθ sinφ; z = r cosθ

By dividing the sphere into concentric circles and calculating vertices and texture coordinates, the model is generated. Because the phone’s coordinate system differs, the calculated (x, y, z) values are transformed to (‑y, ‑z, x) for rendering.

3D renderingVRmobile graphicsspherical coordinatespanorama video
Kuaishou Audio & Video Technology
Written by

Kuaishou Audio & Video Technology

Explore the stories behind Kuaishou's audio and video technology.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.