Mobile Development 16 min read

How to Build an Ambient Mode for Seamless Video Playback on Android & iOS

This article explains the technical design of an ambient mode that fills video black bars with dynamically extracted dominant colors, covering video frame sampling, median‑cut color quantization, Android and iOS rendering pipelines, gradient animations, and performance optimizations.

Baidu App Technology
Baidu App Technology
Baidu App Technology
How to Build an Ambient Mode for Seamless Video Playback on Android & iOS

1. Introduction

When a video’s aspect ratio does not match the screen, black bars appear at the top/bottom or sides, breaking visual immersion. Major players like YouTube and Netflix use an "Ambient Mode" that extracts the video’s dominant colors and fills the black bars to create a harmonious background.

2. Overall Technical Solution

The ambient effect is added in the video player’s post‑processing chain via an AmbientFilter. The filter downloads a video frame from GPU to CPU, divides the frame into regions, extracts each region’s dominant color using a color‑quantization algorithm, and passes the colors to the platform layer for rendering.

2.1 Video Frame Sampling

To avoid performance loss, frames are sampled rather than processed every frame—approximately one frame every 50 frames (≈2 s) at 25 FPS. Two GPU‑to‑CPU transfer optimizations are used:

FBO compression: render the frame to a low‑resolution FBO (e.g., 108 p) before download.

PBO asynchronous transfer: use a Pixel Buffer Object to copy data without blocking the render thread.

2.2 Dominant Color Extraction

The frame is split into six regions (TopLeft, TopCenter, TopRight, BottomLeft, BottomCenter, BottomRight). For each region, a median‑cut algorithm is applied to build a color palette:

Initialize a color box containing all pixels.

Select the color component (R, G, or B) with the largest range as the split axis.

Split the box at the median value of that component.

Recursively repeat steps 2–3 until the desired number of boxes is reached.

Compute the average or median color of each final box to form the palette.

Map each pixel to the nearest palette color.

Other algorithms (K‑means, octree, popularity) were evaluated, but median‑cut offered the best trade‑off between speed, accuracy, and implementation complexity for this scenario.

3. Platform Rendering of the Ambient Effect

3.1 Android

A custom view AmbientView is placed on both sides of the video container. The layout (simplified) is:

<FrameLayout>
    <com.baidu.cyberplayer.sdk.AmbientView
        android:id="@+id/left_ambient"
        android:layout_width="xxxdp"
        android:layout_height="match_parent"/>
    <FrameLayout
        android:id="@+id/video_container"
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"/>
    <com.baidu.cyberplayer.sdk.AmbientView
        android:id="@+id/right_ambient"
        android:layout_width="xxxdp"
        android:layout_height="match_parent"/>
</FrameLayout>

Key functions:

LinearGradient creates a vertical gradient between adjacent region colors.

private void updateGradient() {
    mLinearGradient = new LinearGradient(0, 0, 0, getHeight(),
        mColors, null, Shader.TileMode.CLAMP);
    mPaint.setShader(mLinearGradient);
    invalidate();
}

Color transition between frames uses a ValueAnimator with ArgbEvaluator for smooth RGB interpolation.

private void startColorAnimator() {
    int[] lastColors = new int[mLastColors.length];
    for (int i = 0; i < lastColors.length; i++) {
        lastColors[i] = mLastColors[i];
    }
    mColorAnimator = ValueAnimator.ofFloat(0, 1f);
    mColorAnimator.setDuration(1500);
    mColorAnimator.addUpdateListener(animation -> {
        float progress = (float) animation.getAnimatedValue();
        interpolateColors(progress, lastColors);
        updateGradient();
    });
    mColorAnimator.start();
}

private void interpolateColors(float progress, int[] lastColors) {
    if (mCurColors == null || mCurColors.length == 0) return;
    ArgbEvaluator evaluator = new ArgbEvaluator();
    for (int i = 0; i < mCurColors.length; i++) {
        mColors[i] = (int) evaluator.evaluate(progress, lastColors[i], mCurColors[i]);
    }
}

Finally, a dark mask gradient prevents the ambient area from being too distracting.

float[] mPositions = {0.0f, 1.0f};
int[] mMaskColors = {0x88000000, 0xff000000};
 mMaskLinearGradient = new LinearGradient(0, 0, getWidth(), 0,
        mMaskColors, mPositions, Shader.TileMode.CLAMP);
 mMaskPaint.setShader(mMaskLinearGradient);
 canvas.drawRect(0, 0, getWidth(), getHeight(), mMaskPaint);

3.2 iOS

The iOS side also provides an AmbientView built with two CALayer sublayers: a gradient layer and a mask layer.

- (void)setupSubLayers {
    _gradientLayer = [CAGradientLayer layer];
    _gradientLayer.frame = self.bounds;
    [self.layer addSublayer:_gradientLayer];

    _maskLayer = [CAGradientLayer layer];
    _maskLayer.frame = self.bounds;
    [self.layer addSublayer:_maskLayer];
}

Animation uses CADisplayLink to drive per‑frame color interpolation:

- (void)startAnimation {
    self.displayLink = [CADisplayLink displayLinkWithTarget:self selector:@selector(updateColors)];
    [self.displayLink addToRunLoop:[NSRunLoop mainRunLoop] forMode:NSRunLoopCommonModes];
}

- (void)updateColors {
    CGFloat progress = MIN(1.0, (CACurrentMediaTime() - self.startTime) / self.animationDuration);
    NSMutableArray *interpolated = [NSMutableArray array];
    for (NSUInteger i = 0; i < self.endColors.count; i++) {
        UIColor *from = i < self.startColors.count ? self.startColors[i] : [UIColor clearColor];
        UIColor *to = self.endColors[i];
        [interpolated addObject:(__bridge id)[self interpolateFrom:from to:to progress:progress].CGColor];
    }
    _gradientLayer.colors = interpolated;
}

- (UIColor *)interpolateFrom:(UIColor *)from to:(UIColor *)to progress:(CGFloat)progress {
    CGFloat fr, fg, fb, fa, tr, tg, tb, ta;
    [from getRed:&fr green:&fg blue:&fb alpha:&fa];
    [to getRed:&tr green:&tg blue:&tb alpha:&ta];
    return [UIColor colorWithRed:fr + (tr - fr) * progress
                           green:fg + (tg - fg) * progress
                            blue:fb + (tb - fb) * progress
                           alpha:fa + (ta - fa) * progress];
}

A multi‑step mask gradient with an accelerated curve creates a natural edge transition.

- (void)makeMaskColorsAndLocations {
    const NSInteger steps = 6;
    for (NSInteger i = 0; i < steps; i++) {
        CGFloat t = (CGFloat)i / (steps - 1);
        CGFloat acceleratedT = t * t;
        CGFloat currentAlpha = a + (1.0 - a) * acceleratedT;
        UIColor *color = [UIColor colorWithRed:r green:g blue:b alpha:currentAlpha];
        [_maskColors addObject:(__bridge id)color.CGColor];
        [_maskColorsLocations addObject:@(t)];
    }
    _maskLayer.colors = _maskColors;
    _maskLayer.locations = _maskColorsLocations;
    _maskLayer.startPoint = CGPointMake(0, 0);
    _maskLayer.endPoint = CGPointMake(1, 0);
}

4. Effect Demonstration

Both Baidu App and Haokan App have launched the ambient mode. Metrics such as watch time, distribution, and completion rate improved, confirming a better immersive experience.

Examples:

YouTube vertical ambient mode
YouTube vertical ambient mode
YouTube horizontal ambient mode
YouTube horizontal ambient mode
Overall solution diagram
Overall solution diagram
Thread responsibilities
Thread responsibilities
Video region division
Video region division
Algorithm comparison
Algorithm comparison
Median cut flow
Median cut flow
Baidu App ambient mode
Baidu App ambient mode
Haokan App ambient mode
Haokan App ambient mode

5. Summary

The ambient mode solves the black‑bar problem by dynamically extending the video’s dominant colors to the surrounding area, delivering three main benefits:

Visual immersion : Soft background colors blend the video with the screen edges, reducing visual separation.

Comfortable viewing : Gradual color transitions lower eye strain during long sessions.

Enhanced perception : Matching background tones highlight the video content and create a harmonious visual experience.

Developers can follow the presented pipeline—frame sampling, median‑cut color extraction, and platform‑specific gradient rendering—to implement high‑quality ambient effects similar to those used by leading video platforms.

iOSAndroidGradientvideo playbackcolor extractionambient mode
Baidu App Technology
Written by

Baidu App Technology

Official Baidu App Tech Account

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.