Mobile Development 20 min read

How to Add a Watermark to Android Video Using MediaCodec and OpenGL

This guide explains how to decode a video, render a watermark with OpenGL, re‑encode the frames, and mux them into a new file on Android, covering texture types, viewport positioning, blending, and the required MediaCodec and MediaMuxer code.

Qizhuo Club
Qizhuo Club
Qizhuo Club
How to Add a Watermark to Android Video Using MediaCodec and OpenGL

This article shows how to add a watermark to a video file on Android by decoding the video, drawing the watermark with OpenGL, encoding the frames again, and finally muxing them into a new video.

Decode video Configure MediaCodec (Decoder) with an OutputSurface, decode video frames to a Surface, and use OpenGL to render each frame.

Encode video Configure MediaCodec (Encoder) with an InputSurface and encode the frames rendered by OpenGL.

Compose video Use MediaMuxer to combine the encoded data into the final video file.

On Android, video streams and camera output use the special texture format GL_TEXTURE_EXTERNAL_OES, while a watermark image uses the regular GL_TEXTURE_2D. The watermark’s position is set with GLES20.glViewport:

GLES20.glViewport(bitmap.mOffX, bitmap.mOffY, bitmap.mWidth, bitmap.mHeight);

The fragment shader for 2D textures is:

private static final String FRAGMENT_2D_SHADER =
    "precision mediump float;
" +
    "varying vec2 vTextureCoord;
" +
    "uniform sampler2D sTexture;
" +
    "void main() {
" +
    "  gl_FragColor = texture2D(sTexture, vTextureCoord);
" +
    "}
";

Blending (Blend) is enabled to overlay the watermark image:

GLES20.glEnable(GLES20.GL_BLEND);
GLES20.glBlendFunc(GLES20.GL_ONE, GLES20.GL_ONE_MINUS_SRC_ALPHA);

The watermark image is bound with GLUtils.texImage2D:

GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);

The complete method that draws the watermark is:

/**
 * Draws the external texture in SurfaceTexture onto the current EGL surface.
 */
public void drawFrame(SurfaceTexture st, boolean invert, Bitmap bitmap) {
    checkGlError("onDrawFrame start");
    st.getTransformMatrix(mSTMatrix);
    if (invert) {
        mSTMatrix[5] = -mSTMatrix[5];
        mSTMatrix[13] = 1.0f - mSTMatrix[13];
    }

    GLES20.glUseProgram(mProgram);
    checkGlError("glUseProgram");

    GLES20.glEnable(GLES20.GL_BLEND);
    GlUtil.checkGlError("glEnable");
    GLES20.glBlendFunc(GLES20.GL_ONE, GLES20.GL_ONE_MINUS_SRC_ALPHA);
    GlUtil.checkGlError("glBlendFunc");

    GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
    GLES20.glBindTexture(mTargetTextureType, mTextureID);

    mTriangleVertices.position(TRIANGLE_VERTICES_DATA_POS_OFFSET);
    GLES20.glVertexAttribPointer(maPositionHandle, 3, GLES20.GL_FLOAT, false,
            TRIANGLE_VERTICES_DATA_STRIDE_BYTES, mTriangleVertices);
    checkGlError("glVertexAttribPointer maPosition");
    GLES20.glEnableVertexAttribArray(maPositionHandle);
    checkGlError("glEnableVertexAttribArray maPositionHandle");

    mTriangleVertices.position(TRIANGLE_VERTICES_DATA_UV_OFFSET);
    GLES20.glVertexAttribPointer(maTextureHandle, 2, GLES20.GL_FLOAT, false,
            TRIANGLE_VERTICES_DATA_STRIDE_BYTES, mTriangleVertices);
    checkGlError("glVertexAttribPointer maTextureHandle");
    GLES20.glEnableVertexAttribArray(maTextureHandle);
    checkGlError("glEnableVertexAttribArray maTextureHandle");

    Matrix.setIdentityM(mMVPMatrix, 0);
    GLES20.glUniformMatrix4fv(muMVPMatrixHandle, 1, false, mMVPMatrix, 0);
    GLES20.glUniformMatrix4fv(muSTMatrixHandle, 1, false, mSTMatrix, 0);

    if (bitmap != null) {
        GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);
    }

    GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
    checkGlError("glDrawArrays");

    GLES20.glFlush();

    GLES20.glDisable(GLES20.GL_BLEND);
    GLES20.glDisableVertexAttribArray(maPositionHandle);
    GLES20.glDisableVertexAttribArray(maTextureHandle);

    GLES20.glBindTexture(mTargetTextureType, 0);
    GLES20.glUseProgram(0);
}

After both the video frame and the watermark are drawn into the OpenGL surface, the encoder’s InputSurface.swapBuffers() flushes the data to the encoder, which then encodes and the MediaMuxer combines the streams into the final watermarked video file.

Conversion Code

You can test the process with the following call:

new Mp4Converter().convert("/sdcard/test.mp4", "/sdcard/test_output.mp4");

Mp4Converter.java

Refer to the article “Android Video Processing – MediaCodec 5 – Generating MP4 Video”.

Drawing Code

CodecOutputSurface.java

The full implementation of CodecOutputSurface and its inner class STextureRender provides the EGL setup, surface creation, frame acquisition, and rendering logic required for the watermarking workflow.

package demo.mediacodec.xueting.com.mediacodecdemo;

import android.graphics.Bitmap;
import android.graphics.SurfaceTexture;
import android.opengl.EGL14;
import android.opengl.EGLConfig;
import android.opengl.EGLContext;
import android.opengl.EGLDisplay;
import android.opengl.EGLSurface;
import android.opengl.GLES11Ext;
import android.opengl.GLES20;
import android.opengl.GLUtils;
import android.opengl.Matrix;
import android.util.Log;
import android.view.Surface;

import java.io.BufferedOutputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;
import java.util.ArrayList;
import java.util.List;

/**
 * Holds state associated with a Surface used for MediaCodec decoder output.
 */
public class CodecOutputSurface implements SurfaceTexture.OnFrameAvailableListener {
    private static final String TAG = CodecOutputSurface.class.getSimpleName();
    private static final boolean DEBUG = true;

    private STextureRender mTextureRender;
    private STextureRender mTextureRender2D;
    private SurfaceTexture mSurfaceTexture;
    private Surface mSurface;

    private EGLDisplay mEGLDisplay = EGL14.EGL_NO_DISPLAY;
    private EGLContext mEGLContext = EGL14.EGL_NO_CONTEXT;
    private EGLSurface mEGLSurface = EGL14.EGL_NO_SURFACE;
    int mWidth;
    int mHeight;

    private final Object mFrameSyncObject = new Object();
    private boolean mFrameAvailable;

    private ByteBuffer mPixelBuf;
    private List<Texture2DBitmap> mExternalImgs;

    // ... (rest of the class implementation as shown in the source) ...
}
Original Source

Signed-in readers can open the original source through BestHub's protected redirect.

Sign in to view source
Republication Notice

This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactadmin@besthub.devand we will review it promptly.

AndroidOpenGLSurfaceTextureMediaCodecMp4ConverterVideo Watermark
Qizhuo Club
Written by

Qizhuo Club

360 Mobile tech channel sharing practical experience and original insights from 360 Mobile Security and other teams across Android, iOS, big data, AI, and more.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.