Mobile Development 9 min read

OpenGL ES Live Whiteboard Demo: Multi‑Channel Rendering vs Multi‑Texture Techniques

This article introduces a live‑whiteboard demo built with OpenGL ES for iOS and Android, explains the implementation of multi‑channel rendering and multi‑texture approaches, compares their performance and flexibility, and provides complete GLSL and OpenGL code snippets for developers.

Sohu Tech Products
Sohu Tech Products
Sohu Tech Products
OpenGL ES Live Whiteboard Demo: Multi‑Channel Rendering vs Multi‑Texture Techniques

The author presents the third installment of the OpenGL ES laboratory series, focusing on a live‑whiteboard application used in the Sohu Video app, which mixes camera capture data with pre‑loaded whiteboard images for interactive teaching.

The demo includes two rendering methods: multi‑channel rendering (multiple draw calls) for a large whiteboard and a small overlay, and multi‑texture rendering (single draw call) that selects the appropriate texture in the fragment shader.

Multi‑Channel Rendering draws the whiteboard first, then the camera feed, adjusting the viewport for each layer. The essential OpenGL calls are:

glViewport(0, 0, (int)_frameWidth, (int)_frameHeight);
glActiveTexture(GL_TEXTURE1);
glBindTexture(GL_TEXTURE_2D, _texId1);
glUniform1i(_filterInputTextureUniform, 1);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
// second layer (camera)
CGFloat d = 4;
glViewport(0, 0, (int)_frameWidth/d, (int)_frameWidth/d);
glActiveTexture(GL_TEXTURE2);
glBindTexture(GL_TEXTURE_2D, _texId2);
glUniform1i(_filterInputTextureUniform2, 2);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
// final reset
glViewport(0, 0, (int)_frameWidth, (int)_frameHeight);

The fragment shader simply outputs the sampled pixel:

NSString *const rgbFragmentShaderString = SHADER_STRING(
    varying highp vec2 v_texcoord;
    uniform sampler2D inputImageTexture;
    void main() {
        gl_FragColor = vec4(texture2D(inputImageTexture, v_texcoord).bgr, 1);
    }
);

Multi‑Texture Rendering binds two textures once and decides which one to sample based on the texture coordinate:

glViewport(0, 0, (int)_frameWidth, (int)_frameHeight);
glActiveTexture(GL_TEXTURE1);
glBindTexture(GL_TEXTURE_2D, _texId1);
glUniform1i(_filterInputTextureUniform, 1);
glActiveTexture(GL_TEXTURE2);
glBindTexture(GL_TEXTURE_2D, _texId2);
glUniform1i(_filterInputTextureUniform2, 2);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
NSString *const rgbFragmentShaderString2 = SHADER_STRING(
    varying highp vec2 v_texcoord;
    uniform sampler2D inputImageTexture;
    uniform sampler2D inputImageTexture2;
    void main() {
        if (v_texcoord.y < 0.5) {
            gl_FragColor = vec4(texture2D(inputImageTexture2, v_texcoord).bgr, 1);
        } else {
            gl_FragColor = vec4(texture2D(inputImageTexture, v_texcoord).bgr, 1);
        }
    }
);

The article compares the two methods: multi‑channel rendering requires multiple glDrawArrays calls and allows viewport changes per layer, while multi‑texture rendering performs a single draw but needs more complex shader logic and cannot change the viewport per texture.

Texture caching is demonstrated by uploading a PNG whiteboard image to a shared GL texture, then binding it for rendering:

glGenTextures(1, &_texId2);
glBindTexture(GL_TEXTURE_2D, _texId2);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
unsigned char *pBGRAImageIn;
[QHUtil input:@"WhiteBoard_rgba" ofType:@"rgb" len:&pBGRAImageIn];
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, (GLsizei)1125, (GLsizei)2436, 0, GL_RGBA, GL_UNSIGNED_BYTE, pBGRAImageIn);
glBindTexture(GL_TEXTURE_2D, 0);

On Android, the same concepts apply: multi‑channel rendering is implemented in Java, while multi‑texture rendering uses C++ (NDK) shaders; the GLSL code remains identical.

In practice, live streaming whiteboards constantly receive new image data, making texture updates a performance bottleneck. The author recommends multi‑channel rendering for live streams that require more than two layers (watermarks, nicknames, logos) because of its better extensibility.

The series concludes with thanks to contributors and links to previous articles and resources.

graphicsiOSandroidOpenGL ESLive WhiteboardMulti-Channel RenderingMulti-Texture
Sohu Tech Products
Written by

Sohu Tech Products

A knowledge-sharing platform for Sohu's technology products. As a leading Chinese internet brand with media, video, search, and gaming services and over 700 million users, Sohu continuously drives tech innovation and practice. We’ll share practical insights and tech news here.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.