Frontend Development 9 min read

Creating Animated Wave Effects with WebGL and requestAnimationFrame

This tutorial walks through building a dynamic wave animation in WebGL by converting canvas coordinates, generating point vertices with sine functions, using requestAnimationFrame for smooth updates, and customizing point sizes via buffer attributes, providing complete JavaScript code snippets and shader modifications.

JD Tech
JD Tech
JD Tech
Creating Animated Wave Effects with WebGL and requestAnimationFrame

The article continues the WebGL fundamentals series and demonstrates how to render a moving wave pattern by first setting up a correctly sized <canvas id="glcanvas" width="700" height="500"></canvas> element, noting that CSS sizing can distort the WebGL coordinate system.

Because WebGL coordinates range from -1 to 1 with the origin at the canvas center, helper functions convert pixel positions to WebGL space:

function webglX(num) { return num / (width/2); }
function webglY(num) { return num / (height/2); }

Animation is driven by requestAnimationFrame , which synchronizes redraws with the browser’s refresh rate. A simple loop updates a counter, generates new vertex data, and uploads it each frame:

var num = 0;
function render() {
    requestAnimationFrame(render);
    num = num - 1;
    var data = createPoints(num);
    setPoints(data, 1000);
}

The createPoints function builds a grid of points, applying a sine wave to the y‑coordinate and optionally to the x‑coordinate for richer motion. Degrees are converted to radians with:

function numToDeg(num) { return Math.PI * num / 180; }

A representative vertex‑generation snippet looks like this:

function createPoints(gap) {
    var max = 10;
    var arr = [];
    var n = 100, m = 10;
    for (var i = 0; i < n; i++) {
        for (var j = 0; j < m; j++) {
            var x = webglX(-(width/2) + i*20);
            var y = webglY(-(height/2) - j*20);
            var z = -1;
            arr = arr.concat([x, y, z]);
        }
    }
    return new Float32Array(arr);
}

Uploading the vertices to the GPU is handled by setPoints , which creates a buffer, binds it, defines the attribute layout, clears the canvas, and draws the points:

function setPoints(data, num) {
    var vertexBuffer = gl.createBuffer();
    if (!vertexBuffer) { console.log('创建缓存区失败。'); return -1; }
    gl.bindBuffer(gl.ARRAY_BUFFER, vertexBuffer);
    gl.bufferData(gl.ARRAY_BUFFER, data, gl.STATIC_DRAW);
    var a_position = gl.getAttribLocation(gl.program, 'a_p');
    gl.vertexAttribPointer(a_position, 3, gl.FLOAT, false, 0, 0);
    gl.enableVertexAttribArray(a_position);
    gl.clearColor(0.0,0.0,0.0,1.0);
    gl.clear(gl.COLOR_BUFFER_BIT);
    gl.drawArrays(gl.POINTS, 0, num);
}

To vary point sizes, a second buffer is created and linked to a custom shader attribute. The JavaScript side prepares the size data and calls:

function setSize(sizes, n) {
    var sizeBuffer = gl.createBuffer();
    gl.bindBuffer(gl.ARRAY_BUFFER, sizeBuffer);
    gl.bufferData(gl.ARRAY_BUFFER, sizes, gl.STATIC_DRAW);
    var a_pointsize = gl.getAttribLocation(gl.program, 'size');
    gl.vertexAttribPointer(a_pointsize, 1, gl.FLOAT, false, 0, 0);
    gl.enableVertexAttribArray(a_pointsize);
}

The vertex shader must declare the attribute to receive the size value:

attribute float size;

Combining these techniques yields a smooth, controllable wave animation where upper rows can have larger amplitudes or point sizes than lower rows. The full source code is available at https://github.com/jdf2e/webgl-demo .

frontendanimationgraphicsJavaScriptWebGLShaderrequestAnimationFrame
JD Tech
Written by

JD Tech

Official JD technology sharing platform. All the cutting‑edge JD tech, innovative insights, and open‑source solutions you’re looking for, all in one place.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.