Frontend Development 10 min read

Implementing a Particle Effect with GPGPU in three.js

This article explains how to recreate a complex particle animation using three.js by leveraging GPGPU techniques, custom shaders, and compute shaders to efficiently simulate and render hundreds of thousands of particles in real time.

Rare Earth Juejin Tech Community
Rare Earth Juejin Tech Community
Rare Earth Juejin Tech Community
Implementing a Particle Effect with GPGPU in three.js

The author noticed a popular particle animation and decided to reproduce it using three.js, focusing on the underlying particle effect rather than cross‑window communication.

Because rendering a large number of moving particles with THREE.Points alone would hit performance limits, the solution uses GPGPU (General‑Purpose GPU computing) to off‑load particle position calculations to the GPU.

Key steps include:

Create a GPGPU object using GPUComputationRenderer (provided by three.js).

Generate a data texture to store particle positions and fill it with random values.

Create GPGPU variables, each linked to a compute shader that defines how the positions are updated each frame.

Initialize the GPGPU system and, in the render loop, feed the computed texture back to the particle material.

Sample code for setting up the GPGPU object and data texture:

const width = 512;
const size = 256;
const count = width ** 2; // 262144 particles
const gpgpu = new kokomi.GPUComputer(this.base, { width });
const posDt = gpgpu.createTexture();
const data = posDt.image.data;
for (let i = 0; i < data.length; i++) {
  data[i * 4 + 0] = THREE.MathUtils.randFloatSpread(size);
  data[i * 4 + 1] = THREE.MathUtils.randFloatSpread(size);
  data[i * 4 + 2] = THREE.MathUtils.randFloatSpread(size);
  data[i * 4 + 3] = 1;
}
const posVar = gpgpu.createVariable(
  "texturePosition",
  testObjectComputeShader,
  posDt,
  { uFreq: { value: 1 } }
);
gpgpu.init();

The particle geometry uses a custom BufferGeometry with position and reference attributes, while the material is a ShaderMaterial that samples the GPGPU‑computed texture in the vertex shader:

const geometry = new THREE.BufferGeometry();
const positions = new Float32Array(count * 3);
const references = new Float32Array(count * 2);
for (let i = 0; i < width; i++) {
  for (let j = 0; j < width; j++) {
    const idx = i + j * width;
    positions[idx * 3 + 0] = Math.random();
    positions[idx * 3 + 1] = Math.random();
    positions[idx * 3 + 2] = Math.random();
    references[idx * 2 + 0] = i / width;
    references[idx * 2 + 1] = j / width;
  }
}
geometry.setAttribute("position", new THREE.BufferAttribute(positions, 3));
geometry.setAttribute("reference", new THREE.BufferAttribute(references, 2));

const material = new THREE.ShaderMaterial({
  vertexShader: testObjectVertexShader,
  fragmentShader: testObjectFragmentShader,
  uniforms: {
    texturePosition: { value: null },
    uPointSize: { value: 1 },
    uPixelRatio: { value: this.base.renderer.getPixelRatio() }
  },
  transparent: true,
  blending: THREE.AdditiveBlending,
  depthWrite: false
});

Vertex shader samples the position texture using the reference UVs:

uniform float uPointSize;
uniform float uPixelRatio;
uniform sampler2D texturePosition;
attribute vec2 reference;
void main(){
  vec3 p = texture(texturePosition, reference).xyz;
  gl_Position = projectionMatrix * modelViewMatrix * vec4(p, 1.0);
  gl_PointSize = uPointSize * uPixelRatio;
}

The fragment shader simply outputs a solid color for each particle.

Compute shaders are then used to evolve particle positions. A basic example samples the current position and applies a curl noise function from the lygia library:

#include "/node_modules/lygia/generative/curl.glsl"
void main(){
  vec2 uv = gl_FragCoord.xy / resolution.xy;
  vec3 pos = texture(texturePosition, uv).xyz;
  pos = curl(pos);
  gl_FragColor = vec4(pos, 1.0);
}

More advanced shaders combine curl noise with fractal Brownian motion (fbm) and mix the results to create a smoother, animated sphere, optionally adding attraction forces via signed‑distance functions.

Finally, the particle system is added to the scene and updated each frame:

const points = new THREE.Points(geometry, material);
this.scene.add(points);
this.update(() => {
  const mat = points.material;
  mat.uniforms.texturePosition.value = gpgpu.getVariableRt(posVar);
});

The article concludes with suggestions for improvement, such as using more sophisticated distortion techniques and refining the entanglement animation, and provides a link to the full source code on GitHub.

JavaScriptThree.jsWebGLShaderParticle SystemGPGPU
Rare Earth Juejin Tech Community
Written by

Rare Earth Juejin Tech Community

Juejin, a tech community that helps developers grow.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.