Implementing a Naked‑Eye 3D Effect on Android with OpenGL
This article explains how to create a naked‑eye 3D visual effect in an Android app by splitting an image into three layers, using OpenGL for GPU‑accelerated rendering, registering device sensors to track rotation, applying matrix transformations, and smoothing sensor noise with a low‑pass filter.
The article describes a step‑by‑step implementation of the naked‑eye 3D effect for the Ziroom client app on Android, extending earlier Flutter, native Android, and Jetpack Compose versions.
The effect works by dividing a picture into three layers—foreground, midground, and background—and moving the outer layers in opposite directions when the device rotates, giving the illusion of depth.
OpenGL is chosen because the GPU can efficiently handle the large number of scaling and translation operations required, and rendering the layers as textures avoids black‑edge artifacts by scaling the images before moving them.
Static image rendering : First, a vertex and fragment shader are defined to pass texture coordinates and apply a transformation matrix.
// vertex shader
attribute vec4 av_Position;
attribute vec2 af_Position;
uniform mat4 u_Matrix;
varying vec2 v_texPo;
void main() {
v_texPo = af_Position;
gl_Position = u_Matrix * av_Position;
}During GLSurfaceView creation, the shader program is loaded and the three texture images are bound to the GPU.
public class My3DRenderer implements GLSurfaceView.Renderer {
@Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
mProgram = loadShaderWithResource(mContext,
R.raw.projection_vertex_shader,
R.raw.projection_fragment_shader);
// load textures
texImageInner(R.drawable.bg_3d_back, mBackTextureId);
texImageInner(R.drawable.bg_3d_mid, mMidTextureId);
texImageInner(R.drawable.bg_3d_fore, mFrontTextureId);
}
}The viewport is set to full screen and a simple orthographic projection matrix is used because the images have the same aspect ratio as the screen.
public void onSurfaceChanged(GL10 gl, int width, int height) {
GLES20.glViewport(0, 0, width, height);
Matrix.setIdentityM(mProjectionMatrix, 0);
}Each frame draws the three layers with the same logic, differing only in the texture ID and the transformation matrix.
public void onDrawFrame(GL10 gl) {
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
GLES20.glUseProgram(mProgram);
drawLayerInner(mBackTextureId, mTextureBuffer, mBackMatrix);
drawLayerInner(mMidTextureId, mTextureBuffer, mMidMatrix);
drawLayerInner(mFrontTextureId, mTextureBuffer, mFrontMatrix);
}
private void drawLayerInner(int textureId, FloatBuffer textureBuffer, float[] matrix) {
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId);
GLES20.glUniformMatrix4fv(uMatrixLocation, 1, false, matrix, 0);
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
}Sensor handling : The accelerometer and magnetic field sensors are registered to obtain the device’s rotation matrix, which is then converted to Euler angles.
mSensorManager = (SensorManager) context.getSystemService(Context.SENSOR_SERVICE);
SensorEventListener mSensorEventListener = new SensorEventListener() {
@Override
public void onSensorChanged(SensorEvent event) {
// obtain rotation matrix and orientation
float[] R = new float[9];
float[] values = new float[3];
SensorManager.getRotationMatrix(R, null, mAcceleValues, mMageneticValues);
SensorManager.getOrientation(R, values);
float degreeX = (float) Math.toDegrees(values[1]);
float degreeY = (float) Math.toDegrees(values[2]);
updateMatrix(degreeX, degreeY);
}
};
mSensorManager.registerListener(mSensorEventListener, mAcceleSensor, SensorManager.SENSOR_DELAY_GAME);Matrix update : For each layer a separate 4×4 matrix is built. The background and foreground layers are translated according to the device’s X/Y rotation and then uniformly scaled to avoid empty borders.
private void updateMatrix(float degreeX, float degreeY) {
// background
float maxTransXY = MAX_VISIBLE_SIDE_BACKGROUND - 1f;
float transX = (maxTransXY / MAX_TRANS_DEGREE_Y) * -degreeY;
float transY = (maxTransXY / MAX_TRANS_DEGREE_X) * -degreeX;
float[] backMatrix = new float[16];
Matrix.setIdentityM(backMatrix, 0);
Matrix.translateM(backMatrix, 0, transX, transY, 0f);
Matrix.scaleM(backMatrix, 0, SCALE_BACK_GROUND, SCALE_BACK_GROUND, 1f);
Matrix.multiplyMM(mBackMatrix, 0, mProjectionMatrix, 0, backMatrix, 0);
// mid (no movement)
Matrix.setIdentityM(mMidMatrix, 0);
// foreground
maxTransXY = MAX_VISIBLE_SIDE_FOREGROUND - 1f;
transX = (maxTransXY / MAX_TRANS_DEGREE_Y) * -degreeY;
transY = (maxTransXY / MAX_TRANS_DEGREE_X) * -degreeX;
float[] frontMatrix = new float[16];
Matrix.setIdentityM(frontMatrix, 0);
Matrix.translateM(frontMatrix, 0, -transX, -transY - 0.10f, 0f);
Matrix.scaleM(frontMatrix, 0, SCALE_FORE_GROUND, SCALE_FORE_GROUND, 1f);
Matrix.multiplyMM(mFrontMatrix, 0, mProjectionMatrix, 0, frontMatrix, 0);
}The article also discusses two non‑intuitive details: the rotation direction is opposite to the image movement, and the default device orientation for the X‑axis should be set to –45° (the phone held upright) rather than 0°.
To smooth the noisy sensor data, a simple low‑pass filter is applied before converting the values to angles.
private final SensorEventListener mSensorEventListener = new SensorEventListener() {
@Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
mAcceleValues = lowPass(event.values.clone(), mAcceleValues);
}
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD) {
mMageneticValues = lowPass(event.values.clone(), mMageneticValues);
}
// ... compute degreeX, degreeY and call updateMatrix
}
};With these steps the final interactive 3D effect works smoothly on Android devices, as demonstrated by the animated screenshots in the original article.
All source code referenced in the tutorial is available at the following GitHub link:
https://github.com/qingmei2/OpenGL-demo/blob/master/app/src/main/java/com/github/qingmei2/opengl_demo/c_image_process/processor/C06Image3DProcessor.javaSohu Tech Products
A knowledge-sharing platform for Sohu's technology products. As a leading Chinese internet brand with media, video, search, and gaming services and over 700 million users, Sohu continuously drives tech innovation and practice. We’ll share practical insights and tech news here.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.