Operations 12 min read

Detailed Implementation of Jenkins Job Creation and Build Optimization in ZuanZuan's CI Platform

This article provides a comprehensive walkthrough of how ZuanZuan uses Jenkins together with the custom Beetle platform to automatically generate jobs, configure config.xml, optimize front‑end builds, manage compile packages, and address Sonar scan performance issues, complete with diagrams and script examples.

转转QA
转转QA
转转QA
Detailed Implementation of Jenkins Job Creation and Build Optimization in ZuanZuan's CI Platform

Background : Jenkins is a powerful continuous integration tool, and when combined with various plugins it becomes extremely versatile. At ZuanZuan, the entire CI workflow is managed by a self‑developed platform called Beetle, while Jenkins serves as the underlying scheduler for compilation tasks. A previous short article introduced the basic solution; this piece expands on the details.

Implementation Scheme : The interaction flow is illustrated below.

The following sections explain each step in depth.

Job Creation

The branch name is automatically generated from the project name (e.g., testdemo-112-45 ), and the Jenkins job, GitLab branch, and Beetle identifiers all share this name.

Jenkins creates a job via its API, which requires a config.xml file. Beetle retrieves the appropriate config.xml from Apollo based on the project type (fe, node, ssr, rpc, web).

config.xml Structure

The main sections of config.xml include:

Discard old builds – keep only the latest five builds.

Restrict where this project can be run – assign a machine label (e.g., zhuanzhuan-java ) so that adding or removing build agents automatically updates scheduling.

Source code address – repository URL and branch name are filled in by Beetle before sending the file to Jenkins.

Workspace cleanup – ensures each build starts from a clean state; permissions are tightened to prevent code leakage.

Shell command execution – defines the JDK version and the build script; the JDK version is selected per project based on Beetle configuration.

config.xml updates – stored in Apollo and take effect immediately.

Script update strategy – build scripts on each slave are refreshed every minute via a cron job, allowing rapid bug fixes and automatic vulnerability blocking (e.g., fastjson, log4j).

Sample build‑rpc.sh script – performs pre‑build checks (log4j version, superpom, config files), runs mvn package , uploads the artifact, and notifies Beetle of the new version.

Job Compatibility Optimizations

General config.xml works for most cases, but special projects require overrides. If a project‑specific {projectName}.xml exists in Apollo, it takes precedence over the default configuration.

Front‑end Build Acceleration

Static‑resource projects spend a lot of time on npm install . The optimization decides whether to reuse node_modules based on timestamps:

Retrieve the timestamp of the last successful build via http://build.xxxx.com/jenkins/job/$JOB_NAME/lastSuccessfulBuild/api/json?pretty=true .

Compare the modification times of package.json and package-lock.json with the last successful build time; run npm install only if they are newer.

If a previous build failed and node_modules may be incomplete, provide a parameterized option to force a clean install.

Use high‑performance physical build machines for the npm run build step.

The core script is shown below.

curl -X GET http://build.xxxx.com/jenkins/job/$JOB_NAME/lastSuccessfulBuild/api/json?pretty=true > build.info
buildresult=`grep -w "result" build.info |  awk -F : '{print $2}' | awk -F , '{print $1}' | awk -F " " '{print $1}'`
lastbuildtime=`grep -w -m 1 "timestamp" build.info | awk -F : '{print $2}' | awk -F , '{print $1}' | awk -F " " '{print $1}'`
# custom high‑priority script
if test -f cstatic.sh
then
sh -x cstatic.sh
else
# parameterized build
if test -d node_modules
then
if [ $use_old_node_modules == false ]
/>    then
rm -rf node_modules/
npm install
else
# last build succeeded and old node_modules is allowed
/>        if [ $buildresult == "SUCCESS" ]
/>        then
# check if dependencies changed after last successful build
/>          if [ $pklkjsonmodifytime -gt $lastbuildtime -o $pkjsonmodifytime -gt $lastbuildtime ]
/>          then
npm install
fi
else
# previous build failed, clean problematic packages
/>          rm -rf node_modules/node-sass/
npm install
fi
fi
else
npm install
fi
npm run build
fi

The variable use_old_node_modules is passed as a parameter during the build.

Additional front‑end commands such as yarn are also supported, and projects can provide a custom cstatic.sh script for extra processing.

To handle npm version differences, projects may include an .nvmrc file specifying the required npm version.

Compile Package Management

After a successful build, the artifact is uploaded to a package management server; Jenkins itself does not retain the output. Packages contain compiled code, configuration files for offline, sandbox, and production environments, enabling reuse across multiple deployments.

It is recommended to name the package simply package.tar.gz (e.g., /xxx/testdemo-112-45/package.tar.gz ) because the branch path already provides uniqueness, simplifying deployment scripts.

Beetle works with the release system to clean up old packages and historical releases.

Build Machine Management

Refer to the earlier article “ZuanZuan Continuous Integration – Compilation Solution” for details on managing build agents.

Existing Issues

SonarQube scanning is also triggered via Jenkins, but the scan duration is currently too long. The team is investigating alternatives such as BlueShield and CodeCC, and welcomes suggestions.

Summary

At ZuanZuan, Jenkins is used solely as a lightweight compilation scheduler; all records and packages are managed by other systems, allowing easy cleanup and preventing storage bloat. While Sonar scan performance remains a challenge, ongoing research aims to find better solutions.

Build AutomationCI/CDDevOpsJenkinsConfig XMLscript optimization
转转QA
Written by

转转QA

In the era of knowledge sharing, discover 转转QA from a new perspective.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.