Slash Webpack Build Time: Persistent Caching, Node Upgrade & Rust‑Based SWC
This article explains how a large cloud‑based project reduced its 14‑minute Webpack build to a fraction of the time by upgrading Node, enabling Webpack 5's filesystem cache, adopting Rust‑based SWC, and applying advanced cache strategies such as unsafeCache, safeCache, and LRU management.
Author: Ding Nan. I don’t write code; I move code.
Background
Our cloud‑based static project contains a massive amount of code and depends on about 100 npm packages; a single build takes roughly 14 minutes.
The self‑developed hammer tool can speed up deployment, but it requires manual operations.
The server that stores local build artifacts is full, forcing us to regularly clean old files, which is inconvenient.
Solution Approach
Upgrade Node from 8.x to 12.x.
Enable Webpack 5 persistent caching, which speeds up builds dramatically—up to 7× faster.
Replace Babel with the Rust‑based SWC compiler; tests show a build‑time reduction of about 1.5 minutes, though the ecosystem is not yet production‑ready.
Key Code
<code>module.exports = {
// ...
cache: {
// Use filesystem cache instead of the default memory cache
type: 'filesystem',
buildDependencies: {
// Re‑cache when configuration files change
config: [__filename]
}
},
optimization: {
// "single" creates a runtime chunk shared across all generated chunks
runtimeChunk: 'single',
moduleIds: 'deterministic',
},
}
</code>Webpack includes boilerplate code (runtime and manifest) in the entry chunk. If not extracted, file names change even without code changes, breaking cache hits. In Webpack 4, HashedModuleIdsPlugin was used; Webpack 5 uses moduleIds: 'deterministic' by default.
Cache Methods (From the Build Perspective)
Webpack V4
cache-loader: place before expensive loaders such as babel-loader or vue-loader.
dll: generate a dynamic link library for rarely‑changed dependencies (e.g., react, lodash) using DllPlugin and DllReferencePlugin; at runtime, modules are required via
__webpack_require__.
Webpack V5
Filesystem cache as shown in the key code above. In development it still uses
MemoryCachePlugin, while in production builds it switches to
IdleFileCachePlugin, offering far better recompilation speed than cache-loader.
Some Principles
Webpack 5’s persistent cache optimizes the entire build pipeline. When a file changes, only the affected part of the dependency tree is recompiled, yielding up to a 98% speed increase for a 16,000‑module SPA. The cache is stored on disk.
In a continuous build process, the first run performs a full compilation and serializes the artifacts to disk. Subsequent builds read the disk cache, validate modules, and unpack module contents. Module relationships are still verified each build, using the same dependency‑analysis logic as a fresh compilation.
Resolver caches can also be persisted. When a resolver cache entry validates a match, it speeds up dependency lookup; if validation fails, the normal resolver logic runs.
Cache Safety Design
unsafeCache
In Webpack 4, unsafeCache relied on timestamp comparison for both resolver and module caches. When enabled, Webpack records the last modification time of each resolved file and returns the cached result for identical references.
Webpack 5 drops this strategy; unsafeCache is enabled only for cache‑enabled dependencies under
node_modules, checking for serialized file information before rebuilding.
safeCache
Module relationships are recorded using a content‑hash algorithm and stored in a
ModuleGraphweak map, providing a more reliable cache than timestamp‑based methods.
Cache Capacity Limits
Cache size cannot grow indefinitely; classic LRU (Least Recently Used) algorithms are employed to evict stale entries.
Single‑linked list: add/delete O(1), lookup O(n).
Double‑linked list with hash table: O(1) for all operations.
LRU Analysis
LRU works well for hot data, but periodic bulk operations can sharply reduce hit rates, leading to cache pollution.
LRU Algorithm Improvements
Redis adopts enhanced algorithms such as LIRS and LRU‑K; interested readers can explore them further.
WeDoctor Frontend Technology
Official WeDoctor Group frontend public account, sharing original tech articles, events, job postings, and occasional daily updates from our tech team.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.