Developing a Lighthouse Plugin for Web and Mini‑Program Performance Monitoring
The article explains how Harbor Front‑End extends Lighthouse with a custom plugin—defining gatherers, audits, and categories—to uniformly monitor web pages and mini‑programs, illustrating setup, a sample native API‑call logger, and configuration steps that produce detailed performance reports alongside standard Lighthouse metrics.
Preface
Harbor Front‑End uses Lighthouse as a qualitative performance‑checking tool for its applications. To support both web pages and mini‑programs, the team extended the original Lighthouse Plugin with custom Gatherers, Audits, and Categories. This article shares the experience and implementation details.
Lighthouse Overview
Lighthouse is an open‑source automated auditing tool that evaluates web applications and pages, providing a first‑paint score and best‑practice recommendations.
How to Use
Four usage methods are available; see the official documentation (https://github.com/GoogleChrome/lighthouse#using-lighthouse-in-chrome-devtools) for details.
Module Implementation
The overall architecture consists of three core modules:
Gatherers
Audits
Categories
See the architecture diagram in the original article.
Report Output
Lighthouse reports display five default categories: Performance, Progressive Web App, Accessibility, Best Practices, and SEO. Specific categories can be shown by adjusting the config.
Lighthouse Plugin Development Guide
1. Create a Lighthouse Plugin Project
Initialize an npm package containing plugin.js and package.json :
module.exports = {
// Add custom audits here
audits: [{path: 'lighthouse-plugin-example/test-audit.js'}],
category: {
title: 'title',
description: 'description',
auditRefs: [{id: 'test-id', weight: 1}],
},
};
// package.json
{
"name": "lighthouse-plugin-example",
"main": "plugin.js",
"peerDependencies": {"lighthouse": "^5.6.0"},
"devDependencies": {"lighthouse": "^5.6.0"}
}Note: Lighthouse must be added as a peerDependency to avoid duplicate installations.
2. Add a Custom Audit
Create test‑audit.js :
const { Audit } = require('lighthouse');
class TestAudit extends Audit {
static get meta() {
return {requiredArtifacts: ['devtoolsLogs']};
}
static audit(artifacts) {
return {score: 1};
}
}
module.exports = TestAudit;3. Use the Plugin
CLI execution:
lighthouse https://example.com --plugins=lighthouse-plugin-examplenpm‑based configuration:
module.exports = {
extends: 'lighthouse:default',
plugins: ['lighthouse-plugin-example'],
settings: {/* ... */},
};Practical Application in a Mini‑Program Project
Why a Custom Plugin?
Harbor Front‑End aims to use Lighthouse uniformly for both web and mini‑programs. Mini‑programs cannot run Lighthouse directly, so tools like Taro or AntMove are used to transpile them to a browser‑compatible form, after which a custom Lighthouse Plugin collects mini‑program‑specific metrics.
Example: Counting Alipay Mini‑Program Native API Calls
The solution consists of three parts: injecting collection logic into business code, a Gatherer to retrieve the collected data, and an Audit to process it.
Business Code Injection
Define a global logger window.$$nativeCall and proxy the mini‑program API object:
window.my = $$myproxy(window.my); // proxy for Alipay mini‑program API
window.$$nativeCall = {
calls: [],
push: function(type, callInfo) {
this.calls.push({timestamp: Date.now(), type, callInfo});
},
};
function $$myproxy(my) {
return new Proxy(my, {
get(target, key) {
const keyValue = target[key];
if (typeof keyValue === 'function') {
return $$myProxyFn(keyValue, target, key);
}
window.$$nativeCall.push('api-attr-called', JSON.stringify({key: `my.${key}`}));
return keyValue;
},
});
}
function $$myProxyFn(fn, that, key) {
return (function(...args) {
window.$$nativeCall.push('api-method-called', JSON.stringify({key: `my.${key}`, args}));
try {
const result = fn.apply(this, args);
window.$$nativeCall.push('api-method-success-called', JSON.stringify({key: `my.${key}`, args, result}));
} catch (err) {
window.$$nativeCall.push('api-method-error-called', JSON.stringify({key: `my.${key}`, args, result: err}));
}
}).bind(that);
}The collected data is later accessed by a custom Gatherer.
Custom Gatherer
const { Gatherer } = require('lighthouse');
class CustomNativeCall extends Gatherer {
async afterPass(passContext) {
const { driver } = passContext;
const $$nativeCall = await driver.evaluateAsync('window.$$nativeCall');
return $$nativeCall ? $$nativeCall.calls : [];
}
}
module.exports = CustomNativeCall;Custom Audit
const { Audit } = require('lighthouse');
class ApiMethodCalledAudit extends Audit {
static get meta() {
return {id: 'api-method', requiredArtifacts: ['CustomNativeCall']};
}
static audit(artifacts) {
const { CustomNativeCall } = artifacts;
const apiMap = {};
CustomNativeCall.forEach(log => {
if (log.type === 'api-method-called') {
const {key} = JSON.parse(log.callInfo);
apiMap[key] = (apiMap[key] || 0) + 1;
}
});
const result = Object.keys(apiMap).map(apiKey => ({api: apiKey, count: apiMap[apiKey]}));
return {
details: Audit.makeTableDetails(ApiMethodCalledAudit.getHeadings(), result),
displayValue: `Found ${result.length} API method calls`,
score: 1,
};
}
static getHeadings() {
return [
{itemType: 'text', key: 'api', text: 'Name'},
{itemType: 'numeric', key: 'count', text: 'Call Count'},
];
}
}
module.exports = ApiMethodCalledAudit;Plugin Registration
module.exports = {
audits: [{path: 'lighthouse-plugin-example/api-method-called-audit.js'}],
category: {
title: 'Container',
description: 'Mini‑program specific checks',
auditRefs: [{id: 'api-method', weight: 1}],
},
};With these pieces, data collection, gathering, and analysis are fully integrated.
Log Types Collected
api-attr-called : API attribute access
api-method-called : API method invocation
api-method-success-called : Successful method call
api-method-error-called : Exception during method call
set-data : setData call count and size
set-data-success : setData latency
Audit Catalog
api-async-same-args-called
api-attr-called
api-deprecated-called
api-duplicate-called
api-error-called
api-long-time-called
api-method-called
api-sync-called
page-node-used
set-data-called
set-data-size
Categories Defined in the Plugin
performance : Metrics such as setData frequency, data size, page node count.
container : Mini‑program API usage, native attribute access, etc.
best‑practice : Recommendations like image WebP conversion, avoiding deprecated APIs, error handling.
Plugin Usage Example (npm package)
module.exports = {
extends: 'lighthouse:default',
plugins: ['lighthouse-plugin-miniprogram/plugins/container'],
passes: [{
passName: 'defaultPass',
gatherers: ['lighthouse-plugin-miniprogram/gatherers/custom-native-call'],
}],
settings: {/* ... */},
};Result Snapshot
The plugin produces a report similar to the standard Lighthouse UI, showing the custom audit tables.
Conclusion
This article demonstrates how Harbor Front‑End integrates Lighthouse into its quality‑monitoring pipeline and extends it for mini‑program performance analysis. The approach is still evolving, and feedback is welcomed.
References
Lighthouse developers – https://developers.google.com/web/tools/lighthouse/
Lighthouse testing insights – https://juejin.im/post/5dca05f45188250c643b7d76#heading-3
Web performance optimization map – https://github.com/berwin/Blog/issues/23
Chrome DevTools Protocol – https://chromedevtools.github.io/devtools-protocol/
Lighthouse architecture – https://github.com/GoogleChrome/lighthouse/blob/master/docs/architecture.md
HelloTech
Official Hello technology account, sharing tech insights and developments.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.