How to Quantify Software Engineers’ Performance: Insights from a Tech VP
In a 2019 GTLC Beijing talk, Zhang Chu, Vice President of Technology at Mafengwo, explains why measuring engineers is essential, outlines five dimensions of their work, warns against common metric pitfalls, and shares practical frameworks like OGSM and OKR to create goal‑focused, balanced performance evaluations.
In a March 23, 2019 talk at GTLC Beijing, Zhang Chu, Vice President of Technology at Mafengwo, discussed the challenges of measuring software engineers’ work and offered a framework for quantitative performance evaluation.
Why Measurement Matters
Quoting Peter Drucker, “What gets measured gets managed,” Zhang emphasizes that without quantification, effective management and improvement are impossible.
Five Key Dimensions of Engineering Work
Creative work : Engineering is brain‑power intensive, with output dependent on inspiration, trial‑and‑error, and team dynamics.
Black‑box nature : Many deliverables hide internal complexity; the same feature may involve vastly different effort over time.
Experience quantification : Experienced engineers bring higher quality but their expertise is context‑specific and hard to measure.
Time management : Interruptions, meetings, and multitasking reduce productivity; poor time management leads to stress and burnout.
Collaboration : Engineers work with product, design, QA, and business teams, making individual contribution hard to isolate.
Common Pitfalls in Performance Metrics
Counting lines of code – encourages unnecessary or low‑quality code.
Bug count – can incentivize risk‑averse behavior and higher defect volume.
Project completion time – often leads to conservative estimates and hidden delays.
Potential vs. output – over‑reliance on projected potential can misguide assessments.
Practical Approaches
Zhang advocates focusing on goals rather than tasks, using frameworks such as OGSM and OKR. He shares a case study of building a unified login system (SCS) where the true goal was reducing the number of separate systems from 50 to 3.
Balancing Factors
Effective evaluation must balance short‑term outputs (e.g., team contributions, attendance) with long‑term objectives, and incorporate qualitative aspects like communication and knowledge sharing.
Hierarchical Differentiation
Performance criteria should vary by seniority: junior engineers are measured on timely delivery and code quality; mid‑level engineers on architecture and risk mitigation; senior engineers and managers on strategic impact and KPI alignment.
Evaluation Cycle
Shortening assessment periods to monthly can provide clearer feedback and adapt to fast‑changing internet product cycles.
Overall, Zhang stresses the need to shift mindset from task‑centric metrics to goal‑oriented, quantifiable outcomes that align individual effort with business value.
Mafengwo Technology
External communication platform of the Mafengwo Technology team, regularly sharing articles on advanced tech practices, tech exchange events, and recruitment.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
