Understanding Cumulative Flow Diagrams and Their Impact on Agile Project Delivery
This article explains the purpose and interpretation of cumulative flow diagrams in lean software development, illustrates how WIP and lead time affect delivery, and discusses common pitfalls such as code coupling, optimistic estimates, and non‑project activities that hinder iteration progress.
In the previous article we introduced the burn‑up chart, and here we turn to the cumulative flow diagram (CFD) used in lean software development.
The pink line represents the planned cumulative development completion, and the light‑blue line the planned cumulative testing completion. These lines bend every two weeks because each iteration is a two‑week window during which the team plans how much new work to take on based on the previous iteration’s leftovers.
Removing the two planned lines leaves the true CFD, shown below.
In the diagram, segment a measures the amount of work that has been developed but not yet verified (Work In Progress, WIP), while segment b measures the time from start of development to verification (Lead Time). Both should be as short as possible, as illustrated by point C.
Why was early delivery so poor?
1. Code coupling and legacy development habits affect iteration development – High‑priority requirements often interfere with each other at the code level, forcing developers to modify code for multiple requirements simultaneously. Additionally, developers tend to implement a second related requirement together with the first, believing it saves time, which actually creates delays.
These issues caused only a few requirements to be completed after the first two iterations, and quality problems further slowed progress. After a certain date (mid‑May), development speed finally matched testing speed, allowing testing to be completed within a day of development.
2. Developers’ estimates are always optimistic – The original plan of 9 weeks became 13 weeks on the burn‑down chart because estimates were based on traditional development quality standards, not the higher‑quality expectations of the new iterative model. Moreover, non‑project activities consumed far more time than anticipated (the team expected 40% of time for such tasks, but the reality was higher).
To track this, the team used a simple table beside the story wall, recording three categories: (a) project‑internal time, (b) communication with other teams, and (c) online monitoring and issue resolution. The table is illustrated below.
Filling rules: each person records daily after the stand‑up, with half‑hour precision, noting actual work hours (which may be less or more than 8 hours due to overtime). Team discipline ensures data is not disclosed externally and is unrelated to personal performance evaluation.
Statistics from two weeks showed senior staff contributed about 30% of project time, junior staff about 50%, far below the expected 60% allocation.
The project completed its development iterations by the end of June 2014, finished full testing in mid‑July, and the pilot phase concluded.
A reader might wonder why an earlier chart showed 25 weeks to go live when the project actually finished in seven months; the answer involves Baidu’s queue‑up deployment situation, which will be covered in the next article.
Continuous Delivery 2.0
Tech and case studies on organizational management, team management, and engineering efficiency
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.