Monday, December 26, 2011

That velocity thing

Jim Highsmith, a guru in the agile space, has an interesting post on that velocity thing. Velocity, you will recall, is the agile name--originally an XP name--for the rate throughput can be delivered to customers, or, if not delivered to customers, the rate at which throughput--finished deliverables--can be queued for rollout according to some rollout strategy and workflow.

 
Now, the idea of throughput is not an agile thing, per se. Go back to the mid-1980s to Eliyahu Goldratt's Theory of Constraints (TOC). TOC is all about optimizing at the enterprise or project level  maximum throughput to customers. Velocity was not a Goldratt idea, but Eliyahu certainly focused on cadence, using the metaphor of the drum-buffer-rope.

 
Of course, project managers may know Goldratt for his theory of the "Critical Chain", a risk management strategy for insuring on-time delivery. Critical chain is an outgrowth of TOC, and it's Goldratt's idea of how to put some of his throughput management ideas into the body of knowledge for project management.

 
I digress--as often happens here at Musings--so back to Highsmith's lament: he says that having positioned velocity as a calibration metric on team performance, project managers should not then count on teams to achieve the predicted performance because a emphasis on performance may then trump quality, the ultimate goal of an agile project. Even the customer may become part of the problem. In his words:

 
  • Velocity is increasingly being used as a productivity measure (not the capacity calibration measure that it was intended to be) that focuses too much attention on the volume of story points delivered.
  • Focusing on volume detracts from the quality of the customer experience delivered and investing enough in the delivery engine (technical quality).
  • Giving the product owner/manager complete priority control makes the problem worse—we have gone from customer focus to customer control that further skews the balance of investing in new features versus the delivery engine. 
Dean Leffingwell makes a similar point in his book "Agile Software Requirements: Lean requirements practices for teams, projects, and enterprises". He says that if the velocity metric is turned back on the team by managers, the team will do one of three things:
  1. Practice continuous improvement to meet management objectives
  2. Sacrifice quality in the name of speed, or
  3. Sandbag estimates to create velocity buffers
Of course, my point is not to use the metric to amp up productivity, (that's Highsmith and Leffingwells's fear) but to use the metric as an expectation suitable for estimating. If you can't estimate, what's the point of the metric?

And, they've got a point about sacrificing velocity if quality is the ultimate driver. But that puts 'better' as the nemesis of 'good', and puts in question the real advantage of agile that, in my mind, is delivering best value (which could be different from best quality, though I've not thought that all the way through).

And, you ask, what's my idea of best value? Simply put:

 
  • Best value is the most valuable outcomes achievable, as judged by the customer, for the investment available from the sponsor
  • Best value is the most bang for the buck, a best compromise of scope and quality in context of fixed investment and a critical need date.

By the way, I'm all about throughput. You really can't do a decent job as a agile manager unless you can benchmark for throughput and then hold teams accountable.

 
The issue is: accountable for what? I say the answer to Highsmith's issue is to get the iteration backlog right with the customer and the project team at the point of release planning. Once planned for best value, then bring on the throughput!

 
 
Are you on LinkedIn?    Share this article with your network by clicking on the link.