Thursday, December 20, 2012

How old is the wine? Exposing risk

Mike Cohn has a posting on a "risk exposure burn down chart".

It's a take off on the more familiar object and task burn down chart applied universally in agile. Mike's chart is pretty straight forward: Estimate the risk impact (as modified by likelihood) and call this 'risk exposure'. When the risk is mitigated, or passes by without impact, the exposure is burned off.

Nice. But is this old wine in a new bottle?

Some years ago -- actually, many years ago -- I took over a department of about 85 engineers, mostly system engineers, with a budget (then) of about $10M annually. One of the first things my new boss asked me to do was to figure out what my (our) exposure was.

Exposure? I understood this to be a question about risk, but what's the metric here, and how do you measure for data? For instance, the metric could be scope exposure, schedule exposure (as in Mike's chart), quality, or budget. Or, it could be some other "measure of effectiveness" -- MoE -- that was applied in the various programs.

In the moment what my boss was asking for was an estimate of the budget risk (impact and likelihood) that was lurking in the work package assigments to which my department engineers had committed themselves.

That $10M department budget was all signed out to various projects for work that had to be done, but what if the work could not be done for $10M? How much risk was there for me as department director and for the various program managers who needed the work done?

So, I had my metric: monetized budget; and I had a way to measure it.

Management and Monte Carlo
What I did, of course, was build a version of a risk register, much like Mike's but with some differences that account for risk management: for each work package, I asked my WP managers to give me a 3-point estimate (no single points allowed!) of the  remaining effort required to get 'done' (cost-to-complete according to a completion standard more or less understood by all -- a similiar idea exists in agile, of course).

A Monte Carlo simulation of the WP estimates gave me -- actually, my business analyst -- a single point measurement (expecgted value statistic) to compare with the single point budget. Each statistic value is a deterministic number so I can do arithmetic on them. Of course, the optimistic, ML, and pessimistic numbers are random numbers -- values in a distribution -- so I can't do arithmetic with them, except by simulation. Thus, their columns are not added.The difference in these two points was my (our) exposure measured as the risk weighted expected value of the portfolio.

The spreadsheet below illustrates how it was done: