Thursday, June 23, 2016

Risk Matrix -- One more time!



In 1711 Abraham De Moivre came up with the mathematical definition of risk as:

The Risk of losing any sum is the reverse of Expectation; and the true measure of it is, the product of the Sum adventured multiplied by the Probability of the Loss.

Abraham de Moivre, De Mensura Sortis, 1711 in the Ph. Trans. of the Royal Society

I copied this quote from a well argued posting by Matthew Squair on Critical Uncertainties entitled Risk and the Matrix.

His thesis is sensible to those that really understand that really understanding risk events is dubious at best:
For new systems we generally do not have statistical data .... and high consequence events are (usually) quite rare leaving us with a paucity of information.

So we end up arguing our .... case using low base rate data, and in the final analysis we usually fall back on some form of subjective (and qualitative) risk assessment.

The risk matrix was developed to guide this type of risk assessments, it’s actually based on decision theory, De’Moivres definition of risk and the principles of the iso-risk contour

Well, I've given you De’Moivres definition of risk in the opening to this posting. What then is an iso-risk contour?

Definitions:
"iso" from the Greek, meaning "equal"
"contour", typically referring to a plotted line (or curve) meaning all points on the line are equal. A common usage is 'contour map' which is a mapping of equal elevation lines.

So, iso-risk contours are lines on a risk mapping where all the risk values are the same.

Fair enough. What's next?

Enter: decision theorists. These guys provide the methodology for constructing the familiar risk matrix (or grid) that is dimensioned impact by probability. The decision guys recognized that unless you "zone" or compartmentalize or stratify the impacts and probabilities it's very hard to draw any conclusions or obtain guidance for management. Thus, rather than lists or other means, we have the familiar grid.

Each grid value, like High-Low, can be a point on a curve (curve is a generalization of line that has the connotation of straight line), but Low-High is also a point on the same curve. Notice we're sticking with qualitative values for now.

However, we can assign arbitrary numeric scales so long as we define the scale. The absence of definition is the achilles heel of most risk matrix presentations that purport to be quantitative. And, these are scales simply for presentation, so they are relative not absolute.
So for example, we can define High as being 100 times more of an impact than Low without the hazard of an uncalibrated guess as to what the absolute impact is.

If you then plot the risk grid using Log Log scaling, the iso-contours will be straight lines. How convenient! Of course, it's been a while since I've had log log paper in my desk. Thus, the common depiction is linear scales and curved iso-lines.

Using the lines, you can make management decisions to ignore risks on one side of the line and address risks on the other.

There are two common problems with risk matrix practices:
  1. What do you do with the so-called "bury the needle" low probability events (I didn't use 'black swan' here) that don' fit on a reasonably sized matix (who needs 10K to 1 odds on their matix?)
  2. How do you calibrate the thing if you wanted to?
 For "1", where either the standard that governs the risk grid or common sense places an upper bound on the grid, the extreme outliers are best handled on a separate lists dedicated to cautious 360 situational awareness

For "2", pick a grid point, perhaps a Medium-Medium point, that is amenable to benchmarking. A credible benchmark will then "anchor" the grid. Being cautious of "anchor bias" (See: Kahneman and Tversky), one then places other risk events in context with the anchor.

If you've read this far, it's time to go.


Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog

Monday, June 20, 2016

A brief history of project scheduling



Patrick Weaver gave a talk at Primavera06 about the history of scheduling. His talk is captured in an interesting paper: "A brief history of scheduling: back to the future"

Patrick is somewhat of a historian and prognosticator on matter such as these. He also has written:
So, what is the history of scheduling? I certainly remember the days of yester-year before MSProject and its adult cousin Primavera; I remember when the only scheduling tool I had was graph paper, and I remember when the mainframe scheduling tools began to replace hand-drawn bar chart schedules and simple networks.

Weaver writes:
Modern schedule control tools can trace their origins to 1765. The originator of the ‘bar chart’ appears to be Joseph Priestley (England, 1733-1804); his ‘Chart of Biography’ plotted some 2000 famous lifetimes on a time scaled chart “…a longer or a shorter space of time may be most commodiously and advantageously represented by a longer or a shorter line.”

Priestley’s ideas were picked up by William Playfair (1759-1823) in his ‘Commercial and Political Atlas’ of 1786. Playfair is credited with developing a range of statistical charts including the line, bar (histogram), and pie charts

We learn these additional nuggets:
  • The science of ‘scheduling’ as defined by Critical Path Analysis (CPA) celebrated its 50th Anniversary in 2007.
  • In 1956/57 Kelly and Walker started developing the algorithms that became the ‘Activity-on-Arrow’ or ADM scheduling methodology for DuPont.
  • The PERT system was developed at around the same time for the US Navy's Polaris missile program 
  • PERT lagged CPM (Critical Path Analysis) by 6 to 12 months (although the term ‘critical path’ was invented by the PERT team).
  • Later the Precedence (PDM) methodology was developed by Dr. John Fondahl; his seminal paper was published in 1961 describing PDM as a ‘non-computer’ alternative to CPM.

Of course, one of the most profound developments was the arrival and market penetration of the low cost PC-based personal scheduling tools like MSProject. In Weaver's view that made schedulers out of everyone, but everyone is not a scheduler, or can even spell the word.

In my personal opinion, the integration of Monte Carlo tools with low cost scheduling applications like MSProject was equally profound. The MC simulation "fixes" some of the most egregious problems with PERT and the simulation idea has largely run PERT into obsolescence. The main thing that PERT does not handle well is schedule joins or merge, and the statistical merge bias that goes with it.


On the dark side, MSProject has profoundly muddled the whole idea of the WBS. The fact that one of the fields in MSP is called "WBS" is most unfortunate. There are a whole generation of project analysts who believe they created a WBS by the simple act of making that field visible. Not so: schedule is the "verbs" of the project; WBS is the "nouns". Together, WBS + schedule = project narrative.

Now, in Weaver's view, we return to the future:
The changing role of the scheduler has been almost as interesting:
• The mainframe era saw scheduling as:
  • A skilled profession
  • Central to the success of projects
• Then came the PCs…… everyone and no-one was a ‘scheduler’ in the 1980s and 90s
• However, in the 21st century, the new ‘enterprise’ era sees scheduling as:
  • A skilled profession
  • Central to the success of projects



Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog

Friday, June 17, 2016

Qualities of good employees



Jeff Haden at the Owner's Manual on Inc.com has written about employee qualities

By Haden's telling, great employees have these qualities:
  • They ignore job descriptions. .
  • They’re eccentric...
  • But they know when to dial it back.
  • They publicly praise...
  • And they privately complain.
  • They speak when others won’t.
  • They like to prove others wrong.
  • They’re always fiddling.
I like the fiddling thing. This means curiosity. And, it could mean a propensity for innovation. If companies play into this, they'll give the fiddlers some space and time. Somewhat like Google's "20% of your time to do anything" policy.

I have managed units that had eccentric people. I'm careful to say I didn't manage people, and certainly not an eccentric. As oft said: "things are managed; people are led". In fact, eccentrics are to be tolerated at least, and nurtured at best. You never know what will turn up.

As a case in point, take a look at the famous Bell Labs where the transistor was invented and Claude Shannon discovered the limits of bandwidth. Talk about eccentrics! Like shooting fish in a barrel


Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog

Tuesday, June 14, 2016

Morse code on a Smart Phone?



OMG! Now we learn about an experiment to "simplify" typing on a smart phone -- perhaps someday coming to you. Google has been thinking about a variant of Morse Code they call TAP

Don't remember your Morse? Perhaps you remember this:


Of course, on a smart phone, it's much "simpler". You merely 'tap' on two large buttons, one for dot and one for dash. The code is simpler also:


So, what's the connection to project management? Well, consider the ideas of simple,complicated, and complex:


  • Simple: The least complicated or complex possible (for the situation)
  • Complicated: a lot of parts
  • Complex: complicated, but also burdened with a lot of part-to-part interactions, many of which are difficult to predict

If you were the project manager working with the product manager on TAP how would the conversation go? Presumably the goal is a faster typing experience, one attuned to the world of digraphs and trigraphs like LOL and OMG, etc. And, faster is supposed to be better--less overhead for the same message content.

But is two-button TAP really simpler than a 26 letter keyboard, 10 more for numbers, and a few more for special characters? It would seem so: 2 is surely more simple that 36+. Of course, the information content behind each of the 36+ is much greater than the information content behind one tap of TAP. Thus, it seem from an information encoding perspective TAP is going backward to 1890.

On the other hand, just as Morse was developed with efficiency in mind (the most common letter in the English alphabet is "E", so the Morse symbol is one "dot"), I imagine that TAP will go the same way. Whole thoughts, like LOL, will be simply encoded and perhaps the actual throughput will go up. We'll have to see how the smart phone generation handles this.

On the other hand, it could go the way of other Google innovations. Perhaps the Google folks should read Everett Rogers "Diffusion of Innovation".




Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog

Saturday, June 11, 2016

Cash on the barrel-head



Has everyone got a handle on EBITDA?

NO?

In a few words, EBITDA is a measure of cash earnings from the real business, the day to day stuff that creates value for customers, users, and stakeholders: cash, as Earnings, Before any Interest payments, Taxes, or Deductions for non-cash items like depreciation of tangible assets and Amortization of intangibles.

Profit is an opinion; cash is a fact
Tom Pike

Fair enough. But since this a project management blog, why should we care?

Well, for one we PMs are in the value business; mostly we're in the earned value business. When the project is successful and earns its value, then it's ready for the business. The deliverables can go on to deliver on EBITDA.

What about NPV and EVA you ask?

Haven't we PMs been told by CFOs that the way the business goes about measuring financial success in the business domain is with measures of discounted cash flow (DCF), like NPV (net present value) and EVA (economic value add)?

And, haven't we all seen the IRR (internal rate of return) calculations that demonstrate that this project can't miss (at least in terms of discount)?

I took up this very thing with a CFO in the private equity business with whom I've done business for many years, Steve McBrayer. He writes:
[Private equity] is ..." much more cash flow oriented than [publically traded businesses]. I think it is the nature of the private equity industry in general – Capitalism at its finest in my opinion – very investor focused.
  • Our primary measure is EBITDA (basically a proxy for cash flow). Our focus is converting EBITDA into operating cash flow as efficiently as possible, while balancing the business needs (investments) against these demands.
  • I like to see the business unit bonus program limited to about 5% of EBITDA (I don’t always get there, but that is a reference point)
  • I use 10 % of EBITDA as my reference point for Cap-Ex (some business units get more if they have high growth potential and are more “value added” and some get less (for example a pure distribution business with little value add and lower growth prospects)
  • At the end of the day, the primary responsibility of the CFO is to prioritize the various projects competing for the limited capital budget.  The valuation tools [EVA and NPV] ... are useful, but not the primary tool that I use...[which] is business judgment: does it make sense; who is responsible for the project; do I have confidence in them; is it critcal?
OMG! Business judgment: what will they think of next?



Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog

Wednesday, June 8, 2016

Leading Change -- Kotter



John P. Kotter is a change guy. I've been going through his 1996 classic "Leading Change"


Here's my take: it's a good book, but a little long on the narrative since the essentials are right up front: 8 leadership steps towards change management.

Now, admittedly, this is more aimed at the business readiness swim lane, and the foreplay necesssary to get the business and the customer ready, than it is aimed at some of the change management tactics for scope control. Nevertheless, here's my paraphrase of Kotter's 8:

1. Put a value on short term versus long term
2. Gather a coalition of the willing
3. Develop the vision, goals, and strategy
4. Communicate
5. Push action to the practitioners
6. Be incremental
7. Consolidate gains
8. Leverage culture


Now, in Kotter's actual formulation for point #1, he wrote more about creating a sense of urgency than simply putting a value on the short term. But, actually, I'm not a fan of crying wolf on urgency just to get the team moving. Frankly, I'm more about finding a legitimate reason to value a short term goal; with that in hand, you should be able to get some action going.

His point about #5 was the ole "empowerment" thing. It was probably less worn in 1996 that it is 15 years later. The issue is that the empowered may not know how to use their power. That hasn't changed since the tension and conflict between power and empowerment were invented, as observed by British General Dill when conferencing with his American counterparts early on in the WW II era:

Those with the power have no experience Those with the experience have no power
General Sir John Dill
Placentia Bay Conference, August 1941



I really like the last one about culture. We've mused on culture a few times here. Just click here to get a sense of where I'm coming from.



Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog

Sunday, June 5, 2016

Thinking: it's about systems first



I've pointed to "Thinking in Systems: a primer" by D. Meadows in prior posts, and here I go again, largely because I think she's offered a lot that is useful to managers of all stripes, not just systems people.

But, there's traps!
Here's an abridged list of 'traps' she cautions against, a list of ills we should all be cautious about:
Success to the successful
If the winners of a competition are systematically rewarded with the means to win again, a reinforcing feedback loop is created by which, if it is allowed to proceed uninhibited, the winners eventually take all, while the losers are eliminated.

The Tragedy of the Commons
When there is a commonly shared resource, every user benefits directly from its use, but shares the costs of its abuse with everyone else. Therefore, there is very weak feedback from the condition of the resource to the decisions of the resource users. The consequence is overuse of the resource, eroding it until it becomes unavailable to anyone.

Drift to Low Performance
Allowing performance standards to be influenced by past performance, especially if there is a negative bias in perceiving past performance, sets up a reinforcing feedback loop of eroding goals that sets a system drifting toward low performance.

Rule Beating Trap
Rules to govern a system can lead to rule-beating-perverse behavior that gives the appearance of obeying the rules or achieving the goals, but that actually distorts the system. Rule beating is usually a response of the lower levels in a hierarchy to overrigid, deleterious, unworkable, or ill-defined rules from above.

Seeking the Wrong Goal
System behavior is particularly sensitive to the goals of feedback loops. If the goals-the indicators of satisfaction of the rules-are defined inaccurately or incompletely, the system may obediently work to produce a result that is not really intended or wanted.


Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog