Friday, May 28, 2010

Complexity and the extreme

I was struck by two essays that were published side by side today, one extolling the recently announced breakthrough in synthetic biology, a program of projects extending some 15 years, and another warning about unpredictable risks in systems of extreme complexity.

On the one hand, Olivia Judson, an evolutionary biologist and biology research fellow at Imperial College London tells us that although " problem with creating life from the drawing board is that evolved biological systems are complex, and often behave in ways we cannot (thus far) predict" we should not be too concerned because we are going slowly (15 years to the this point) and the benefits are profound: an understanding of life itself.  I'm not so sure.  There is an entire body of knowledge around "Complex Adaptive Systems" (CAS) most of which that have been studied are biological, and the extent of our inability to predict outcomes is also profound.

But, as I said, side by side with Ms Judson's faith in our ability to understand biological complexity is a counter point, motivated by the oil debacle in the Gulf, but really about the complexity of systems that support our way of life and the intersection of human understanding and the state of our emotional and objective decision making.

David Brooks, a political analyst and not a risk manager in our sense, in his piece makes six points:
1. People have trouble imagining how small failings can combine to lead to catastrophic disasters. (Some call this cause-effect networks in which the sum is larger than the parts, a coherent reinforcement of small risks)
2. People have a tendency to get acclimated to risk. 
3. People have a tendency to place elaborate faith in backup systems and safety devices. ( Even if the devices are untested, not validated, or only theoretically effective, and/or not maintained)
4. People have a tendency to match complicated technical systems with complicated governing structures. (There is a familiar idea in system engineering that the architecture of systems often mimics the architecture of the developing organization, or owners looks like their dogs!)
5. People tend to spread good news and hide bad news. 
6. People in the same field begin to think alike, whether they are in oversight roles or not.

Are you on LinkedIn? Share this article with your network by clicking on the link.

Thursday, May 27, 2010

Complexity and the marching army

Every project manager has an intuitive feel for the fact that cumulative fixed costs escalate with a slipping schedule. We in the industry call this the cost of the marching army.  Whether usefully engaged or not, they are on the project's charge number and their costs mount as the schedule slips. No news here.

Now, a little more thought about this leads in this direction: there actually are two cost densities to be concerned about.  One density is cost per unit of value delivered, dimensioned in cost dollars per dollar of value.  Some may want to call such a density a cost/value efficiency, and that's ok.

Another cost density is cost per unit of schedule duration, dimensioned in cost dollars per time unit.  This is the density that controls the marching army cost.
To get total cost in the project, we multiply each density by its units consumed or delivered. So far, just arithmetic.

What about interdependencies?  They usually add grief to any project.  Does one affect the other?  Peel the onion one more layer and you come to complexity as a driver on cost/value density, and to complexity as a driver on the number of schedule units to apply to the cost/schedule density.  Complexity creates the interdependency between fixed and variable costs.

How so?  Complexity itself is a matter of two primary elements: [1] the number of elements and the number of relationships between the elements, and [2] the feasibility of the elements.  The latter is going to drive special tools, skills, and practices in the cost/value density.  The former, because implementing relationships are such a cost driver, are going to drive the the number of schedule units for design, testing, and validation.

So, in a word: complexity!  That's the correlating element between the marching army and those that create value.

Are you on LinkedIn? Share this article with your network by clicking on the link.

Monday, May 24, 2010

All things bell shaped

Chapter 14 from the GAO Cost Estimating Manual on "Cost Risk and Uncertainty" is a good read, easily understood, and very practical in its examples.  Here's one illustration that I particularly like.  When you look at it, it's understood in a moment that the repeated random throw of two dice generates a probability density function [PDF] that has a bell-shape curve that is tending towards a true Normal distribution.

Statisticians call this phenomenon the Central Limit Theorem: random occurrences over a large population tend to wash out the asymmetry and uniformness of individual events.  A more 'natural' distribution ensues.  The name for it is the Normal distribution, more commonly: the bell curve.

Here's what it looks like to a project manager.  Notice that regardless of the distribution of cost adopted by  work package managers for each individual work package, in the bigger picture at the summation of the WBS there will tend to be a bell-shaped variation in the WBS budget estimate.  In part, use of these ideas addresses the project manager's need to understand the parameters of variation in the project budget as evidenced by the esitmates of WBS.  This diagram is (again) from Chapter 14 of GAO's manual:

If the risk analyst generates these data from a simulation, like a Monte Carlo simulation, then the numeric statistics like variance and standard deviation are usually reported, along with the cumulative probability more commonly called the "S" curve.  In the diagram, on the right side, we see the cumulative curve plotted and labeled on the vertical axis as the confidence level.  With a little inspection, you will realize that the cumulative curve is just the summation of the probabilities of the bell curve that is adjacent on the left.

The GAO manual, and especially Chapter 14, has a lot more information that is well explained.  Give it a read.

Are you on LinkedIn? Share this article with your network by clicking on the link.

Thursday, May 20, 2010

Crystal Clear: the book

About five years ago, Alistair Cockburn wrote a book, "Crystal Clear: A human powered methodology for small teams", that is still a pretty good read today for those getting started in agile methods but concerned for the people aspect.  Cockburn is probably one of the foremost thought leaders in the agile community about the ability of developers, testers--indeed, the whole team--to respond predictably and consistently, whether agile or not.

Of course, one of my favorite papers by Cockburn about the vagaries of human performance, and especially when embedded on an agile team, is this one on the non-linear characteristics of people.  I think you'll find this one a good read as well; and some provocative ideas to ponder.

Cockburn has given some serious thought to scaling the methodology.  Agile is best suited for small teams that do not have mission-critical requirements.  "Clear" is his "one small team" method.  Crystal Orange, described in another book, is "Clear" scaled up a bit.

If you're interested in "Clear", here's a nice summary book report by Donna Davis that gives you a pretty good idea of what you are going to find in the book.

Are you on LinkedIn? Share this article with your network by clicking on the link.

Monday, May 17, 2010

Things to count

"Unless we know how things are counted, we don't know if it's wise to count on the numbers"
John Allen Paulos
Mr. Paulos, in writing this thought, was rephrasing Albert Einstein's familiar caution: "Not everything that can be counted counts, and not everything that counts can be counted."

John Paulos goes on to say that more often than not it's not the statistical tests applied to data observations,  although it is important to choose wisely about what to observe [Dr Einstein's point], but rather the really important thing is what we do to prepare before data is gathered, and what we do with the conclusions and analysis once data is gathered and processed. 

Part of the preparation is constructing a data dictionary that gives full explanation and definition to the data elements in the observation.  Ambiguities of definition will invalidate or damage any credibility of subsequent analysis.  Second, analysts should identify ambiguous cause-and-effect that might erroneously show functional relationships where they don't actually exist.

And, of course, once data is gathered, it's typical to aggregate data by common affinity.  Affinity is a matter of judgment, and judgment is subject to many biases.  So, now we see the opportunity for qualitative biases to color objective facts.

Of course, Einstein is telling us that sometimes the stuff that counts is not actually countable.  This is an entry point for utility assignments.  Utility is a tool for giving relative quantitative weights to qualitative properties so that some of our counting tools can be used.  Having made this point, now cycle back to John Paulos's caution and begin again!

Are you on LinkedIn? Share this article with your network by clicking on the link.

Friday, May 14, 2010

Some ideas: managing quantitative risk in schedules

There are a lot of ideas about managing the quantitative risk in schedules: I put down a few in this presentation on slideshare.

test 1
Are you on LinkedIn? Share this article with your network by clicking on the link.

Tuesday, May 11, 2010

Systems Assurance Guidebook

The National Defense Industrial Association, NDIA, in 2008 published a systems assurance [SA] guidebook entitled "Engineering for Systems Assurance".

From the executive summary, the threat:
"For decades, industry and defense organizations have tried to build affordable, secure, and trustworthy systems.  Despite significant strides towards this goal, there is ample evidence that adversaries retain their ability to compromise systems"

And this definition of SA:
"System Assurance is the justified confidence that the system functions as intended , and is free of exploitable vulnerabilities...."

In an article in this month's Crosstalk, the journal of defense software engineering, there is a discussion of how SA fits into the DoD's program acquisition framework, the general lifecycle of large scale programs [projects] in the DoD.

The whole concept is built around an idea called an "assurance case".  In the case, the program manager and system engineer assert, with proof, that the functions are indeed free of exploitable vulnerabilities.

Frankly, its good to know that someone is on the case!  What with China, Google, and a host of others including the social networks, SA is more important than ever before.  The guidebook is worth a read.

Are you on LinkedIn? Share this article with your network by clicking on the link.

Sunday, May 9, 2010

Dilbert on Agile

Among some comments to a posting on TechCrunchIT, is this snippet about Dilbert's run-in with Agile:

Dilbert: We need 3 more programmers.
Boss: Use agile programming methods.
Dilbert: Agile programming does not mean doing more work with less people.
Boss: Find me some words that do mean that and ask again

Dilbert is a creation of Scott Adams

Are you on LinkedIn? Share this article with your network by clicking on the link.

Tuesday, May 4, 2010

Principles for every day

In a prior blog, I talked about my seven top values to manage by. Values are what we believe in but they don't necessarily point toward action. For action, we need principles. Principles, the way I define them, are the actionable response to values.  Here is my to ten for everyday working success.  The emphasis is on teams and teamwork, but these principles can stretch further.

Teams are the structure of choice to execute complex interdisciplinary projects
Multifunctional teams accept and embrace complexity, disorder, and uncertainty more effectively than individuals working alone because of the mutual support for problem solving and the opportunity for group creativity

No team will be chartered without a compelling purpose and mission
A compelling mission is the most effective motivator for cohesion and commitment to results.

Communication and collaboration will be frequent and without reticence
Teams are only better than groups of individuals when teammates achieve synergy.

Teams will be made small, but encouraged to network for scale
Larger teams require internal structures and authority figures to manage the scale.
Small teams can have the effect of large teams by networking and committing to joint objectives.

Team assignments will be made presuming commitment and accountability
Assignments only made on the basis of position and availability are discouraged
Assignments focus on completing the compliment of technical, functional, and decision-making skills

Time and activities to promote trust will be planned into the project time line. Strangers do not trust. Virtual teams need more time and specific opportunity to overcome factors unique to the displacement of team members

A safe working environment will be provided
Safety is the first step to trust. A role as nemesis will be accepted

Team members will be encouraged to listen actively, give the benefit of the doubt, respond constructively, and acknowledge achievements of others
The 'golden rule' of team behavior

Team results and measurements will be evaluated for collective achievement. Individuals are valued for their skill and ingenuity. Collective achievement is valued for its best value fit to customer expectation

Self-organized teams will be granted a measure of autonomy
Leadership comes from within the team. Processes conform to the conventions of the enterprise to ensure that all assertions and claims to certifying bodies are valid

Are you on LinkedIn? Share this article with your network by clicking on the link.

Saturday, May 1, 2010

Of plans and the planning mystic

General Dwight D. Eisenhower (DDE) once said during the European campaign of 1944:
Planning is everything; plans are nothing

As one of the foremost military planners in American history, what does it say when General Eisenhower's primary value assignment--did someone say utility?--is to planning and not  the plan itself?  Simply put [well, nothing is really simple, especially in war, but also in projects] that planning is the creative part where ideas, innovation, and myriad of facts and opinions--the 'dots' in the current vernacular--are connected in a particular pattern, a particular network.

Eisenhower was referring to assembling and evaluating the dots--the act of planning--as being more valuable than a particular pattern of dots--a plan--that is a point solution for a moment in time.  Indeed, as a plan is a network of interconnected dots, so is there a critical path through that network, and that path is probabilistic, subject to risks and uncertainties, and likely to give way to another path through the network as events unfold.  Even more so, some uncertainty may pop up and require a whole new network.  DDE would probably say the hard work has been done: most of the dots are available and evaluated; some new dots have to be integrated, and then a reconstituted point solution has to be brought forward.

Dudley Pope, now deceased, was a popular English author who wrote novels about British naval tactics in the French-British-Spanish wars of the early 19th century.  Describing one of his dashing sea captain's thinking about plans and planning during high risk and uncertainty, Pope's hero, a frigate captain, said:
 "....that was one lesson .... learned over the years: do not explain your entire plan to subordinates all at once; do a section at a time, as it becomes necessary.  Rarely can a plan be carried out from beginning to end in its entirety.  There is usually some hitch somewhere in the middle, so the plan has to be amended to fit the new conditions.  Subordinates, however, are slow to change to a sudden new situation if their heads are full of the old plan.  Somehow, they seem to resist any new modification.  But if you tell them a section at a time--keeping them just ahead of events--then they react quickly and decisively".
There's a certain arrogance there about the relationship of leaders and followers that offends contemporary ears, but the message of agile thinking and the hazards of inflexibility in plans is clear.

[Pope, D. "Ramage's Challenge", McBooks Press, New York, 2002, pg 225}

Are you on LinkedIn? Share this article with your network by clicking on the link.