Friday, May 28, 2010

Complexity and the extreme

I was struck by two essays that were published side by side today, one extolling the recently announced breakthrough in synthetic biology, a program of projects extending some 15 years, and another warning about unpredictable risks in systems of extreme complexity.

On the one hand, Olivia Judson, an evolutionary biologist and biology research fellow at Imperial College London tells us that although "...one problem with creating life from the drawing board is that evolved biological systems are complex, and often behave in ways we cannot (thus far) predict" we should not be too concerned because we are going slowly (15 years to the this point) and the benefits are profound: an understanding of life itself.  I'm not so sure.  There is an entire body of knowledge around "Complex Adaptive Systems" (CAS) most of which that have been studied are biological, and the extent of our inability to predict outcomes is also profound.

But, as I said, side by side with Ms Judson's faith in our ability to understand biological complexity is a counter point, motivated by the oil debacle in the Gulf, but really about the complexity of systems that support our way of life and the intersection of human understanding and the state of our emotional and objective decision making.

David Brooks, a political analyst and not a risk manager in our sense, in his piece makes six points:
1. People have trouble imagining how small failings can combine to lead to catastrophic disasters. (Some call this cause-effect networks in which the sum is larger than the parts, a coherent reinforcement of small risks)
2. People have a tendency to get acclimated to risk. 
3. People have a tendency to place elaborate faith in backup systems and safety devices. ( Even if the devices are untested, not validated, or only theoretically effective, and/or not maintained)
4. People have a tendency to match complicated technical systems with complicated governing structures. (There is a familiar idea in system engineering that the architecture of systems often mimics the architecture of the developing organization, or owners looks like their dogs!)
5. People tend to spread good news and hide bad news. 
6. People in the same field begin to think alike, whether they are in oversight roles or not.

Are you on LinkedIn? Share this article with your network by clicking on the link.

1 comment:

  1. John,

    A good book is John Gall's "The System Bible," i which he extols the naive and simple minded approaches to complex system.

    The professional Systems Engineering society INCOSE speaks to methods of manging complexities in some domain - Systems of Systems for example.

    But the biological system emerging now, may swamp completely our ability to comprehend the "system."

    Like all major "accident" the BP blowout was a series of minor mistakes compounded by poor decisoion making. I grew up in the Texas Panhandle, with my father running a drilling company. The blowout preventer should have worked, the fire and gas system should have shut down all operating equipment (the prime movers over revved with the gas entered the engine room), the driller and tool pusher had stated there was going to be a problem with moving off the drill hole before the cement had set (this is no the cement we see in our driveways, but a special chemical to solidifies underwater at extreme pressures). And the list went on.

    Gall has a great quote we used when designing and building fault tolerant process control equipment.

    In a complex system, malfunction and even total nonfunction may not be detected for long periods, if ever. — John Gall

    ReplyDelete