Friday, May 28, 2010

Complexity and the extreme

I was struck by two essays that were published side by side today, one extolling the recently announced breakthrough in synthetic biology, a program of projects extending some 15 years, and another warning about unpredictable risks in systems of extreme complexity.

On the one hand, Olivia Judson, an evolutionary biologist and biology research fellow at Imperial College London tells us that although "...one problem with creating life from the drawing board is that evolved biological systems are complex, and often behave in ways we cannot (thus far) predict" we should not be too concerned because we are going slowly (15 years to the this point) and the benefits are profound: an understanding of life itself.  I'm not so sure.  There is an entire body of knowledge around "Complex Adaptive Systems" (CAS) most of which that have been studied are biological, and the extent of our inability to predict outcomes is also profound.

But, as I said, side by side with Ms Judson's faith in our ability to understand biological complexity is a counter point, motivated by the oil debacle in the Gulf, but really about the complexity of systems that support our way of life and the intersection of human understanding and the state of our emotional and objective decision making.

David Brooks, a political analyst and not a risk manager in our sense, in his piece makes six points:
1. People have trouble imagining how small failings can combine to lead to catastrophic disasters. (Some call this cause-effect networks in which the sum is larger than the parts, a coherent reinforcement of small risks)
2. People have a tendency to get acclimated to risk. 
3. People have a tendency to place elaborate faith in backup systems and safety devices. ( Even if the devices are untested, not validated, or only theoretically effective, and/or not maintained)
4. People have a tendency to match complicated technical systems with complicated governing structures. (There is a familiar idea in system engineering that the architecture of systems often mimics the architecture of the developing organization, or owners looks like their dogs!)
5. People tend to spread good news and hide bad news. 
6. People in the same field begin to think alike, whether they are in oversight roles or not.

Are you on LinkedIn? Share this article with your network by clicking on the link.