Friday, July 1, 2011

Cockburn on thermodynamics

Many of us learned recently from Alistair Cockburn that he studied engineering in college and took a course in thermodynamics for which he received a passing grade. However, he declares: ....can't recall anything about thermodynamics. He asks: ....does that invalidate the worth of my degree?

No, his degree shows a pathway (his word) of self improvement and education. And, for the most part, a college degree is more of an indicator of ability to perform in an academic or theoretical environment than it is a measure of actual retention of the constituent knowledge base.

On the other hand....
It's most unfortunate for a software leader like Cockburn to not recall his thermo instruction, particularly the famous "2nd Law of Thermodynamics", and its most important spinoff: the concept of ENTROPY

What is entropy?
Entropy is a measure of randomness or disorder in a system. A stable system at rest has an irreducible non-zero entropy--that is: a finite amount of system capability that is present but not available to do work.

And from this stable state of equilibrium entropy can only go up as the system departs a bit from absolute stability as conditions change.

The practical effect is that some of the energy that goes into changing state is diverted into waste thereby raising entropy. In mechanical systems, this is most evidenced by waste heat. In other systems, like information systems, entropy effects are wasted memory, CPU cycles, and unused bandwidth.

The corollary: a system in some elevated state of instability can be made more stable. And, as stability increases, entropy decreases, and wasted energy (work * time) is leaned out.

Entropy for project managers
Now, in the modern practice of information theory and computer science, the concept of entropy is hugely important. The 2nd Law is alive and well!

As practical matter we really can't, or usually don't, measure entropy directly since it's not economic to discover the true minimum state of equilibrium.  What we do is measure the change in entropy:
  • Every bit of non-value add work leaned from a process is a change in process entropy
  • Every improvement in system downtime (rate of failure) is a change in system entropy
  • Every improvement in design errors (error density of design units) is a change in design entropy
And, in a computer science application, the random energy created by random key strokes and other random processes is harnessed and put to work doing useful work in operating systems.  Windows, Linux, Unix, etc all use the entropy [energy of disorder] in this way.

In a past engagement developing and bringing to operations an ERP system in a large scale enterprise, my team was constantly aware of the entropy of our work product.  We didn't know the absolute stable state we might be able to achieve, but we had enough history to know we weren't there yet.

Our basic entropy metric was the rate of discovery of new problems.  This is modeled with a Poisson distribution with a average rate of 'lambda'. (drawing) 
Who do we blame for this complication of the body of knowledge (a search of the PMBOK does not bring up entropy)?

We blame the Bell System and the telephone industry.  Claude Shannon (in 1948) coined the term 'entropy' to describe the telephone bandwidth unavailable for communication purposes; in effect, the residual disorder and randomness in the communication channel after all means to get lean have been tried.  (photo)

Recently, a posting by John Baez, et al explains Shannon and the concept of only measuring the difference in entropy rather than entropy itself.  Baez is a little challenging to read, but hey: no pain, no gain!


Delicious
 Bookmark this on Delicious  
Are you on LinkedIn?    Share this article with your network by clicking on the link.