No, his degree shows a pathway (his word) of self improvement and education. And, for the most part, a college degree is more of an indicator of ability to perform in an academic or theoretical environment than it is a measure of actual retention of the constituent knowledge base.
On the other hand....
It's most unfortunate for a software leader like Cockburn to not recall his thermo instruction, particularly the famous "2nd Law of Thermodynamics", and its most important spinoff: the concept of ENTROPY
What is entropy?
Entropy is a measure of randomness or disorder in a system. A stable system at rest has an irreducible non-zero entropy--that is: a finite amount of system capability that is present but not available to do work.
And from this stable state of equilibrium entropy can only go up as the system departs a bit from absolute stability as conditions change.
The practical effect is that some of the energy that goes into changing state is diverted into waste thereby raising entropy. In mechanical systems, this is most evidenced by waste heat. In other systems, like information systems, entropy effects are wasted memory, CPU cycles, and unused bandwidth.
The corollary: a system in some elevated state of instability can be made more stable. And, as stability increases, entropy decreases, and wasted energy (work * time) is leaned out.
Entropy for project managersNow, in the modern practice of information theory and computer science, the concept of entropy is hugely important. The 2nd Law is alive and well!
As practical matter we really can't, or usually don't, measure entropy directly since it's not economic to discover the true minimum state of equilibrium. What we do is measure the change in entropy:
- Every bit of non-value add work leaned from a process is a change in process entropy
- Every improvement in system downtime (rate of failure) is a change in system entropy
- Every improvement in design errors (error density of design units) is a change in design entropy
Our basic entropy metric was the rate of discovery of new problems. This is modeled with a Poisson distribution with a average rate of 'lambda'. (drawing)
Who do we blame for this complication of the body of knowledge (a search of the PMBOK does not bring up entropy)?
Claude Shannon (in 1948) coined the term 'entropy' to describe the telephone bandwidth unavailable for communication purposes; in effect, the residual disorder and randomness in the communication channel after all means to get lean have been tried. (photo)
Recently, a posting by John Baez, et al explains Shannon and the concept of only measuring the difference in entropy rather than entropy itself. Baez is a little challenging to read, but hey: no pain, no gain!
Bookmark this on Delicious