Saturday, December 30, 2017

Trust vis Acceptance


As I've written elsewhere, there is not really -- and can not be -- trust between strangers because trust requires a transfer (or exchange) of power based on a mutually shared belief that interests are entangled.

To wit: you can trust me on this because I accept some responsibility for our mutual success

And, I'll only do that if I know you well enough to believe in your competence, your integrity, and your sense of values that align well enough with what I value. This, of course, lets out strangers

On the other hand, even if I don't have trust, I can have acceptance. I can believe, based on observation or context, that you are likely to follow the norms I follow. If that weren't the case, we probably couldn't hurdle down the highway at 80mph with on-coming traffic of strangers. We don't know the other drivers, of course, but we have acceptance of their belief in the value of the traffic norms.

With that in mind, I was struck by this statement in a novel I'm reading:
"Trust is something an intelligence officer does not give on first meeting. But, I [can] accept you"
"The Quantum Spy" David Ignatius

Given that tens of thousands of PMs work in classified environments, and given the breaches of confidence and norms in the past few years, is it any wonder that trust in our industry is devolving to merely acceptance -- with a "show me" to follow?


Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog

Wednesday, December 27, 2017

Big data -- a quotable snip


For years and years, PMs have been "little data" people. We plot data points, compute averages, look at 1-sigma, 2-sigma (and sometimes, 6-sigma) limits, ponder the Central Limit Theorem, and wonder about the law of large numbers

Fair enough

What then is "big data" if not simply more numbers, more data points?
Andrew Gelman -- eminent authority in the statistical analysis field -- has this pithy answer:
“Big Data” is more than a slogan; it is our modern world in which we learn by combining information from diverse sources of varying quality."
The biggie, of course, is the key phrase "diverse sources of varying quality". That certainly fits the project world -- we are not, after all, operations or manufacturing or distribution where processes are well defined and the data is all about process quality.

And, so what is it we do when the data quality varies?
  • Get more, to see if there is a discernible and useful pattern
  • Use Bayesian techniques to refine hypothesis based on observations and feedback
  • Throw out the obviously bad stuff, though try not to throw it out just because it's an inconvenient counter-point
Of these, Bayesian techniques are the most powerful.
Not familiar with Bayes? Search this blog site; you'll find a lot of stuff here




Read in the library at Square Peg Consulting about these books I've written

Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog

Saturday, December 23, 2017

Christmas math



You know, I wrote a book on quantitative methods in project management. But somehow I missed covering these equations!





Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog

Tuesday, December 19, 2017

Fidelity, faithfulness, loyalty, and commitment



Fidelity, faithfulness, and commitment often seem to be the tension between:
  • What the customer/sponsor/user want, and
  • What the project charter/scope calls for.

Why so? Why isn't it straightforward? The business case begets the project charter; the charter begets the project plan; and the project team is committed. Simple, right?

Wrong!

It's never that simple -- though on paper that's the way it should be.

What is reality is a challenge between "fidelity to user expectation" and "fidelity to user specification".

Expectation v specification. How to manage this? Any gap between them should always be a decision and not just a consequence of wandering off track. And, consider this:
  • If you have the latitude to shift "loyalty" from specification to expectation, you are in what the community generally calls an "agile" environment. 
  • Some will see a shift in loyalty as a breach of commitment, and a lack of faithfulness to the project charter.
  • Indeed, there may be two decisions, one for each criteria, with the customer as the referee: does the customer want to honor the spec, or shift to expectation? (Does the customer have the latitude to make this decision?)
Measurements and value
At the end of the day, how are you measured:  Fidelity, faithfulness, loyalty to the charter and specification, or commitment to customer satisfaction?

Keep this thought close by:
What is really at stake is a "best value" outcome: "the most useful and important scope that's affordable."


Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog

Friday, December 15, 2017

The technical debt thing



Steve McConnell has pretty nice video presentation on Managing Technical Debt that casts the whole issue in business value terms. I like that approach since we always need some credible (to the business) justification for spending Other People's Money (OPM)

McConnell, of course, has a lot of bona fides to speak on this topic, having been a past editor of IEEE Software and having written a number of fine books, like "Code Complete".

In McConnell's view, technical debt arises for three reasons:
  1. Poor practice during development, usually unwitting
  2. Intentional shortcuts (take a risk now, pay me later)
  3. Strategic deferrals, to be addressed later
In this presentation, we hear about these ideas:
  • Risk adjusted cost of debt (expected value), 
  • Opportunity cost of debt, cost of debt service -- which Steve calls debt service cost ratio (DSCR), a term taken from financial debt management -- and 
  • The present value of cost of debt. 
In other words, there's a lot of ways to present this topic to the business, and there's a lot ways to value the debt, whether from point 1, 2, or 3 from the prior list.

One point well made is that technical debt often comes with an "interest payment". In other words, if the debt is not attended to, and the object goes into production, then there is the possibility that some effort will be constantly needed to address issues that arise -- the bug that keeps on bugging, as it were.

Interest payments factored in
To figure out the business value of "pay me now, pay be later", the so-called interest payments need to be factored in.

In this regard, a point well taken is that debt service may crowd out other initiatives, soaking up resources that could be more productively directed elsewhere. Thus, opportunity cost and debt service are related.

Bottom line: carrying debt forward is not free, so even strategic deferrals come with a cost.




Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog

Tuesday, December 12, 2017

ISO 21500 Project Management



ISO has published -- December, 2012 -- the first version of the 21500 standard on project management. The official site for 21500 is here.

The purported benefits are these, as reported by ISO:
  • Encourage transfer of knowledge between projects and organizations for improved project delivery
  • Facilitate efficient tendering processes through the use of consistent project management terminology
  • Enable the flexibility of project management employees and their ability to work on international projects
  • Provide universal project management principles and processes


The good news for the PMP crowd is that there are not many differences between the the two standards since ISO used the PMBOK as the foundation for 21500. A few things are added, and a few things are rearranged, but it's largely the same content. For instance, Stakeholder Management has been added as a knowledge area. But the 42 processes in the PMBOK have been consolidated into 39 in 21500.

Companion standards
  • Quality management systems
    Guidelines for quality management in projects
  • Risk management
    Principles and guidelines


Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog

Saturday, December 9, 2017

Avoid -- if you can -- batch transfers


Hiss! We no longer say "waterfall"; now we say "batch transfer"

Fair enough. New label; old wine. I get it.

But, still, to be avoided -- if you can
What replaces the sequential batch transfer (formerly: sequential waterfall. Hiss!)?

Collaboration! (*) And, less optimally, smaller "batches" (did someone say "lean"?)

Mike Cohn has an excellent posting on the "collaboration-replaces-batch-transfer" thing. And, he has some wise advice on smaller batches, to wit:
Small is good, until it's too small. When is that? When there are too many things to manage; when there is danger of a small item getting forgotten; when the forest is not evident because of all the saplings.(**)
-------------------
(*) Collaboration: working together to plan our joint journey down the waterfall, as it were
(**) System engineering speak for "there are so many things that I can't judge the interactions, predict chaotic responses, or understand the value-add of integration"


Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog

Wednesday, December 6, 2017

Cockburn on Thermodynamics



Many of us learned recently from Alistair Cockburn that he studied engineering in college and took a course in thermodynamics for which he received a passing grade. However, he declares: ....can't recall anything about thermodynamics. He asks: ....does that invalidate the worth of my degree?

No, his degree shows a pathway (his word) of self improvement and education. And, for the most part, a college degree is more of an indicator of ability to perform in an academic or theoretical environment than it is a measure of actual retention of the constituent knowledge base.

On the other hand....
It's most unfortunate for a software leader like Cockburn to not recall his thermo instruction, particularly the famous "2nd Law of Thermodynamics", and its most important spinoff: the concept of ENTROPY

What is entropy?
Entropy is a measure of randomness or disorder in a system. A stable system at rest has an irreducible non-zero entropy--that is: a finite amount of system capability that is present but not available to do work.

And from this stable state of equilibrium entropy can only go up as the system departs a bit from absolute stability as conditions change.

The practical effect is that some of the energy that goes into changing state is diverted into waste thereby raising entropy. In mechanical systems, this is most evidenced by waste heat. In other systems, like information systems, entropy effects are wasted memory, CPU cycles, and unused bandwidth.

The corollary: a system in some elevated state of instability can be made more stable. And, as stability increases, entropy decreases, and wasted energy (work * time) is leaned out.

Entropy for project managers
Now, in the modern practice of information theory and computer science, the concept of entropy is hugely important. The 2nd Law is alive and well!

As practical matter we really can't, or usually don't, measure entropy directly since it's not economic to discover the true minimum state of equilibrium.  What we do is measure the change in entropy:
  • Every bit of non-value add work leaned from a process is a change in process entropy
  • Every improvement in system downtime (rate of failure) is a change in system entropy
  • Every improvement in design errors (error density of design units) is a change in design entropy
And, in a computer science application, the random energy created by random key strokes and other random processes is harnessed and put to work doing useful work in operating systems.  Windows, Linux, Unix, etc all use the entropy [energy of disorder] in this way.

In a past engagement developing and bringing to operations an ERP system in a large scale enterprise, my team was constantly aware of the entropy of our work product.  We didn't know the absolute stable state we might be able to achieve, but we had enough history to know we weren't there yet.

Our basic entropy metric was the rate of discovery of new problems.  This is modeled with a Poisson distribution with a average rate of 'lambda'. (drawing) 
Who do we blame for this complication of the body of knowledge (a search of the PMBOK does not bring up entropy)?

We blame the Bell System and the telephone industry.  Claude Shannon (in 1948) coined the term 'entropy' to describe the telephone bandwidth unavailable for communication purposes; in effect, the residual disorder and randomness in the communication channel after all means to get lean have been tried.  (photo)

Recently, a posting by John Baez, et al explains Shannon and the concept of only measuring the difference in entropy rather than entropy itself.  Baez is a little challenging to read, but hey: no pain, no gain!




Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog

Sunday, December 3, 2017

Decision triangle and metacognition



Project management, risk management, and other related disciplines are replete with triangles. Here's one more for the collection, taken from an article on "metacognition" by Marvin Cohen and Jared Freeman.

As you can readily see, it's all about decision making under stress [indeed, that's the paraphrase title of their paper] when information potential [in the form of choices and perhaps disorder] may be maximum but data may be incomplete, conflicting, or unreliable.


Their principal conclusion:
Proficient decision makers are recognitionally skilled: that is, they are able to recognize a large number of situations as familiar and to retrieve an appropriate response. Recent research in tactical decision making suggests that proficient decision makers are also metarecognitionally skilled. In novel situations where no familiar pattern fits, proficient decision makers supplement recognition with processes that verify its results and correct problems

Of course, my eye is drawn to the word 'familiar'. In point of fact, there is a decision bias described by Tversky and Khaneman, named by them as "availability bias". In a word, we tend to favor alternatives which are similar to things we can readily bring to mind--that is, things are that are readily available in our mind's eye.

Back to Cohen and Freeman:
"More experienced decision makers adopt more sophisticated critiquing strategies. They start by focusing on what is wrong with the current model, especially incompleteness. Attempting to fill in missing arguments typically leads to discovery of other problems (i.e., unreliable arguments or conflicts among arguments)."

Of course, there's the issue of calling the question and getting to convergence--or, in sales: getting to yes!  Discovery is good, but it is also is the mark of a more experienced decision maker to stay on course and only evaluate new discoveries if they are truly in the path to a decision on the current problem. 




Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog