Friday, February 16, 2018

Illuminating the path


Here below I shamelessly reprint a posting from herdingcats.
Why? Because Glen's point about the purpose of plans is to illuminate possibilities; not to constrain critical thinking, common sense, and accommodation for changing facts, is profound.  Everyone who practices project management or team management should understand that.

From herding cats:
In the space of two days I had evolved two plans, wholly distinct, both of which were equally feasible. The point I am trying to bring out is that one does not plan and then try to make circumstances fit those plans. One tries to make plans fit the circumstances 
George Patton
General, U.S. Army

"Any suggestion that plans and planning are not part of project management, no matter the approach - agile or traditional - wilfully ignores the purpose of a plan. That purpose is not the force the team to follow a path, but to illuminate the possible paths that can be followed to reach the goal."

end quote (emphasis added)


Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog

Sunday, February 11, 2018

Organization v management


We organize with hierarchies; we manage through networks
 More or less, that idea is the theme of historian Niall Ferguson's book "The Square and the Tower".

Simply said, the orgainizing principle of bureaucracy is the hierarchy  -- span of control; process flow-down; interchangeable people (to an extent) such that no one is indespensible -- everyone ages out!

But effective management more or less ignores the organization chart. Effective management reaches and influences any and all who can make a difference.

Hierarchy is way too restrictive for effective management; it promotes stovepipes and "we/they" stresses that are not value-additive. And, it's one dimensional: command and control. Networks, on the other hand:
  • Are multidimensional: cultural, financial, personal, professional, political (and, all of the above)
  • Are fluid with circumstances
  • Respond to multiple cultures -- virtual, domestic, foreign, business
  • Convey power in multiple ways: real power (hierarchy); associative power (who you know, and how well you know them); cultural or personality power
  • Facilitate promotion: way more people get to know you  
"They" say: if you can't effectively network, you can't expect to be influential and consequential. "They" are probably right
 


Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog

Thursday, February 8, 2018

Accurate v Fairness



Do you use a lot of data in your project? Most everyone does.
Things to keep in mind:
  • All data has some bias(es) built in
  • Because it's objectively accurate doesn't make it acceptably fair
Bias in data
There are library shelves full (or virtually full at library websites) of papers explaining biases that get built into data, even if unwittingly. Kahneman and Tversky probably have some of the best reading on this subject. Factors:
  • (Planning) Experiment design flaws
  • (Collection) Measurement and counting method error sources
  • (Collection) Expectations influences on observations
  • (Analysis) Statistical errors and data selection errors (bias)
  • (Interpretation) Framing to make a point
Accuracy v Fairness
You probably ran into this in school: grading on the curve. Grades were objectively one thing, but in fairness--we are told--grades posted were something else.

What's fair isn't what's accurate, necessarily
Fairness is about impact--what the downstream causation and consequences are for having applied data results to an issue. When people are in the loop, all the harder: race, class, win-lose, antimosities.

In the project office, the first instinct is (or should be) objectivity.


Risk management
But then objectivity meets management goals (*), and before you know it, we're back to "grading on the curve". If "fair" and "accurate" are not the same, then there is a gap. And, what is the nature of this gap? Risk! And, who is the risk manager? The PMO, naturally

(*) Political, humanitarian, economic, social justice, or competitive factors enter the picture.


Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog

Monday, February 5, 2018

Bitcoin project budgets


Cryptocurrencies are coming to a project near you! Indeed, you could be first on your block to budget in a cryptocurrencies, of which there are now hundreds according to a study by NIST

It's all magic you say? You're in good company: Arthur C. Clarke once wrote, “Any sufficiently advanced technology is indistinguishable from 130 magic”

Now, I've worked on projects with foreign currency funding; that put's the company's treasurer on the project team. The treasurer works the currency hedges and other exchange related stuff.

Would a project paid for in Bitcoin be different? Yes, and no. Yes, the volatility of cryptocurrency values adds a degree of risk not usually on the risk matrix. No, currency-hedged projects are not that unusual. We've done it before.

What is money?
This question doesn't usually arise in project management circles, but hey! It's a new monetary world. 
Most of the world's money today, some say 90%, is virtual: no paper, no coinage, no gold or other tangible back-up. Virtual money that is one of the five world exchange currencies has properties that transfer well to cryptocurrency:
  • Convenienty portable, more so than gold bullion for sure
  • Persistent, never wears out or can be destroyed like coinage and paper (the Canadians have gone to a plastic replacement for paper, but nonetheless....)
  • Value universally understood by comparison to tangibles (we know the value of a Coke the world over)
Ergo: cryptocurrencies can qualify as money

The blockchain
The thing that makes a cryptocurrency work is an underlying technology called a "block chain". The important take away understanding a block chain is that it is a secure distributed database with no central authority but with enforced data integrity.

Now, a secure distributed database can be useful for a lot of stuff, and for distributed projects it could be very useful indeed.

Here's what NIST says about what a blockchain is:
Blockchains are immutable digital ledger systems implemented in a distributed fashion (i.e., without a central repository) and usually without a central authority. At their most basic level, they enable a community of users to record transactions in a ledger that is public to that community, such that no transaction can be changed once published.

A blockchain is essentially a decentralized ledger [database] that maintains transaction records on many computers simultaneously. Once a group, or block, of records is entered into the ledger, the block’s information is connected mathematically to other blocks, forming a chain of records.

Because of this mathematical relationship, the information in a particular block cannot be altered without changing all subsequent blocks in the chain and creating a discrepancy that other record-keepers in the network would immediately notice.

In this way, blockchain technology produces a dependable ledger without requiring record-keepers to know or trust one another, which eliminates the dangers that come with data being kept in a central location by a single owner.


Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog

Friday, February 2, 2018

Collingridge's dilemma



A simple but profound observation:
When change is easy, the need for it cannot be foreseen; when the need for change is apparent, change has become expensive, difficult and time consuming.
David Collingridge
"The Control of Technology"


Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog

Tuesday, January 30, 2018

My backlog is blocked!



Yikes! My backlog is blocked! How can this be? We're agile... or maybe we've become de-agiled. Can that happen?

Ah yes, we're agile, but perhaps not everything in the portfolio is agile; indeed, perhaps not everything in the project is agile.

In the event, coupling the culprit

Coupling? Coupling is system engineering speak for transferring one effect onto another, or causing an effect by some process or outcome elsewhere. The coupling can be loose or tight.
  • Loose coupling: there is some effect transference, but not a lot. Think of double-pane windows decoupling the exterior environment from the interior
  • Tight coupling: there is almost complete transference of one effect onto another. Think of how a cyclist puts (couples) energy into moving the chain; almost none is lost flexing the frame.

In the PM domain, it's coupling of dependencies: we tend to think of strong or weak corresponding roughly to tight or loose.

The most common remedy is to buffer between effects. The effect has to carry across the buffer. See Goldratt's  Critical Chain method for more about decoupling with buffers

But buffers may not do the trick. We need to think of objects, temporary or permanent, that can loosen the coupling from one backlog to another (agile-on-agile), or from the agile backlog to structured requirements (agile-on-traditional).

With loose coupling, we get the window pane effect: stuff can go on in environment A without strongly influencing environment B; this is sort of a "us vs them" approach, some might say stove piping.

Obviously then, there are some risks with loose coupling in the architecture that bear against the opportunity to keep the backlog moving, to wit: we want to maintain pretty tight coupling on communication among project teams while at the same time we loosen the coupling between their deliverables.

There are two approaches:
  • Invent a temporary object to be a surrogate or stand-in for the partner project/process/object. In other words, we 'stub out' the effect into a temporary effect absorber.
  • Invent a service object (like a window pane) to provide the 'services' to get from one environment to another.
Of course, you might recognize the second approach as a middle layer, or the service operating system of a service-oriented-architecture (SOA), or just an active interface that does transformation and processing (coupling) from one object/process to another.

With all this, you might see the advantages of an architect on the agile team!


Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog

Saturday, January 27, 2018

GIGO ... or quality in; quality out


Garbage in; garbage out... GIGO to the rest of us
Re GIGO, this issue is raised frequently when I talk about Monte Carlo simulation, and the GIGO thing is not without merit. These arguments are made:
  • You have no knowledge of what distribution applies to the uncertainties in the project (True)
  • You are really guessing about the limits of the three point estimates which drive the simulation (partly true)
  • Ergo: poor data information in; poor data information out (Not exactly! the devil is in the details)
Here are a few points to consider (file under: Lies, damn lies, and statistics):

Who care about the distribution?
First: There's no material consequence to the choice of distribution for the random numbers (uncertain estimates) that go into the simulation. As a matter of fact, for the purposes of PM, the choices can be different among tasks in the same simulation.
  • Some analysts choose the distribution for tasks on the critical path differently than tasks for paths not critical.
  • Of course, one of the strengths of the simulation is that most scheduling simulation tools identify the 'next most probable critical path' so that the PM can see which path might become critical.
Re why the choice is immaterial:  it can be demonstrated -- by simulation and by calculus -- that for a whole class of distributions, in the limit, their infinite sum takes on a Normal distribution.

  • X(sum) = X1 + X2 + X3 +.... +XN, where N is a very large number, X(Sum) is Normal, no matter what the distribution of X is
As in "all roads lead to Rome", so it is in statistics: all distributions eventually lead to Normal. (those that are practical for this purpose: single mode and defined over their entire range [no singularities]),

To be Normal
To be Normal, or normal-like, means that the probability (more correctly, the probability density function, pdf) has an exponential form. See: Normal distribution in wikipedia

We've seen this movie before in this blog. For example, few uniform distributions (not Normal in any respect), when summed up, the sum took on a very Normal appearance. And, more than an appearance, the underlying functional mathematics also became exponential in the limit.

Recall what you are simulating: you are simulating the sum of a lot of budgets from work packages, or you are simulating the sum of a lot of task durations. Therefore, the sim result is summation, and the summation is an uncertain number because every element in the sum is, itself, uncertain.

All uncertainty is distributed... no point solution
All uncertain numbers have distributions. However, the distribution of the sum need not be the same as the distribution of the underlying numbers in the sum. In fact, it almost never is. (Exception: the sum of Normal is itself Normal) Thus, it really does not matter what distribution is assumed; most sim tools just default the Triangular and press on.

And, the sim also tends to discount the GIGO (garbage in/garbage out) problem. A few bad estimates are likewise immaterial at the project level. They are highly discounted by their low probability. They fatten the tails a bit, but project management is a one-sigma profession. We largely don't care about the tails beyond, certainly not six sigma!

Second: most folks when asked to give a 3-point simply take the 1-pointer they intended to use and put a small range around it. They usually resist giving up anything so the most optimistic is usually just a small bit more optimistic than the 1-pointer; ....  and then they put something out there for the most pessimistic, usually not giving it a lot of thought.

When challenged, they usually move the most-likely 1-pointer a bit to the pessimistic side, still not wanting to give up anything (prospect theory at work here). And, they are usually reluctant to be very pessimistic since that calls into question the 1-pointer (anchor bias at work here). Consequently, you get two big biases working toward a more optimistic outcome than should be expected

Bias can be overcome
Third: with a little coaching, most of the bias can be overcome. There is no real hazard to a few WAG because unlike an average of values, the small probability of the tails tends to highly discount the contribution of the WAGs. What's more important than reeling a few wild WAGs is getting the 1-pointer better positioned. This not only helps the MC Sim but also helps any non-statistical estimate as well.

Bottom line: the garbage, if at the tails, doesn't count for much; the Most likely, if a wrong estimate, hurts every methodology, whether statistical or not.


Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog