Sunday, August 20, 2017

Average is not always the answer

Doesn't all data sets have an average?
Actually, not
  • Yes, data taken from a common distribution along a common scale  has a parametric value* that emerges from a long run of repetitions

    The repetition-weighted sum we commonly call the average, even if it's not the most frequent outcome, and even if it's not an allowable outcome (the average of integers is often not an integrer, as the average value of the roll of one die is 3.5)
  • Data that cluster's about a central value always has an average. This clustering idea we call the "central limit theorem"
Here's a couple of issues
  • No, not every data set we run across in PM are from a common distribution, or has a common scale, and so the concept of average doesn't apply. Example: big projects and little projects don't have a common scale: big numbers for the former; small numbers for the latter
  • Not every data set clusters
Three ideas you may not have thought about:
  1. Clustering: When there's clustering, there will be an average. Usually, data taken from a common scale clusters You can't average feet and inches; nor apples and oranges. Common scale required!
  2. Not clustering: Data taken from many different scales doen't cluster, but rather follows a "power law". This gives rise to the so called "80/20" rule and other other ideas commonly shown on a Pareto Chart of descending values... so, no cluster effect, and NO AVERAGE.

    Almost all issues dealing with money is power law stuff.
    There's just no meaningful average of big projects (on one scale) with little projects (on another scale). The 80/20 rule is more the way to look at it since a Pareto Chart is scale-free!
  3. Independent events: Something happens; when will it happen again? You're doing something; when will it be finished? This isn't clustering, and it's not 80/20 stuff. You read this page; when will you read it again? Or, how long will it take you to finish reading?

    This leads to the so-called "additive rule": just add a constant to what you know to get the next outcome. Where does the constant come from? You estimate it! This leads to: "just give me five more minutes to get this done" sort of thing.

    Of course, there's no average because there's no common scale because everything is independent, memory-less (the present does not depend on the past, though the past may figure into your experience to give an estimate of the constant).
* Parametric values, like average, are calculated; often they are not observed or even reliazable in the data set
This discussion adapted from the chapter on Bayes Rule in "Algorithms to Live By" by Christia and Griffiths

Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
Read my contribution to the Flashblog

Thursday, August 17, 2017

The Lord Beaverbrook model

Lord Beaverbrook, a genious in getting things done with big bureaucracy under the extreme pressure of war, had this philosophy
Organization is the enemy of improvisation
It is a long jump from knowing to doing
Committees take the punch out of .....  [fill in the blank]

Lord Beaverbrook was a close advisor and confidant of Winston Churchill during WW II. His genius: making processes work. His first job was making the production of aircraft stupendous enough to win the "Battle of Britain" in the air in 1940.

Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
Read my contribution to the Flashblog

Monday, August 14, 2017

Everyone estimates!

We all make estimates; and we all make estimates all the time.
  • When I make the 20 mile trip each Tuesday to a client site, I estimate adjustments to a baseline based on weather and road construction and if I know about an accident.
Really, no one sets out to do anything meaningful without some estimate in mind re time, or cost, or risk; usually we can also notionally estimate the scope.

So, now we hear from renown Agilist Mike Cohn, author of "Agile Estimating and Planning", about his views estimating new scope:

Rule 1: Estimating is not a vote. Well, actually, there's a lot of voting in estimating as a group agrees on a consensus, so I don't agree with Mike on this one, but he's welcome to his idea.
Rule 2: Most estimating is relative and so most people know notionally whether or not some is bigger or smaller; more or less complex, etc than an understood baseline
Rule 3: Everyone participates; even if you don't know what you're talking about. Seems curious, but Mike's idea is that most estimating is relative (see above), so even the uninformed can make some contribution to the discussion. My take: perhaps, but the uninformed often don't have "permission" to be in the decision making.

Anyway, to end with the beginning in mind: We all make estimates; and we all make estimates all the time.

Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
Read my contribution to the Flashblog

Friday, August 11, 2017

Auteur model of innovation

Good grief!  Yet another model to learn about.  Now we hear about the auteur model of innovation, an idea coined by John Kao, an innovation guru.

Auteur: an innovator that has a distinctly personal style and maintains artistic control over all design and production aspects of his/her work [a definition adapted for projects from the film industry where the auteur model has been practiced for many years].

Sounds a bit autocratic!  Whatever happened to the empowered team and even more broadly: the wisdom of the crowds?  What happened to the embedded customer model--think: SCRUM -- and other customer-intimacy models like the market leader ideas of Treacy and Wiersema? 

Indeed! where is system engineering in this model?
In our world, the project world, and especially the technology project world where anyone can have an idea, the auteur model is a rare occurrence; a successful implementation, sustained over time, is even more rare.But take a look at the poster child for the auteur model:  today that is Apple.

Without its innovative leader, the company foundered; with its bold visionary, the company prospered. And, in a decidedly not-agile way of doing things, it is said Apple never asks its customers anything.  They sure did not ask about a name for the iPad!
Obviously, the dependency on one personality is Risk 1 for this model -- both an upside opportunity and a downside risk.  For the successful, it's evident opportunity triumphs! Obviously, Tim Cook is having a good run post Steve Jobs; but I think everyone would agree Jobs set the bar.But it's evident that Apple, and others with this model, go a step or two further.
  • Step 1: make  your own market where one didn't exist;
  • Step 2: exploit that market with the simplest possible product that has the highest possible wow! factor. 
  • And, step 3: guard the gate!  Be selective about letting others ride the coattails of your own auteurist [is that a word?]
Mr Kao writes: "Mr. Jobs is undeniably a gifted marketer and showman, but he is also a skilled listener to the technology. He calls this “tracking vectors in technology over time,” to judge when an intriguing innovation is ready for the marketplace.

Skilled listener?  Perhaps so.  One of the mantra's of the agile movement today is to be value on simplicity.  Apple has certainly gotten that.   Jobs told us that the floppy was dead 18mos before it was indeed dead; and now Cook has done away with the headphone jack (gasp!)

  The best possible example of the wisdom of simplicity.  Not that things Apple builds are simple; they are simply minimally complex!

Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
Read my contribution to the Flashblog

Tuesday, August 8, 2017

Dilbert on Agile

Among some comments to a posting on TechCrunchIT, is this snippet about Dilbert's run-in with Agile:

Dilbert: We need 3 more programmers.
Use agile programming methods.
Agile programming does not mean doing more work with less people.
Find me some words that do mean that and ask again

Dilbert is a creation of Scott Adams

Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
Read my contribution to the Flashblog

Saturday, August 5, 2017

When to stop?

You're doing something. Fair enough
What if there's no obvious end-point?
  • Like finding the "just-right" SME for the team ... you never know when that is going to happen
  • When should you stop looking at candidates?
Actually, there's a computer science answer to this.
Look no farther than the "optimal stopping" algorithm!

As explained by authors Christian and Griffiths*, there are two ways to make a mistake trying to find the "best" SME ... assuming you are interviewing candidates in random order
  1. Stop too soon, hiring "suboptimally", missing out on the best candidate you never get to interview
  2. Stop too late, passing up the pick of the litter hoping there is an even better candidate
Obviously, the first person you interview is the "best" so far ... there's only one. The next one you interview has a 50/50 chance of being the best; and the third person has a one in three chance, etc. But how long to go on with this?

After a lot of research, and numerous academic papers, the computer science community has arrived at an algorithm (of course they have): "Look, then leap". 

Look, then leap
Step 1: set a fixed time to look over the field. Don't choose anyone (or anything) in the "look" phase
Step 2: after the look phase, leap! Leap on the best candidate that then comes along, or go back to a "best" in the look phase if they are still available.

What's the chance that you'll get the "best" candidate? About one chance in three ... actually, long term, 37%. Here's an abbreviated chart of the odds, as developed by validation of the optimal stopping algorithm:
  • 3 candidates; take the best after 1; 50% chance of getting the best
  • 6 candidates; take the best after 2; 43% chance of the best
  • 10 candidates; take the best after 3; 40% chance of the best
  • Huge number; take the best after 37% of huge; 37% chance of getting the best
Actually, look before leaping is not bad advice for a whole host of activity!
* "Algorithms to Live by", Brian Christian and Tom Griffiths

Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
Read my contribution to the Flashblog

Wednesday, August 2, 2017

A commentary on Agile

I was looking back at some prior essays on Agile and came across the April 2012 PMNetwork magazine. Specifically I was attracted (again) to page 58 for an interview with some agilists on the state of the practice.

Here are a couple of quotes from Jim Highsmith worth tucking away:

Agile project management embraces both “doing” agile and “being” agile—and the latter is the hardest. It defines a different management style: one of facilitation, collaboration, goal- and boundary-setting, and flexibility.

... agile is changing the way organizations measure success, moving from the traditional iron triangle of scope, schedule and cost to an agile triangle of value, quality and constraints.

My take:
Doing agile and being agile: Good insight, but these ideas are certainly agile but not unique to Agile.
  • To my way of thinking, all enlightened project managers have been doing this all along, or they should have been.
Now, I certainly agree: Agile calls for a reset of manager's and management's approach (aka style) to projects.
  • Fixed price, for a fixed scope, in a fixed schedule is ok if you're in "we've done it before" or some kind of production, but not if you are trying to cure cancer, etc.

Value shift: Agile shifts the discussion from fixed value to best value. And, what is best value?
It's the best the team can do, with the resources committed, towards achieving project goals that will ultimately lead to business success.

Who says what's "best"? In the Agile space, that is a collaboration of the project team, the sponsor, and whoever holds the customer/user's proxy. That's the key:
  • The customer/user--through their proxy--gets an input to the value proposition because they may use or buy the outcomes, but the customer/user has no money at stake; it's other people's money, OPM
  • The sponsor also gets an input  because it is their money at stake. (The sponsor may be a contracting office, as in the public sector)
  • The project team gets an input because they are in the best place to judge feasibility.

Measuring success: Highsmith's second idea is certainly Agile, but it may be too agile for some. Why so? First, there's still "other people's money (OPM)".... . you can't work with OPM and not have metrics of performance to stack up against the money. So, the cost-schedule-scope tension may be hard to manage, but at least there are metrics.

I don't have a problem with another paradigm, say Highsmith's value, quality and constraints, so long as they come with metrics that align with the value that sponsors put on money.

That's why I associate myself with best value: It's OPM with metrics that align with a value proposition leading not only to project success but to business success as well.

Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
Read my contribution to the Flashblog