Showing posts with label risk decision. Show all posts
Showing posts with label risk decision. Show all posts

Tuesday, November 1, 2022

NSA warns about Taiwan tensions


Does your project depend on supplies from Taiwan?
Is your project doing project work in Taiwan?

You should be thinking about how to be anti-fragile (*) if the balloon goes up according to national security officials.

NSA Director of Cybersecurity Rob Joyce has some critical lessons on how companies can withstand an escalation in China-Taiwan tensions and what such conflicts matter in the first place.

"We had advance warning of the Russia invasion" of Ukraine, said Joyce during a keynote at Mandiant's mWISE security conference today. "What would you do if tomorrow you got advanced warning of a China-Taiwan conflict? What business decisions would you have to make?"
_________________

(*) Anti-fragile: able to sustain and survive shocks to your plan without catastrophic failure of the overall project




Like this blog? You'll like my books also! Buy them at any online book retailer!

Saturday, March 19, 2022

Visionary's approach to Risk Management


Leap!
And expect a parachute to follow

From the Netflix series "Inventing Anna"

Actually, in the book "The Network" by Scott Wooley -- which is a pseudo-biography of Edwin Armstrong (vacuum tube amplifier inventor, among many electronics inventions ... 42 patents and IEEE Medal of Honor) and David Sarnoff (founder of RCA and NBC) ---  there are many "Leap!" projects described, to include but not limited to:

  • AM and FM modulation, commercial radios and radio networks (NBC)
  • Transoceanic radio telegraphy
  • Television, color television, and television networks
  • Communication satellites (RCA)
These guys were amazing!



Like this blog? You'll like my books also! Buy them at any online book retailer!

Tuesday, November 17, 2020

Feeling the pressure


" [The concern my some is] that with all good intentions, some project managers might start cutting corners. It's easily done. Don't be fooled by the trappings. ....

This sort of success comes with a lot of pressure. There are deadlines, penalties, [finances], and [executive changes]. [PMs] are stuck in the middle. Priorities can become murky.

It would be natural for some to feel the pressure and choose speed over quality" 
Louise Penny, novelist

I'm sure you can tell from the brackets that I took the excerpt from one of Ms Penny's crime novels, but nonetheless her character's words probably ring all too true to many readers. One wonders if she is writing about the false engineering found in the diesel car industry or the calamitous decision-making in the aviation industry of late.

Risk assessment and confirmation bias

I put it down to executive-level risk assessments. Looking the other way or deliberately hiding is always the path to trouble. There is a political adage that might apply: the coverup is always worse than the underlying transgression.

Even if that is understood, the pressure of the moment is often telling. One sign: the stressed PM is looking everywhere for confirmation .... and making themselves susceptible to confirmation bias. It is likely they will hear what they want to hear.

It's like a bad email

Most people handle email (and social media) poorly, sending email (or media) when they are mad or when they think no one else will find out. Never make an important decision when mad, and always assume what you write will appear on the front page.

The same is for the high-risk assessments. Unless life itself hangs in the balance, there is time to consider the consequences more thoughtfully. Take that time.




Buy them at any online book retailer!

Saturday, October 10, 2020

Be Bold; be brave; be calculating



I would hardly think today of making my first flight on a strange machine in a 27-mile wind . . .

I look with amazement upon our audacity in attempting flights with a new and untried machine under such circumstances.

Yet faith in our calculations and the design of the first machine, based upon our tables of air pressures, secured by months of careful laboratory work, and confidence in our system of control … had convinced us that the machine was capable of lifting and maintaining itself in the air

— Orville Wright, from “How We Made the First Flight” (*)

I hope you were able to read the last sentence, as long as it is -- my editor would have been apocalyptic!

So, what have we got here with O.Wright's statement that can inform project management?

He begins with audacity! Audacity: "a willingness to take bold risks"
To be audacious! Audacity is a risk attitude that is at first personal, but then infects the whole project culture.

But not recklessly bold risks. Audacious is one thing; willful recklessness is entirely different.

Then comes the skill and science

So then comes the science, the engineering, and the risk management to leaven the audacity. Afterall, as we learn from author Jo Nesbo, "Someone will no doubt come up with an opinion with the benefit of hindsight, but we prefer to be wise in foresight".

In this case, wisdom in foresight requires:

  • Calculations and tables of metrics
  • Careful laboratory work
  • Confidence in the system engineering
  • Measurable goal: capable of lifting maintaining itself in air

And what does the world get with this elixir of bold thinking, careful consideration of risk, and skill-and-science?

  • Heavy machines that fly
  • Semiconductors of near atomic size
  • Electric, hydrogen, and possibly others that upend the transportation industry
  • Social media
  • Private space travel
  • And all the other stuff yet to be envisioned!


----------
(*) Quotation courtesy of Herding Cats




 



Buy them at any online book retailer!

Sunday, July 1, 2018

If you only know one thing about Risk Management .....


If you only know one thing about Risk Management, know this:
Schedule slack is your most powerful tool
Poorly developed instincts and skills in the use of this most powerful tool are leading causes of poor risk management

If you are a Systems person --- a strategic thinker; an integrator; a "it all has to work" person -- you'll translate schedule slack into to "loose coupling"

Loose coupling is your most powerful tool
This all sounds like schedule, but the side effects are profound (slack is like a nail; it works everywhere):
  • Time is lost to effect design, manufacturing, or delivery mitigation
  • Pressures mount to "do something"
  • Short-cuts are taken
  • The thing may not work at the end (oops, that's career limiting)
And, the list of slack misuse is relatively short, so everyone should be able to keep these bullets in mind:
  •  It shall be: All schedules require slack; a schedule without slack is but a hope, and is risky all the way
  • At the end: Slack is always sequenced after a risky event is to occur. NEVER put the slack first, hoping it will all go away
  • Don't add risk unwittingly: Unnecessary coupling (to wit: bundling) just adds risk where there was none. Decouple everything; don't purposely couple anything. 
If you bundle (tightly couple) the statistics are against you:
  • If two independent events have a 9-in-10 chance of success, then when tightly coupled and no slack between them, success of the pair is only 8-in-10, a loss of 10 points
  • It gets worse fast: a pair with 7-in-10 chance of success degrades to less than 1-in-2. A loss of 20 points.  Good grief!
The effect of slack? NO loss of points .... a cheap way to fight  
 
 


Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog

Sunday, June 24, 2018

I don't want to lose what I've got



Don't want to slip backward, take a demotion, less pay, risk your savings? Same answer if there is a prospect of doing a whole lot better if you'll just take a risk ...?

Prospect theory may be for you!

Daniel Kahneman and Amos Tversky may be a project manager's best friends when it comes to understanding decision making under conditions of risk. 

Of course, they've written a lot good stuff over the years.....my favorite is "Judgment under uncertainty: Heuristics and biases".  You can find more about this paper in a posting about the key points at HerdingCats

The original prospect thinking
Tversky and Kahneman are the original thinkers behind prospect theory..  Their 1979 paper in Econometrica is perhaps the best original document, and it's entitled: "Prospect Theory: An analysis of decision under risk".  It's worth a read [about 28 pages] to see how it fits project management

What's a prospect?  What's the theory?
 A prospect is an opportunity--or possibility--to gain or lose something, that something usually measured in monetary terms.

Prospect theory addresses decision making when there is a choice between multiple prospects, and you have to choose one.

A prospect can be a probabilistic chance outcome, like the roll of dice, where there is no memory from one roll to the next. Or it can be a probabilistic outcome where there is context and other influences, or it can be a choice to accept a sure thing. 

A prospect choice can be between something deterministic and something probabilistic.

The big idea
So, here's the big idea: The theory predicts that for certain common conditions or combinations of choice, there will be violations of rational decision rules

Rational decision rules are those that say "decide according to the most advantageous expected value [or the expected utility value]".  In other words, decide in favor of the maximum advantage [usually money] that is statistically predicted.

Violations driven by bias:
Prospect theory postulates that violations are driven by several biases:

  • Fear matters: Decision makers fear a loss of their current position [if it is not a loss] more than they are willing to risk on an uncertain opportunity.  Decision makers fear a sure loss more than a opportunity to recover [if it can avoid a sure loss] 
  • % matters: Decision makers assign more value to the "relative change in position" rather than the "end state of their position"
  • Starting point matters: The so-called "reference point" from which gain or loss is measured is all-important. The reference point can either be the actual present situation, or the situation to which the decision maker aspires. Depending on the reference point, the entire decision might be made differently.
  • Gain can be a loss: Even if a relative loss is an absolute gain, it affects decision making as though it is a loss
  • Small probabilities are ignored: if the probabilities of a gain or a loss are very, very small, they are often ignored in the choice.  The choice is made on the opportunity value rather than the expected value.
  • Certainty trumps opportunity: in  a choice between a certain payoff and a probabilistic payoff, even if statistically more generous, the bias is for the certain payoff.
  • Sequence matters: depending upon the order or sequence of a string of choices, even if the statistical outcome is invariant to the sequence, the decision may be made differently.

Quick example
Here's a quick example to get everyone on the page: The prospect is a choice [a decision] between receiving an amount for certain or taking a chance on receiving a larger amount.

Let's say the amount for certain is $4500, and the chance is an even bet on getting $10,000 or nothing. The expected value of the bet is $5,000.

In numerous experiments and empirical observations, it's been shown that most people will take the certain payout of $4,500 rather than risking the bet for more.

The Certainty Effect: Tversky and Kahneman call the effect described in the example the "Certainty effect". The probabilistic outcome is underweighted in the decision process; a lesser but certain outcome is given a greater weight.

The Reflection Effect: Now, change the situation from a gain to a loss: In the choice between a certain loss of $4,500 and an even bet on losing $10,000 or nothing, most people will choose the bet, again an expected value violation. In other words, the  preference....certain outcome vs probabilistic outcome...is changed by the circumstance of either holding onto what you have, or avoiding a loss.

These two effects are summarized in their words:

....people underweight outcomes that are merely probable in comparison with outcomes that are obtained with certainty. This tendency, called the certainty effect, contributes to risk aversion in choices involving sure gains and to risk seeking in choices involving sure losses.

Other Effects:  There are two other effects described by prospect theory, but they are for Part II....coming soon!



Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog

Sunday, December 3, 2017

Decision triangle and metacognition



Project management, risk management, and other related disciplines are replete with triangles. Here's one more for the collection, taken from an article on "metacognition" by Marvin Cohen and Jared Freeman.

As you can readily see, it's all about decision making under stress [indeed, that's the paraphrase title of their paper] when information potential [in the form of choices and perhaps disorder] may be maximum but data may be incomplete, conflicting, or unreliable.


Their principal conclusion:
Proficient decision makers are recognitionally skilled: that is, they are able to recognize a large number of situations as familiar and to retrieve an appropriate response. Recent research in tactical decision making suggests that proficient decision makers are also metarecognitionally skilled. In novel situations where no familiar pattern fits, proficient decision makers supplement recognition with processes that verify its results and correct problems

Of course, my eye is drawn to the word 'familiar'. In point of fact, there is a decision bias described by Tversky and Khaneman, named by them as "availability bias". In a word, we tend to favor alternatives which are similar to things we can readily bring to mind--that is, things are that are readily available in our mind's eye.

Back to Cohen and Freeman:
"More experienced decision makers adopt more sophisticated critiquing strategies. They start by focusing on what is wrong with the current model, especially incompleteness. Attempting to fill in missing arguments typically leads to discovery of other problems (i.e., unreliable arguments or conflicts among arguments)."

Of course, there's the issue of calling the question and getting to convergence--or, in sales: getting to yes!  Discovery is good, but it is also is the mark of a more experienced decision maker to stay on course and only evaluate new discoveries if they are truly in the path to a decision on the current problem. 




Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog

Sunday, October 29, 2017

Of facts and theories


"I have no data yet. It is a capital mistake to theorize before one has data. Insensibly, one begins to twist facts to suit theories, instead of theories to suit facts"
Sherlock Holmes
Yes, Mr Holmes has hit upon the dilemma of various reasoning strategies:
  • Inductive reasoning: from a specific observation to a generalization of causation
  • Deductive reasoning: from a generalized theory to a predicted set of specifics
As he correctly posits, inductive reasoning is hazardous. Just a slight error in facts, or in fashioning causation, or most frequently confusing causation with correlation, may lead to quite incorrect theories.

Thus, the strength of Bayes reasoning (*), a form of deductive reasoning. Aren't we all Bayesians?

-------------------
Look up Bayes Theory on Wikipedia; it's means of deducing conditions which are predictive of facts, a form of statistical reasoning.



Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog

Monday, September 18, 2017

"I wouldn't do that"



"Doubt is unpleasant, but certainty is absurd"
Voltaire

Interested in quantitative risk management, understanding risk metrics, and risk management?
This one may be for you:
Matthew Squair is working on a new book, entitled "I wouldn't do that". He's made a draft of Chapter 1 available, at least for a short time.
Why another book on risk management? The shelves are already full (even if only virtually full). Squair addresses the question. He writes:
" ..... in the face of the uncertainties posed by complex systems and new technologies, can we also achieve an acceptable level of safety. Answering this question both philosophically and practically is the purpose of this book"
Squair's focus as a consultant is risk-safety .... connecting those dots. But, he really writes in a larger context. Thus, if you're the least interested in managing with uncertainty, and self-educating with a very readable explanation of quantitative risk considerations, look through the Chapter 1 sample.

Here's the opening statement:
A nuclear reactor’s defences are overwhelmed by a tsunami, an aircraft manufacturer’s newest aircraft is brought low by design faults while another aircraft’s crew is overwhelmed by a cascading series of failures.

Why do these things happen?

Can we predict such events and protect ourselves from them, or are we destined to suffer like disasters again and again?
Do you actually care? You do! But, maybe not all that much

Squair posits that for there to be risk, there has to be value first. That is, if there is really nothing at stake then there really is not a risk to be considered. It's an interesting idea, really: start thinking first of value, or the value of an outcome, and then posit the risk proposition to that value.

Squair, again:
" ... there is an exposure to a proposition about which we care and about which uncertainty exists. If there is complete certainty then risk does not exist. For risk to be meaningful it also implies that we must value a specific outcome of the proposition."




Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog

Tuesday, February 7, 2017

Gambling leadership


Every [leader] is a gambler
Admiral John McCain 
 (the senator's father)
Admiral Halsey's air commander, WW II in the Pacific

And, so what do we make of that idea?*
  • To exert leadership, do we layer a propensity to gamble on top of the usual uncertainty that goes along with projects? Is it additive, or compounding?
  • As a gambler, should we expect certain rules to apply ... as in poker?
  • Should we put aside reserves since "the house always wins ... in the long run"**
  • What if we are naturally risk averse ... can we be an effective leader nonetheless?
  • Are managers just leaders who are not gamblers? (Ouch! if you're a manager)
To the first point, it's situational, but certainly risks compound, though you won't see that in a traditional risk impact-likelihood matrix which considers each risk independently. Compounding makes risk management in the real situation much more an "adult" manager's task
To the second point: No! The rules that govern probability examples in textbooks and games in the casino don't really apply to project management. Why? Because in projects, coins aren't fair (see: politics), they have more than two sides, and projects are not stationary over long periods (see: stuff happens!)
To the third point: "It's reserves, stupid!". Of course you need reserves. If you need an explanation, you're really not managing a project, you're hoping
To the fourth point: That's actually hard. Yes, some can adopt a "professional" risk profile different from their natural profile, but it's exhausting and hard to maintain over time
And, to the last point: Not exactly. See the fourth point.


*Admiral Halsey, as a WWII Pacific fleet commander, was perhaps the greatest naval gambler America had -- he even gambled with typhoons; and, he lost his share of bets
**The Las Vegas strip is financed by losers


Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog

Thursday, December 1, 2016

Kahn Academy on game theory



Chapter 12 of my book, "Maximizing Project Value", posits game theory as tool useful to project managers who are faced with trying to outwit or predict other parties vying for the same opportunity.

When John von Neumann first conceived game theory, he was out to solve zero-sum games in warfare: I win; you lose. But one of his students challenged him to "solve" a game which is not zero-sum. To wit: there can be a sub-optimum outcome that is more likely than an outright win or loss.

For this most part, this search for compromise or search for some outcome that is not a complete loss is throughout the business world, the public sector (except, perhaps, elective politics), and certainly is the situation in most project offices.

The classic explanation for game theory is the "prisoner's dilemma" in which two prisoners, both arrested for suspected participation in alleged crimes, are pitted against each other for confessions.

The decision space is set up with each "player" unable to communicate with the other. Thus, each player has his/her own self interest in mind, but also has some estimate of how the other player will react. The decision space then becomes something like this:
  1. If only you confess, you'll get a very light sentence for cooperating
  2. If you don't confess but the other guy does, and you're found guilty, you'll get a harsher sentence
  3. If both of you confess, then the sentence will be more harsh than if only you cooperated, but less harsh than if you didn't cooperate
  4. If neither of you confess, risking in effect the trust that the other guy will not sell you out, you and the other prisoner might both go with a fourth option: confess to a different but lesser crime with a certain light sentence.

From there, we learn about the Nash Equilibrium which posits that in such adversarial situations, the two parties often reach a stable but sub-optimum outcome.

In our situation with the prisoners, option 4 is optimum -- a guaranteed light sentence for both -- but it's not stable. As soon as you get wind of the other guy going for option 4, you can jump to option 1 and get the advantage of even a lighter sentence.

Option 3 is actually stable -- meaning there's no advantage to go to any other option -- but it's less optimum than the unstable option 4.

Now, you can port this to project management:
  • The prisoners are actually two project teams
  • The police are the customer
  • The crimes are different strategies that can be offered to the customer
  • The sentences are rewards (or penalties) from the customer

And so the lesson is that the customer will often wind up with a sub-optimum strategy because either a penalty or reward will attract one or the other project teams away from the optimum place to be. Bummer!

There are numerous YouTube videos on this, and books, and papers, etc. But an entertaining and version is at the Khan Academy, with Sal Khan doing his usual thing with a blackboard and and voice over.


And, you can read Chapter 12 of my book: "Maximizing Project Value" (the green/white cover below)



Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog

Tuesday, August 16, 2016

Evaluating prospects -- alternatives



Daniel Kahneman and Amos Tversky may be a project manager's best friends when it comes to understanding decision making under conditions of risk. 

Of course, they've written a lot good stuff over the years.....my favorite is "Judgement under uncertainty: Heuristics and biases".

The original prospect thinking
Tversky and Kahneman are the original thinkers behind prospect theory..  Their 1979 paper in Econometrica is perhaps the best original document, and it's entitled: "Prospect Theory: An analysis of decision under risk".  It's worth a read [about 28 pages] to see how it fits project management

What's a prospect?

 A prospect is an opportunity or possibility with both an upside advantage and a downside risk. Said another way: by opting fo a prospect, you might gain something you don't have or lose something that you do have, that something usually measured in monetary terms.

Prospect theory addresses decision making when there is a choice between multiple prospects, and you have to choose one.

And, a prospect choice can be between something deterministic or certain and something probabilistic or uncertain.

What's the Theory? The big idea
So, here's the big idea: The theory predicts that for certain common conditions or combinations of choice, there will be violations of rational decision rules

Rational decision rules are those that say "decide according to the most advantgeous expected value [or the expected utility value]".  In other words, decide in favor of the maximum advantage [usually money] that is statistically predicted.

Ah yes! Statistics .... lies, damn lies, and statistics!
Shocking news -- sometimes, we ignore the statistics. Ergo: violations of rational decision rules.

Evaluating alternatives and prospects: Violations of decision rules driven by bias:
Prospect theory postulates that violations of decision rules are driven by several biases which we all have, to some degree or another:
  • Fear matters: Decision makers fear a loss of their current position [if it is not a loss] more than they are willing to risk on an uncertain opportunity.  Decision makers fear a sure loss more than a opportunity to recover [if it can avoid a sure loss] 
  • % matters: Decision makers assign more value to the "relative change in position" rather than the "end state of their position"
  • Starting point matters: The so-called "reference point" from which gain or loss is measured is all-important. (A small gain matters more to those that have nothing, than the same amount matters to those that have a lot) The reference point can either be the actual present situation, or the situation to which the decision maker aspires. Depending on the reference point, the entire decision might be made differently.
  • Gain can be a loss: Even if a relative loss is an absolute gain (to wit: I didn't get as much as I expected), the lesser outcome affects decision making as though it is a loss
  • Small probabilities are ignored: if the probabilities of a gain or a loss are very, very small, they are often ignored in the choice.  The choice is made on the opportunity value rather than the expected value.
  • Certainty trumps opportunity: a bird in hand ... in  a choice between a certain payoff and a probabilistic payoff, even if statistically more generous, the bias is for the certain payoff.
  • Sequence matters: the near-term counts for more. Depending upon the order or sequence of a string of choices, even if the statistical outcome is invariant to the sequence, the decision may be made differently.




Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog

Saturday, July 2, 2016

The case for SLOW programming


Every other posting about software these days is about 'go fast' or 'quick delivery' etc. Agile everywhere; Agile every time!

But, there's a case for SLOW!

How about when you're asked to code up some morality; when you're asked to code the decisions which are philosophical, steeped in moral decisions, and perhaps are quite personal?

This project issue is embedded in this essay about coding the autonomous vehicle.

Let's say you're doing the coding for some decision making the car's control must address:
You’re driving through an intersection and three people step into the road; the only way to avoid hitting them is to steer into a wall, possibly causing serious injury to yourself.

Would you sacrifice yourself?

Now change the equation: There are four people in the road and you have two family members in the back seat.

Do you still steer into the wall?

Would it matter if one of the pedestrians was a baby? If one of them was old or very attractive
 
Should we just put this scenario into some story cards?
 "As an autonomous vehicle I want to ... when ..., or else ....",
Code it up, and call it DONE? Then, release it to production as soon as the functionality is certified?

Or, do we do a BDUF* (gasp!), and thoughtfully go through the morality tree. In other words, take it SLOW!

And, is this a case of the wicked problem where there are only virtuous circles, endlessly conflicting needs, and no satisfactory point of entry or exit?

Did I mention -- I don't have the answers to any of this. But I hope Google and others do.

*BDUF: Big design up front, the thing we are supposed to rid ourselves of


Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog

Wednesday, May 25, 2016

The Brain and the Decision


Since at least the 1970s we've all been acutely aware of the vagaries of the influence of the brain processes in concert with environment and context on decision making. Fair enough

For those that are not so acutely aware, begin with the work of Daniel Kahneman and Amos Tversky -- their stuff is the classic treatment
Now comes more on the same line of thought -- no pun intended
No less an authority than Oppenheimer Funds posted an ad for financial decision making that had some good points for project managers
  •  First, as a matter of review, rational thought and emotional thought are supported differently in the brain: the latter in the limbic region and the former in the frontal lobe. So, each of us are likely equipped differently for the two thought influences ... Mars and Venus, as it were
  • Second, all decision making is a mix of the rational and the emotional. Some studies, reported elsewhere, have shown that under conditions of brain injury, a rational brain may get into an endless do-loop and not break out to a decision without a dose of emotion.
  • Our personal biology, beyond even the brain, is an influencer: reactions to gut feelings, etc in the "body-brain" system
  • Stress responses trigger hormones, and these in turn, through the body-brain thing, influence risk taking.
That is, risk appetite can vary with stress. It's not altogether fixed by context, experience, and track record. Risk appetite can be "in the moment"



Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog

Tuesday, April 5, 2016

Weight of evidence



If you are into risk assessments, you may find yourself evaluating the data

Evaluating the data usually means testing its applicability with a hypothesis. The process usually cited in project management chapters about risk is this: 
A hypothesis is formed. And then the evidence against the hypothesis—observable data—is evaluated. If the evidence against is scant, the hypothesis is assumed valid; otherwise: false.
Guessing?

Is this guessing? The hypothesis is true because no one seems to object? After all, how much evidence is against? Enough, or have you taken the time to really look?

Most of us would agree: the evidence-against-the-hypothesis does not always fit the circumstances in many project situations.

There are many cases where you've come to the fork in the road; what to do? Famed baseball player Yogi Berra once posited: "When you come to the fork in the road, take it!"


In the PM context, Yogi is telling us that with no data to evaluate, either road is open. In Bayes* terms: it's 50:50 a-priori of there being an advantage of one road over another.

Weight of Evidence
Enter: weight-of-evidence**, a methodology for when there is some data, but yet there is still a fork in the road. In this case, we consider or evaluate each "road"—in project terms, models or suppositions—and look at the ratio of probabilities. 
  • Specifically, the probability of model-1 being the right way to go, given the data, versus the probability of model-2, given the data.
Each such ratio, given the observable data, conditions, or circumstances, is denoted by the letter K: K equals the ratio of probabilities

Philosophers and mathematicians have more or less agreed on these strength ratios:
Strength ratio, aka “K”
Implied strength of evidence favoring one over the other
1 - 3
Not really worth a mention
3 -20
Positive
20 – 150
Strong
> 150
Very strong

Why form a ratio?
It's somewhat like a tug-of-war with a rope:
  • Each team (numerator team and denominator team) pulls for their side.
  • The analogy is that the strength of the pull is the strength or weight of evidence. Obviously, the weight favors the team the pulls the greatest. Equal weight for each team is the case for the rope not moving.
  • K is representative of the strength of the pull; K can be greater than 1 (numerator team wins), less than 1 (denominator team wins), or equal to 1 which is the equal weight case.
More data
The importance and elegance of the methodology is felt when there are several data sets—perhaps from different sources, conditions, or times—and thus there are many unique calculations of "K". 

You might find you have a set of K’s: K1 from one pair of teams, but K2 from another, and so on. What to do with the disparate K’s?

Sum the evidence
The K’s represent the comparative weight of evidence in each case. Intuitively, we know we should sum up the "evidence" somehow. But, since "K" is a ratio, we really can't sum the K’s without handling (carefully) direction.
  • That is: how would you sum the odds of 2:1 (K = 2) and 1:2 (K = 2, but an opposite conclusion)? We know that the weights are equal but pulling in opposite directions. Less obvious, suppose the opposing odds were 2:1 and 1:5?
Add it up
Fortunately, this was all sorted some 70 years ago by mathematician Alan Turing. His insight: 
  • What really needs to happen is that the ratio's be multiplied such that 2:1 * 1:2 = 1. 
  • To wit: evidence in opposite direction cancels out and unity results.
But, hello! I thought we were going to sum the evidence. What's with the multiplying thing?
Ah hah!  One easy way to multiply—70 years ago to be sure—was to actually sum the logarithms of K
It's just like the decibel idea in power: Add 3db to the power is the same as multiplying power by 2
Is it guessing?
You may think of it this way: when you have to guess, and all you have is some data, you can always wow the audience by intoning: "weight of evidence"!

Appendix:
Geeks beyond this point
Does everyone remember the log of 2 is 0.3? If you do, then our example of summing odds of 2:1 and 1:2 becomes: Log(2) + Log(1/2) = 0.3 – 0.3 = 0.
Of course the anti-Log of 0 is 1, so we are back at the same result we had by intuitive reasoning.
On first examination, this might seem an unusual complication, but it's a take-off on the slide rule and the older decibel (dB) measurement of power. They both multiply by summing: a 3db increase in power (or volume) means the power has doubled.
An example
What if we had four sets of evidence: odds of {10:1; 2:1; 1:2; and 1:20}. What’s the weight of evidence?
Using logarithms: log(10) = 1, log(2) = 0.3, log(1/2) = -0.3; log(20) = log(10) + log(2),
{1 + 0.3 – 0.3 – 1.3} = - 0.3 or odds of 1:2
Not all that obvious
_____________________________________________
*Bayes refers to a protocol of evaluating probabilities, given some a-priori conditions, with the idea of discovering a-posterior an underlying probable “truth”. Such discovery depends on an opportunity to gather more data, perhaps with other conditions attached.
**A practical application of the weight of evidence method is cryptanalysis. It was first developed into a practical methodology in WWII by famed mathematician Alan Turing working at storied Bletchley Park. Something of it is explained in the book "Alan Turing: The Enigma"


Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
http://www.sqpegconsulting.com
Read my contribution to the Flashblog