Friday, December 31, 2010

It's New Year's Eve!

Be always at war with your vices, at peace with your neighbors, and let each new year find you a better man. ~Benjamin Franklin

Wednesday, December 29, 2010

The Closer!

Michael Young has a posting on PMHut about closing the project. It's aggressively titled "A complete guide to closing projects"

Except I don't think it's complete, or complete enough 

It's a good discussion as far as it goes, but here's what he says about archiving data:

Following delivery of the Post Implementation Review Report, the project database is archived. Building a repository of past projects serves as both a reference source and as a training tool for project managers. Project archives can be used when estimating projects and in developing metrics on probable productivity of future teams.

Now, that's necessary, but not sufficient.  What you really have to do is update the enterprise estimating models, not just archive the project.  Maintaining models is really the only way to improve estimating. If I say that a "hard" specification requires "X" hours with "Y" skills, the credibility of X and Y are on the line in proposals and in execution. 

Too many organizations "build a repository of past projects" without putting the effort into mining the information in that repository to refine a model of how the organization really works.

And of course, the outliers have to be dealt with, either in the footnotes, or in the distribution of possible outcomes.  After all, X and Y should be expected values if they are single numbers, and if not, then they should be ranges, better yet: percentile rankings.  To wit: a hard specification requires X hours at the 95th percentile.  Now we have something we can work with.

My advice: don't just close; be a Closer!

 Bookmark this on Delicious
Are you on LinkedIn?    Share this article with your network by clicking on the link.

Monday, December 27, 2010

Schedule heresy

Here's a little heresy on schedules, just before the holiday break:

I don't like, and don't recommend, MSProject and similar tools for managing a project! 


Plan v Manage

There's too much administration and faux assumptions day-to-day managing dependencies for it to  be an effective management tool, especially for smaller projects.  There's always a mad scramble and a lot of time and effort taken up on evaluating ad-hoc task level interactions, most of which can be worked out by other means.

On the other hand, it's a great planning tool to get a project started, including a consideration for dependencies, resource conflicts, and other artifacts.  As a planning tool, so long as dependencies are restricted to finish-to-start, and there are no fixed dates, it's a good tool to host Monte Carlo simulations.  It's just that once a project is under way, milestone charts, gated criteria, and earned value spreadsheets are better tools on account of their efficiency, even on large projects.

Walk the talk

I finished an engagement a couple of years ago that went several years, consumed multiple hundreds of millions of dollars, involved hundreds of project staff, and had four blocks of deliveries over two years. 

And, we never had a task-level project schedule. 

What we had were swim lane milestones and major dependencies identified between swim lanes.  Within the lanes, there were one or more teams and planned interteam dependencies, each team with their milestone schedules, and within teams, there were small working groups, also with milestone schedules.

We set up pipelines for sequential delivery, and gates to frame the pipelines.  We managed the pipelines with Excel.  [Look for more on pipelines in a future post]

We also did resource leveling with Excel.... that's a good thing!  The algorithms in schedule tools can return some silly answers.  I always marvel at Excel--it's truly amazing what you can do quantitatively with that tool!

We successfully delivered business value.  The first block was late--and that was a value bummer--but the next three hit their milestones on time.  We did not successfully earn the intended project EV package of cost-performance-schedule.  It was an ERP project [Oracle business systems] for a multi-billion$ enterprise.  Re EV: ERP says it all!

In any event: Plan with MSProject [or Primavera, or other similar], but manage with milestones, gates, and EV

 Bookmark this on Delicious
Are you on LinkedIn?    Share this article with your network by clicking on the link.

Saturday, December 25, 2010

Happy Holidays

From all the folks that make "Musings" possible:

Photo credit:

Thursday, December 23, 2010

The Theory of the Mosaic

And so we have the "Theory of the Mosaic", which roughly told is given by this:

From many disparate and small bits, hiding in public and knowable to those who seek, comes a revelation of the larger idea and greater knowledge

This is much more than just seeing the forest in the presence of trees; this is finding the trees in the first place, and then placing them in context, in juxtaposition, and in relationship so that not only the forest, but all the attributes and nuances of the forest are revealed.

Project management?

You betcha!

The proposal project

Let's begin with the competition to win new business. Bidding competitively is a project in itself; it's only after you win that the execution project begins.

One of the tools used by practitioners of the Theory of the Mosaic is the "expert network". These are the relationships that extend in myriad directions and trade "bits" in the marketplace of knowledge [sometimes rumor or conjecture]. In competition, the network extends not only to the potential customer, but to the customer's customer, regulator, appropriator, and suppliers. It even extends to the direct competition. How many of us have sat down with our direct competitor for a chat about the 'opportunity'?

Assembling all this information [in many cases, just bits of data, not even information] into a narrative that can be then transformed--through the proposal process--into a WBS, cost, schedule, and performance promise for a winning offer is no small task and requires all the discipline and commitment to an objective that is the mark of successful project management.

Of course, one of the tenants of the Theory is that information is hiding in public. Expert networks to ferret out the public information is the secret to success. Usually, there is a bright line--that is, no peek at proprietary competitive information that is unethical or illegal is permitted--but often the line gets blurred, the rules change [sometimes in mid-stream], or international ambiguity [read: culture] mislevels the playing field.

Guardianship of such corruption is no small matter. Just refer to the infamous USAF refueling tanker competition for many "don't do this" examples.

The execution project

No project of any scale operates without interpersonal relationships, a dollop of politics, and the obscure actions of many people working on their part. Enter: "expert networks". The project manager for sure, work stream managers, and cost account managers all 'work their networks': up and out to the sponsors, and down and in to the worker-bees.

Assembling the mosaic

For those experienced in brainstorming, assembling a mosiac from an expert network is really no different.

First, the science:
Like items are grouped
Relationships are labeled
Sequencing is labeled
Small items are grouped under bigger items
Gaps are identified; gap filler plans are formulated
Narrative headlines are written

Then, the art:
Interpretations are made; the 'big picture' emerges from the 'pixels'
Importance and urgency are weighed
Actionable 'intelligence' is separated for execution plans

Finally: act on the intelligence!

 Bookmark this on Delicious
Are you on LinkedIn?    Share this article with your network by clicking on the link.

Wednesday, December 22, 2010

Quotation for project managers

A word from Henry Ford:

 "Quality means doing it right when no one is looking".

Thanks to Luis Coehlo at "" for this bit of wisdom.

In later years, this went on to "Quality is Job One", but then Ford lost the recipe. Now, in a resurgence of Henry's guidance, Ford is regaining the high ground, in part because of a commitment to quality, and part because of another piece of ageless advice:

Keep it simple, stupid!

Of course, simple and complex are not two sides of the same coin.  The simplist idea that gets the job done can still be quite complex.  My definition of simple is that it's the least complexity that is functionally complete and meets 'quality' measures in the large sense of the word: fitness to form, function, effectiveness, efficiency, availability, etc.

In the December 9th 2010 print edition of "The Economist", there is a great interview--"Ephiphany in Detroit"--with CEO Alan Mullaly on the quality and management turn-around at Ford.  It's certainly no secret that many of the things that made Boeing a great aircraft innovator, developer, and production house are being applied at Ford.

There's a lesson to be read about in this interview for all program managers tagged with turning around a project with quality problems. 

It's no secret that the first thing Mullaly did to get on top of quality was insist on candid discussion from his functional managers and to instill a culture of safety from prosecution if a problem werre raised.  The second thing he did was tune-in to what outside objective evaluators have to say; again, he changed the culture from defense to offense. 

So, communicating was a big stick.  There's no doubt that "communication in a commonly understood language is the key that unlocks the culture".  [This I paraphrase from the US Ambassordor to China from a recent interview on Charlie Rose]

 Bookmark this on Delicious
Are you on LinkedIn?    Share this article with your network by clicking on the link.

Tuesday, December 21, 2010

Risk informed decision making

At NASA's risk management homepage you can navigate to a paper on risk informed decision making. As defined there:
Risk-informed decision making, as described in this paper, is
the formal process of analyzing various decision alternatives with respect to their impact on the PMs, of
assessing uncertainty associated with their degree of impact, and of selecting the optimal decision
alternative using formal decision theory and taking into consideration program constrains, stakeholder
expectations, and the magnitude of uncertainties.

Probably the most useful idea in this paper is that risk-informed is different from risk-based. The former takes risk into consideration; the latter adjusts all values for risk and makes a utility decision.

In effect, although some quantitative models are introduced and suggested, the main idea is that risk informs decision making but there are other factors that may intervene and override. Just common sense, really.

Nevertheless, the paper proposes three big steps that are useful to review:

1. Formulation and Selection of Decision Alternatives: In this step the decision alternatives are generated by quantitative and qualitative analyses, past experience, as well as engineering judgment. Unacceptable alternatives are removed after deliberation

2. Analysis and Ranking of Decision Alternatives -- In this step, the screened alternatives are ranked

3. Actual Decision making -- The final decision can be made only after a deliberation takes place (that is, we are describing a risk-informed rather than risk-based process). Deliberation is necessary because there may be aspects of the particular decision that cannot be considered in a formal way.

 Bookmark this on Delicious
Are you on LinkedIn?    Share this article with your network by clicking on the link.

Sunday, December 19, 2010

Top 50 Industrial Engineering and Project Management Blogs

This is not my list, but we here at "Musings" made the list of 50 Industrial Engineering and Project Management blogs, actually placing 10th in the project management list, as complied by Masters of

And, I am happy to report that we are in good company with many blogs listed that we follow here.

Of course, in a list as long as 50 there's always something to discover.

One for me was under the category of Operations Research Blogs:,  One blog on this site caught my eye: posted this past September, it is entitled "The Flaw of Averages and why everything is late" and refers to a book similarly titled: "The Flaw of Averages: Why We Underestimate Risk in the Face of Uncertainty" by Sam Savage. 

Sam talks about the 7 Deadly Sins of Averages in his paper published in OR/MS Today from the Institute of Operations Research and the Management Sciences. 
The Seven Deadly Sins of Averaging
1. The Family with 1 1/2 Children
2. Why Everything is Behind Schedule
3. The Egg Basket
4. The Risk of Ranking
5. Ignoring Restrictions
6. Ignoring Optionality
7. The Double Whammy

Of course for the risk astute project manager, "expected value" is the statistic of choice, not an arithmetic average. Expected value is a risk adjusted average of all the possibilities that go into an estimate. Thus, it is a richer piece of information, incorporating more of the information in the distribution of possibilities than just an average. Nevertheless, it does no harm to understand Savage's 7 points--just don't throw away useful information by not understanding expected value as well.

 Bookmark this on Delicious
Are you on LinkedIn?    Share this article with your network by clicking on the link.

Friday, December 17, 2010

More about process improvement project risks

In "Process Part I", I put it to you that business' are run from a vertical--that is, functional--perspective but most process improvement projects attempt to improve the business by improving cross-functional--that is, horizontal--process performance.

Trust and verify

The issue for project managers is to convince stakeholders--most of who have a vested interested in the functional metrics oriented vertically--to trust process metrics that are 'invented' or put in place by the project outputs. Without trust, there will be no meaningful outcomes to justify the effort to develop the outputs.


One issue is that to support process metrics business data has to be reorganized. Data gathered functionally--that is, vertically--during the normal course of business activity has to be reported horizontally. That requires reorganizing the data schema. The 'get it in' schema is often too inefficient and ineffective to support the 'report it out' needs of the process.

Enter the data warehouse:

One use of a data warehouse is to store the vertical data from the P&L data base in a horizontal form so it can be read out in a process dimension. However, what appears simple on paper--changing the view of the data--is not simple in practice.

Just the facts!

One principle of system engineering of which project managers are well aware is that "view" doesn't change the underlying facts.  One example familiar to project managers is the WBS: the sum of the horizontals [which is one 'view']  equals the sum of the verticals [which is another view].

The same applies to business data.  In the case of a DW, it should be to sum up horizontally what the P&L database is reporting vertically.

DW project risks

The risk arises in the validation task.  The P&L data is 'certified' and 'validated'; the horizontal process view is not.  Therein is the risk: data transformations, data timing, and query logic all bear upon results.

Project managers are well advised to take this risk seriously. 

It took one of my projects about a year to validate the DW so that it would add up to the P&L reliably.  It's not just a matter of arithmetic.  It takes time, and repeated success, to obtain the trust of stakeholders whose livelilhood may depend on the results.

 Bookmark this on Delicious
Are you on LinkedIn?    Share this article with your network by clicking on the link.

Wednesday, December 15, 2010

Acquisition reform

Last month, I had a post on EVM reform. Now, that is actually set in the context of acquisition reform in the DoD, a Secretary Gates initiative.

In the October 2010 issue of "Air Force Magazine", there is a description of the five reforms the Air Force is putting in place to prevent debacles like the solicitation for the replacement airborne refueling tanker--twice canceled, and only this month in some more hot water--and some other big ticket programs in trouble, most notably the F-35 fighter aircraft.

Here is what the Air Force is calling reform:

1. Add 7,000 uniformed and civilian procurement specialists, all being new hires, and most being interns with no experience. Get 'em young, I guess, is the idea. About half of these folks are already on-board. [Note: within all of DoD, Gates is planning on 20,000 new hires for acquisition....that's a whole agency in most places!]

2. Resist change and requirement volatility by elevating to executive level the approval needed to make a change.  The air force plans to "....insinuate acquisition pros into the requirements process" early, and then block up requirements such that IOC is at an 80% level with 'block 1'.  Haven't we been doing that for decades?  How is this a reform?  Perhaps we should try the flip side: provoke change while there is still time to deal with it, and be ever open to common sense.

3. Stabilize the budget.  A noble objective to be sure, but good luck with that one in the political climate we'll have for the next generation.

4. Improve the quality of the source selection.  Again, a good idea always to work on process improvement.  In a somewhat shocking statement, the article says that the "....Air Force will execute the source selection exactly like we said we would" in the source selection rules.  What a concept!

5. Align authority with responsibility, the bane of all large command and control bureaucracies.  The first step is increase the PEO's from 6 to 17 to allow a better spread of command.  I hope it works, but we have been working on the A&R problem for 50 years, at least.

 Bookmark this on Delicious
Are you on LinkedIn?    Share this article with your network by clicking on the link.

Monday, December 13, 2010

Quotation for project managers

Be eternally suspicious, take nothing for granted, investigate everything. Program success is obtained only by enormous attention to detail everywhere
Quoted on HerdingCats
 Bookmark this on Delicious
Are you on LinkedIn?    Share this article with your network by clicking on the link.

Saturday, December 11, 2010


Microtasking. Is microtasking something that is coming to a project near you? Should you be the first your block to do it?

In a recent article, microtasking was explained as subdividing a task into tasks so small that they could be executed in a few seconds, perhaps a minute.

What kind of tasks are these? Mostly repetitive tasks. One task is filling in blanks on a form; another is transcribing sentences in a document; another could be loading a database.

Is this practical? Well, there are a couple of companies providing microtasking as a service, and other companies building tools, primarily web-based tools, to make it possible for others to do it for themselves, or contract to have it done.

Could it work in a project? Is it the ultimate Agile, or the ultimate 'federalism' of project management? I don't know. I'm not aware that it's been tried.

But like social networking, cloud computing, virtual teams, and a host of other technology driven ideas, micotasking, or perhaps its bigger cousin, millitasking, could be coming to a project near you!

 Bookmark this on Delicious
Are you on LinkedIn?    Share this article with your network by clicking on the link.

Thursday, December 9, 2010

Grady Booch on DoD software management

Back to this month's "Crosstalk" that is the issue on architecture. This issue also has an interview with Grady Booch, now with IBM but probably best known as one of the three main authors of the Unified Modeling Language, UML. [Historical note: IBM bought Rational a few years ago, and Booch was a key guy at Rational inventing UML]

Booch gave his opinion about the DoD software community:

It really used to be, decades ago, that the DoD was leading the marketplace in the delivery of software-intensive systems. The harsh reality is that the commercial sector is leading best practices and really pushing the arc relative to software engineering and software development. So, in that regard, the DoD is behind the times. That is not to say that they are not pushing the limits in some areas. The kind of complexity we see in certain weapons systems far exceeds anything one would see commercially, but ultimately, there are a lot of things that the DoD can learn from the commercial world.

I wonder if the commercial "best practices" he refers to are Agile, or something from the CMMI? I'm not sure I would put Agile under "best practices" for general development, but certainly for cases of evolving and emerging requirements that are not fixed in anyone's mind, Agile is a risk management solution for that dilemma.

And, I certainly wouldn't call Agile "software engineering". Test driven design--TDD--might qualify as an engineering practice, so also refactoring, but most of the rest of Agile is management not engineering.

Those are my opinions. Booch had something different to offer:

Three recommendations for large scale organizations

Booch has three recommendations for DoD software managers and practitioners, but really these apply to any large scale organization doing software intensive systems:

1.  Increase leverage of open-source tools and resources. This is not just a cost issue; it's an issue of leverage for innovation and transparency.  Specifically, Booch says push '' along, the services arm of 'SourceForge'

2.  Build more elaborate and deployed infrastructure for collaboration. [I think he means: It's not that programs should be managed from Facebook or Twitter....they shouldn't...] There should be more emphasis on leveraging the worldwide expertiese of DoD and its contractors

3.  Go beyond functional modeling and get down to modeling the system itself.  Make architecture an artifact of the project. To this, I say: Amen!

The DoDAF [DoD architecture framework] as a means to bring more emphasis and utility of architecture to DoD programs. Booch thinks DoDAF is effective for modeling the 'enterprise of the warfighter', but is less effective in modeling software intensive systems.

Perhaps so. However, the DoDAF is certainly in evidence in the battlefield robots now under development and being deployed. We'll see how that works out. Advantage: USA [for now]

 Bookmark this on Delicious
Are you on LinkedIn?    Share this article with your network by clicking on the link.

Tuesday, December 7, 2010

Of infamy and innovation

Today is December 7th, "...a day that will live in infamy" said FDR a day later in an address to Congress. The events of December 7th, 1941 ushered the United States into WW II as an armed belligerent. In turn, WW II ushered profound changes into the culture and society of the United States.

The enormous industrialization of WW II all but put defined process control on the map. Decades later, "six sigma" emerged, but only after TPM and other quality movements that had their roots in the projects to arm millions of service men and women.

The scope of WW II projects was unprecedented, leading to the military-industrial complex that defined and codified program management, system engineering, risk management, analog simulation, and a host of other project practices heretofore unknown or undefined.

WW II unleashed innovation as no other world event. The modern research university was empowered. During the war, the laboratories at MIT and CalTech and Stanford were at the forefront of new ideas, inventions, and applications. Since then, a multitude of research universities have been drivers of the innovation explosion in the United States.

Although the war drove atomic science, atomic science drove quantum mechanics, an understanding of the sub-atomic structure. From this we have all manner of semiconductors that have in turn been the underpinning of the information age.

And, let us not forget that WW II empowered 50% of our workforce for the first time. Women entered the workforce in large numbers doing jobs never open to them before. They have never looked back

And finally, WW II beget the 'GI Bill' that sent millions to college and all but invented the modern middle class from which yet more innovation, inventiveness, and entrepreneurship has arisen.

It sounds like "...there's nothing like a good war".  But that's not the case.  The emergency of warfare has always raised the bar.  Before the U.S. civil war in the mid 19th century, railroads as a means for tactical support for forces was unheard of; so also electronic messaging...the telegraph in those days.  Innovation, as a consequence of great national emergency, is the sidebar that always gets a boost.

 Bookmark this on Delicious
Are you on LinkedIn?    Share this article with your network by clicking on the link.

Sunday, December 5, 2010

Crosstalk reviews architecture

"Crosstalk" has a new online website that presents the magazine in a truly online format. I personally like the "flip through the magazine" functionality....a really easy way to see the whole issue quickly.


This month's edition is dedicated to architecture, ordinarily the domain of system engineering, but a discipline I firmly believe PM's should embrace as necessary in every project.


Architecture is the arching narrative that pulls the whole WBS together. And since the WBS is the object of the schedule, architecture helps to integrate all the project pieces. Architecture is an abstraction of the WBS;  its that level of detail that's usually of interest and important to sponsors, stakeholders, and beneficiaries .... therefore, it's important to PMs. 

And, here's the clincher for me: architecture plays directly into risk management.  To see why, consider these properties of architecture:

Topology and protocols:
Architecture tells us the topology of the system, product, or process. Topology tells us about hierarchy, interconnectedness, and whether nodes are reached by point-to-point, hub-and-spoke, or some mesh circuitry.  In some cases, architecture gives the protocols, that is: the rules, by which elements of the system tie together.  Architecture gives form to requirements. 
Cohesion, coherence, and coupling:
Cohesion is a measure of "stickiness", the degree to which elements of the project outputs will hang together under stress, work together well in the environment, and not do chaotic or disparate things when stimulated differently.  Good cohesion is good and lowers risk.
Coherence is a measure of sympathetic reinforcement.  Coherence gives rise to the adage: "the sum is greater than the parts".  High coherence is generally good and lowers risk
Coupling is a measure of interference or dependency between units, subsystems, and modules.  In general good architecture respects natural boundaries; disrespect leads to strong coupling and propagation of errors, stress, and failures.  Loose coupling that traps effects before they propagate to other components is generally good, and lowers risk.

Summary: pay attention to architecture!

 Bookmark this on Delicious
Are you on LinkedIn?    Share this article with your network by clicking on the link.

Friday, December 3, 2010

WBS yet again

From time to time, the debate reemerges about the definition of the WBS. And so it happened again last month with a series of exchanges about 'work' vs 'the product of the work'.

This time, the fireworks began with a posting my Mike Clayton, followed by several responses from readers and critics.

In my response to Mike's post, I said:
Hey Mike: On this side of the pond, the WBS originated in the Defense Department, going back into the ’60s at least, as now given in MIL HDBK 881A, now in its upteenth upgrade and reprinting. PMI is a very late comer to the ‘definition’ game. DoD has always defined the WBS in terms of the product of the work, not the work itself. The 881A definition is: “A product-oriented family tree composed of hardware, software, services, data, and facilities. The family tree results from systems engineering efforts during the acquisition of a defense materiel item. ” You can read all about it at

But really, I think all the controversy can be reduced to one word: "Microsoft".

Microsoft can be blamed for everything.  Microsoft beget MSProject, and MSProject captured the market for an inexpensive and easy to use scheduling tool decades ago. Being mostly a database of tables and fields, with some application code written around it, MSProject allows the user to expose certain fields that have a built-in data definition. One of these fields is entitled "WBS".

Verbs, nouns, and narrative
However, schedules are the world of 'verbs': actions that are to be scheduled. The WBS, on the other hand, is the world of the 'nouns', things that are memorials to completed actions.

The project 'narrative' is just the verbs from the schedule put into sentences where the nouns from the WBS are the objects of the verbs [hopefully, everyone recalls sentence diagramming from the 5th grade]

Application smarts
MSProject's application is not smart enough to distinguish between the 'nouns' and the 'verbs'. So, even if you have been diligent by making the summary row a noun with the subordinate rows containing the scheduled verbs, when the WBS column is exposed all records [rows] in the database [schedule] acquire a WBS number in an indentured and sequential order. The numbering is part of the application functionality.

So, naturally there is a confusion between schedule and WBS if you do not give the field [aka 'column' in the application] a user-defined 'title'. I like 'index', as shown in the figure below, but pick your own. Caution: do not rename the 'field name' since the field name is sacrosanct in the database.

 Bookmark this on Delicious
Are you on LinkedIn?    Share this article with your network by clicking on the link.

Wednesday, December 1, 2010

Prospect Theory

Prospect Theory is an explanation of choosing among alternatives [aka "prospects"] under conditions of risk.  Amos Tversky and David Kahneman are credited with the original thinking and coined the term "prospect theory".

Prospect Theory postulates several decision making phenomenon,  a couple of which were discussed in the first posting.  Here are two more:

The Isolation Effect
If there is a common element to both choices in a decision, decision makers often ignore it, isolating the common element from the decision process.  For instance, if there is a bonus or incentive tied to outcomes, for which there is a choice of methods, the bonus is ignored in most cases.

Here's another application: a choice may have some common elements that affect the order in which risks are considered; the ordering may isolate a sure-thing, or bury it in a probabilistic choice.

Consider these two figures taken from Tversky and Kahneman's paper.  In the first figure, two probabilistic choices are given, and they are independent of each other.  The decision is between $750 in one choice and $800 in the other.  The decision making is pretty straight forward: take the $800.

In the second figure, choice is a two step process.  In the first step, the $3000 is given as a certainty with a choice to choose the other path that has an EV of $3200.  This decision must be made before the consequences are combined with the chance of $0.  

The decision outcome [at the square box] is either sure thing $3000  or expected value $3200.  But, there is then a probabilistic activity that weights this decision such that at the far left chance node the prospect is either ($0, $750) or ($0, $800). 

So, the EV of the prospect is the same in both figures. However, in Figure 2 the second tree has the 'certainty' advantage over the first tree with the choice that is available to pick the sure-thing $3000 at the decision node.

The Value Function

Quoting Tversky and Kahneman: "An essential feature of the ..... theory is that the carriers of value are changes in wealth or welfare, rather than final states.  ...... Strictly speaking, value should be treated as a function in two arguments: the asset position that serves as reference point, and the magnitude of the change (positive or negative) from that reference point. "

The point here is that the authors postulate that every prospect has to be weighted with a factor that represents this value idea.  The weightings do not have to sum to 1.0 since they are not probabilities; they are utility assignments of value.  Weightings give rise to the apparent violations of rational decision making; they account for overweighting certainty; taking risks to avoid losses and avoiding risks to protect gains; and ignoring small probabilities, among other sins.

 Bookmark this on Delicious
Are you on LinkedIn?    Share this article with your network by clicking on the link.