Sunday, April 30, 2023

Getting to "Zero Trust" architecture and philosophy


In PMO school, they teach you that trust is everything when building a successful project team.
Fair enough.

But now comes "Zero Trust", and the "Zero Trust Architecture" which is more like a philosophy than an architecture. And, of course, the acronyms: ZT and ZTA.

I don't know if NIST (*) coined this phrase, 'zero trust', but they have a proposed zero trust architecture you can read about here. 

Motive
The motive for developing ZTA was a realization that security threats to an enterprise's intellectual property (IP), whether corporate proprietary or government classified, are more often now inside the perimeter of a security firewall. Indeed, with the proliferation of remote working, the 'cloud, and 'bring-your-own-device (BYOD), the very idea of a perimeter is somewhat bye-the-bye. And so IP protection can no longer just be a matter of a security firewall around the enterprise.

Philosophy
So if you are philosophically in touch with 'zero trust', the idea is that every element of IP is subject to an enforced need-to-know, and an enforced limitation on copy and dissemination. The perimeter really no longer exists; a pass through the perimeter, even if existent, is relatively unproductive because of ZT gates on the IP.

The idea is to move from protecting a perimeter or a network segment to protecting the actual resource that is the IP of the enterprise. In effect, it is realized that there will be persistent active threats in the network; the security objective is to block them from accessing the actual IP.

ZT according to NIST
NIST says this: "Zero trust (ZT) is the term for an evolving set of cybersecurity paradigms that move defenses from static, network- based perimeters to focus on users, assets, and resources. A zero trust architecture (ZTA) uses zero trust principles to plan industrial and enterprise infrastructure and workflows.

ZT is not a single architecture but a set of guiding principles for workflow, system design and operations that can be used to improve the security posture of any classification or sensitivity level. Zero trust (ZT) provides a collection of concepts and ideas designed to minimize uncertainty in enforcing accurate, least privilege per-request access decisions in information systems and services in the face of a network viewed as compromised.  

Zero trust assumes there is no implicit trust granted to assets or user accounts based solely on their physical or network location (i.e., local area networks versus the internet) or based on asset ownership (enterprise or personally owned). Authentication and authorization (both subject and device) are discrete functions performed before a session to an enterprise resource is established. 

Zero trust is a response to enterprise network trends that include remote users, bring your own device (BYOD), and cloud- based assets that are not located within an enterprise-owned network boundary. Zero trust focus on protecting resources (assets, services, workflows, network accounts, etc.), not network segments, as the network location is no longer seen as the prime component to the security posture of the resource."

NIST continues: 
"In this new paradigm, an enterprise must assume no implicit trust and continually analyze and evaluate the risks to its assets and business functions and then enact protections to mitigate these risks. In zero trust, these protections usually involve minimizing access to resources (such as data and compute resources and applications/services) to only those subjects and assets identified as needing access as well as continually authenticating and authorizing the identity and security posture of each access request."

Their conclusion:
"When balanced with existing cybersecurity policies and guidance, identity and access management, continuous monitoring, and best practices, a ZTA can protect against common threats and improve an organization’s security posture by using a managed risk approach.

_______________
(*) NIST: National Institute for Standards and Technology
 


Like this blog? You'll like my books also! Buy them at any online book retailer!

Thursday, April 27, 2023

Ghostwriting for Project Communications



So, you work in project communications, perhaps directly for the PM. 
Great job, and it can be a lot of fun being creative.

In the 'old days', that probably meant long-form press releases and updates to the project web page or communications dashboard.

Today, it's those plus social media; scripts and plans for podcasts and videos; and other duties as assigned (ODAS)

And even more so today, it may mean consulting or incorporating some regenerative AI artifact ... text, video, art, image ... in your creation for which you may have been the creative "prompter"

But, if you're a ghost writer (or creator) for the 'boss', who gets the credit? Do you have a byline as a contributor? And, does the boss get the credit for imaginative and effective writing or production when it's really you? And (gasp!) does the AI thing get some of the byline or creative credit, even if you're the prompter? 

Welcome to the world of ghost writing. 
The person you're writing for gets the credit, usually, and you're lucky if you're recognized outside your PMO. You probably knew all that coming into the job. Why else is it called 'ghost' writing?
 
But what if you disagree in some fundamental way with the content of the communication you've been asked to write or produce? What then?

Two cases:
  1. Opinion: It's not your opinion you're opining. You can write 'B' for the public, but believe 'A' privately.
  2. False facts: If not about 'beliefs' but rather about misleading or even factually incorrect material, you have an obligation to push back. Indeed, you could put yourself in legal peril for liability or defamation. 

Life is too short

If you can't live with the material you're writing, then don't. Find new material; and find a way to persuade the boss to let you use it.

And if all that doesn't work, you may have to fire the boss! (Aka: get a new job)




Like this blog? You'll like my books also! Buy them at any online book retailer!

Sunday, April 23, 2023

Comp by AI assessment


It's been reported variously that AI is getting into the real-time compensation process. The upshot is that similar work does not engender similar pay, even on the same project. 

We know all about you
It's been alleged that companies are amassing "deep data sets", individualized by name, for "gig workers" which provide all the information needed to fashion a customized compensation package to stimulate the behavior the project expects from that worker. Some details on the research into this practice are found here.

Discriminatory?
On the one hand, really productive people could expect compensation in accord with their value; others may feel its "wage discrimination" based on a myriad of factors known only to the vagaries of the AI engine. 

Fair, unbiased?
Most of us know by now that these generative engines, loosely modeling neural networks, are largely "black boxes". Even the CEO of Google has said that. And, not only are the engine internals obscure, but their outcomes are not wholly predictable, nor are the outcomes and results entirely repeatable.  

It's also been reported that many of the biases that lurk in the "deep data sets" find their way into outcomes. 

Value demonstrated
Maybe we should continue with the mainstream comp system which relies on demonstrated value to the project. The project at large is judged that way; why not the people component as well?




Like this blog? You'll like my books also! Buy them at any online book retailer!

Saturday, April 22, 2023

Measuring stuff



There's a lot of stuff to measure if you're running a PMO.
And all that stuff shares these characteristics:
  • Measurable (attributes can be represented on a calibrated scale)
  • Calculable (numbers from the scale can be handled with arithmetic)
  • Estimable (enough is known about the entity that a credible numerical forecast can be made)
  • Determinable (essentially, not random, and thus not requiring statistical methods)
And, why do we care?
Enter: risk management, for one thing. But also we measure, calculate, and estimate in deterministic methods cost, schedule duration, and quality characteristics.

What about apples and oranges?
Apples can be measured; oranges can be measured
Can they be compared, one with the other?
Perhaps
Some characteristics, like weight and density are comparable
Some characteristics, like susceptibility to damage if dropped are really not comparable. It's apples and oranges after all. They are dissimilar in certain ways that require they be separated for analysis.

That latter idea, that dissimilarity is a separator for analysis brings us to risk management. 
To compare two dissimilar risks violates the rule that risks to be compared must be similarly mensurable.

In a posting during the COVID 19 period, Matthew Squair, writing at 'critical uncertainties' makes the point that early on ....
(and I quote Squair)

".... lockdown sceptics were pointing out that your risk of drowning in a pool, in California, was much higher than that of dying from Covid 19 so why to worry? if you feel this is intuitively wrong, in fact wronger [sic] than wrong, you're right.

One of these risks is based on an independent probability. That is if I drown in a pool it's not going to have an affect on the probability of my neighbour drowning. But, on the other hand if I have Covid 19 you'll find the probability of my neighbour having Covid 19 is dependent on that; that is, the probabilities are dependent.

In one we truck along with a base rate of events unaffected by each other, in the other the events can affect each other and the risk can suddenly blow up.

To be very clear the two risks are immensurable and not directly comparable."

He goes to point out that many such immensurable comparisons were being made in the Covid space, such as the risk of getting Covid itself compared, incorrectly, to the risk of blood clot from a vaccine, etc.



Like this blog? You'll like my books also! Buy them at any online book retailer!

Tuesday, April 18, 2023

Mixed methods, Agile and Other




There comes a point where more planning can not remove the remaining uncertainty, instead execution must be used to provide data and remove uncertainty.

This quote comes from a nicely argued case -- from the agile blog 'leading answers' -- for mixing agile methods in rather traditional businesses, like the oil and gas exploration/production business

If ever there was a business that benefits from Boehm's Spiral Model, OGM (oil, gas, minerals) is certainly one. (Disclosure: In the past, I've owned some OGM leases in Texas, so I've a bit of personal experience with this)

So, what have you got here?
  • A lot of risk acknowledged up front (can't know everything -- thus the opening quote)
  • A need to run with pilot projects before committing to production
  • A need to tie into legacy systems (in the OGM case, distribution systems)
  • A lot of deliverables that can be done incrementally and then integrated
  • Small (it's all relative re small) teams, co-located (or the virtual version thereof), personally committed, with risk hanging on every move.
  • A degree of local autonomy -- even if virtual -- required to meet the challenges of the moment
Sounds like an environment that needs agility, if not agile methods, on a lot of the stuff.

Of course, there's "one big thing":

You can't go around self-organizing (agile speak) willy-nilly! There are regulatory constraints everywhere and safety-first doctrine hanging on every move.

So, yes, there is a big bureaucracy that watches over... it's certainly more intrusive than a coach or a servant-leader (more agile speak)  (I'm sure they never heard this stuff in an oil field or an offshore rig!). In fact, I'll bet the rig boss is a force to be reckoned with!

Agile in the Enterprise
So, the bureaucracy has to be reckoned with, aka, 'the enterprise'. To that point take a read of my post about 'agile in the enterprise', or better yet, take a look at my book, 'Project Management the Agile Way; Making it work in the Enterprise."



Like this blog? You'll like my books also! Buy them at any online book retailer!

Friday, April 14, 2023

"Against the Gods" : a thinking perspective on risk management



If you are in the project management (read: risk management) business, one of the best books that describes the philosophy and foundation for modern risk management is Peter L. Bernstein's "Against the Gods: the remarkable story of risk".

Against the Gods is historical, somewhat philosophical, and void of math!
It's a book for "thinkers"

Between the covers of this "must read" we learn this bit:
The essence of risk management lies in maximizing the areas where we have some control over the outcome while minimizing the areas where we have absolutely no control over the outcome and the linkage between effect and cause is hidden from us.

Peter Bernstein
"Against the Gods: The Remarkable Story of Risk"

Knowledge and control
Dealing with risk necessarily breaks down into that in which more knowledge will help us understand deal with risk (climate change), and that in which effects are truly random and no amount of additional knowledge is going to help (rolling dice).

Bernstein goes on to develop one of the key themes of the book which is the idea that probability theory and statistical analysis have revolutionized our ability to understand and manage risk.

Picking apart Bernstein's "essence" separates matters into control and knowledge:
  • We know about it, and can fashion controls for it
  • We know about it, and we can't do much about it, even if we understand cause and effect
  • We know about it, but we don't understand the elements of cause and effect, and so we're pretty much at a loss.
  • We don't know about it, or we don't know enough about it, and more knowledge would help.
Of course, Donald Rumsfeld, in 2002, may have put it more famously:
" ....... because as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns—the ones we don't know we don't know."
No luck
So there is an ah-hah moment here: if all things have a cause and effect, even if they are hidden, there is no such thing as luck. (Newtonian physics to the rescue once again)

Thus, as a risk management regimen, we don't have to be concerned with managing luck! That's probably a good thing (Ooops, as luck may have it, if our project is about the subatomic level, then the randomness of quantum physics is in charge. Thus: luck?)

Indeed, our good friend Laplace, a French mathematician of some renown, said this:
Present events are connected with preceding ones by a tie based upon the evident principle that a thing cannot occur without a cause that produces it. . . .
All events, even those which on account of their insignificance do not seem to follow the great laws of nature, are a result of it just as necessarily as the revolutions of the sun.
Bernstein or Bayes' (with help from ChatGPT)

Following up on the idea of the knowledge-control linkage to risk management, Bayes' Theorem comes to mind. Bayes' is all about forming a hypothesis, testing it with real observations, and using those outcomes to refine the hypothesis, eventually arriving at a probabilistic description of the risk.

LaPlace, mentioned above, is one of the architects of the probability theory that underlay Bayes'.  Thus, one of the most interesting discussions in the book centers on Bayes' theorem, which Bernstein describes as "one of the most powerful tools of statistical analysis ever invented."

Bayes' theorem is a manner of reasoning about random and unknown effects and a mathematical formula that allows us to update our beliefs about the probability of an event occurring based on new evidence. It is a powerful tool for making predictions and decisions based on incomplete information, and it has applications in fields ranging from medicine to finance to engineering.

Bernstein's discussion of Bayes' theorem in "Against the Gods" is particularly interesting because he highlights the fact that Bayesian reasoning is often at odds with our intuition. Humans have a tendency to overestimate the likelihood of rare events and underestimate the probability of more common events. Bayes' theorem provides a framework for overcoming these biases and making more accurate predictions.

Cognitive Bias in risk management
Bernstein talks a lot about cognitive biases and their impact on decision-making under uncertainty.

According to Bernstein, cognitive biases are mental shortcuts that people use to simplify complex decisions. These shortcuts can lead to errors in judgment and decision-making. Cognitive biases can be influenced by a number of factors, including emotions, personal experience, and cultural values.

Some examples of cognitive biases that Bernstein discusses in the book include the availability bias, which is the tendency to overestimate the likelihood of events that are more easily recalled from memory; and the confirmation bias, which is the tendency to look for information that confirms our existing beliefs and to ignore information that contradicts them.

One key point Bernstein makes is that humans have a natural tendency to be overconfident in their abilities to predict and control events. This is known as the "illusion of control" bias. People often believe they have more control over events than they actually do, leading them to take on more risk than is rational.

Another common cognitive bias is the "confirmation bias," in which people seek out information that confirms their preexisting beliefs, while ignoring or dismissing information that contradicts those beliefs. This can lead to a lack of objectivity in decision-making.

Bernstein also discusses the "hindsight bias," in which people tend to believe that an event was more predictable after it has already occurred. This bias can lead to overconfidence in future predictions, as people may believe that they could have predicted the outcome of an event that has already occurred.

Overall, Bernstein suggests that understanding and being aware of cognitive biases is essential to making better decisions and managing risk effectively. By recognizing these biases, individuals can take steps to mitigate their impact on their decision-making processes.






Like this blog? You'll like my books also! Buy them at any online book retailer!

Monday, April 10, 2023

Connecting the dollar dots: Cost-Price-Margin


Though not intending this posting to be rigorous for accountants or tax preparers, there is nonetheless a need for the PMO to be in touch with the three "dollar dots" that monetize project value: Cost, Price, and Margin.

And, as my book title (*) says, the PMO should be one of those seeking maximum project value. So, here's a quick look at the three 'dollar dots':

1. Cost: All the money required to get from project charter to the final report. For anything other than a Kool-aide stand, cost can be a bit tricky: Direct costs, capital costs (**), and indirect costs. The first two, at least, are project inputs which are pretty much the responsibility of the PMO to estimate and then manage, after the accountants set the rules. Indirect costs may also be a cost input, but the PMO has much less to say about them. (***) 

2. Price: What the customer pays (and when they pay, to wit: purchase, lease, or rent). Price is largely a marketing responsibility to determine. There are many considerations: Cost input is one of them, so the PMO's cost decisions do connect to price. Indeed, the price-point for the customer deliverable may strongly influence the project budget, not only in terms of development cost, but also the deliverable design that feeds into post-project production and post-delivery service costs. 

And beyond an intended price-point, there are other market considerations for price as well, and these considerations are usually multi-factored (discounts for some customers; meeting the competition; loss leaders, inventory clearance sales, etc)

3. And then there's margin. For simplicity, margin is price-less-cost. So again, the PMO connects to this dot through the 'cost input'. In real life, margin is a pretty complex computation involving both accounting rules and tax rules. In fact, the accounting margin and the margin reported for taxes (profit) may be quite different. So, leave these computations to the accountants!

___________

(*) "Managing Project Value" (cover picture below) 

(**) For accounting purposes, those project items that are "capitalized" will have a value on the business' balance sheet until they are depreciated over time as non-cash expenses on a P&L statement. However, the actual cash expense incurred when the items are purchased may go against the project's budget, depending on the accounting rules of your particular enterprise.

(***) Indirect costs are usually an allocation based on resource utilization. The allocation rules are generally set by the accountants. These costs are largely out of the control of the PMO even though they may appear on the project budget. 

Some direct costs may be subject to rules: For labor, "standard cost" may be set by accountants such that a project is charged for internal labor at a 'standard cost' by labor category, and not the actual salary of the employee.



Like this blog? You'll like my books also! Buy them at any online book retailer!

Friday, April 7, 2023

Is it activity, methods, or outcomes?



Back in yesteryear, I recall the first time I had a management job big enough that my team was too large for line-of-sight from my desk and location.

Momentary panic: "What are they doing? How will I know if they are doing anything? What if I get asked what are they doing? How will I answer any of these questions?"

Epiphany: What I thought were important metrics got reordered. I realized 'activity' becomes less important, whereas outcomes rise to the top
  • Activity becomes not too important. Where and when they worked could be delegated to managers and team leaders working locally and close to the action.
  • Methods are still important because Quality (in the large sense) is buried in Methods. So, I decided that I can't let methods be delegated willy nilly. Methods are something to be trained and practiced so that quality, predictability, and reliability are built-in.
  • Outcomes now become the biggie: are we getting results according to expectations?
There's that word: "Expectations", which of course is tied to 'results'
In any enterprise large enough to not have line-of-sight to everyone, there are going to be lots of 'distant' managers, executives, investors, and customers who have 'expectations'. And, they have the money! 

But not only do they have the money, they have a big say about how the money is going to be allocated and spent. So, you don't get a free ride on making up your own expectations (if you ever did)

Results, tied to Expectations
The PMO job is to map expectations into doable results, which is another version of 'deliverables'. Deliverables are, of course, multi-dimensional: feature, function, schedule (or availability to users and customers) and cost-price-margin which connects the dots between the project (cost), the customer (price) and the business (margin)

Back to my epiphany.
At the End of the Day:
  • I had 800 on my team
  • 400 of them were in overseas locations
  • 400 of them were in multiple US locations
  • I had multiple offices
  • It all worked out: we made money!




Like this blog? You'll like my books also! Buy them at any online book retailer!

Tuesday, April 4, 2023

Minimize your maximum cost


Full disclosure: I wrote this posting myself, but I did ask ChatGPT for some ideas to include. 

It's always a PMO objective to minimize cost if scope and quality and schedule are constant. But they never are. So, those parameters are usually intertwined and mutually dependent variables along with cost. 

But suppose for discussion that scope and quality are held constant (not to be traded off to save cost or schedule), and the primary objective is minimization of cost. Here are a few ideas.

Labor-dominant projects
I'm talking about projects where labor is 60% or more of the cost. Many software projects fall in this box, but many other intellectual content (IC) projects do as well: HR, finance, marketing, just to name a few.

Productivity
Assuming competence is not in question, the first order of business is productivity, which is always a ratio: output valued by the customer per unit of labor required for achievement. As in all ratios, the PMO can work on the maximizing numerator and minimizing the the denominator. 

Getting the numerator right the first time minimizes the cost of waste and rework and minimizes schedule mishaps. The skill required: good communications with the people who establish the value proposition. 

Minimize the "marching army" cost
But the numerator is also about finding useful outcomes for the "white space" that crops up: you have staff in place, you can't afford to let them scatter when there is downtime, so you have to have a ready backlog of useful second tier stuff. Staff you can't afford to lose, but may have downtime nonetheless, is often labeled the "marching army"

The denominator is sensitive to organizational stability and predictability, personal skills, tools, interferences, teamwork, and remote working. Anything that PMO can do about the first five is more or less mainstream PMO tasking. 

Remote working:
But the issue of large scale remote working is somewhat new since the Covid thing. Loosely coupled to that is greater emphasis on work-life balance rather than "do what ever it takes" and often for no overtime pay. 

Such has then spawned more of the "do the minimum not to get fired" mindset. All that has cast a shadow on remote working.

Cost-free synergism.
Consequently, the pendulum has swung in the direction of minimizing remote working in order to get the synergistic production (at nearly cost free) from casual contacts with other experts and innovators, to say nothing of problem avoidance and thereby waste and rework avoidance.

Risk management and scheduling
When it comes to labor, the first risk is dependable and predictable availability, particularly if the staff are so-called gig workers. Many PMOs limit W9s to less than 25% of the workforce for just this reason.
One anecdote is loose coupling on schedule tasks to allow for the occasional misstep in staffing. After all, even W2s have matters that interfere.

Material-dominant projects
Here is where a lot of construction projects, hardware development, and critical (or scarce) material projects come in.

Material impacts are largely mitigated by the usual strategies of earliest possible order, acceptance of interim and partial shipments, incentives for faster delivery, and strategic stockpiles of frequently used items.

The workforce for many of these type projects is often contracted by specific trades who have licenses to work on specific work. It's typical that these contactors operate in a matrix management environment of multiple and independent customers vying for a scarce and technical workforce. The impact is uncertainty of schedule and availability, and a cascade of dependencies that have to be reworked.

The customary approach to scarcity is cost incentives to direct resources to your need. 
The mitigation for cascading dependencies is schedule as loosely as possible so that slack among tasks forms risk management buffers to a slipping schedule.
  


Like this blog? You'll like my books also! Buy them at any online book retailer!