Saturday, November 30, 2013

Security design principles



Security of software systems is all the buzz these days with the emergence of official and unofficial surveillance and hacking. So, one might wonder, why look back 40 years to a 1974 paper on system security principles?

Answer: Some stuff is timeless, and some stuff is still valid after four decades.

We refer, of course, to the classic by Jerome H. Saltzer and Michael D. Schroeder entitled "The Protection of Information in Computer Systems", arguably the most important part of which are the famous "8 Principles of Design"

Saltzer's and Schroeder's Design Principles
Each principle referred to the "protection mechanism"

Principle of Economy of Mechanism
... should have a simple and small design.

Principle of Fail-safe Defaults
... should deny access by default, and grant access only when explicit permission exists.

Principle of Complete Mediation
... should check every access to every object.

Principle of Open Design
... should not depend on attackers being ignorant of its design to succeed. It may however be based on the attacker's ignorance of specific information such as passwords or cipher keys.

Principle of Separation of Privilege
... should grant access based on more than one piece of information.

Principle of Least Privilege
... should force every process to operate with the minimum privileges needed to perform its task.

Principle of Least Common Mechanism
... should be shared as little as possible among users.

Principle of Psychological Acceptability
... should be easy to use (at least as easy as not using it).


Check out these books I've written in the library at Square Peg Consulting

Thursday, November 28, 2013

PM Flashblog E-book... read now!


The PM Flashblog E-book has been published, and you may have seen links to it on dozens of other websites. This E-book is a compilation of many of the blogs that were flashed out simultaneously on August 24/25 (depending on your GMT)

My contribution is on page 46, but there are lots of interesting posts throughout:

http://www.sqpegconsulting.com



Compliments to Allen Ruddock for putting this E-book together


Bookmark this on Delicious

Read about these books I've written in the library at Square Peg Consulting
You can buy them at any online book retailer!

Monday, November 25, 2013

The four faces of risk


1
When you say "risk management" to most PMs, what jumps to mind is the quite orthodox conception of risk as the duality of an uncertain future event and the probability of that event happening.

Around these two ideas -- impact and frequency -- we've discussed in this blog and elsewhere the conventional management approaches. This conception is commonly called the "frequentist" view/definition of risk, depending as it does on the frequency of occurrence of a risk event. This is the conception presented in Chapter 11 of the PMBOK.

The big criticism of the frequentist approach -- particularly in project management -- is that too often there is no quantitative back-up or calibration for the probabilities -- an sometimes not for the impact either. This means the PM is just guessing. Sponsors push back and the risk register credibility is put asunder. If you're going to guess at probabilities, skip down to Bayes!

However.. (there's always a however it seems), there are three other conceptions of risk that are not frequentist in their foundation. Here are a few thoughts on each:

2
Failure Mode Event Analysis (FMEA): Common in many large scale and complex system projects and used widely in NASA and the US DoD.  FMEA focuses on how things fail, and seeks to thwart such failures, thus designing risk out of the environment. Failures are selected for their impact with essentially no regard for frequency. This is because most of the important failures occur so infrequently that statistics are meaningless. Example: run-flat tires. Another example: WMD countermeasures.

3
Bayes/Bayes theorem/Bayesians: Bayesians define risk as the gap between a present (or more properly 'a priori') estimate of an event and an observed outcome/value of the actual event (called more properly the posterior value).

There is no hint of frequentist in Bayes; it's simply about gaps -- what we think we know and what it turns out that we should have known. The big criticism -- by frequentists -- is about the 'a priori' estimate: it's often a guess, a 50/50 estimate get things rolling.

Bayes analysis can be quite powerful... it was first conceived in the 17th century by an English mathematician/preacher named Thomas Bayes. However, in WWII it came into its own; it became the basis for much of the theory behind antisubmarine warfare.

But, it can be a flop also: our 'a priori' may be so far off base that there is never a reasonable convergence of the gap no matter how long we observe, or how many observations we take.

4
Insufficient controllability, aka anonymous operations: the degree to which we have command of events. Software, particularly, and all anonymous systems generally are considered a "risk" because we lack absolute control. See also: control freak managers. See also the move: 2001: A Space Odyssey. Again, no conception of frequency.

Do you have a comment? Optional, of course.
John - Instructor

Check out these books I've written in the library at Square Peg Consulting

Saturday, November 23, 2013

Work from home infographic


If you work from home, full time, or just a few days a week, you'll identify with this infographic big time.


The Work From Home Disadvantage



Bookmark this on Delicious
Please include attribution to InternetProvider.org with this graphic.


Check out these books I've written in the library at Square Peg Consulting

Wednesday, November 20, 2013

It's urgent -- but not important


Kotter* says: Provoke urgency to get change moving

In your project life, you are going to be faced from time to time with both establishing/promoting urgency and importance as possible tools you go to in affecting change.

A few words about these popular choices:
  • One thing to keep in mind is that urgency and importance are not the same thing. Many things that are urgent are simply not that important... they are urgent only because they have a temporal sequencing issue that puts them at the head of the line and a need to do now.
  • And, many important things may have weak sequencing demands, so long as -- in the end --they get done.
Kotter, of course, is using urgency as a prod to get things going. The caution is the familiar bromide:  "Nothing is urgent if everything is urgent".




* John P. Kotter, "Leading Change"

Check out these books I've written in the library at Square Peg Consulting

Monday, November 18, 2013

Leveling up..


Ever been told you're working above your pay grade? Maybe you're "Leveling up". That's the label  you get when you work or consult with others at higher level, even if they are only an intellectual or experience level beyond yours

Consider the challenges*:

John Baez tell us:
Sometimes, in your ... career, you find that your slow progress, and careful accumulation of tools and ideas, has suddenly allowed you to do a bunch of new things that you couldn’t possibly do before. ...when they’ve all become second nature, a whole new world of possibility appears.

You have “leveled up”, if you will. Something clicks, but now there are new challenges, and now, things you were barely able to think about before suddenly become critically important.

It’s usually obvious when you’re talking to somebody a level above you, because they see lots of things instantly when those things take considerable work for you to figure out.

Talking to somebody two or levels above you is a different story. They’re barely speaking the same language, and it’s almost impossible to imagine that you could ever know what they know.

Somebody three levels above is actually speaking a different language. They probably seem less impressive to you than the person two levels above, because most of what they’re thinking about is completely invisible to you.

Check out these books I've written in the library at Square Peg Consulting

Friday, November 15, 2013

Insufficient controllability


"Insufficient controllability"... it just rolls out when you say it. And, it's one definition of risk*. A related malady is: "autonomy", as in autonomous machines achieved with (gasp!) autonomous software.

Critical Uncertainies (Matthew Squair) brings us these gems in a posting about machine autonomy.
Squair tells us:
From this perspective the approach of the authors to risk can be seen as a reflection of our human fear of loss of control. The greater the autonomy of automation the greater our perception of risk. Thus a loss of control to something as intangible as a software program is always going to be perceived as a risk

It seems reasonable when you think about it. Perhaps "insufficient controllability" belongs on the long list of cognitive biases that inform risk attitude. It certainly explains "control freak" management styles!



* Brun, W., Risk perception: Main issues, approached and findings. In G. Wright and P. Ayton (Eds.), Subjective probability (pp. 395-420). Chichester: John Wiley and Sons, 1994.

Check out these books I've written in the library at Square Peg Consulting