Wednesday, October 29, 2014

Requirements entropy framework

This one may not be for everyone. From the Winter 2014 "System Engineering" we find a paper about "... a requirements entropy framework (REF) for measuring requirements trends and estimating engineering effort during system development."

Recall from your study of thermodynamics that the 2nd Law is about entropy, a measure of disorder. And Lord knows, projects have entropy, to say nothing of requirements!

The main idea behind entropy is that there is disorder in all systems, natural and otherwise, and there is a degree of residual disorder  that can't be zero'd out. Thus, all capacity can never be used; a trend can never be perfect; an outcome will always have a bit of noise -- the challenge is to get as close to 100% as possible. This insight is credited to Bell Labs scientist Claude Shannon

Now in the project business, we've been in the disorder business a long time. Testers are constantly looking at the residual disorder in systems: velocity of trouble reports; degrees of severity; etc

And, requirements people the same way: velocity and nature of changes to backlog, etc.

One always hopes the trend line is favorable and the system entropy is going down.

So, back to the requirements framework. Our system engineering brethren are out to put a formal trend line to the messiness of stabilizing requirements.

Here's the abstract to the paper for those that are interested. The complete paper is behind a pay wall:
This paper introduces a requirements entropy framework (REF) for measuring requirements trends and estimating engineering effort during system development.

The REF treats the requirements engineering process as an open system in which the total number of requirements R transition from initial states of high requirements entropy HR, disorder and uncertainty toward the desired end state of inline image as R increase in quality.

The cumulative requirements quality Q reflects the meaning of the requirements information in the context of the SE problem.

The distribution of R among N discrete quality levels is determined by the number of quality attributes accumulated by R at any given time in the process. The number of possibilities P reflects the uncertainty of the requirements information relative to inline image. The HR is measured or estimated using R, N and P by extending principles of information theory and statistical mechanics to the requirements engineering process.

The requirements information I increases as HR and uncertainty decrease, and ΔI is the additional information necessary to achieve the desired state from the perspective of the receiver. The HR may increase, decrease or remain steady depending on the degree to which additions, deletions and revisions impact the distribution of R among the quality levels.

Current requirements volatility metrics generally treat additions, deletions and revisions the same and simply measure the quantity of these changes over time. The REF measures the quantity of requirements changes over time, distinguishes between their positive and negative effects in terms of inline image, and ΔI, and forecasts when a specified desired state of requirements quality will be reached, enabling more accurate assessment of the status and progress of the engineering effort.

Results from random variable simulations suggest the REF is an improved leading indicator of requirements trends that can be readily combined with current methods. The additional engineering effort ΔE needed to transition R from their current state to the desired state can also be estimated. Simulation results are compared with measured engineering effort data for Department of Defense programs, and the results suggest the REF is a promising new method for estimating engineering effort for a wide range of system development programs

Read in the library at Square Peg Consulting about these books I've written
Buy them at any online book retailer!
Read my contribution to the Flashblog