Tuesday, November 13, 2007

Classical Requirements Verification Methods

Back in August, I started introducing Program Management with an introduction to the System Development Life Cycle. I have talked in more detail about some of the early phases of the life cycle, like Operational Concept Development and System Requirements Specification, but right now I would like to skip ahead to the Test phase.

We are at a point in our basic switch replacement process when we have to start planning to test the candidates that have made it through our paper evaluation.

In this part of the process, we need to verify that the devices we are considering actually meet our stated requirements. I will provide a future post on the testing process, but before I do it is good to have an understanding of the four classical requirements verification methods:

  • Inspection
  • Demonstration
  • Test
  • Analysis

Inspection

Inspection is observation using one or more of the five senses, simple physical manipulation, and mechanical and electrical gauging and measurement to verify that the item conforms to its specified requirements.

For instance, we have this requirement on the physical characteristics of our basic switch:

PC2. Port connectors provided on the front panel

We will verify this requirement through inspection. That is, we will look at the switch and observe where the port connectors are located.

In fact, we will verify many of our requirements through inspection. All of the requirements indicating port types and counts, like:

AE1. Provides at least 32 10/100/1000BASE-T ports

In addition, for anything that is beyond our capabilities to test, we must rely on vendor documentation. We also considered these inspection methods. For instance:

EN1. Operates in temperatures between 32 and 104°F

Some companies would put the item in a temperature chamber and cycle the temperature while conducting performance tests. We do not have that capability. We will satisfy ourselves with an inspection of the vendor documentation as to their reported operating temperature range to determine whether the units satisfy these requirements.

Demonstration

Demonstration is the actual operation of an item to provide evidence that it accomplishes the required functions under specific scenarios.

Consider this requirement:

CM5. Manageable locally using console access

We will plug a local console device into the item and demonstrate that we can use it to perform a sampling of management functions.

Test

Test is the application of scientific principles and procedures to determine the properties or functional capabilities of items.

Test is similar to demonstration, but is more exacting, generally requiring specialized test equipment, configuration, data, and procedure in order to verify that the item satisfies the requirement.

Consider this requirement:

IO1. Passes IPv4 unicast packets at full line rate

We will verify this requirement by testing. We will use specialized test equipment (SmartBits Data Sheet) we will connect it to the item in a particular way, configure the control software just so, and run specific data through it according to a repeatable procedure from which we will get a binary result indicating whether the item did, or did not satisfy the requirement.

Analysis

Analysis is the use of established technical or mathematical models or simulations, algorithms, or other scientific principles and procedures to provide evidence that the item meets its stated requirements.

As test was like a more involved version of demonstration, so analysis is like testing on steroids. In analysis, many tests may be performed, but the results of any given test do not give a pass or fail indication, rather all of the results must be taken in concert and we must perform some further operation in order to determine whether the item satisfies the requirement.

Let me say that I do not believe we have any requirements that require analysis in this case. After all, this is the basic switch. However, if we had a requirement like this:

ZZ1. Provides an average jitter of less than 10 milliseconds.

In this case, we would have to verify this requirement using analysis. We would perform many tests, collect the resulting data, measure the time between output packets, and then calculate the average. We might repeat this many times over a 24-hour period and show whether the average jitter ever went above 10 milliseconds.

Labels: ,

3 Comments:

At 5/11/2009 07:47:00 PM , Blogger J.R. said...

Love this description, but I have a quibble. In non-IT-centric requirements verification, analysis usually refers to pencil-and-paper (or computer-based) modeling of the problem. So for a requirement of "The beam shall hold a five-gallon bucket of water without deflecting to greater than 5 microstrain," you'd have:

- Inspection: "Yeah, that's not going anywhere."
- Analysis: "A bucket that size, full of water, should weigh X kilograms. The beam is MxN millimeters in cross section and L millimeters long. A simple statics model suggests that it will only deflect by 2.5 microstrain."
- Demonstration: "Look, it's not deflecting."
- Test: "We verified that the water was no less than five gallons, and that its density was as expected. The bucket and beam have been measured and are both to spec. Strain gauges connected to the beam registered no more than 4 microstrain."

For computer hardware it's tougher to say what constitutes a "computer model," though. Does hardware-in-the-loop automatically make it a demonstration? Does measuring throughput with a packet sniffer make it a test? I don't think so, but it's much harder to draw the lines in network testing.

 
At 5/12/2009 08:47:00 AM , Blogger Mark (the Brush Valley Brewer) said...

I'd say that was an excellent summary. I'll use it in the future if you don't mind.

Regarding "hardware-in-the-loop," I normally think of demonstration as "hardware-in-real-life" that is, in the field, operating, doing the thing it was built to do in the way it was built to do it. For me, "hardware-in-a-contrived-setup" is a test.

If you're only showing (running out of gerunds here) one aspect of the device — that is, the light comes on when power is applied — it's harder to tell a test from a demonstration. Though I think your summary makes it clearer. "Look, the light came on when I plugged it in," is a demonstration.

Plugging an Intrusion Prevention device into a test network with a simulation of a bot army and showing that an un-patched Windows XP box survives more than 90 seconds is a test. Plugging the same thing into your enterprise network border and showing that your help desk calls for the day go down by 90% is a demonstration.

One more analogy.

A skinny guy with a goatee and a black cape and top hat made a rabbit disappear. I saw it with my own eyes. That's a demonstration.

 
At 11/05/2009 02:16:00 PM , Blogger Pete said...

Great example! Mind if I reference them?

 

Post a Comment

<< Home