Back in August, I started introducing Program Management with an introduction to the System Development Life Cycle. I have talked in more detail about some of the early phases of the life cycle, like Operational Concept Development and System Requirements Specification, but right now I would like to skip ahead to the Test phase.
We are at a point in our basic switch replacement process when we have to start planning to test the candidates that have made it through our paper evaluation.
In this part of the process, we need to verify that the devices we are considering actually meet our stated requirements. I will provide a future post on the testing process, but before I do it is good to have an understanding of the four classical requirements verification methods:
Inspection is observation using one or more of the five senses, simple physical manipulation, and mechanical and electrical gauging and measurement to verify that the item conforms to its specified requirements.
For instance, we have this requirement on the physical characteristics of our basic switch:
PC2. Port connectors provided on the front panel
We will verify this requirement through inspection. That is, we will look at the switch and observe where the port connectors are located.
In fact, we will verify many of our requirements through inspection. All of the requirements indicating port types and counts, like:
AE1. Provides at least 32 10/100/1000BASE-T ports
In addition, for anything that is beyond our capabilities to test, we must rely on vendor documentation. We also considered these inspection methods. For instance:
EN1. Operates in temperatures between 32 and 104°F
Some companies would put the item in a temperature chamber and cycle the temperature while conducting performance tests. We do not have that capability. We will satisfy ourselves with an inspection of the vendor documentation as to their reported operating temperature range to determine whether the units satisfy these requirements.
Demonstration is the actual operation of an item to provide evidence that it accomplishes the required functions under specific scenarios.
Consider this requirement:
CM5. Manageable locally using console access
We will plug a local console device into the item and demonstrate that we can use it to perform a sampling of management functions.
Test is the application of scientific principles and procedures to determine the properties or functional capabilities of items.
Test is similar to demonstration, but is more exacting, generally requiring specialized test equipment, configuration, data, and procedure in order to verify that the item satisfies the requirement.
Consider this requirement:
IO1. Passes IPv4 unicast packets at full line rate
We will verify this requirement by testing. We will use specialized test equipment (SmartBits Data Sheet) we will connect it to the item in a particular way, configure the control software just so, and run specific data through it according to a repeatable procedure from which we will get a binary result indicating whether the item did, or did not satisfy the requirement.
Analysis is the use of established technical or mathematical models or simulations, algorithms, or other scientific principles and procedures to provide evidence that the item meets its stated requirements.
As test was like a more involved version of demonstration, so analysis is like testing on steroids. In analysis, many tests may be performed, but the results of any given test do not give a pass or fail indication, rather all of the results must be taken in concert and we must perform some further operation in order to determine whether the item satisfies the requirement.
Let me say that I do not believe we have any requirements that require analysis in this case. After all, this is the basic switch. However, if we had a requirement like this:
ZZ1. Provides an average jitter of less than 10 milliseconds.
In this case, we would have to verify this requirement using analysis. We would perform many tests, collect the resulting data, measure the time between output packets, and then calculate the average. We might repeat this many times over a 24-hour period and show whether the average jitter ever went above 10 milliseconds.