Aerospace Systems Engineer

Validation & Verification

Test Development

Test Development
Much of what we test are transitions from one state/mode/condition to another.  These have several items in common that must be considered in crafting a valid test.  In the following I use “state” generically, it could be the state of a component (active/inactive, enabled/disabled) or a system or sub-system mode (standby, operate, maintenance).

Steps toward developing a valid test.

1. Define the state

         1. What are the salient characteristics of the state?  What defines being in the state?
         2. What are the monitors and values of those monitors when in the state?
  
2. Define any behavioral constraints regarding entering the state.  These may be transition dependency (a state diagram may be necessary)
  
3. Validation has two aspects:
         1. Is state achieved? Defined by monitors for state.
         2. Is the transition behavior executed correctly?  Defined by behavior constraints.
  
4. Define test procedure
         1. Preconditions (may need to be set to confirm a function executed successfully.
         2. Action (trigger)
         3. Result (see validation)




Interface Validation

ICD Verification and Audits
ICDs define the interface between two systems. ICDs address, the Physical, Protocol and Application level details of the interface. ICDs provide design information to the designers implementing the system functionality.
1.  Where there are no options there is no design latitude and no ambiguity (i.e., the physical envelop of a device)
2.  Where design latitude provides implementation options the ICD documents the design agreements (i.e., device B configured as 1553 RT 5).
3.  Some systems may have utilization options (i.e.. Control directive 123 or control directive ABC will turn on the device. Typically there will be factors to consider in the choice of these control directives. )  IRS's are sometimes used to enforce an implementation.

Verification

ICDs

ICDs need not be verified. We do, of course want to demonstrate and convince ourselves that the ICD is, in fact, correct! That is different from verification.

The design required to achieve the functionality defined by the requirements; Spec A, Spec B and their parent, demand that the ICD be adhered to. If not it won't work. Therefore the successful verification of the requirement indicates that the ICD was adhered to. There is no option for the function to execute successfully unless the interface is adhered to.  Where the interface provides implementation latitude (#3 above) and a specific implementation is not identified in the requirement, the designer chooses. Verification doesn't care which choice is made.

IRS

An IRS contains requirements; "Shall" statements. If the IRS approach is chosen the IRS imposed design implementation, the IRS requirement, is verified.

Managing ICD Development and Implementation What are the salient items of the ICD?

Differentiating between descriptive material and essential interface design elements in the ICD is important. The spec writing approach of the use of "Will" is helpful to identify these necessary interface design elements. Tracking of the "Wills' can then be managed.  "Wills" are not verified, but we do want to assure that they are correct.

How do we insure ICD items are correct?
This is often what is meant by "verifying" the ICD. It is really a method of confirming that the interface is correctly documented. It is NOT verification in the sense that we verify requirements.
*      Peer Reviews by Subject Matter Expert and Systems Engineering provide a level of assurance of the correctness of the data.
*      Placing the ICD under configuration control and the process of "boarding" the ICD add another layer.
*      Component acceptance testing provides an opportunity to confirm that the expected interface behavior agrees with the documented behavior. This also provides risk reduction prior to assembly.
*      These things assure us, and the customer, that we are managing our interface risk.

How do we insure both sides know what "truth" is?
Achieving successful functionality requires that both sides of the interface have the same understanding of the interface. There should be only one interface document between interfacing entities. An audit of the requirements specifications can be used to provide assurance that the interfacing entities are both referencing the same interface document for the same functions.  Auditing the respective specifications, of the interfacing products, assures that they each reference the appropriate specific section of the correct ICD.

ICD validation occurs when the interfacing products requirements are verified.

A validation plan, mapping the interfacing products functional requirements to the associated ICD "wills" provides a mechanism to track validation of adherence to the ICD.

One should NEVER SAY we will have a "validation" product for ICDs. This would just open ourselves up to a mindless tracking of a meaningless effort. The true value is in the engineering exercise of incrementally defining, coordinating, documenting, communicating, evaluating the interface agreements so that there are NO SURPRISES when the two things interface for the first time.

Analysis
A standard SE Analysis approach, tools and implementations will vary.