A Practical Approach to Verification and Validation Dr. Eugene W.P. Bingue U. S. Navy dr.bingue@gmail.com Dr. David A. Cook Stephen F. Austin State University cookda@sfasu.edu A Practical Approach to V&V 1
V& V - It s all about Quality! A Practical Approach to V&V 2
The Three Domains of V&V Requirements Validation Used Right? USER DOMAIN Program Validation PROBLEM DOMAIN Requirements Verification TOOL DOMAIN Fit for Intended Use? Built Well? Verification A Practical Approach to V&V 3
Verification and Validation Definitions Verification The process of determining that a model implementation accurately represents the developer's conceptual description and specifications. Did I build the system right? Validation The process of determining (a) the manner and degree to which a model is an accurate representation of the real-world from the perspective of the intended uses of the model. Did I build the right system? A Practical Approach to V&V 4
V&V vs. Testing Testing is a discrete phase VV&A should occur during each phase The first mistake that people make is thinking that the testing team is responsible for assuring quality Brian Marick, as quoted in Pressman A Practitioners Guide to Software Engineering A Practical Approach to V&V 5
Non-Simulation It works well, and as I expected! It gives the right solutions! A Practical Approach to V&V 6
Simulation A Practical Approach to V&V 7
Accreditation (for simulations) Accreditation is the official certification that a model or simulation is acceptable for use for a specific application Three steps: Identify gaps in the program (what it WON T do) Assess the risks Recommend acceptable uses, and list limitations A Practical Approach to V&V 8
Why V&V? When Quality is vital, independent checks are necessary, not because people are untrustworthy but because they are human. Watts Humphrey, Managing the Software Process A Practical Approach to V&V 9
Source: Ould and Unwin. Testing in Software Development, 1988 Users View Mature Process for System Development Designers View Developers View Rqmts. Analysis System Spec System Design Module Design User's Views Reqts User Trial Plan Delivered System System Spec Accept Test Plan Integ System System Design Integ Test Plan Modules Module Design Unit Test Plan module coding Coded Units The V&V View Acceptance testing System & int. testing unit testing 10
Basic (and Practical VV&A: ) Right Product built Right Taxonomy for V&V Validate Equations/Algorithms Inspect Conceptual Model Inspect Requirements Inspect Design User's Views Reqts System Spec System Design Module Design Formal Document Review Inspect CM practices User Trial Plan Delivered System Accept Test Plan Integ System Integ Test Plan Modules Unit Test Plan Coded Units module coding Functionality Testing System acceptance testing VV&C Input/Default Data integration testing unit testing Inspect Test Plans/Test Results Inspect Code Verify Equations/Algorithms A Practical Approach to V&V 11
Informal Audit Desk Checking Face Validtion Inspections Reviews Turing Test Walkthroughs Source: DMSO Best Practices Static Cause-Effect Graphing Control Analysis Calling Structure Concurrent Process Control Flow State Transition Data Analysis Data Dependency Data Flow Fault/Failure Analysis Interface Analysis Model Interface User Interface Semantic Analysis Structural Analysis Symbolic Evaluation Syntax Analysis Traceability Assessment Potential Verification & Validation Techniques V&V Techniques Dynamic Acceptance Testing Alpha Testing Assertion Checking Beta Testing Bottom-Up Testing Comparison Testing Compliance Testing Authorization Performance Security Standards Debugging Execution Testing Monitoring Profiling Tracing Fault / Failure Insertion Testing Field Testing Functional (Black-Box) Testing Graphical Comparisons Interface Testing Data Model User Object-Flow Testing Partition Testing Predictive Validation Product Testing Regression Testing Sensitivity Analysis Special Input Testing Boundary Value Equivalence Partitioning Extreme Input Invalid Input Real-Time Input Self-Driven Input Stress Trace-Driven Input Statistical Techniques Structural (White-Box) Testing Branch Condition Data Flow Loop Path Statement Submodel / Module Testing Symbolic Debugging Top-Down Testing Visualization / Animation Formal Induction Inference Logical Deduction Inductive Assertions Calculus Lambda Calculus Predicate Calculus Predicate Transformation Proof of Correctness A Practical Approach to V&V 12
V&V Techniques Informal V&V techniques are among the most commonly used. They are called informal because their tools and approaches rely heavily on human reasoning and subjectivity without stringent mathematical formalism. Static V&V techniques assess the accuracy of the static model design and source code. Static techniques do not require machine execution of the model, but mental execution can be used. The techniques are very popular and widely used, and many automated tools are available to assist in the V&V process. Static techniques can reveal a variety of information about the structure of the model, the modeling techniques used, data and control flow within the model, and syntactical accuracy (Whitner and Balci, 1989)..
V&V Techniques (continued) Dynamic V&V techniques require model execution; they evaluate the model based on its execution behavior. Most dynamic V&V techniques require model instrumentation, the insertion of additional code (probes or stubs) into the executable model to collect information about model behavior during execution. Dynamic V&V techniques usually are applied in three steps: executable model is instrumented instrumented model is executed model output is analyzed, dynamic model behavior is evaluated Formal V&V techniques (or formal methods) are based on formal mathematical proofs or correctness and are the most thorough means of model V&V. The successful application of formal methods requires the model development process to be well defined and structured. Formal methods should be applied early in the model development process to achieve maximum benefit. Because formal techniques require significant effort they are best applied to complex problems, which cannot be handled by simpler methods.
Verification and Validation Technique Taxonomy Informal Techniques audit desk check face validation review Turing test walk-through Static Techniques control analyses data analyses calling control flow fault/failure cause-effect graphing structure data concurrent state dependency data flow process transition interface analyses model interface inspection analysis symbolic semantic analysis structural analysis user evaluation interface syntax analysis traceability assessment Dynamic Techniques acceptance test alpha test assertion check beta test bottom-up test execution tests monitor profile trace graphical comparison comparison test fault / failure insertion test interface tests data model user compliance tests authorization security performance standards field test object-flow test debugging functional test (Black Box test) partition test predictive validation product test regression test sensitivity analysis boundary value equivalence partitioning extreme input invalid input special input tests real-time input self-driven input stress trace-driven input structural tests (White Box tests) branch condition data flow loop path statement statistical techniques submodel / module test symbolic debugging top-down test visualization / animation Formal Techniques induction inference logical deduction inductive assertion lambda calculus predicate calculus predicate transformation proof of correctness
What activities do you select? It depends upon Available time Available funds Confidence in the development and developers and process Accreditation needs (very important for simulations) Type of activity User needs and desires Criticality of the application Formally, you should document what activities you will perform in a V&V Plan A Practical Approach to V&V 16
Lessons Learned Tricks and Traps in V&V A Practical Approach to V&V 17
Lesson 1 Identify intended uses of the product early Create use cases, scenarios, or SRS Verify and Validate the requirements. Do it again. Keep the requirements separate and current. Insist on a design (or future maintenance will be problematic). Plan for V&V early Insist on user involvement in V&V of requirements A Practical Approach to V&V 18
Lesson 2 Software Engineering 101 Ten 1,000 line programs are easier to V&V than one 10,000 one Separate different classes of uses and users. Plan and design accordingly. You MUST have a design. A Practical Approach to V&V 19
Lesson 3 Determine acceptability criteria as early as possible Determine how you will know when the product is good enough Know what the user really needs - perfect vs. the 80% solution This is another way of saying The requirements must be very clear and the user must agree with the developers as to what the requirements are A Practical Approach to V&V 20
Lesson 4 Keep track of complex requirements Accuracy Fidelity Speed Response time Interfaces Interoperability Real-time requirements You will need domain-specific expertise for these areas A Practical Approach to V&V 21
Lesson 5 Start the V&V early (which is a nice way of saying FUND the V&V early ) Manage, organize and update the V&V artifacts Do not confuse V&V with testing A Practical Approach to V&V 22
Lesson 6 use a taxonomy Conceptual Model (SRS, Briefing, Conversation) Requirements (formal and informal) Equations and Algorithms Design (validity, coupling, cohension) Code (documentation and coding standards) Equations / algorithms / dimensional analysis Test plans, test results Check input data / default values / constants Functionality check (final user approval) Configuration Management Documentation Risks A Practical Approach to V&V 23
Lesson 7 Know the limits of YOUR expertise Know when to to use Subject Matter Experts (SMEs) who usually don t understand program development, but can help you understand the real world Statisticians and mathematicians Domain experts, who can translate SME input into program requirements Specialized domain-specific developers A Practical Approach to V&V 24
Lesson 8 Configuration Management is critical. Small changes to the program can invalidate V&V results. It is critical to re-evaluate program results, and perform incremental V&V after any changes that might affect the validity of the program. Limit access to the code and requirements. Update requirements and design as needed. Save the test cases you use for V&V you will need to reuse them frequently! A Practical Approach to V&V 25
Final Lessons The cost of V&V is a small part of overall development V&V saves you time and money Without V&V, you get stuck in the code fix code fix loop late in the development process The risk of overall program failure increases dramatically without V&V V&V pays for itself and saves you $$s in decreased future maintenance costs A Practical Approach to V&V 26
A Practical Approach to V&V 27
A Practical Approach to V&V 28