Design Verification. Introduction



Similar documents
AS9100 B to C Revision

AS9100 Quality Manual

What methods are used to conduct testing?

Design Reviews. Introduction

VERIFICATION (TEST) PLAN GUIDELINES

Comparison ISO/TS (1999) to VDA 6.1 (1998)

Quality Management System Manual

An Introduction to. Metrics. used during. Software Development

Then call us today (07) or to find out more about how we can help you!

UNCONTROLLED COPY FOR REFERENCE ONLY

Failure Modes & Effects Analysis

Row Manufacturing Inc. Quality Manual ISO 9001:2008

Quality Management System General

10-3. SYSTEM TESTING AND DOCUMENTATION

The Fulfillment of AS 9100 Rev C Requirements by EnterpriseIQ

4.4 Commissioning, Handover and Warranty

Quality Management System Manual ISO9001:2008

R214 SPECIFIC REQUIREMENTS: INFORMATION TECHNOLOGY TESTING LABORATORY ACCREDITATION PROGRAM

Introduction to Systems Analysis and Design

CORPORATE QUALITY MANUAL

8. Master Test Plan (MTP)

SOFTWARE DEVELOPMENT STANDARD FOR SPACECRAFT

ISO 9001:2008 Quality Management System Requirements (Third Revision)

Copyright 2006 Quality Excellence for Suppliers of Telecommunications Forum

CUSTOMER SPECIFIC REQUIREMENTS

Company Quality Manual Document No. QM Rev 0. 0 John Rickey Initial Release. Controlled Copy Stamp. authorized signature

An Introduction to PRINCE2

Specialties Manufacturing. Talladega Castings & Machine Co., Inc. ISO 9001:2008. Quality Manual

Quality Assurance QUALITY ASSURANCE PLAN

ISO 9001:2015 Internal Audit Checklist

ISO 9001 (2000) QUALITY MANAGEMENT SYSTEM ASSESSMENT REPORT SUPPLIER/ SUBCONTRACTOR

Building Commissioning

A Quality System Approach to Retrospective Validation of Manufacturing Support Systems William Lodato, P.E.

Test Automation Architectures: Planning for Test Automation

DNV GL Assessment Checklist ISO 9001:2015

Karas Engineering AS9100 QUALITY MANAGEMENT SYSTEM MANUAL

ISO 9001:2000 Gap Analysis Checklist

GENERAL WELDING & FABRICATING, INC. 991 Maple Road Elma, New York 14059

Cost of Poor Quality:

HP Hard Disk Drive Quality System The Driving Force of Reliability

Cartel Electronics. AS 9100 Quality Systems Manual

Supplier quality guideline. VOSS Automotive Group Issue Rev00

Internal Quality Management System Audit Checklist (ISO9001:2015) Q# ISO 9001:2015 Clause Audit Question Audit Evidence 4 Context of the Organization

ISO 13485:201x What is in the new standard?

Configuration Management ISO 10007

Quality Assurance Agreement (QAA)

Software Development Process Models

Checklist. Standard for Medical Laboratory

1 For more information T: / E: DAP@ul.com / W: ul.com/dap

ALL PRODUCTS MFG & SUPPLY

Seven Habits of Highly Effective Product Development

White Paper On Pilot Method Of ERP Implementation

ISO 9001:2000 AUDIT CHECKLIST

Camar Aircraft Products Co. QUALITY MANUAL Revision D

Accelerate your mission with GTSI Integration Services

ISO/IEC QUALITY MANUAL

2016 OCR AUDIT E-BOOK

NORTH AMERICA OPERATIONS. (Fairmont and Montreal Facilities) QUALITY MANUAL. Prepared to comply with the requirements of ISO 9001:2008

Custom Development Methodology Appendix

ISO 26262:2011 Functional Safety Assessment Report. Texas Instruments Richardson, TX USA. Project: TDA2X ADAS SoC. Customer:

COST ESTIMATING METHODOLOGY

Exhibit F. VA CAI - Staff Aug Job Titles and Descriptions Effective 2015

ISO 9001:2015 QUALITY MANAGEMENT SYSTEM ***** ISO 14001:2015 ENVIRONMENTAL MANAGEMENT SYSTEM

EARSC Guideline Document. EARSC EO Industry Certification Scheme

Quality Management Systems Manual

Cisco Services for IPTV

Formal Software Testing. Terri Grenda, CSTE IV&V Testing Solutions, LLC

Guidance for the Quality Assurance of Fire Protection Systems

North American Development Bank. Bid Evaluation Procedures

Space project management

Risk Assessment for Medical Devices. Linda Braddon, Ph.D. Bring your medical device to market faster 1

AS 9100 Rev C Quality Management System Manual. B&A Engineering Systems, Inc Business Park Drive, Suite A-1 Costa Mesa, CA 92626

ISO 9001:2015 Overview of the Revised International Standard

Honeywell. Our Quality & Environmental. Commitment. The Sensing and Control (S&C) Global Management System Manual. Honeywell Page 1 of 37

Oracle Insurance Policy Administration System Quality Assurance Testing Methodology. An Oracle White Paper August 2008

QUALITY MANUAL REVISION RECORD

Data Validation and Data Management Solutions

An Introduction to Reliability testing

Quality Management System Manual Revision L

Auditing HACCP Programs

WHITE PAPER. HALT and HASS in IPC 9592A. HALT - Highly Accelerated Life Test HASS - Highly Accelerated Stress Screen

Electrical Safety Tester Verification

RESTRICTION ON DISCLOSURE AND USE OF DATA

Configuration, Change, and Release Management Policies and Procedures Guide

MEAT GRADING AND CERTIFICATION BRANCH QUALITY MANUAL

pm4dev, 2016 management for development series Project Scope Management PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS

ISO-9001:2000 Quality Management Systems

TL 9000 and TS16949 Comparison

Calculation of Risk Factor Using the Excel spreadsheet Calculation of Risk Factor.xls

Installation and Operational Qualification Protocol (Reference: SOP )

Page 1 of 7 Effective Date: 12/18/03 Software Supplier Process Requirements

Supplier Quality Manual

Supplier Quality Requirements

Table of Contents 1. SCOPE APPLICABLE DOCUMENTS TERMS AND DEFINITIONS QUALITY MANAGEMENT SYSTEM...4-8

TEC Nov08 Rev A

Quality and Quality Control

Transcription:

Design verification is an essential step in the development of any product. Also referred to as qualification testing, design verification ensures that the product as designed is the same as the product as intended. Unfortunately, many design projects do not complete thorough design qualification resulting in products that do not meet customer expectations and require costly design modifications. Project activities in which design verification is useful: Concept through to Detailed Design Specification Development Detailed Design through to Pre-Production Production Other tools that are useful in conjunction with design verification: Requirements Management Configuration Management FMEA Introduction Many customers hold the testing of products in the same regard as the actual design. In fact, many development projects specify design verification testing as a major contract requirement and customers will assign their own people to witness testing and ensure that it is completed to satisfaction. Although potentially costly, the expense of not testing can be far greater therefore projects that do not specifically require testing are wise to include testing as part of the development program. Testing may occur at many points during the design process, from concept development to post-production. This tool will focus mainly on prototype testing however many of the guidelines that will be provided can be applied to all testing. Development tests conducted with materials, models or subassemblies are useful for determining the feasibility of design ideas and gaining insights that further direct the design. The results of these tests cannot be considered verification tests however their use can be crucial. Prototype testing occurs with items that closely resemble the final product. These tests generally stress the product up to and beyond specified uses conditions and may be destructive. Testing may occur at many levels. Generally, the more complex the product, the more levels of testing. For a complex system, tests might be conducted at the unit level, subsystem level then finally at the system level. Testing with prototypes Design Verification.doc Page 1 of 10 V0.0

allows the correction of deficiencies and subsequent re-testing before large commitments are made to inventory and production readiness. Proof testing is another type of design verification testing that employs prototypes. Rather than testing to specification, proof tests are designed to test the product to failure. For example, if a table is designed to support a certain amount of weight, prototype testing will be used to ensure that the table will support the specified weight plus a pre-determined safety factor. Proof testing would continue loading the table until failure is reached - likely beyond the specified limits. These tests are often used to identify where eventual failures might occur. This information is often useful for identifying potential warranty issues and costs. Acceptance testing is a form of non-destructive testing that occurs with production units. Depending on the criticality of failures, testing costs and the number of units produced, tests may be conducted on initial production units and/or random or specified samples (e.g., every 10 th unit), or every single unit produced. Application of Design Verification Testing Verification Methods There are a number of methods that can be used in verification testing. Some are relatively inexpensive and quick, such as inspection, while others can be costly and quite involved, such as functional testing. A description of the most common verification methods follow: Demonstration. Demonstrations can be conducted in actual or simulated environments. For example, if a specification for a product requires that it be operable with one hand, likely the simplest method for verifying this requirement is to have someone actually operate the product with one hand. As record of the test, it may be acceptable to simply have the test witnessed or alternatively, recorded on video. The cost of a demonstration will vary according to the complexity of the demonstration, however most are relatively inexpensive. Inspection. Inspection is usually used to verify requirements related to physical characteristics. For example, if a specification requires that the product is a certain colour, a certain height or labelled in a specific manner, inspection would be used to confirm that the requirements have been met. Inspection is typically one of the less expensive verification methods. Design Verification.doc Page 2 of 10 V0.0

Analysis. Typically, analysis is often used in the design of a product. It can also be used to verify the design and is often the preferred method if testing is not feasible or the cost of testing is prohibitive, and risk is minimal. For example, analysis may be used to support the argument that a product will have a lifecycle of 25 years. Similarity. If a design includes features or materials that are similar to those of another product that has met or exceeded current specifications, an analysis to illustrate this similarity may be used to verify a requirement. For example, if a specification requires that a product be water resistant and materials that have been proven to be water resistant in other applications have been chosen, an analysis of similarity could be used. Testing. Testing can be one of the most expensive verification methods, depending on complexity as well as equipment and facility requirements. However, sometimes it is the only acceptable means for verifying aspects of a design. For example, a product may be required to survive transportation over various terrain (e.g., dirt roads). The most common method for validating this requirement is transportation testing where the product is placed in a test bed that moves up and down, and vibrates to simulate worst-case transportation. Although this testing requires relatively expensive and specialized equipment, it allows the testers to observe the test and is more economical that using a truck to validate by demonstration. Selection of Method(s) Often a number of verification methods may be equally appropriate to verify a requirement. If this is the case, the cost and the time that is required to complete the verification should be considered. For example, to verify that a product satisfies a requirement to fit through a standard 30 by 7 doorway, inspection (measure the height and width of the product) or demonstration (move the product through the doorway) can be used. Sometimes it is necessary or useful to utilize two or more methods of verification. For example, if a specification requires that a product be usable by persons from the 1 st to the 99 th percentile, a demonstration may be conducted with representatives from each extreme and an analysis completed to prove accessibility to all other sized persons within the specified range. Identification of Verification Activities Initial identification of verification activities should be done concurrently with specification development. For each specification developed, a method for verifying the Design Verification.doc Page 3 of 10 V0.0

specification should be determined. Usually the method at this stage will include the method(s) to be employed and a general idea on how the test will be conducted. This forces the designer to make sure that the specification is objective and verifiable, and also allows the test engineers to get a head start on putting together a detailed test plan and procedures. The one caution is that this parallel development puts responsibility on the designer to make sure that test engineering is promptly informed of any changes to specifications which normally is of minimal concern in integrated team environments. If verification activities are not identified during the preparation of the specification, the design engineer needs to ensure enough notice is given to test engineering to allow timely planning and preparation. The communication and identification of required testing between design and test can occur through various modes and will generally depend on the overall approach to the design project (e.g., integrated team versus department based) and established company procedures. With an integrated team approach, the test engineer may take the product specification and work jointly with designers and other members of the team to identify and plan tests. If the design approach is department or functionally based, the design engineers may be required to complete and forward test requests to the test engineering department as the tests engineers are not intimately involved in the development of the design. Preparation of Verification Activities The preparation of verification activities involves: Determining the best approach to conducting the verifications Defining measurement methods Identify opportunities to combine verification activities (i.e., a single demonstration or test might be used to verify a number of requirements) Identifying necessary tools (e.g., equipment, software, etc.) and facilities (e.g., acoustical rooms, environmental chambers) Identifying a high-level verification schedule Once the above items have been addressed, the overall verification plan should be reviewed with the design team to address any issues before detailed planning occurs. Issues that may arise are insufficient in-house equipment/facilities or expertise, and problems with schedule. Many tests often require specialized equipment and facilities that are not available inhouse (e.g., environmental chambers) therefore out-of-house facilities that can conduct these tests must be identified. At this time, estimates for out-of-house testing are usually obtained. These help to determine which test facility to use or, if costs exceed budget constraints, whether to redefine the verification requirements such that verification can be Design Verification.doc Page 4 of 10 V0.0

conducted in-house. If tests will be subcontracted, this will generally be managed by test engineering. Problems with the verification schedule may be due to a number of reasons. The time to complete the verification may be insufficient. In this case, some trade-offs may be necessary. Time may need to be increased, or the number or duration of tests decreased. Sometimes a brainstorming session with the development team may lead to creative solutions. Another problem with schedules may be the fact that certain verification activities need to take place during certain weather conditions (e.g., snow) however the period for verification will occur during summer months. It is likely undesirable to delay a project in the expectation of weather conditions therefore alternative means must be considered. Detailed Verification Planning and Procedures Once all of the issues surrounding initial preparation have been resolved, verification procedures can be developed. Written procedures should be developed for even the simplest of verification activities. This increases the quality and accuracy of results, and also ensures that repeated tests are conducted in an identical manner. The size of these procedures will depend on the complexity of the activities to be performed and therefore can be as short as a few lines or as large as a substantial document. Attachment A contains an outline for verification procedures. The format for procedures should be tailored as appropriate and only those items in the outline relevant to an individual verification activity should be included. An important consideration to make when developing detailed verification plans and procedures is the order in which activities are conducted. Verification time can be substantially reduced if all tests requiring a similar set-up are conducted sequentially. Also, shorter activities can be scheduled to occur while longer activities that do not require consistent monitoring are in progress. Two final considerations are related to the order in which activities are conducted. If testing is destructive, it should be conducted in order from least to most destructive to limit the number of test units required. Additionally, it is sometimes beneficial to order verification activities such that the outputs of one test can be used as inputs to subsequent tests. Conducting Verification Activities Execution of Verification Activities It is extremely important that the test procedures be followed to the letter when conducting verification activities. A failure to do so may invalidate results and may have more dire consequences if the customer believes that it was done intentionally to increase the probability of passing verification or future product failures lead to legal action. If for Design Verification.doc Page 5 of 10 V0.0

some reason it is discovered that procedures require modification, these changes should be documented and the necessary approvals obtained before continuing with the affected verification activity. If a verification activity is continued after a modification rather than started over, it should be noted in the record of results. Recording of Results Careful collection and recording of data is extremely important. The customer may contractually require these records and they may be a prerequisite for obtaining certifications (e.g., Canadian Standards Association). Attachment B provides an example outline for recording results of verification activities. Depending on the requirements of the development project, the verification records may be sufficient to report the results. In other cases, a formal test report may be necessary. Attachment C provides an outline for a formal test report. All test records and reports should be reviewed and approved as defined by company procedures. If formal procedures are not in place, the test engineering lead, the project manager, a customer representative or some other authority as agreed upon can review these items. Highlighting Non-Conformance If a non-conformance (e.g., anomaly or failure) is discovered through verification activities, it is important to first attempt to verify that the non-conformance is with the product and not due to test equipment or other extenuating factors. If the nonconformance is product related, then details should be fed back to the designers as quickly as possible rather than waiting for the completion of a test record or report. In highly integrated teams, the optimum method for feedback may be to have the designer witness the non-conformance first-hand. In any case a non-conformance report should be generated. It is important that the test engineer maintain these reports to ensure that all non-conformances are adequately addressed. References Burgess, John A., Design Assurance for Engineers and Managers, Marcel Dekker, Inc., New York, 1984. ISBN 0-8247-7258-X pp. 150-165 MIL-STD-1540D (1999), Product Verification Requirements for Launch, Upper Stage and Space Vehicles, U.S. Department of Defense, Government Printing Office, Washington, D.C. Design Verification.doc Page 6 of 10 V0.0

Attachment A Verification Procedure Outline 1 Procedure Number: Method: Applicable Requirements: Purpose/Scope: Items Under Test: Precautions: Special Conditions/ Limitations Equipment/ Facilities: Data Recording: Any numbering scheme can be used but it is useful to correlate the number with the applicable specification number. It may be useful to add a prefix to designate the type of activity (e.g., T1.2.3 for a test, D1.2.2 for a demonstration, etc.) If a verification activity is designed to address multiple specifications, the procedure should be written once and the other procedure sheets should simply include a procedure number and a reference to the written procedure. If multiple verification activities are used to validate a single requirement, consider using the same procedure number for each plus a unique suffix (e.g., T1.2.3a, T1.2.3b) Analysis, similarity, demonstration, test and/or inspection An identification of the requirements to be verified or the inclusion of the actual written requirement. If multiple requirements are to be addressed with the verification activity, all should be included with the main test procedure. What is to be done and why? A definition of the items to be tested, including as applicable, part and/or model numbers, material type, size, shape, etc. as well as the number of units to be tested. An identification of hazards, safety considerations, or special factors (e.g., level of cleanliness required) For example, special requirements for the verification activity such as video recordings of a specific portion of the activity or the required presence of certain witnesses during the activity. Equipment or facilities required including any requirements for calibration of equipment. Requirements relating to the recording of test data (e.g., specific formats, frequency of readings, etc.) 1 Adapted from p.158 of Design Assurance for Engineers and Managers by Burgess. Design Verification.doc Page 7 of 10 V0.0

Acceptance Criteria: Procedures: Troubleshooting: Post-Test Activities: Pass/fail criteria including required accuracy of results. Detailed script outlining how the verification activity is to be performed (e.g., application of loads, environmental conditions, sequence, data collection at specific steps, etc.) This section should be written such that any tester can easily and accurately follow the procedure. Identify the actions to be taken if an inadvertent event occurs (e.g., premature failure). Any activities that must be conducted once verification activities are completed (e.g., disposal requirements, teardown of test equipment, etc.) Design Verification.doc Page 8 of 10 V0.0

Attachment B Verification Results Outline 2 Procedure Number: Method: Applicable Requirements: Items Under Test: Date: Testers: Witnesses: Equipment/ Facilities: Results: Comments: Recommendations: Signatures: See Attachment A for description. Analysis, similarity, demonstration, test and/or inspection An identification of the requirements to be verified or the inclusion of the actual written requirement. If multiple requirements are to be addressed with the verification activity, all should be included with the main test procedure. See Attachment A for description. The date(s) and time of day during which the verification activity was conducted. Names of persons conducting the verification activity. Names of any witnesses to the verification activity. If a witness only observes a portion of the test, that portion should be identified. Equipment or facilities used including the manufacturer s name as well as model and serial numbers as applicable. If equipment requires calibration, indicate the date on which the latest calibration was completed. Include all data collection as indicated in the test procedures. Attach copies of any auto-generated test data. Clearly identify pass or failure with respect to acceptance criteria. Description of test conditions, unusual circumstances, etc. If the verification record will also serve as the verification report, any conclusions and recommendations should be presented. Signature(s) of the tester(s). 2 Adapted from p.158 of Design Assurance for Engineers and Managers by Burgess. Design Verification.doc Page 9 of 10 V0.0

Attachment C Verification Report Outline 3 Title Page: Approvals: Table of Contents: Title page may contain the procedure number, the name of the procedure and the item under test. The date and revision number, if applicable can be included. Usually includes the name and signature of the person who has prepared the report as well as any others whose approval is required. Index of Tables & Figures: Summary: Introduction Description of Test: Appendices: References: Summary should include an identification of applicable requirements, an indication of what the verification activity was to achieve, description of test conditions, and a summary of conclusions. A brief introduction to the test report. The description might include: Items under test Facilities and equipment Description of verification methods Instrumentation and measurements Results Analysis and discussion of results Conclusions and recommendations Calculations, data sheets, verification procedures, etc. 3 Adapted from p.158 of Design Assurance for Engineers and Managers by Burgess. Design Verification.doc Page 10 of 10 V0.0