SEVEN KEY TACTICS FOR ENSURING QUALITY

Similar documents
The QA Governance Methodology

Project Lifecycle Management (PLM)

<name of project> Software Project Management Plan

PROJECT MANAGEMENT PLAN TEMPLATE < PROJECT NAME >

Essentials of the Quality Assurance Practice Principles of Testing Test Documentation Techniques. Target Audience: Prerequisites:

BAL2-1 Professional Skills for the Business Analyst

Strategies for a Successful E2E Systems Integration Test. Fiona Charles Let s Test May 9, 2012

IT Project Management Methodology. Project Scope Management Support Guide

The George Washington University

Skatteudvalget (2. samling) SAU Alm.del Bilag 48 Offentligt. Programme, Project & Service Management Analysis

Project Management Office (PMO)

Requirements-Based Testing: Encourage Collaboration Through Traceability

Develop Project Charter. Develop Project Management Plan

The Software Development Life Cycle (SDLC)

PROJECT MANAGEMENT PLAN Outline VERSION 0.0 STATUS: OUTLINE DATE:

Phases, Activities, and Work Products. Object-Oriented Software Development. Project Management. Requirements Gathering

ABC COMPANY Extended Accounting System (EAS) Project Charter Sample

CDC UNIFIED PROCESS PRACTICES GUIDE

Enterprise Test Management Standards

PHASE 6: DEVELOPMENT PHASE

SECTION 4 TESTING & QUALITY CONTROL

Quality Assurance - Karthik

Application Test Management and Quality Assurance

<project name> COMMUNICATIONS PLAN

CDC UNIFIED PROCESS PRACTICES GUIDE

Template K Implementation Requirements Instructions for RFP Response RFP #

OE PROJECT CHARTER TEMPLATE

Creating a Coaching Plan Toolkit

Practice Overview. REQUIREMENTS DEFINITION Issue Date: <mm/dd/yyyy> Revision Date: <mm/dd/yyyy>

Essential QA Metrics for Determining Solution Quality

EXHIBIT L. Application Development Processes

California Department of Mental Health Information Technology Attention: MHSA-IT th Street, Room 141 Sacramento, CA 95814

Introduction. Architecture Re-engineering. Systems Consolidation. Data Acquisition. Data Integration. Database Technology

Appendix H Software Development Plan Template

COMMONWEALTH OF MASSACHUSETTS EXECUTIVE OFFICE OF HEALTH AND HUMAN SERVICES

Design Verification The Case for Verification, Not Validation

Project Management Guidelines

This alignment chart was designed specifically for the use of Red River College. These alignments have not been verified or endorsed by the IIBA.

BIG DATA KICK START. Troy Christensen December 2013

Enterprise Data Governance

POLAR IT SERVICES. Business Intelligence Project Methodology

PMP Examination Tasks Puzzle game

Camber Quality Assurance (QA) Approach

From Chaos to Clarity: Embedding Security into the SDLC

Essential Elements for Any Successful Project

Department of Administration Portfolio Management System 1.3 June 30, 2010

HP Application Lifecycle Management

Software Quality Assurance Plan

Functional Validation of SAP Implementation

FSW QA Testing Levels Definitions

This is the software system proposal document for the <name of the project> project sponsored by <name of sponsor>.

Partnering for Project Success: Project Manager and Business Analyst Collaboration

PHASE 6: DEVELOPMENT PHASE

Release Management: Effective practices for IT delivery

Crosswalk Between Current and New PMP Task Classifications

Smarter Balanced Assessment Consortium. Recommendation

TIBCO Spotfire and S+ Product Family

ESKITP Manage IT service delivery performance metrics

Software Test Plan (STP) Template

PORTFOLIO, PROGRAMME & PROJECT MANAGEMENT MATURITY MODEL (P3M3)

MNLARS Project Audit Checklist

HDI Support Center Certification: A Pragmatic Approach to Verifying Core ITIL Process Maturity

Project Management Plan for

Draft Documents RFP 3.2.4

System Build 2 Test Plan

The Power of Process, People, and Tools When Testing a Complex Integration Landscape for a Very Large Initial Retail ERP Implementation

8. Master Test Plan (MTP)

LEXEVS OPERATIONS AND MAINTENCE SUPPORT PROJECT MANAGEMENT PLAN

Ten Steps to Comprehensive Project Portfolio Management Part 3 Projects, Programs, Portfolios and Strategic Direction By R.

MKS Integrity & CMMI. July, 2007

4.13 System Testing. Section 4 Bidder's Products, Methodology, and Approach to the Project System Training

"Data Manufacturing: A Test Data Management Solution"

Construction Management System (CMS) Deliverable Review Process

Organization. Project Name. Project Overview Plan Version # Date

PHASE 3: PLANNING PHASE

How To Write An Slcm Project Plan

CUSTOMER GUIDE. Support Services

Colorado Department of Health Care Policy and Financing

PHASE 3: PLANNING PHASE

STATE BOARD OF ELECTIONS P.O. BOX 6486, ANNAPOLIS, MD PHONE (410)

Risk Management Primer

ITRM Guideline CPM Date: January 23, 2006 SECTION 5 PROJECT CLOSEOUT PHASE

Project Risk Management: IV&V as Insurance for Project Success

PROJECT MANAGEMENT METHODOLOGY SECTION 3 -- PLANNING PHASE

Enterprise Security Tactical Plan

Professional Engineers Using Software-Based Engineering Tools

"End-to-End Testing in an Enterprise Agile Environment"

SLCM Framework (Version ) Roles and Responsibilities As of January 21, 2005

Request for Proposal for Application Development and Maintenance Services for XML Store platforms

Copyright 2014 Carnegie Mellon University The Cyber Resilience Review is based on the Cyber Resilience Evaluation Method and the CERT Resilience

The Quality Assurance Centre of Excellence

The Key to a Successful KM Project

RFP Attachment C Classifications

Oracle Insurance Policy Administration System Quality Assurance Testing Methodology. An Oracle White Paper August 2008

Program Lifecycle Methodology Version 1.7

Issue Management Plan Preparation Guidelines

Business Logistics Specialist Position Description

Intland s Medical Template

Latest Trends in Testing. Ajay K Chhokra

SCOPE MANAGEMENT PLAN <PROJECT NAME>

Transcription:

SEVEN KEY TACTICS FOR ENSURING QUALITY 1

INTRODUCTION Besides avoiding disasters and fatal flaws, quality assurance (QA) delivers significant benefits for banks. Strong QA planning provides the groundwork for effective and efficient testing, which ensures new solutions will meet and even exceed customer expectations. This e-book offers seven key tactics financial institutions can execute in order to improve quality planning within their product development initiatives. 2

1. ESTABLISH A QA STRATEGY In developing new banking products, QA covers the systematic process of checking whether the new solution addresses all the specified requirements. A comprehensive QA strategy provides the guidance and structure for those related activities. In the diagram below, a typical approach to a software development effort is depicted in the seven chevrons in the middle of the graphic. Development starts with a solution design working toward installation and then support of the completed new system. The activities shown here include how joint reviews (JRs) serve as a forum to resolve any outstanding quality issues before the next development step can proceed. Verification of the application requirements and the application design is critical to establish and document mutual understandings as is the case with high-quality initiatives. The QA strategy outlines the framework for when and how these key activities occur within a program or software development initiative. Establishing the QA Strategy QUALITY PLANNING CONTROL OF DOCUMENTS VERIFICATION VALIDATION AUDIT Solution Design Requirements Issues Design Build Test Install Defects Support JR JR JR JR JR JR DEFECT MANAGEMENT PROBLEM RESOLUTION Approvals JR (Joint Review Results) Verification Results Test Results Issues Defect Reports CONTROL OF QUALITY RECORDS 3

2. UNDERSTAND DIFFERENCES BETWEEN A TEST PLAN AND TEST STRATEGY Bankers often have a hard time distinguishing between test strategies and plans, but one can t exist without the other. Plans are concrete, while strategy documents provide vision and governance. In the IT world, a test strategy defines the QA activities to be performed during the software life cycle. It describes the responsibilities and authorities for accomplishing software QA activities and identifies the required coordination of those activities with other activities within a project. A test plan, on the other hand, describes the plans for quality testing of software applications and software systems. It describes the software test environment to be used for the testing, identifies the tests to be performed and provides schedules for test activities. The following table provides an overview of some of the key distinguishing points between a test strategy and a test plan. TEST STRATEGY TEST PLAN PURPOSE TEST APPROACH TIMING DELIVERABLES INCLUDE OTHER Defines governance and ownership assigned Defines overall approach for assuring the solution Defines test approach for each phase within the strategy: system integration testing (SIT), user acceptance testing (UAT), performance testing Defines entry, exit and acceptance criteria for test phases Test communications, reporting and metrics, allocation release plan, defect management and approach for change requests Defines testing terminology and testing objectives Defines scope of testing including what functionality will be tested Covers when specific testing techniques will be employed, e.g., regression testing, data validation, test limits Provides testing schedule and calendar to stage specific activities Reference to test scripts, documentation of existing testing constraints, definition of defect levels and test criteria Includes different types of test plans for integrated testing, application testing, systems testing, etc. 4

3. IMPLEMENT KEY QA ACTIVITIES Beyond developing a test strategy and test plan, there are other QA activities to implement at the beginning of a program to create a foundation for success for the overall effort. They include: DEVELOP DOCUMENT CONTROL ENSURE RECEIPT OF QUALITY RECORDS REQUIRE JOINT REVIEWS TO FACILITATE DOCUMENT APPROVALS EXPECT A QA SUMMARY REPORT A program and related supporting projects should generate multiple supporting documents. Thorough quality practices require complete document approval and sound recordkeeping. Program stakeholders need to know that only one version of the truth exists, and they need to know where to access that critical document. Evidence that a quality activity was completed must be logged and uploaded into a readily available document repository. These records include approvals, verification of results and actual test results. The joint review process is also established during the planning phase of the initiative and is critical to QA governance. Before the client or program sponsor begins their phase of solution testing, they should review the findings in a QA summary report. This report details the defects and resolutions their partner uncovered and addressed before transitioning testing activities to the client or sponsor. 5

4. VERIFY REQUIREMENTS Verifying the requirements of the new solution is another critical activity within the QA process. The QA team must first identify deliverables to be verified and then define quality standards for each deliverable type. Next, they must review, approve and release verification standards for use in the initiative. Authors of any quality documents should produce deliverables to meet these standards. The QA team should log problems when quality standards are not met. Problems achieve resolution when a deliverable meets quality standards. After confirming the deliverable meets quality standards, the team releases it for use by the program. A partial example of the classification and logging of defect codes for one recent FIS-managed QA program is shown below. DEFECT CODE CATEGORY DEFECT DESCRIPTION IMPACT EXAMPLE BR-01 Not complete Terms and units of measure are not defined. BR-02 BR-03 Not traceable Not traceable A business requirement is not uniquely identifiable. A business requirement cannot be traced back to a stated business scope item (i.e., the scope of the solution has crept beyond the business need/objective). BR-08 Not relevant The requirement is a solution. Multiple interpretations of solution; incorrect design is developed. Inability to trace requirements forward or backwards Scope Creep Solution delivers more than stated business need/ objective. Design is constrained by the requirement. The acronym CST is used as a requirement, but it has not been defined. Each requirement needs to be numbered, e.g., 3.1.3 Requirement Description. Refer to a requirements traceability matrix. Requirement is stated as: BeB will call a standard Connectware Deposit Account Balance inquire service to verify the funds in the original account. BR-11 Not traceable The requirement cannot be checked for correctness by a person or a tool. It is not possible to confirm that the requirement has been met or, conversely, not met. Example of untestable requirements: System performance shall run fast. The system shall be easy to use. 6

5. UTILIZE A REQUIREMENTS TRACEABILITY MATRIX One technique to ensure full test coverage of a client s requirements is the use of a Requirements Traceability Index. A solution architect creates a solution design that captures client requirements. QA experts then use the Requirements Traceability Matrix to ensure solution design requirements align with product definition documents created by the various product teams involved in the creation of the client s new solution. This index ensures the allocation of each bank requirement back to the product requirements. This entire matching process helps achieve a high degree of confidence that all bank requirements receive the optimal testing coverage. Requirements Traceability Matrix Ensures client requirements have been fully allocated throughout the project Ensures full test coverage of client requirements Contract Items Solution Requirement Items E2E Solution Test Cases AT THE APPLICATION LEVEL Requirement Items Design Items Source Code Test Cases Requirement Items Design Items Solution Code Test Cases Requirement Items Design Items Solution Code Test Cases 7

6. EMPLOY FAILURE MODE AND EFFECTS ANALYSIS AS NEEDED Besides carefully tracing and testing the bank s requirements and maintaining solid QA documentation, other quality techniques can add significant value to banking initiatives. Failure mode and effects analysis (FMEA) is one of them. FMEA started in the 1940s in the United States as a way to improve manufacturing for the war effort. Although it began 70 years ago, this process remains valuable to the banking industry today. Because financial institutions compete globally against more nimble nontraditional financial services providers, they need to develop products quickly and with the high quality that encourages customer loyalty. In times of scarce QA and technical resources, testing priorities must be determined in order to best mitigate project risk. FMEA lays out a clear and straightforward process for performing that critical ranking. The design elements or components that compose a complex solution are analyzed by functional lines of business in order to: Identify the potential failures that could impact the customer experience Identify the potential causes of each failure Assess the resulting impact on customer experience should the failure occur Determine the probability that the cause will occur and subsequently result in the failure Identify the current controls or the ability to detect the failure or cause The FMEA focuses QA and testing resources in development efforts. Few banks have the luxury of unlimited time and technical resources. Prioritization in the testing can lead to fewer defects needing correcting at more expensive points in a product s life cycle. In times of scarce QA and technical resources, testing priorities must be determined in order to best mitigate project risk. 8

7. ESTABLISH PROCESSES FOR CONTINUAL QA IMPROVEMENT Over time, the long-term impact of QA on a program needs to be evaluated and assessed in order to evolve and improve. A robust continuous improvement methodology should collect quality metrics from a wide variety of sources, analyze the data and contribute to the adjustment of QA processes. The methodology should also incorporate best practices from other complex initiatives and provide a forum for open discussion of lessons learned. These improvement activities can occur as milestones are reached in the program and during a postmortem discussion of the initiative. The continual improvement methodology powers a repeating cycle of Plan, Do, Check and Act (PDCA) that benefits and evolves from the changes spurred by the methodology, such that each repetition of the PDCA cycle results in better QA than the last. AC T PLAN QA IMPROVEMENT DO C H E C K 9

2015 FIS and/or its subsidiaries. All Rights Reserved. 10