Considerations When Validating Your Analyst Software Per GAMP 5



Similar documents
Considerations for validating SDS Software v2.x Enterprise Edition for the 7900HT Fast Real-Time PCR System per the GAMP 5 guide

OPERATIONAL STANDARD

INTRODUCTION. This book offers a systematic, ten-step approach, from the decision to validate to

Computer System Validation - It s More Than Just Testing

Implementing Title 21 CFR Part 11 (Electronic Records ; Electronic Signatures) in Manufacturing Presented by: Steve Malyszko, P.E.

This interpretation of the revised Annex

OECD DRAFT ADVISORY DOCUMENT 16 1 THE APPLICATION OF GLP PRINCIPLES TO COMPUTERISED SYSTEMS FOREWARD

Testing Automated Manufacturing Processes

unless the manufacturer upgrades the firmware, whereas the effort is repeated.

Risk-Based Validation of Computer Systems Used In FDA-Regulated Activities

GAMP5 - a lifecycle management framework for customized bioprocess solutions

Using the ISPE s GAMP Methodology to Validate Environmental Monitoring System Software

LabChip GX/GXII with LabChip GxP Software

FDA Software Validation-Answers to the Top Five Software Validation Questions

Computerized System Audits In A GCP Pharmaceutical Laboratory Environment

The SaaS LMS and Total Cost of Ownership in FDA-Regulated Companies

OMCL Network of the Council of Europe QUALITY ASSURANCE DOCUMENT

ComplianceSP TM on SharePoint. Complete Document & Process Management for Life Sciences on SharePoint 2010 & 2013

Computerised Systems. Seeing the Wood from the Trees

Clinical database/ecrf validation: effective processes and procedures

The FDA recently announced a significant

A Model for Training/Qualification Record Validation within the Talent Management System

A Pragmatic Approach to the Testing of Excel Spreadsheets

Qualification Guideline

International GMP Requirements for Quality Control Laboratories and Recomendations for Implementation

21 CFR Part 11 Implementation Spectrum ES

CONTENTS. List of Tables List of Figures

EUROPEAN COMMISSION HEALTH AND CONSUMERS DIRECTORATE-GENERAL. EudraLex The Rules Governing Medicinal Products in the European Union

ABSTRACT INTRODUCTION WINDOWS SERVER VS WINDOWS WORKSTATION. Paper FC02

Validated SaaS LMS SuccessFactors Viability

TIBCO Spotfire and S+ Product Family

ISCT Cell Therapy Liaison Meeting AABB Headquarters in Bethesda, MD. Regulatory Considerations for the Use of Software for Manufacturing HCT/P

Validating Enterprise Systems: A Practical Guide

Compliance Response Edition 07/2009. SIMATIC WinCC V7.0 Compliance Response Electronic Records / Electronic Signatures. simatic wincc DOKUMENTATION

OECD SERIES ON PRINCIPLES OF GOOD LABORATORY PRACTICE AND COMPLIANCE MONITORING NUMBER 10 GLP CONSENSUS DOCUMENT

21 CFR Part 11 Checklist

8. Master Test Plan (MTP)

Computer validation Guide. Final draft

LOW RISK APPROACH TO ACHIEVE PART 11 COMPLIANCE WITH SOLABS QM AND MS SHAREPOINT

Your Software Quality is Our Business. INDEPENDENT VERIFICATION AND VALIDATION (IV&V) WHITE PAPER Prepared by Adnet, Inc.

Training Course Computerized System Validation in the Pharmaceutical Industry Istanbul, January Change Control

Documenting Distribution Operations: FDA Validation Beyond the Laboratory and Manufacturing Facility

GAMP 4 to GAMP 5 Summary

Software Verification and Validation

Protecting Business Information With A SharePoint Data Governance Model. TITUS White Paper

Regulatory Asset Management: Harmonizing Calibration, Maintenance & Validation Systems

Computer System Configuration Management and Change Control

Software Validation in Clinical Trial Reporting: Experiences from the Biostatistical & Data Sciences Department

Analyst 1.6 Software. Laboratory Director s Guide

Copyright 2014 Carnegie Mellon University The Cyber Resilience Review is based on the Cyber Resilience Evaluation Method and the CERT Resilience

Back to index of articles. Qualification of Computer Networks and Infrastructure

Sharon Strause 9/10/ years with the

STS Federal Government Consulting Practice IV&V Offering

R214 SPECIFIC REQUIREMENTS: INFORMATION TECHNOLOGY TESTING LABORATORY ACCREDITATION PROGRAM

User Bulletin Cellular Detection System Analysis Software v4.0. Introduction. 21 CFR Part 11 Software Console - Administrators Guide

Full Compliance Contents

Compliance Response SIMATIC SIMATIC PCS 7 V8.1. Electronic Records / Electronic Signatures (ERES) Edition 03/2015. Answers for industry.

COTS Validation Post FDA & Other Regulations

Organisation de Coopération et de Développement Économiques Organisation for Economic Co-operation and Development

A ChemoMetec A/S White Paper September 2013

Barnett International and CHI's Inaugural Clinical Trial Oversight Summit June 4-7, 2012 Omni Parker House Boston, MA

Conducting a Gap Analysis on your Change Control System. Presented By Miguel Montalvo, President, Expert Validation Consulting, Inc.

e-signatures: Making Paperless Validation a Reality

The Impact of 21 CFR Part 11 on Product Development

GOOD PRACTICES FOR COMPUTERISED SYSTEMS IN REGULATED GXP ENVIRONMENTS

Guidance for Industry COMPUTERIZED SYSTEMS USED IN CLINICAL TRIALS

How to Survive an FDA Computer Validation Audit

GOOD LABORATORY PRACTICE (GLP) GUIDELINES FOR THE VALIDATION OF COMPUTERISED SYSTEMS. Working Group on Information Technology (AGIT)

For technical assistance, please contact: Thermo Nicolet Corporation 5225 Verona Road Madison WI

Tools to Aid in 21 CFR Part 11 Compliance with EZChrom Elite Chromatography Data System. White Paper. By Frank Tontala

21 CFR Part 11 Deployment Guide for Wonderware System Platform 3.1, InTouch 10.1 and Historian 9.0

Complete Document & Process Management for Life Sciences on SharePoint 2010

Software. For the 21 CFR Part 11 Environment. The Science and Technology of Small Particles

Computer System Configuration Management and Change Control

QUALITY CONTROL AND QUALITY ASSURANCE IN CLINICAL RESEARCH

Overview of STS Consulting s IV&V Methodology

GAMP 5 and the Supplier Leveraging supplier advantage out of compliance

MHRA GMP Data Integrity Definitions and Guidance for Industry January 2015

DeltaV Capabilities for Electronic Records Management

MHRA GMP Data Integrity Definitions and Guidance for Industry March 2015

Software Development for Medical Devices

AAMI TIR 36: Validation of SW for Regulated Processes. Denise Stearns April 2008

Eclipsys Sunrise Clinical Manager Enterprise Electronic Medical Record (SCM) and Title 21 Code of Federal Regulations Part 11 (21CFR11)

How To Validate Software

The use of computer systems

DeltaV Capabilities for Electronic Records Management

A CRO's Dilemma - The CDMS Validation Package that Failed Client Audits 19.OCT Disclaimer

ORACLE PROCESS MANUFACTURING QUALITY MANAGEMENT

SYLOGENT DEDICATED HOSTING

Electronic records and electronic signatures in the regulated environment of the pharmaceutical and medical device industries

Page 1 of 7 Effective Date: 12/18/03 Software Supplier Process Requirements

Welcome Computer System Validation Training Delivered to FDA. ISPE Boston Area Chapter February 20, 2014

Nova Southeastern University Standard Operating Procedure for GCP. Title: Electronic Source Documents for Clinical Research Study Version # 1

Validation and Part 11/Annex 11 Compliance of Computer Systems

Improving Service Asset and Configuration Management with CA Process Maps

SIMATIC PCS 7 V6.1. GMP - Engineering Manual. Guidelines for implementing automation projects in a GMP environment

Product Life Cycle Management

Off-the-Shelf Software: A Broader Picture By Bryan Chojnowski, Reglera Director of Quality

QUESTIONS FOR YOUR SOFTWARE VENDOR: TO ASK BEFORE YOUR AUDIT

Medical Device Training Program 2015

Transcription:

WHITE PAPER Analyst Software Validation Service Considerations When Validating Your Analyst Software Per GAMP 5 Blair C. James, Stacy D. Nelson Introduction The purpose of this white paper is to assist users in understanding what must be addressed when validating their Analyst Software in accordance with GAMP 5. What is Software Validation? Software validation, in the context of government regulation, is distinct from hardware qualification (ie. IQ/OQ/IPV). 1 Rules and regulations for laboratory systems can be found in 21 CFR 211 (Good Manufacturing Practice), 21 CFR 58 (Good Laboratory Practice), 21 CFR 820 (Quality System Regulation for Medical Devices), 21 CFR Part 11 (Electronic Records and Electronic Signatures), as well as several others. 2,3 An analysis of the relevant regulations is beyond the scope of this paper. However, the spirit of the regulations can be summarized as follows: Validation is a required activity that should document that the system is secure, reliable, and fit for its intended purpose. It must also provide procedures so the system remains in a validated state throughout its operation. The FDA considers software validation to be confirmation by examination and provision of objective evidence that software specifications conform to user needs and intended uses, and that the particular requirements implemented through software can be consistently fulfilled. 4 Note the terms objective evidence, and particular requirements. Confirmation of conformity to user needs and intended uses is obtained by comparing actual system performance against particular or pre-determined requirements. During validation this comparison is accomplished by executing test procedures and collecting objective evidence (screen shots, printed reports, result data files, etc.). The point is not to produce a mountain of documentation, but rather to demonstrate that the software satisfies the intended use. It is also important to demonstrate that validation activities were properly planned and that the tests were executed according to the plan. Who is Responsible for Validation? Under OECD5, validation is the responsibility of the Test Site Management. In GLP, is it the responsibility of the System Owner or Business Process Owner. Often, but not always, this is the GLP laboratory manager. While the laboratory manager will be a key member of the validation team, the team must include representatives from all validation stakeholders. The Quality Assurance team certainly has a role to play in validation. Ultimately upper management must provide the impetus and resources for validation. They must also accept that the system is validated. While an organization can utilize third parties to design and perform the validation, the responsibility for validation and the maintenance of a validated state cannot be delegated. To Validate or Not? The most important update in GAMP 5 is the focus on risk management as it relates to patient safety. 4 GAMP 5 requires validation if there could be an impact on patient safety, product quality, or data integrity. 6 Therefore, the decision to validate, what to validate and how to validate is largely an exercise in risk management. When doing risk assessment in accordance with GAMP 5, it is important to assess the probability and severity of harm to the patient if a fault occurs, rather than the probability of the fault occurring. Therefore, risk must assessed based on critical functionality, for example, in Analyst acquisition of samples would be more critical than formatting reports. Therefore, more in depth testing would be needed for acquisitions. GAMP 5 also recognizes that higher system complexity increases the likelihood of risk.1 For example, a system configured to use Analyst Administrator Console (AAC) AAC for centralized security and project file management would be more complex than a stand alone system using Analyst. Therefore, additional tests would be required to validate the system with AAC.

The Cost of Compliance vs. The Cost of Non-compliance The decision to forego validation when it is required amounts to the acceptance of the unmanageable risk of noncompliance. A brief review of recent judgments against pharmaceutical companies and independent/contract labs reveals that the cost of compliance may be thousands of dollars. But the cost of noncompliance can be millions of dollars along with lost revenue, lost productivity, possible process rework, and damaged investor and customer confidence and goodwill. Compliance can be accomplished by validating critical sub-systems thoroughly and other functions minimally. This does not eliminate risk, but reduces it to manageable levels, while still controlling validation effort and expense. The decision to fully validate all components of a system further eliminates risk, but with a higher cost., not necessarily equivalent to the reduced risk. Prospective vs. Retrospective Validation Ideally, the validation process should begin at the earliest stages of system procurement. It is not possible to wisely select a computerized system without an explicate understanding of the needs of stakeholders and the requirements of regulators. If one is to ensure that a system meets the needs it is intended to, then one must have a clear definition of what those requirements are. In practice, it is sometimes necessary to validate an existing system already in use. In this case, it is still necessary to clearly state the requirements of the system, and to verify that those requirements are met. The validation process continues throughout the lifecycle of the system. As changes to the system are made it will be necessary to confirm that the system remains validated in the face of those changes. Planning for the ultimate retirement of the system is also part of the validation process. Controls: Technical and Procedural To satisfy the regulations, it is necessary to place controls in the system. Controls can be of two types: technical controls and procedural controls. Technical controls are those controls that are enforced through hardware and software. They reduce human effort through automation thereby reducing the incidence of human error. Procedural controls are processes that are documented, approved and enforced typically in a Standard Operating Procedure (SOP). An example with both technical and procedure controls can be a door to a lab with an electronic lock. Procedurally, the lab should have an SOP describing the assignment, distribution, and maintenance of identification devices such as entry of a pass code, presentation of an electronic identification such as a badge, or use biometric identity verification. The badge or the biometric identity lock are technical controls because the lock uses hardware and software to allow or deny entry to the lab. A procedural control would instruct a user to identify themselves to the lock in order to open the door. The badge must remain in the sole possession of the employee to whom it was assigned. If an employee lends her badge to another employee, who then uses it to access a restricted area, the procedural control is compromised. As shown in this example, it is important to put both technical and procedural controls in place. Standard Operating Procedures As described above SOPs are a major part of the procedural controls of the system. Some requirements, such as training, cannot be satisfied through technical controls, but must be satisfied through procedural controls. Some important SOPs are issuance and control of usernames and passwords, training procedures, change control procedures, documentation maintenance procedures, backup and restoration of data, and archival and retrieval of data. It is also worthwhile to document your company s software validation procedures in an SOP. Change Control Change is inherent in any computerized system. As new requirements are identified, errors found, and procedures revised, changes to the system will be necessary. It is essential that changes to a validated system be carefully controlled. Any change contemplated should be documented, analyzed, and tested. It is not adequate to test only the change. A change to one subsystem might affect other, seemingly unrelated, parts of the system. Minimally, a change should be requested in writing via a Change Request. The Change Request must be analyzed and approved by the technical resources involved. The risk assessment may need to be updated. Finally the Change Request must by approved by the Quality Assurance Unit. The Change Control policy should be documented in a SOP. Failure to control change will result in a system that is no longer validated, and will expose the business to the risk of non-compliance. Software Categories Originally the GAMP Good Practice Guide: Validation of Laboratory Computerized Systems classified computer software in five categories 3. There were some changes to categorization of software in GAMP 5 and category 2 was discontinued. The categories were not renumbered. Therefore, Analyst remains in Category IV - configured Commercial off-the-shelf (Configurable COTS). 7

Analyst is configurable because it accommodates the storage and persistence of user names, passwords, customized audit trails and instrument configuration. The effort required to validate a configurable system such as Analyst, is greater than that required to validate operating systems, firmware, and standard software. Analyst can also be customized using Microsoft Visual Basic (VB). Custom or bespoke software requires even greater validation effort. GAMP 5 Increased Awareness of Configurable and Networked Systems GAMP 5 recognizes that most computerized systems are now based on configurable packages that utilize computer networks. Therefore, it recommends that software testing should focus on the specific configuration of the program rather than core operational characteristics, especially when the supplier can demonstrate testing of the core operational functionality of the system. 8 Therefore, Supplier Audit Programs have more importance in GAMP 5 because more and more supplier certificates are accepted in lieu of actual supplier audits. This table also includes a mapping of these documents to the GAMP 5 Validation Lifecycle. Validation Document Lifecycle Category 01 Validation Plan (VP) Plan 02 Validation Risk Assessment (VRA) Plan and Risk Management 03 User Functional Requirements Specification Specify (UFRS) 04 System Design Specification (SDS) Specify 05 Test Plan Plan 06 Installation Qualification Protocol Configure and Verify 07 Operational Qualification Protocol Verify 08 Performance Qualification Protocol Verify 09 21 CFR Part 11 Verify 10 Traceability Matrix Plan 11 QAU Review Verify/Report 12 Vendor Assessment Verify/Report 13 Validation Summary Report (VSR) Report Changes to the V Validation Life Cycle Plan Specify Risk Management Verify Report It is important for the validation document set to be well organized. This can be accomplished by numbering the documents in a clear, easy to read manner. For example, each document is numbered 01-13 as shown above. Each document also has a code corresponding to the company, document and version such as CMPY-SDS-01 in the following example which is the typical introduction section in the Analyst? document set. 1. Introduction Terms used in this document: Figure 1. GAMP 5 Validation Lifecycle 1 Configure VENDOR COMPANY TEAM SYSTEM Analyst IQ AB SCIEX, Inc COMPANY NAME (CMPY) Analyst Software Validation Project Team defined in the Software Validation Plan CMPY-SVP-01 Defined in the System Design Specification CMPY-SDS-01 Analyst Software Installation Qualification created by the TEAM in accordance with the Software Validation Plan CMPY-SVP-01 Because GAMP 5 recognizes that most systems are configurable software, it suggests a simplified V validation life cycle as shown in Figure 1 GAMP 5 Validation Lifecycle. Important Documents This type of cross-referencing makes for ease of use, modularity, and can assist any auditor in finding information quickly. The following sections provide an overview of each validation document. At a minimum, the validation documentation set should contain documents 01-08, 10 and 13 listed in the table below. Documents 09, 11 and 12 are special documents provided by AB SCIEX for software validation of Analyst?.

The Validation Plan (VP) The Validation Plan is a key strategic planning document 9 that describes the entire validation effort, and covers the system life cycle from inception to retirement. The regulations place great emphasis on the Validation Plan, because it is the key to controlling the validation project. At a minimum, the Validation Plan should describe the scope of the validation project, the work to be done, the schedule of activities, and the individuals responsible for planning, execution, testing, and approval. Additionally, instructions for testing including protocol execution and collection of objective evidence, as well as, post validation activities, deliverables, and instructions for identifying and documenting anomalies may be included in the Validation Plan or the Test Plan depending upon the needs of the System Owner. Validation Risk Assessment (VRA) The VRA documents the risks in accordance with GAMP 5 and includes prescribed mitigations for each risk. User Functional Requirements Specification (UFRS) The UFRS contains objectively stated requirements that the system is intended to fulfill. It must address Analyst Technical Controls, Procedural Controls, capacities, accuracy, security, fault tolerance, physical environment, and training requirements, among others. It is critical that the UFRS be a complete statement of the needs and objectives of the acquiring organization. A typical UFRS will contain up to several hundred unique requirements. The System Design Specification (SDS) Because Analyst software is configurable, it is adaptable to variations in instrument and peripheral equipment setup, security, and data processing. Therefore, it is necessary to describe the intended configuration in a System Design Specification (SDS). Some examples of the Analyst features that are addressed in the SDS are: security and user roles, audit trail settings, equipment configuration, and quantitation settings. It is important for the validation document set to be well organized. This can be accomplished by numbering the documents in a clear, easy to read manner. For example, each document is numbered 01-13 as shown above. Each document also has a code corresponding to the company, document and version such as CMPY-SDS-01 in the following example which is the typical introduction section in the Analyst? document set. Test Plan (TP) The Test Plan ensures that each test is supported by a user requirement. At a minimum, it serves as a forward-pointing traceability matrix showing the relationship of the tests to the user requirements. Qualification Protocols The qualification of the Analyst software can be separated into four phases: Design Qualification (DQ), Installation Qualification (IQ), Operational Qualification (OQ), and Performance Qualification (PQ). Since it is not always clear in which phase a particular requirement or test belongs, the following guidance may be helpful. Since Analyst is a GAMP 5 category IV software, the DQ is performed by AB SCIEX, the vendor. This can be verified by performing a vendor audit or accepting vendor certifications. AB SCIEX provides a standard postal audit that addresses the common elements of a vendor audit. It identifies the development and verification methodology and the quality procedures, such as ISO 9001 as implemented by AB SCIEX. IQ, OQ and PQ testing involves the execution of a pre-defined set of tests. A test script contains the instruction, the expected result and the acceptance criteria. It also has a place for the tester to indicate whether the test passed or failed. Good test scripts will reference each applicable requirement specifically. A single test script or a certain number of tests might address one or several requirements. Normally, test scripts are organized around a specific range of functionality, such as security or data acquisition. Test scripts must be carefully designed, and should include both positive and negative tests. For example, a test for password acceptance will include procedures to verify the result of entering a valid password, as well as the result of entering an invalid password. If a test step fails, then an Anomaly Report is required. The deviation report should identify the nature of the deviation, the test script or procedure where the deviation occurred, proposed corrective action, and responsibility for implementation and verification. 21 CFR Part 11 Assessment The Analyst software can be configured to meet the requirements of 21 CFR Part 11, Electronic Records; Electronic Signatures (Part 11). Part 11 regulates the security, reliability and integrity of laboratory data, and the security and integrity of electronic signatures. The predicate rules contain relatively few signature requirements. Where signatures are required, such as in a data audit trail, Part 11 defines how an electronic signature must be derived and the meaning of the electronic signature. Many Part 11 requirements will be met with a combination of technical and procedural controls.

The Analyst software is not compliant with part 11 until the Analyst software has been configured correctly and appropriate SOPs are in place. The 21 CFR Part 11 Assessment contains a checklist to verify that 21 CFR Part 11 regulations have been met. Traceability Matrix The Traceability Matrix shows the relationship between each user requirement to a corresponding test script (in the case of technical controls) or SOP (for procedural controls). The TM makes it possible to confirm that each user requirement has been addressed and satisfied by the validation procedure. Quality Assurance Review The Quality Assurance (QA) or Quality Control (QC) department must be actively engaged in the validation effort. Management s approval of validation depends on the recommendation of QA. Fortunately, GAMP 5 simplified the document approval process. Technical experts are now empowered to approve technical documentation. QA must ensure documents meet applicable regulations. For example QA should review a UFRS against the applicable regulations but the UFRS technical review is the responsibility of technical subject matter experts. Thus, QA no longer needs to sign a design specification because they can rely upon technical subject matter experts. QA should verify that design specifications are being produced for projects (i.e. verify that processes are being followed) but QA does not need to sign every document in a project. 1 After a final review for completeness, the QA department must submit recommendations to management regarding the release of the system for use. The best way to ensure that the QA department can recommend the release of the system is to include them in the validation effort throughout the lifecycle. The Quality Assurance Review document provides a checklist to ensure the above criteria have been met. It also requires signatures denoting the pass/fail status of all validation documents. Planning for Maintenance of the Validated State Analyst validation is not a one-off process. The validation effort should encompass the entire system lifecycle, from inception to retirement. The most important tool for maintaining a system in its validated state is the Change Control Procedure. By carefully following a pre-defined plan for evaluating and approving changes to the system, the physical environment, and the procedural environment, a system can be maintained in a validated state over time. Replicate System Validation When more than one instrument is being installed in a laboratory at the same time, the system can be replicated. When this occurs, it would be redundant and costly to perform a complete software validation on each system. The following provides guidance for tailoring the software validation for replicated systems. First, the Validation Plan must describe that several instruments are being validated at the same time. The strategy is to test all requirements on one system called First in Family then test a subset of requirements on the replicated systems. The Test Plan must denote tests to be performed on the First in Family and tests to be performed on Replicated Systems. Tests to be performed on Replicated Systems include security, audit trail configuration and acquisition settings. Thus, testing is limited to confirming that the configuration is correct. Additionally, an acquisition of a small number of samples (about 10) must be run to ensure the instrument is acquiring properly. Quantitation validation would not be necessary, nor testing reporting because these functions have been tested on the First in Family. There should be two sets of IQ/OQ/PQ protocols. One for the First in Family, one for replicate systems. The Validation Traceability Matrix should trace both the First in Family and the Replicated Systems. More than one trace table may be required. The Validation Summary Report must list the validation status of each system and any anomalies encountered for each system. Vendor Assessment The vendor assessment contains the standard postal audit for AB SCIEX. Validation Summary Report (VSR) The VSR contains the results of the software validation project including the decision as to whether the system passed or failed.

Conclusion Analyst validation need not be an onerous undertaking. By adopting the best practices prescribed by GAMP 5, other regulatory bodies and professional societies, validation can be performed efficiently. GAMP 5 brought some changes to software validation. These include: Validation based on risk management with more testing required for functionality that could impact on patient safety, product quality, or data integrity. Increased Awareness of Configurable and Networked Systems Changes to the V Validation Life Cycle Simplified the document approval process In addition to regulatory compliance, the processes and business objectives of the organization are enhanced by proper validation. Contact Us Contact your local AB SCIEX sales representatives or email AB SCIEX Validation Services at: SoftwareValidation@absciex.com REFERENCES 1. GAMP Good Process Guide: Validation of Laboratory Computerized Systems. International Society for Pharmaceutical Engineering, 2005. 2. The Good Automated Manufacturing Practice (GAMP) Guide for Validation of Automated Systems in Pharmaceutical Manufacture - GAMP 4, International Society for Pharmaceutical Engineering, 2001. 3. IEEE Standard for Software Verification and Validation (IEEE Standard 1012), The Institute of Electrical and Electronic Engineers, 2005. 4. General Principles of Software Validation; Final Guidance for Industry and FDA Staff. U.S. Department of Health and Human Services, Food and Drug Administration, 2002. 5. www.oecd.org 6. GAMP 4 to GAMP 5? Summary ISPE GAMP 5 2008 7. GAMP 5 Newsletter Holistic Approach to Science-based Risk Management, Thursday, 13 March 2008 8. GAMP 5 Quality Risk Management Approach, Kevin C. Martin and Dr. Arthur (Randy) Perez, Pharmaceutical Engineering, May/June 2008 Vol. 28 No.3 9. Draft Guidance for Industry: 21 CFR Part 11; Electronic Records; Electronic Signatures Validation (WITHDRAWN). U.S. Department of Health and Human Services, Food and Drug Administration, 2001. For Research Use Only. Not for use in diagnostics procedures. Microsoft and Windows are registered trademarks of Microsoft Corporation. 2010 AB SCIEX. The trademarks mentioned herein are the property of AB Sciex Pte. Ltd. or their respective owners. AB SCIEX is being used under license. IN NO EVENT SHALL ABSCIEX BE LIABLE, WHETHER IN CONTRACT, TORT, WARRANTY, OR UNDER ANY STATUTE OR ON ANY OTHER BASIS FOR SPECIAL, INCIDENTAL, INDIRECT, PUNITIVE, MULTIPLE OR CONSEQUENTIAL DAMAGES IN CONNECTION WITH OR ARISING FROM ABSCIEX PRODUCTS AND SERVICES OR THE USE OF THIS DOCUMENT. Publication number 0490110-01 Headquarters 353 Hatch Drive Foster City CA 94404 USA Phone 650-638-5800 www.absciex.com International Sales For our office locations please call the division headquarters or refer to our website at www.absciex.com/offices