PERFORMANCE EVALUATION AUDIT CHECKLIST EXAMPLE. EIIP Volume VI



Similar documents
FORM 20A.9 SAMPLE AUDIT PROGRAM FOR TESTING IT CONTROLS. Date(s) Completed. Workpaper Reference

U. S. Department of Energy Consolidated Audit Program Checklist 5 Laboratory Information Management Systems Electronic Data Management

CHAPTER 11 COMPUTER SYSTEMS INFORMATION TECHNOLOGY SERVICES CONTROLS

Information Systems and Technology

OMCL Network of the Council of Europe QUALITY ASSURANCE DOCUMENT

4 Backing Up and Restoring System Software

INFORMATION TECHNOLOGY CONTROLS

San Francisco Chapter. Information Systems Operations

ACDM GUIDELINES TO FACILITATE PRODUCTION OF A DATA HANDLING PROTOCOL

FINAL May Guideline on Security Systems for Safeguarding Customer Information

Frequently Asked Questions About WebDrv Online (Remote) Backup

SECTION 15 INFORMATION TECHNOLOGY

PART 10 COMPUTER SYSTEMS

Auditing in an Automated Environment: Appendix C: Computer Operations

IT Service Management

Accounts Payable User Manual

Nova Southeastern University Standard Operating Procedure for GCP. Title: Electronic Source Documents for Clinical Research Study Version # 1

OECD SERIES ON PRINCIPLES OF GOOD LABORATORY PRACTICE AND COMPLIANCE MONITORING NUMBER 10 GLP CONSENSUS DOCUMENT

2.2 INFORMATION SERVICES Documentation of computer services, computer system management, and computer network management.

Accounts Receivable User Manual

Guidance for Industry Computerized Systems Used in Clinical Investigations

REVENUE REGULATIONS NO issued on December 29, 2009 defines the requirements, obligations and responsibilities imposed on taxpayers for the

Managing & Validating Research Data

DISASTER RECOVERY AND CONTINGENCY PLANNING CHECKLIST FOR ICT SYSTEMS

Data Management Implementation Plan

BACKUP SECURITY GUIDELINE

IT Sr. Systems Administrator

Sponsor Site Questionnaire FAQs Regarding Maestro Care

Accounts Receivable System Administration Manual

15 Organisation/ICT/02/01/15 Back- up

Guidance for Industry COMPUTERIZED SYSTEMS USED IN CLINICAL TRIALS

SESSION 8 COMPUTER ASSISTED AUDIT TECHNIQUE

Systems Management Advanced job scheduler

Document Management Plan Preparation Guidelines

Clinical Data Management (Process and practical guide) Nguyen Thi My Huong, MD. PhD WHO/RHR/SIS

Information System Audit. Arkansas Administrative Statewide Information System (AASIS) General Controls

Information Technology General Controls Review (ITGC) Audit Program Prepared by:

Draft Copy. Change Management. Release Date: March 18, Prepared by: Thomas Bronack

The Weill Cornell Medical College and Graduate School of Medical Sciences. Responsible Department: Information Technologies and Services (ITS)

Idaho Records Center A Program of the Idaho State Archives

(COMPANY LOGO) CGMP COMPUTERIZED SYSTEM VENDOR AUDIT QUESTIONNAIRE

Union County. Electronic Records and Document Imaging Policy

SUBJECT: SECURITY OF ELECTRONIC MEDICAL RECORDS COMPLIANCE WITH THE HEALTH INSURANCE PORTABILITY AND ACCOUNTABILITY ACT OF 1996 (HIPAA)

DSK MANAGER. For IBM iseries and AS/400. Version Last Updated September Kisco Information Systems 7 Church Street Saranac Lake, NY 12983

Examples: collecting the right sample on the right patient; labeling the sample accurately;

Accounts Payable System Administration Manual

HIPAA Security. 2 Security Standards: Administrative Safeguards. Security Topics

General Computer Controls

CENTER FOR NUCLEAR WASTE REGULATORY ANALYSES

Upgrading Your SQL Server 2000 Database Administration (DBA) Skills to SQL Server 2008 DBA Skills Course 6317A: Three days; Instructor-Led

IT Assurance - Business Continuity and Disaster Recovery

OPERATIONAL STANDARD

White Paper: Financial Services Compliance

U.S. Department of the Interior's Federal Information Systems Security Awareness Online Course

Feedback Ferret. Security Incident Response Plan

POLICY AND GUIDELINES FOR THE MANAGEMENT OF ELECTRONIC RECORDS INCLUDING ELECTRONIC MAIL ( ) SYSTEMS

EMA Clinical Laboratory Guidance - Key points and Clarifications PAUL STEWART

Disaster Recovery Planning for Homesteaders 2004 Paul Edwards & Associates

DAIDS Appendix 2 No.: DWD-POL-DM-01.00A2. Data Management Requirements for Central Data Management Facilities

SEMETS3-68 Performing engineering software configuration management

Recovery Management. Release Data: March 18, Prepared by: Thomas Bronack

Review and Approve Results in Empower Data, Meta Data and Audit Trails

ScreenMaster RVG200 Paperless recorder FDA-approved record keeping. Measurement made easy

IT General Controls Domain COBIT Domain Control Objective Control Activity Test Plan Test of Controls Results

ILLINOIS DEPARTMENT OF CENTRAL MANAGEMENT SERVICES CLASS SPECIFICATION DATA PROCESSING OPERATIONS SERIES CLASS TITLE POSITION CODE EFFECTIVE

Quality Assurance/Quality Control Plan: Samples and Manual for Development

- 1 - StruxureWare TM Data Center Expert Periodic Maintenance. Software Integration Service. 1.0 Executive Summary. 2.0 Features & Benefits

Disaster Recovery and Business Continuity Plan

CITY UNIVERSITY OF HONG KONG. Information Classification and

HIPAA Security. assistance with implementation of the. security standards. This series aims to

This document and the information contained herein are the property of Bowman Systems L.L.C. and should be considered business sensitive.

State Meat and Poultry Inspection (MPI) Program Laboratory Quality Management System Checklist

REGULATIONS COMPLIANCE ASSESSMENT

HIPAA: In Plain English

Evaluating Laboratory Data. Tom Frick Environmental Assessment Section Bureau of Laboratories

Gatekeeper PKI Framework. February Registration Authority Operations Manual Review Criteria

PORTLAND ORTHOPAEDICS PTY LTD MARGRON TOTAL HIP REPLACEMENT. Unit 3, 44 McCauley Street MATRAVILLE NSW 2036 PROCEDURE DOCUMENT & RECORD CONTROL

Assessment of Vaisala Veriteq vlog Validation System Compliance to 21 CFR Part 11 Requirements

STATE OF NEBRASKA STATE RECORDS ADMINISTRATOR DURABLE MEDIUM WRITTEN BEST PRACTICES & PROCEDURES (ELECTRONIC RECORDS GUIDELINES) OCTOBER 2009

How To Use A Court Record Electronically In Idaho

Final Audit Report. Audit of Data Integrity MCCS Feeder System Interfacing with SAP

Business Continuity Planning and Disaster Recovery Planning. Ed Crowley IAM/IEM

DISASTER RECOVERY PLAN

SCADA Compliance Tools For NERC-CIP. The Right Tools for Bringing Your Organization in Line with the Latest Standards

1 Quality Assurance and Quality Control Project Plan

Master Document Audit Program

COLUMBUS STATE COMMUNITY COLLEGE POLICY AND PROCEDURES MANUAL

Business Continuity Position Description

APPLICATION FOR LOUISIANA STATE ARCHIVES IMAGING EXCEPTION TO LA. R.S. 44:39 SSARC 790

Transcription:

Final 7/96 APPENDIX E - PERFORMANCE EVALUATION AUDIT APPENDIX E PERFORMANCE EVALUATION AUDIT CHECKLIST EXAMPLE

APPENDIX E - PERFORMANCE EVALUATION AUDIT Final 7/96 This page is intentionally left blank.

Final 7/96 APPENDIX E - PERFORMANCE EVALUATION AUDIT PERFORMANCE EVALUATION AUDIT CHECKLIST EXAMPLE The Computer Audit Checklist provides guidelines for evaluating the adequacy of a computer that could be used to calculate emissions, store inventory data, and generate tables of results for the final inventory report. The testing, calibrating, and evaluation of operating procedures of instruments are all important for assuring the quality of data; a computer is also an instrument and should be treated as such. Because of the dependence of the inventory staff on computers to complete the inventory, an assessment of the testing, operation, and use of the computer would be one of the most important audits conducted during the planning and development phases of the inventory. E-1

APPENDIX E - PERFORMANCE EVALUATION AUDIT Final 7/96 This page is intentionally left blank. E-2

Final 7/96 APPENDIX E - PERFORMANCE EVALUATION AUDIT AUDIT CHECKLIST COMPUTER DATA ACCURACY Procedure: Select a date with results passing and not passing quality acceptance criteria, if possible. Determine if reported results accurately reflect the raw data. Use an alternate method to verify the accuracy of computer manipulations to yield calculated results. 1. Identify data maintained only on the hard drive 2. Locate and identify data maintained on hard copies 3. Are quality control data referenced to sample results? (standards, blanks, calibrations, replicates, duplicates, spikes, instrument conditions, surrogates, internal standards, etc.) - LAB DATA a. Are the references to quality control data protected or can they be easily changed? E-3

APPENDIX E - PERFORMANCE EVALUATION AUDIT Final 7/96 b. Are the references sufficient to associate quality data with individual sample results? 4. Are data outside of acceptance criteria flagged? a. If yes, are actions taken described? 5. Are the detection limits for target analytes clearly defined or referenced in the raw data? (LAB) a. Are they accurately represented in the report? b. Were changes made to reflect dilutions? 6. Are the data reduction steps definined in a written procedure? a. Is the procedure sufficient to recalculate results? b. Are the data sufficient to verify results? c. Were errors found in the data verified? If Yes, explain and identify results: (use additional comment sheet if necessary) 7. Are the data accurately reported? a. Are the units accurate? b. Are the data accurately flagged to note bias or confidence in the results reported? 8. Can results be traced back to the original data associated with a specific test article or data set? If not, describe problem: E-4

Final 7/96 APPENDIX E - PERFORMANCE EVALUATION AUDIT AUDIT CHECKLIST COMPUTER PROGRAM DOCUMENTATION AND VALIDATION 1. Are the following documents present a. Software management plan b. Software development plan and results c. Software test and acceptance plan d. Software User s operations document e. Software maintenance document f. Assessment of hardware 2. Are the following documented: a. Program b. Table of definitions c. System size and timing requirements d. Definitions of subsystems e. Requirements for: -- Hardware -- Electricity -- Security f. Backup and disaster recovery procedures g. Quality requirements for: -- Reliability -- Maintainability -- Flexibility for expansion h. Testing procedures 3. Does software management include the following: a. Independent validation E-5

APPENDIX E - PERFORMANCE EVALUATION AUDIT Final 7/96 b. Definitions/identifications of interfaces c. Definition of software tools -- Identification of program language -- Identification of network software requirements d. Configuration control (Control, release, and storage of master copies) 4. Is there a flow chart or text showing functional flow? 5. Are input and output fields identified? 6. Are there written procedures for software revisions? a. Are revisions tested to determine how the entire program is affected? b. How are revisions implemented? c. Are revisions documented? 7. Does the User s Guide or software description include: a. Description of system (hardware) b. Identification of person to contact when problems occur c. How to access the system d. How to input data e. How to generate reports f. How to update data g. Description of error codes h. Procedures to follow if the system goes down E-6

Final 7/96 APPENDIX E - PERFORMANCE EVALUATION AUDIT 8. Do testing procedures include the following: a. Description of test procedures to perform b. Expected outcome c. Documentation of results d. Recommendation for handling problems 9. Has security be addressed (Statements or passwords to safeguard accuracy of computer program operation)? 10. Can someone knowledgeable of the program language understand from reading the program what it does to the data to yield the expected or desired results (if not, explain on separate page) E-7

APPENDIX E - PERFORMANCE EVALUATION AUDIT Final 7/96 AUDIT CHECKLIST COMPUTER PROGRAM OPERATION 1. Is a password required to access the system? 2. Operator Training a. Are operators adequately trained? b. Is training documented? 3. When testing the system, are there system delays that hamper the job? 4. Are error messages given if an entry error is made (ex. data out-ofrange)? 5. Does the system prevent entry of data whic are out-of-range? 6. Are there prompts if fields are missed that are required for data manipulations? 7. How are default parameters assigned by the system? 8. Does the system carry over data from one screen to the next in order to minimize entry errors? 9. Are data entered into the central database via a computer readable media? If yes, does the data include information concerning: a. The source of the data b. When the data were collected c. Conditions under which collected d. Pointers to link data to quality control data E-8

Final 7/96 APPENDIX E - PERFORMANCE EVALUATION AUDIT e. Quality control (QC) flags indicating the level of data acceptability 10. If the data are entered by prompting the system to access a previously existing data file, are the data validated by a. Comparison of #/size of files transferred b. A log which documents files transferred c. Documenting or creating a record of the data, date, and name of the person transferring the data d. Periodic audits of data transfer -- Are audits documented? 11. Are data entered into the central data system via a link from an instrument? If yes, is the following information given a. Data generated b. Date and time generated c. Identification of the instrument d. QC flags indicating the level of acceptability of the data e. Instrument conditions (operating) (Lab) 12. If data are entered by direct link from an instrument is the data validated by... a. Voltage or calibration checks b. Comparison of results c. Comparison of hardcopy output with database contents d. Are data reasonableness checks either built into the data capture system or instruments? 13. Data changes a. How are the data corrections made? b. Are corrections verified? E-9

APPENDIX E - PERFORMANCE EVALUATION AUDIT Final 7/96 c. Are corrections documented on a written log (who, when, authorization) d. Is there a computer generated record of the changed and unchanged data? e. If changes were made to data transferred from another source, was the original source corrected? f. If changes are made in flags from a central data base -- Who determines the need to make the change? -- Is authorization for revision documented? -- Is the change adequately documented? 14. Data Reduction, Analysis and Assessment a. If data quality flags are used, are they defined b. Are qualifying flags correct? c. Can new flags be created? If so, how are they created? d. Are the mathematical expressions used by the system available in a written format? e. Were the mathematical expressions reviewed for accuracy? f. Was the validation of mathematical expression documented? g. Did revisions affect the over-all performance of data manipulations? h. If mathematical expressions are modified, -- Is the reason for the modification documented? -- Are old data recalculated with new formulas? E-10

Final 7/96 APPENDIX E - PERFORMANCE EVALUATION AUDIT i. Are printouts of report modifications routinely checked for accuracy? -- By whom -- Documented -- Percent checked % 15. Data output a. Are there written procedures for generating data output (graphs, charts, reports) b. Are the data used to generate the output adequately identified c. After final output are generated, is the database "locked" so that no further changes can be made without managerial consent d. Are output generated in a timely fashion (unusual delays be database) e. Are final copies of output properly archived -- Secure -- Precaution against fire -- Limited access f. Are reports electronically generated? g. If reports are electronically generated, how is it done? -- Direct computer link -- Magnetic media -- Verified upon transmission 16. Backups a. Are system backups performed? b. If backups are performed, indicate frequency c. Are backups partial or total? E-11

APPENDIX E - PERFORMANCE EVALUATION AUDIT Final 7/96 d. Is there an individual responsible for backups? If so, who? e. On what media are backups stored -- Magnetic tape -- Disks -- Diskettes -- Other f. Are media storing backups properly labelled? g. Are backups documented? -- Written log -- Project identified -- Computer system identified -- Other h. How are data from backups stored short term i. How are backup data stored long term j. Are the backup data arranged for expedient retrieval from the storage area(s)? k. Are long-term backup data stored off site or in a location other than the original data? l. Is the data storage area adequate? -- Security assured -- Access limited -- Fire precautions -- Environmentally controlled E-12

Final 7/96 APPENDIX E - PERFORMANCE EVALUATION AUDIT 17. Hardware maintenance a. Are there procedures for conducting and documenting preventative maintenance? b. Is there a regularly scheduled preventative maintenance program? Frequency c. Is preventative maintenance documented (who, what, when)? d. Is non-routine maintenance performed by in-house staff? If yes, how is it documented? e. How is non-routine maintenance documented? f. What provisions, if any, are made for system downtime? E-13

APPENDIX E - PERFORMANCE EVALUATION AUDIT Final 7/96 g. Has downtime adversely afffected the project? If yes, explain h. If the system fails because of electrical glitches or power outage what happens to the system? -- Backup power source available -- Backup starts automatically or manually -- How are power failures indicated if the system is running? -- Does the system lose the data being processed? -- Does the system start where it left off? E-14

Final 7/96 APPENDIX E - PERFORMANCE EVALUATION AUDIT -- If data are lost, does the system indicate the loss? -- While the system is running, is there a backup procedure to minimize the data loss if the system goes down? -- If the system goes down is the time down and time restored easy to determine? E-15

APPENDIX E - PERFORMANCE EVALUATION AUDIT Final 7/96 This page is intentionally left blank. E-16