1 Quality Assurance and Quality Control Project Plan



Similar documents
Ecology Quality Assurance Glossary

TEST PLAN APPLICATION TEMPLATE for Field Verification of Advanced Onsite Pretreatment Units for Nitrogen Reduction Chesapeake Bay Watershed States

QUALITY ASSURANCE / QUALITY CONTROL POLICY BREG SOIL AND WATER QUALITY LABORATORY UNIVERSITY OF DELAWARE

Data Quality. Tips for getting it, keeping it, proving it! Central Coast Ambient Monitoring Program

Commonwealth of Kentucky. Wastewater Laboratory Certification Manual

MIDLAND AREA SOIL SAMPLING MIDLAND, MICHIGAN

CHAPTER ONE TABLE OF CONTENTS 1.0 INTRODUCTION... 1

ASSURING THE QUALITY OF TEST RESULTS

Technical Guidance Manual for the Implementation of the Hawai`i State Contingency Plan. Interim Final SECTION 10

9. Quality Assurance 9.1 INTRODUCTION 9.2 FIELD SAMPLING QUALITY ASSURANCE 9.3 ANALYTICAL QUALITY ASSURANCE Internal Quality Control

APPENDIX N. Data Validation Using Data Descriptors

Draft. Quality Assurance Project Plan. Haul Road Fugitive Dust Study. Red Dog Mine, Alaska. Prepared for. Teck Cominco Alaska Inc.

Appendix B. Eugene/Springfield Environmental Services Laboratory Quality Assurance Plan

Evaluating Laboratory Data. Tom Frick Environmental Assessment Section Bureau of Laboratories

Determination of Total Suspended Solids (TSS) and Total Volatile Solids (TVS) in Waters of Fresh/Estuarine/Coastal Waters

Quality Assurance/Quality Control in Acid Deposition Monitoring

MDEP/RCRA Quality Assurance Plan Revision 6 Date: November 21, 2012 Page 0 of 19

4 QUALITY ASSURANCE 4.1 INTRODUCTION

State Water Resources Control Board Environmental Laboratory Accreditation Program

1. PURPOSE To provide a written procedure for laboratory proficiency testing requirements and reporting.

DATA QUALITY OBJECTIVES

Division of Environmental Response and Revitalization

Canadian Environmental Technology Verification Program Applicant Information Package CONTENTS

Minnesota Pollution Control Agency. Quality Assurance Project Plan Guidance

QUALITY ASSURANCE PLAN

Data Quality Objectives and Quality Assurance Project

LAB ID and/or LABORATORY NAME: ASSESSOR NAME: CONTINUOUS RADON MONITOR (CRM) by Proportional Counting. Method Number: ELAP method code 7037

DRY CLEANING PROGRAM QUALITY ASSURANCE MANAGEMENT PLAN

Nitrogen, Phosphorus, and Suspended Solids Concentrations in Tributaries to the Great Bay Estuary Watershed in 2011

COPY. Revision History: Changes were made to reflect the current practice of only including the most recent changes in the revision history.

Analytical Test Method Validation Report Template

This example of a completed sampling plan worksheet has been included to illustrate the information

Sample Management SOP 2v8 Page 1 STANDARD OPERATING PROCEDURE NO. 2 SAMPLE MANAGEMENT REVISION LOG. Revision Number Description Date

P07 CALA Application of Requirements in ISO/IEC 17025:2005 Revision 3.4 April 9, 2014

Third Party Data Validation: What you need to know and do!

Laboratory Quality Assurance Manual

APPENDIX 7-B SUGGESTED OUTLINE OF A QUALITY ASSURANCE PROJECT PLAN

ANALYTICAL SUMMARY REPORT


Step-by-Step Analytical Methods Validation and Protocol in the Quality System Compliance Industry

STANDARD OPERATING GUIDELINES: EVIDENTIAL BREATH ALCOHOL INSTRUMENT CALIBRATION

Model Quality Assurance Manual

ATTACHMENT 2. BMP Monitoring Plan Template

AUDIT PROTOCOL FOR THE VICTORIAN WATER QUALITY MONITORING NETWORK

EPA NATIONAL LEAD LABORATORY ACCREDITATION PROGRAM. LABORATORY QUALITY SYSTEM REQUIREMENTS (LQSR) REVISION 3.0 (November 05, 2007)

Common Findings. Lynn Boysen MN-ELAP. Presented at MWOA Annual Training June 10, 2010

Texas Commission on Environmental Quality Page 1 Chapter Control of Air Pollution from Nitrogen Compounds

Quality Assurance Manual

QA/QC. Standard methods in field. QA/QC in the field. QA/QC in analysis

TASK 2: QUALITY ASSURANCE

Great Lakes National Program Office and Office of Water Quality Management Training Modules

PERFORMANCE-BASED EVALUATION OF LABORATORY QUALITY SYSTEMS An Objective Tool to Identify QA Program Elements that Actually Impact Data Quality

NIST HANDBOOK CHECKLIST CONSTRUCTION MATERIALS TESTING

Revision Log. Section 2.0, added Standard Scope Statement, applicable to M&O and LW only

Water Quality Monitoring and Analysis of the Bennett Spring Watershed and Recharge Area

CERTIFICATE OF ANALYSIS

EPA GUIDANCE FOR QUALITY ASSURANCE PROJECT PLANS EPA QA/G-5

U. S. Department of Energy Consolidated Audit Program Checklist 5 Laboratory Information Management Systems Electronic Data Management

Help Desk Review of Quality Assurance Project Plans

FOOD FOR THOUGHT Topical Insights from our Subject Matter Experts UNDERSTANDING WHAT IS NEEDED TO PRODUCE QUALITY DATA

QUALITY ASSURANCE PROJECT PLAN FOR BROWNFIELDS/VOLUNTARY CLEANUP PROGRAM SITES

Quality Assurance Procedures for Construction and Materials Sampling and Testing 14

PROCEDURES FOR HANDLING OOS RESULTS

Control Charts and Trend Analysis for ISO Speakers: New York State Food Laboratory s Quality Assurance Team

Data Management Implementation Plan

Guidance for Industry

International GMP Requirements for Quality Control Laboratories and Recomendations for Implementation

GUIDELINES FOR USE BY CLASS A LICENSEES IN MEETING SNP REQUIREMENTS JULY 1996

Quality Assurance Plan

ISO 9001:2008 Audit Checklist

Medical Cannabis Laboratory Approval Program

DIVISION OF ENVIRONMENT QUALITY MANAGEMENT PLAN PART III: SUPERFUND (MACA) PROGRAM QUALITY ASSURANCE MANAGEMENT PLAN

State Meat and Poultry Inspection (MPI) Program Laboratory Quality Management System Checklist

CHAPTER 7 QUALITY ASSESSMENT

Water and Wastewater. Sample Collection and Analysis

QUALITY ASSURANCE and QUALITY CONTROL DATA STANDARD

NC SBI QUALITY ASSURANCE PROGRAM

BASE CONSTRUCTION INC

PERFORMANCE REPORT. Quality Assurance Program - January to June 2011

DEPARTMENT OF NATURAL RESOURCES and ENVIRONMENTAL CONTROL SITE INVESTIGATION & RESTORATION BRANCH

Procedure for Equipment Calibration and Maintenance

US EPA - Region III BROWNFIELDS. Quality Assurance Project Plan. Template

Phosphorus, Reactive (Orthophosphate)

ANALYTICAL METHODS INTERNATIONAL QUALITY SYSTEMS

Quality Manual. DuraTech Industries, Inc Commerce Street La Crosse, WI MANUAL SERIAL NUMBER 1

DIVISION OF ENVIRONMENT QUALITY MANAGEMENT PLAN VOLUNTARY CLEANUP AND PROPERTY REDEVELOPMENT PROGRAM QUALITY ASSURANCE MANAGEMENT PLAN

BRC Food Safety Management System Implementation Workbook


OPERATIONAL STANDARD

PERFORMANCE EVALUATION AUDIT CHECKLIST EXAMPLE. EIIP Volume VI

Analytical Data Package Reports

ENVIRONMENTAL LABORATORY APPROVAL PROGRAM CERTIFICATION MANUAL

South Coast Air Quality Management District

EVALUATION AND ACCREDITATION OF SUPPLIER TEST FACILITIES GP 10

UNCONTROLLED COPY FOR REFERENCE ONLY

GUIDELINES FOR THE VALIDATION OF ANALYTICAL METHODS FOR ACTIVE CONSTITUENT, AGRICULTURAL AND VETERINARY CHEMICAL PRODUCTS.

Guidelines for Quality Assurance and Quality Control in Surface Water Quality Programs in Alberta

ACCEPTANCE CRITERIA FOR QUALITY DOCUMENTATION (AC10) June 2014

BQ-9000 Quality Management System. Certified Marketer Requirements

Quality Assurance Manual

Transcription:

1 Quality Assurance and Quality Control Project Plan The purpose of this section is to describe the quality assurance/quality control program that will be used during the system specific field testing to ensure that data and procedures are of measurable quality and support the quality objectives and test plan objectives for this verification test. The QA/QC plan shall be tailored to meet the specific field testing requirements of the system being evaluated and for the range of parameters being evaluated 1. The QA/QC plan is written as part of the system specific Field Testing Protocol and should be read and used with the complete Field Testing Protocol as a reference. 1.1 Verification Test Data Data Quality Indicators (DQI) Several data quality indicators (DQIs) have been identified as key factors in assessing the quality of the data and in supporting the verification process. These indicators are: Precision Accuracy Representativeness Comparability Completeness Each DQI is described below and the goals for each DQI are specified. Performance measurements will be verified using statistical analysis of the data for the quantitative DQI s of precision and accuracy. If any QA objective is not met during the tests, an investigation of the causes will be initiated. Corrective action will be taken as needed to resolve the difficulties. Data failing to meet any of the QA objectives will be flagged in the verification report, and a full discussion of the issues impacting the QA objectives will be presented. 1.1.1 Precision Precision refers to the degree of mutual agreement among individual measurement and provides an estimate of random error. Analytical precision is a measurement of how far an individual measurement may deviate from a mean of replicate measurements. Precision is evaluated from analysis of field and laboratory duplicates and spiked duplicates. The standard deviation (SD), relative standard deviation (RSD) and/or relative percent difference (RPD) recorded from sample analyses are methods used to quantify precision. RPD is calculated by the following formula: RPD = [(abs [C 1 C 2 ]) / (C 1 + C 2 ) / 2] x 100% Where: C 1 = Concentration of the compound or element in the sample C 2 = Concentration of the compound or element in the duplicate 1 Specific sampling methods may vary among the range of systems that may be tested and the QA/QC elements of the sampling protocol should reflect the specific system being tested. The methods for preserving, transporting, and maintaining chain of command are not expected to vary by system but should conform to scientifically acceptable methods for each parameter. The range of parameters tested may vary in accordance with the Field Testing Protocol.

Field duplicates of both influent and effluent samples will be collected. The field duplicates will be collected at a frequency of one duplicate for every ten samples collected of influent and effluent. The laboratory will run duplicate samples as part of the laboratory QA program. Duplicates are analyzed on a frequency of one duplicate for every ten samples analyzed. The data quality objective for precision is based on the type of analysis performed. Table 7-2 shows the laboratory precision that has been established for each analytical method. The data quality objective varies from a relative percent difference of ±10% to ±30%. 1.1.2 Accuracy Accuracy is defined for water quality analyses as the difference between the measured value or calculated sample value and the true value of the sample. Spiking a sample matrix with a known amount of a constituent and measuring the recovery obtained in the analysis is a method of determining accuracy. Using laboratory performance samples with a known concentration in a specific matrix can also monitor the accuracy of an analytical method for measuring a constituent in a given matrix. Accuracy is usually expressed as the percent recovery of a compound from a sample. The following equation will be used to calculate percent recovery: Percent Recovery = [( A T A i ) / A s ] x 100% Where: A T = Total amount measured in the spiked sample A i = Amount measured in the un-spiked sample A s = Spiked amount added to the sample During the verification test, the laboratory will run matrix spike samples at a frequency of one spiked sample for every 10 samples analyzed. The laboratory will also analyze liquid and solid samples of known concentration as lab control samples. The accuracy objectives by parameter or method are shown in Table 7-2. 1.1.3 Comparability Comparability will be achieved by using consistent and standardized sampling and analytical methods. All analyses will be performed using EPA or other published methods as listed in the analytical section (Table 6-2). Any deviations from these methods will be fully described and reported as part of the QA report for the data. Comparability will also be achieved by using National Institute of Standards (NIST) traceable standards including the use of traceable measuring devices for volume and weight. All standards used in the analytical testing will be traceable to verified standards through the purchase of verifiable standards, and maintaining a standards logbook for all dilutions and preparation of working standards. Comparability will be monitored through QA/QC audits and review of the test procedures used and the traceability of all reference materials used in the laboratory. 1.1.4 Representativeness Representativeness is the degree to which data accurately and precisely represent a characteristic population, parameter at a sampling point, a process condition, or an environmental condition.

The test plan design calls for grab and composite samples of influent and effluent to be collected and then analyzed individually or as flow-weighted composites. The sampling locations for the samples are designed for easy access and are directly attached to the pipes that carry the wastewater. This design will help ensure that a representative sample of the flow is obtained in each grab or composite sample bottle. The sample handling procedure includes a thorough mixing of the composite container prior to pouring the samples into the individual containers. The laboratory will follow set procedures (in accordance with good laboratory practice) for thorough mixing of any samples prior to sub-sampling in order to ensure that samples are homogenous and representative of the whole sample. The <System> will be operated in a manner consistent with the supplied O&M manual, so that the operating conditions will be representative of a normal installation and operation for this equipment. Representativeness will be monitored through QA/QC audits (both field and laboratory), including review of the laboratory procedures for sample handling and storage, review and observation of the sample collection, and review of the operating logs maintained at the test site. At least two field and lab audits will be performed by the VO or their representative. 1.1.5 Completeness Completeness is a measure of the number of valid samples and measurements that are obtained during a test period. Completeness will be measured by tracking the number of valid data results against the specified requirements in the test plan. Completeness will be calculated by the following equation: Percent completeness = (V / T) x 100% Where: V = number of measurements that are valid T = total number of measurements planned in the test The goal for this data quality objective will be to achieve minimum 80% completeness for samples scheduled in the test plan. 1.2 Project Management 1.2.1 Management Team NSF or another ANSI certified test facility shall be responsible for management of the verification test including meeting the Field Testing Protocol objectives and the data quality objectives, as measured by the DQI s. A quality control officer shall be identified in the QA/QC section of the system specific field protocol. The quality control officer will be responsible for audits, assessment, and review of procedures and quality data. The phone number, email address, and mailing address for each person named shall be provided.

1.2.2 Project Objectives The primary objective of the field verification protocol is to measure the performance of this technology through a well-defined test plan that includes measurement of key parameters in the wastewater under real-world operating conditions. This objective will be accomplished by implementing the sampling and analysis program described the Field Testing Protocol and by meeting the data quality objectives described in this Quality Assurance and Quality Control - Project Plan. The test plan includes methods used for selecting systems to be evaluated, and measuring the influent to and effluent from the unit and all removed residuals. The primary parameters being measured are TSS, CBOD 5 temperature, and ph, and optional secondary parameters which may be measured include TKN, NH 3 -N, TP, Alkalinity, NO 3 /NO 2 and total coliform. 1.2.3 Project Schedule After system selection testing shall be conducted for 12 consecutive months. A sampling schedule by day will be established prior to startup of the unit. The schedule will be confirmed with the homeowners, manufacturer, and the testing laboratory or laboratories and adjusted as necessary. 1.3 Measurements and Data Acquisition 1.3.1 Sample Collection and Chain of Custody There are two basic types of samples which may be collected for this verification test, grab samples and flow weighted composite samples. Samples will be collected only when the unit is actively receiving or discharging wastewater. The sample bottles required for the various analyses will be provided by the designated analytical laboratory. The bottles will come with preservative in the bottles and labeled by analysis type. Samples will be placed in coolers with ice to maintain temperature, and will be delivered to the laboratory the day of sample collection. Chain of custody will be maintained for all samples collected during the verification test. The unit operators responsible for sample collection will fill out a chain of custody form. The form will be signed and dated for each set of samples delivered to the analytical laboratory. The receiving technician will acknowledge receipt of the samples by signing the chain of custody form and providing a copy of the form to the sample delivery person. Copies of the completed chain of custody forms will be included with all laboratory reports transmitting final analytical results. 1.3.2 Analytical Methods All of the analytical methods used during the verification test will be EPA-approved methods or methods from Standards Methods for the Examination of Waster and Wastewater, 20 th Edition. Table 6-2 shows the analytical methods that will be for the verification test and the typical detection limits that are achieved by these methods.

1.3.3 Analytical Quality Control The quality control procedures for blanks, spikes, duplicates, calibration of equipment, standards, reference check samples and other quality control measurements will follow the guidance in the EPA methods, the analytical laboratory s SOP s and the analytical laboratory s Quality Assurance and Quality Control Manual. Table 7-1 shows the frequency of analysis of various quality control checks. Table 7-2 shows the quality control limits that will be used by the laboratory for these analyses and to ensure compliance with the DQI s for accuracy and precision. Field and laboratory duplicates will be performed at a frequency of one duplicate per ten samples collected. Samples will be spiked for accuracy determination at a frequency of one sample per ten samples analyzed by the laboratory. Accuracy and precision will be calculated for all data using the equations presented in earlier in this section.

Table 7-1. Summary of Calibration Frequency and Criteria Analysis (Reference Methods) ph (150.1) Calibration Frequency Calibration Points Acceptance Criteria Initial Calibration daily Check calibration after every 10 noncalibration analyses Initial Calibration: Initial: Two buffers 4-7 or 7-10 ICV ± 0.1 s.u. Independent buffer at 8 (ICV) Continuing: Continuing Calibration Check: CCV ± 0.1 s.u. ph 4, 7 or 10 (CCV) Depending on initial calibration range Temperature Once per quarter NIST traceable Turbidity Once per month unless ICV out of range Five point calibration curve ICV every ten samples Alkalinity Calibrate ph meter every run see above Calibrate at ph 8, run check at ph 3, ICV check Total Suspended Calibrate balance daily QC standard each run Solids NIST traceable weights Blank each set CBOD 5 COD PO 4 -P Total And Soluble Calibrate DO Probe daily with Winkler titration Calibrate at start of each run Check calibration after every 10 non-calibration analyses Calibrate at start of each run Check calibration after every 10 non-calibration analyses QC check sample each set Blank each set Initial Calibration: Four point standard curve, blank Continuing Calibration Check: mid-range standard (ICV) every ten samples blank check every ten Initial Calibration: Six point standard curve, blank, ICV check Continuing Calibration Check: ICV and CCV standards and a blank every 10 samples ICV ± 10% of true value ICV ± 10% QC std. within supplier specifications Blank < MDL QC ± 10% Initial: Blank <MDL Continuing: CCV ± 10% of Theo. Value Blank <MDL Initial: Correl. Coeff. >0.995 Blank <MDL ICV ± 15% Continuing: CCV ± 15% of Theo. Value ICV ± 15% Blank <MDL TKN Same as PO 4 -P Same as PO 4 -P Same as PO 4 -P NH 3 -N Same as PO 4 -P Same as PO 4 -P Same as PO 4 -P NO 3 /NO 2 Same as PO 4 -P Same as PO 4 -P Same as PO 4 -P Total Coliform Media quality check QC check monthly Positive QC check monthly Confirmation of 5% of positive samples

Table 7-2. Summary of Analytical Accuracy and Precision Limits Sample Matrix Analyses Units Reference Methods Accuracy Percent Recovery Precision Relative Percent Diff. Water ph S.U. 150.1 N/A 0-10% Temperature ο C SM 2550B N/A 0-10% Turbidity NTU EPA 180.1 N/A 0-10% Alkalinity mg/l as EPA 310.1 80-120% 0-10% CaCO 3 Total Suspended mg/l EPA 160.2 N/A 0-10% Solids CBOD 5 mg/l EPA 405.1 75-125% 0-20% COD mg/l EPA 410.4 80-120 0-20 PO 4 -P (Total) mg/l EPA 365.2 80-120 0-10 PO 4 -P (Soluble) mg/l EPA 365.2 80-120 0-10 TKN mg/l EPA 351.2 80-120 0-10 NH 3 -N mg/l EPA 350.1 80-120 0-10 NO 3 /NO 2 mg/l EPA 353.2 80-120 0-10 Total Coliform cfu/100ml SM 9223 N/A N/A Solid Total Solids % EPA 160.1 N/A 0-30 N/A - Not applicable Laboratory blank water of known quality will be used for all laboratory analyses. If contamination is detected in the blank water, the analysis will be stopped and the problem corrected. Laboratory blanks, method blanks and any other blank water data will be reported with all analytical results. Laboratory control samples, where applicable, will be used to verify that methods are performing properly. The control samples will be blank water spiked with constituents from standards obtained from certified source material. Balances will be calibrated each day with NIST traceable weights. A calibration logbook is maintained to demonstrate the balances are accurate. Field blanks will be prepared at the test site and sent to the laboratory with the samples for two sampling events. 1.3.4 Data Reduction, Handling, and Reporting 1.3.4.1 Reporting Units Requirements All analytical results will be reported in standard units of mg/l, ug/l, mg, grams, etc. Flow rates and volumes will be reported as gallons per minute, gallons per day and gallons. Analysis of

solids will clearly indicate if the concentration is on a dry weight or wet weight basis. Table 7-2 shows a summary of the reporting units. 1.3.4.2 Documentation All of the field and laboratory activities will be thoroughly documented by the use of field logbooks, chain of custody sheets, laboratory notebooks and bench sheets, and instrument records. A field logbook and a maintenance checklist will be maintained at the test site. Daily activity entries documenting operating conditions, observations, and maintenance activities will be made in the logbook. Each sample collected will be noted in the logbook and any other pertinent information will be recorded. Completed pages in the logbook will be signed and dated. Original chain of custody forms will be sent with all sample(s) sent to the chemical laboratory. The laboratory will produce a final data report that includes all chemical test results, physical measurements, QA/QC data for blanks, accuracy (recovery), and precision (percent difference), and lab control or matrix check samples. Any deviations from the standard protocols will be discussed in a narrative, any data that does not meet the QA/QC requirements will be flagged, and a narrative will be prepared discussing the findings of any corrective action. The laboratory will maintain all logbooks, bench sheets, instrument printouts etc. in accordance with the laboratory QA/QC Manual. The QA/QC or Laboratory Coordinator will make these records available for inspection upon request. 1.3.4.3 Document Handling During the test period, the original field logbook will be kept at the test site. At the end of the test period, the original logbook will be sent to the VO manager for storage in a secure central project file. Original laboratory data reports with the original chain of custody will be placed in the central project file at NSF or other ANSI certified testing agency. Copies of these reports and any electronic data will be sent to the Project Manager (TO) and QA/QC officer for review in preparation for writing the verification report. Other copies of the data or logbooks may be distributed to other project team members. 1.3.4.4 Data Reduction and Validation All measurements and analytical results will be reported in units that are consistent with the methods used, and as shown in Table 7-3. The laboratory analysts will record raw data in laboratory notebooks or bench sheets using standard formats. Each analytical method will contain instructions for recording and calculating the results. The laboratory analyst will have primary responsibility for verifying that the results recorded are accurate. Data review and QA/QC review will be the responsibility of the laboratory staff following the standard data review and verification procedures for the laboratory. Data transcribed for entry into a computerized database will be checked against the lab bench sheets or instrument printouts. The final data report will be signed by an authorized laboratory manager/supervisor in accordance with laboratory policy.

The final data reports and any electronic data received by the project team from the laboratory will be 100% checked. The Project Manager will cross check 100% of the data in the final reports from the laboratory with a printout of spreadsheets developed to summarize the data. The QA/QC officer will review the final data reports and all QA/QC information. The QA/QC officer will issue a QA/QC review report discussing the quality of the data, how it compares to the DQI s, and any data that should be flagged as invalid or questionable. 1.4 Assessments At least one field audit will be conducted by NSF QA/QC staff or a designee during the test. The audit(s) will serve to observe the sample collection procedures being used, to observe operation of the unit, condition of the test site, and to review the field logbook(s). A written report will be prepared by the auditor and submitted to the NSF QA/QC Officer and the NSF Manager. At least one lab audit will be performed by NSF QA/QC staff or a designee during the test to observe sample receipt, handling, storage, and to confirm proper analytical methods, QA/QC procedures and calibrations are being used. NSF will verify the analytical laboratory has an assessment program that includes internal and external audits, quality reports to management, and other internal checks, which are part of the system used to ensure that the QA/QC procedures are being implemented and maintained. The assessment procedures will be part of the QA/QC program and will be followed during the time the analytical work is being performed for the verification test. 1.5 Corrective Action Field-related activities that could require corrective action include problems with sample collection, labeling, and improper entries or missed entries in logbooks, or operational problems with the unit. The primary person responsible for monitoring these activities will be the responsibility of the laboratory collecting the samples with external audits by NSF designated staff. If a problem occurs, the problem will be noted in the field logbook and laboratory will notify NSF and the manufacturer of the system being evaluated of any deficiencies. The problem, once identified, will be corrected. If a change in field protocol related to sample collection or handling is needed, the change will be approved by the NSF Manager. All corrective actions will be thoroughly documented and discussed in the verification report. Laboratory corrective actions will be taken whenever: There is a non-conformance with sample receiving or handling procedures; The QA/QC data indicates any analysis is out of the established control limits; Audit findings indicate a problem has occurred; and Data reporting or calculations are determined to be incorrect. The analytical laboratory shall have a corrective action plan as part of the laboratory QA/QC Manual. These procedures will be followed, including notifying the laboratory QA/QC Manager and the TO. All corrective actions will be thoroughly documented and reported to the TO. All data impacted by a correction will be so noted and a discussion of the problem and corrective action will be included with the data report.

All corrective actions, either in the field or in the laboratory, will be reported to the Project Manager. The Project Manager will review the cause of the problem and the corrective action taken by the TO. The review will include consideration of the impact of the problem on the integrity of the test, and a determination will be made if the test can continue or if additional action is needed. Additional action could include adding additional days to the test period, restarting the test at day one, or other appropriate action as determined by the Project Manager. The Project Manger will respond to any notification of corrective action within twenty-four hours of being notified of the problem. This response can be to continue the testing, cease testing until further notice, or other appropriate communication regarding the problem. The response by the Project Manager will be in writing by email, fax, or letter.