1 DIVISION OF ENVIRONMENTAL RESPONSE AND REVITALIZATION DATA QUALITY OBJECTIVES PROCESS SUMMARY DERR-00-DI-32 Internal Guidance Document: January 2002
2 DERR DQO PROCESS SUMMARY Table of Contents Step Page Foreward... ii Introduction Step 1: State the Problem Step 2: Identify the Decision Step 3: Identifying the Inputs to the Decision Step 4: Define the Boundaries of the Study Step 5: Develop a Decision Rule Step 6: Specify Tolerable Limits on Decision Errors Step 7: Optimizing the Design i
3 The Division of Environmental Response and Revitalization (DERR) has developed a summary of the US EPA Guidance for the Data Quality Objectives Process (EPA QA/G-4) (EPA 1994). This interpretation of the DQO Process is meant to serve as a tool for DERR site coordinators and project-specific DERR staff (support staff). By following the prescribed steps of the DQO Process, DERR staff (and other project personnel) will gain insight and assistance into addressing, formulating and documenting the type and quality of data needed for sound environmental decisions. A description of the methods for collecting and assessing those data are also a part of the DQO Process described herein. The development, review and successful implementation of the DQO Process is a mandatory part of the DERR planning process and the DERR Quality Assurance/Quality Control (QA/QC) Program. This process will help DERR develop, plan and implement projects which will ensure that data generated and used in decisionmaking are of the type and quality needed for their intended purpose. This document provides a summarized version of the DQO Process and includes the specific requirements necessary for appropriate project planning within DERR and for organizations that conduct environmental data collections operations on behalf of the Ohio EPA/DERR. This document is one of the Ohio EPA/DERR QA/QC documents which describe DERR policies and procedures for planning, implementing, and assessing the effectiveness of DERR s quality system. ii
4 Background With the creation and implementation of the Division of Environmental Response and Revitalization (DERR) Quality Management Plan (QMP) (October 2000), the DERR is now required to follow the Data Quality Objective Process (DQO) as it has been established by the US EPA document, Guidance for the Data Quality Objective Process (EPA QA/G-4). This change to the project planning process is one component in the overall Quality Assurance/Quality Control Program within the division. DERR engages annually in a number of data collection activities. This data is used for environmental decision making and for support of project-specific activities and actions. The need to collect data of known quality and quantity assists in the overall confidence of the data being collected. For this confidence to exist, and for the data to be used in support of decisions, there is therefore a need for an established, structured process for this data collection activity. II What are Data Quality Objectives (DQO)?: DQO s are Qualitative and Quantitative criteria for clarifying project objectives, defining the appropriate types of data needed, and defining the tolerable levels of potential decision errors for the project. It is a systematic planning process to generate environmental data appropriate and sufficient for its intended use. The process is designed to answer four basic questions: 1). What data is needed? 2). Why is it needed? 3). How will the data be used? 4). What tolerance does the user have for decision errors? The DQO process is a method to ensure that the collection and analysis of data for a project meets the requirements for the specific project goal. The conceptual site model is an integral part of the DQO process and can provide direction to define tolerable limits and data needs. III Data Quality Objective Process The structured process for the planning of all data collection activities within DERR will be the DQO Process. The DQO Process is DERR s recommended planning process for environmental decision making. DQOs provide consideration for a number of issues during the planning stages of a project.
5 Some of these items/issues include: Early involvement of the decision maker(s) A graded approach to data quality requirements Effective sampling and analysis programs A basis for judging the usability of the collected data The DQO Process is essentially a systematic planning process for generating environmental data that will be sufficient for their intended use. From consideration of the qualitative and quantitative statements and the general questions indicated above, formation of specific underlying principles for environmental data collection can be realized through implementation of the DQO Process. These principles (or items for consideration) include, but are not necessarily limited to: 1. Collected data may have the potential for error. 2. Absolute certainty comes with a high price 3. The DQO Process defines tolerable error rates 4. Without DQOs, the quality of decisions are unknown 5. The DQO Process is based on the scientific method The DQO Process involves a seven step process that is considered a dynamic process through the life-cycle of a project. The term dynamic is used because it affords project personnel the ability to re-visit the planned goals and objectives for a project and make necessary adjustments where and when needed for successful completion of the project. Before the seven step DQO Process is started, several issues/items must be taken into account which help in preparation for consideration of each step of the DQO Process. These issues/items include getting the right people involved in the right way with the project. Who are the stakeholders, the decision maker, the technical experts, and the personnel with appropriate statistical training? Other items proven to be useful when implementing the DQO Process include gathering existing site data/knowledge, giving consideration to the overall project objectives, and being realistic about resource and sociopolitical constraints. The seven steps of the DQO Process are as follows: 1. State the Problem 2. Identify the Decision 3. Identify Inputs to the Decision 4. Define Boundaries 5. Develop a Decision Rule 6. Specify Limits on Decision Errors 7. Optimize the Data Collection Design
6 Data Quality Objective Process - DERR Summarized Overview The following pages in this document include a summary of the DQO Process used by DERR staff for environmental data collection activities. The material and/or language have been summarized to provide a usable, general understanding of this process for all DERR staff. The key points for each step of the DQO Process are put into general terms for ease of understanding. This summary will help facilitate appropriate implementation of the process and will further afford DERR with the collection of data that is of sufficient quality and quantity to support DERR decision making.
7 STEP 1 - STATE THE PROBLEM I. Establish the planning team including the decision makers. This should include the project manager, technical staff, data users and stakeholders. II. Describe the problem and develop a conceptual model of the environmental hazard to be investigated. Describe the conditions that may pose a threat to human health or the environment or circumstances of potential noncompliance with regulations and the reasons for undertaking the study. Conduct literature and ongoing/past study searches to help define the problem. Development of the conceptual model at this time is critical. The conceptual model is often a diagram that shows: A. known or expected locations of contaminants, B. potential sources of contaminants, C. media that are contaminated or may become contaminated, and D. exposure scenarios (location of human health or ecological receptors). III. Identify available resources, constraints and deadlines. Limitations on resources and time constraints for collecting data need to be identified. Available personnel, contracts (if applicable) and intermediate and time-frames for collecting data should be identified. Outputs of this step are as follows: 1). A list of the planning team members and their roles. 2). Identification of decision makers. 3). A concise description of the problem and a conceptual model of the environmental problem to be investigated. 4). A summary of available resources and relevant deadlines for the study including budget, availability of personnel and schedule. Reference: US EPA Guidance for the Data Quality Objective Process (EPA QA/G-4 August 2000) Chapter 1 - pages 1-4 through 1-6.
8 STEP 2 - IDENTIFY THE DECISION The purpose of this step in the DQO process is to develop a Decision Statement that identifies both the purpose of your study and the potential actions that may be taken when the data is analyzed. If the principal study question is not obvious then the study may fall in the category of exploratory research, in which case this particular step may not be needed. I. Identify the principal study question (p. 2-2) A. What is the unknown condition(s) or unresolved issue(s) being studied? or What do you know vs. what you don t know? Ex. Does the concentration of contaminants in groundwater exceed acceptable levels? II. What are the alternative actions that may be taken to solve the problem? (p. 2-2) A. The purpose of this question is to develop a list of possible remedial actions that may be taken depending on the answer to the study question. Ex. 1. Design and implement a ground water cleanup action 2. Hook up residents to City water supply 3. No action III. Combine the principal study question and the alternative actions into a decision statement. This statement is the output of this DQO step. (p. 2-3) A. The following template may be helpful: Determine whether or not [unknown environmental conditions/issues/criteria from the principal study questions] require (or support) [taking alternative actions]. Ex. Determine whether or not the concentration of contaminants in groundwater exceed acceptable levels and if they do design and implement a groundwater cleanup action, and/or hook up residents on private wells to the City water supply. IV. If multiple decisions are to be resolved during the study, you should examine how the decisions relate to one another and then prioritize each separate decision step in order of importance and the sequence in which they should be resolved. (p. 2-3) Output of this step are as follows: 1). A Decision Statement that links the principal study question to possible actions that will solve the problem. Reference: US EPA Guidance for the Data Quality Objectives Process (EPA QA/G-4 August 2000) Chapter 2 - pages 2-1 to 2-4.
9 STEP 3 - IDENTIFYING THE INPUTS TO THE DECISION The following represents the basic elements required to identify inputs to the decision for the Data Quality Objectives Process: I. Identify to the kinds of information needed II. Identify the sources of information III. Determine the basis for setting the Action Level IV. Confirm the appropriateness of proposed sampling and analyses methods I. Identify to the Kinds of Information Needed Determining the kinds of information needed can be accomplished by answering the following questions: A. is information on the physical properties of the media required; B. is information on the chemical or radiological characteristics of the matrix needed; C. can existing data be used to make the decision; and D. are new measurements of environmental characteristics needed. If it is decided that new measurements are needed, a list of characteristics required to be measured to make the decision should be developed. If the decision can be based on existing data, the sources of data should be examined to ensure acceptability. In the event consideration is given to integrating new data with existing data, parameters with the existing data need to be examined so that new samples are collected and/or analyzed in a similar manner. II. Identify the Sources of Information It is necessary to identify and document the sources of information needed to resolve the decision. The sources of information may include results of previous data collections, historical records, regulatory guidance, professional judgment, scientific literature, or new data collections. III. Determine the Basis for Setting the Action Level Action Levels are concentrations of contaminants that are based on regulatory requirements, risk assessments, performance criteria for analytical methodology, or a reference standard. It is important to understand how the Action Level will be derived or rather what information will be used to determine the Action Level.
10 Establishing an actual numerical value for the Action Level is not required at this stage; however, a potential Action Level should be established. It is important to note that if the Action Level (used here as an example) is based on a regulatory requirement, the numerical value of the Action Level will be known at this stage in the process. If the Action Level is based on a risk assessment or other performance criterion, specification of the numerical value of the Action Level should be deferred until after the study boundaries have been determined. IV. Confirm the Appropriateness of Proposed Sampling and Analyses Methods A list of sampling and analytical methods, derived from evaluating pertinent environmental characteristics, should be developed that are appropriate for the scope of the problem being investigated. It is imperative to specify sampling considerations required for detecting analytes at low concentrations and procedures required to collect these sample quantities as well as identifying analytical methods that have appropriate detection limits. Importance should be given to the problem of minimizing bias as this is an important performance characteristic of sampling and analysis. Methods known to exhibit large biases should be avoided if possible. Additional considerations include requirements for certification of personnel and laboratory accreditation. Laboratories analyzing environmental samples should follow standard protocols and procedures or use performance-based methods. Outputs of this step are as follows: 1). A list of environmental characteristics that will be measured to enable decision making. 2). A list of information sources or methods/assumptions that indicate how each Action Level will be derived 3). A list of information that may be applicable to uses of the data in future investigations 4). Confirmation that sampling and analytical methods exist to meet the detection limit criteria required given the appropriate magnitude of the Action Level 5. Reference: US EPA Guidance for the Data Quality Objectives Process (EPA QA/G-4, August 2000) Chapter 3 - pages 3-1 through 3-5.
11 STEP 4 - DEFINE THE BOUNDARIES OF THE STUDY I. Define the target population of interest. 1 The target population is the set of all environmental samples about which the decision maker wants to draw a conclusion and include the various media to be sampled. 2 In environmental studies the target population is often the set of all possible samples from various media that taken together constitute the geographic area of interest. II. Determine spatial boundaries that define the physical area to be studied and generally where samples will be collected. 1 Define the geographic area applicable for decision making. A metropolitan city limit, soil within a property (to a depth of six inches), or a length of shoreline are examples of defined geographic areas. 2 Divide the population into subsets that have relatively homogeneous characteristics. Dividing the target population into subpopulations that are relatively homogeneous within each area or subunit and appropriate sampling design (Step 7) can reduce the total number of samples required to meet tolerable limits on decision errors. III. Determine temporal boundaries that describe the time frame that the study will represent and when samples should be collected. 1 Determine when to collect data. Consider time-related phenomena such as weather, seasons, etc., and how they may affect data and data collection. 2 Determine the time frame for decision making. Evaluate the population and determine the optimum time frame for collecting data, given that the medium may change over time. IV. Determine the practical constraints on collecting data. 1. Practical constraints could include site access, availability of equipment or staff, and conditions which would prevent sampling such as flooding, freezing temperatures, etc. V. Define the scale of the decision making. 一 Determine the smallest subpopulation, area, volume, or time for which separate decisions must be made (e.g., operable units, species age, gender, etc.). 一 Decisions may be based on risk, technological considerations, temporal considerations, financial scale, or others.
12 Outputs of this step are as follows: 1). Detailed descriptions of the characteristics that define the population to be sampled 2). Detailed descriptions of geographic limits (spatial boundaries) that are appropriate for the data collection and decision making. 3). Time frame for collecting data and making the decision. 4). List of practical constraints that may interfere with data collection. 5). Appropriate scale for decision making. Reference: US EPA Guidance for the Data Quality Objective Process (EPA QA/G-4, August 2000) Chapter 4 - pages 4-1 through 4-9.
13 STEP 5 - DEVELOP A DECISION RULE I. Background: A. Steps 1-5 of the DQO Process are primarily focused on identifying qualitative criteria including: 1 The nature of the problem that initiated the study and a conceptual model of the environmental hazard to be investigated. 2 The decisions that need to be made and the order of priority for resolving them. 3 The type of data needed (i.e., geographic area, environmental medium, overall timing of data collection, etc.) 4 A decision rule that defines how the data will be used to choose among alternative actions. (EPA QA/G-4 0-8) B. Step 5, assume that all information determined in the previous steps (1-4 outputs) are perfect : 1 The site problems have been correctly identified 2 the conceptual model is appropriate for the site 3 the proper decisions to be made have been identified 4 the appropriate sources of information were identified 5 the appropriate potential Action Levels were identified 6 the appropriate possible measurement methods were identified 7 the spatial/temporal boundaries of the decision have been decided. C. Step 5 Activities 1 define appropriate statistical parameter(s) (e.g., mean, median, etc.) 2 specify Action Level(s) (sets boundaries for outcomes of the decision process) 3 develop logic for action (measurement and analysis methods -sufficiently sensitive (and appropriate to media, etc.) so measurement method detection limit is below Action Level 4 combine true value of the population parameter selected and the Action Level with scale of decision making (Step 4) and alternative actions (Step 2).
14 D. Outputs from the 4 previous steps are integrated with the Step 5 inputs. 1. A theoretical if...then decision rule is decided. (This rule determines when to choose possible alternative actions). II. Activities (Referenced above in background information): A. Define the population parameter Select a population parameter...that summarizes the critical characteristic or feature of the population (e.g.,see b and c below) that will be compared to the Action Level to make a decision. 1. Mean - (e.g., average of all sediment sample concentrations within the boundaries of the site - central tendency) a. Compares only the middle of the population (group of all sample sets) to the Action Level - extreme values below detection limits will dilute the population mean ([Cd]) and a few higher values will elevate it. b. Unless the sample set concentrations are all close to the same value the mean is not a very useful tool, particularly if there are hot spots (skewed or outlier data). c. Good for chronic carcinogenic exposures concentrations for comparison to Action Levels d. Allows an estimate of the total amount of a contaminant in defined area of soil or water body 2. Median - (e.g., the center most [Cd] value between all sediment samples within the boundaries of the site) a. More useful than the mean for skewed (nonsymetrical, or not normally distributed) data (not influenced by extreme values in sample distribution) b. Also preferred if many sample values are below detection limits (must ave no more than 50% sample values below detection limits - or a true median cannot exist) 3. Percentile - (percent of samples below given value (e.g., Action Level)) a. For use in more acutely toxic contaminants. b. Useful when many of the samples analyzed are below detection limits. c. Can require larger sample sizes than the previous two parameters.
15 4. Total amount - (All samples used) NOTE: The Decision rule and data collection design complexity is dependent upon the chosen parameter complexity. (The above parameters are ranked starting with least to most complex) B. Define Action Level 1. Types of Action Levels a. Predetermined C. Detection Limits 1). Fixed standards (e.g. drinking water or technology based) b. DQO Process determined 1). Background or specific risk-based standards (e.g., conservative or real concern thresholds) NOTE: more conservative standards can require more stringent detection limits (i.e., more sensitive analytical methods - usually more expensive) 1. Detection limits for measurement methods identified in Step 3 must be below the Action Level. If not, then a more sensitive method or different approach will be necessary. 2. Detection limits are defined for a specific purpose. For quantitative identification purposes they will be more stringent, ruling out most probabilities of false positive results in the matrix while increasing a high probability of an accurate positive identification. 3. When comparing a mean concentration to a threshold action level, detection limits must be determined to define the reliability of quantitation. Outputs of this step are as follows: A. Construct the theoretical if...then.. decision rule 1). Combine selected population parameter and Action Level with scale of decision making (Step 4) and the alternative actions (Step 2) (e.g., If the true mean dioxin concentration in the surficial 2 inches of soil of a decision unit (20' X 100') exceeds 1 ppb, then remove a 6 in layer of soil. If the true mean is not greater than 1 ppb, then do nothing.) Reference: US EPA Guidance for the Data Quality Objectives Process QA/G-4 August 2000) Chapter 5 - pages 5-1 to 5-5. (EPA
16 STEP 6 - SPECIFY TOLERABLE LIMITS ON DECISION ERRORS In this step of the DQO process, the user no longer imagines that perfect information and unlimited data will be available for making decisions. The purpose of Step 6 is to specify quantitative performance goals for choosing between the alternative actions in order to determine the probabilities of making errors in the decision. To do this, a user should conduct the following activities: I. Determine the sources of potential errors in the data set II. Establish a plausible range of values for the chemicals of concern III. Define the types of potential decision errors and the consequences of making the errors: Determine how to manage the errors Select the baseline condition that will be assumed to be true in any event VI. Specify a range of possible parameter values (gray area) VII. Assign probability values at several true value points above and below the Action Level that reflect your tolerable probability for the occurrence of decision errors. The Activities of Step 6: I. Sources of Error: A decision error occurs when the sample data set misleads you into making the wrong decision (wrong response action). Sample data is subject to various random and systematic errors throughout the collection process. The combination of all errors is called the total study error. Typically, there are two types of study errors: A). Sample Design Error: This occurs when the data collection design does not fully capture the variability in the decision unit. This error leads to random errors, such as variability and imprecision, and systematic errors, such as bias, in estimating a chemical of concern. B). Measurement Error: This is an error of variability in sample collection, handling, preparation, analysis, data reduction, transmission, and storage. Both random and systematic errors in measurement occur because of imperfections in the measurement and analysis system selected for a project. II. Establish a Plausible Range of Values: The user must review the current and historical data and approximate the upper and lower boundaries of the chemical of interest using best professional judgment. An example would be to use the mean of the chemical of interest as the relevant value and the highest and lowest concentration of contamination as the upper and lower boundaries as the range.
17 III. Defining Decision Errors: When uncertainty in the data does not allow the user to determine whether the true value is above or below the Action Level, it becomes necessary to assign a de facto decision when there is not enough evidence to refute the decision. An example is innocent until proven guilty. When the user examines the data set and makes incorrect decisions based upon a faulty data set, the errors can be either False Acceptance errors (false positives) or False Rejection errors (false negatives). IV. Managing Decision Errors: Although the possibility of making decision errors cannot be totally eliminated, a user must learn to manage the errors when making decisions on response actions. The user must determine the final goal of the data and what type of errors would make the biggest impact on the use of this data. If sampling design errors are believed to be the biggest concern, a larger sample size or development of better sampling techniques may be required. If the measurement errors are suspect, multiple analysis of the sample or use of more precise and accurate analytical methods may be required. However, a balance of the consequences against the cost of limiting these errors must also be evaluated. If moderate errors in sample design or measurement would not compromise a project, a user may consider the screening data obtained satisfactory for the preliminary stages of the project and reconfirm the decision later based upon confirmation samples. V. Selection of the Baseline Condition: The baseline condition may already be selected for the user as established by regulatory considerations. If there is not an established baseline, then one would be defined based upon the relative consequences of decision errors, such as a judgment between the threat to public health and the spending of unnecessary resources. Please refer to Section 3 and Section 5 outline pages for this selection and determination. The baseline condition is the Action Level.The user should evaluate the potential consequences of decision errors at several points within the false positive and false negative ranges. In the DQO Process, the baseline condition is the Action Level determined in Step 5. VI. Specify Gray Area Values: The gray area is a region where it is not feasible to control the false position errors at low levels due to the high cost of sampling and analysis and the potential of selecting the wrong course of action is low. The first boundary of the gray area is usually the Action Level.
18 Generally, the gray area is narrower when significant expense is afforded to sampling and analysis in order to reduce uncertainty. It is recommended that the gray area be widened during the early phases of the study process, but narrowed at latter stages to determine if the chemicals of concern are only slightly different than the Action Level. VII. Assign Probability Values Above and Below the Action Level to Reflect Error Tolerance: The user should specify a limitation for false positives and false negatives in the data set with regard to the Action Level based upon the consequences expected from these errors. The rationale for setting these limits should include consideration of regulatory guidelines, potential cost impacts; human health, ecological conditions, and socio-political consequences. USEPA guidance recommends the most stringent limits typically encountered: 1% for both false positives and false negatives. Other guidance can be found in EPA policies and programs if a user has concerns which justify an alternate tolerance. Two software programs which may be helpful during this step are "Data Quality Assessment Statistical Toolbox" (DataQUEST), which is very helpful for the statisticalside of the process, and "Decision Error Feasibility Trials" (DEFT), which can be utilized for the final steps of the DQO Process and helps the scoping team especially in the final step #7 when optimizing your design. These will need to be evaluated based upon the project complexity. Outputs of this step are as follows: 1). Baseline Condition = Regulatory or Risk Limit or Background 2). Gray Area = Range of Acceptable Values 3). Decision Error Limits = Percent of False Positives and Negatives Allowable. Reference: US EPA Guidance for the Data Quality Objectives Process (EPA QA/G-4 - August 2000) Chapter 6 - pages 6-1 through 6-15.
19 I. Purpose: II. Activities: The purpose of this final step is to achieve optimization of the Sampling and Analysis Design. The technical and/or scoping team needs to be re-assembled for this final purpose of the DQO Process. Most typically, there will be more than one design option for the technical/scoping team to review. The end determined decision will incorporate the type and number of samples deemed necessary to characterize a site or an area of the site. The scoping team is expected to review each design option and balance the site assessment sampling with such resources as funding, personnel, and temporal constraints while meeting the DQOs established in the previous steps. The technical/scoping team should review the outcomes of the previous DQO Process steps to determine exactly how the limits on decision errors will prescribe number and location of samples to be collected and the types of analyses per sample. From information of the previous DQO steps, the team can then confirm the budget for sampling and analysis and the project schedule. Once the final design is selected, the team leader and/or project coordinator must ensure that the chosen design is properly documented. Outputs of this step are as follows: 1). The most resource-effective design is expected to achieve the DQOs. Reference: US EPA, Guidance for the Data Quality Objectives Process (EPA QA/G-4 August 2000), Chapter 7 - pages 37 through 40.
AN APPLICATION OF USEPA S DATA QUALITY OBJECTIVE PROCESS Karen A. Storne O Brien & Gere Engineers, Inc., 5000 Brittonfield Parkway, Syracuse, NY 13221 ABSTRACT The United States Environmental Protection
United States Office of Environmental EPA/240/B-06/002 Environmental Protection Information Agency Washington, DC 20460 Data Quality Assessment: A Reviewer s Guide EPA QA/G-9R FOREWORD This document is
APPENDIX N Data Validation Using Data Descriptors Data validation is often defined by six data descriptors: 1) reports to decision maker 2) documentation 3) data sources 4) analytical method and detection
United States Office of Solid Waste and EPA 540-R-98-038 Environmental Protection Emergency Response OSWER 9230.0-83P Agency (5102G) PB98-963307 September 1998 Quality Assurance Guidance for Conducting
Risk Management Procedure For The Derivation and Use Of Soil Exposure Point Concentrations For Unrestricted Use Determinations Hazardous Materials and Waste Management Division (303) 692-3300 First Edition
United States Environmental Protection Agency Office of Environmental Information Washington, DC 060 EPA/600/R-96/08 July, 000 Guidance for Data Quality Assessment Quality Practical Methods for Data Analysis
Appendix B: Monitoring Tool Matrices Content: This appendix provides a detailed description of the ISRAP matrix organization and an explanation of each field of the monitoring tool matrices (www.israp.org).
United States Environmental Protection Agency Office of Environmental Information Washington, DC 20460 EPA/240/B-06/001 February 2006 Guidance on Systematic Planning Using the Data Quality Objectives Process
Science & Engineering Practices in Next Generation Science Standards Asking Questions and Defining Problems: A practice of science is to ask and refine questions that lead to descriptions and explanations
Developing Quality Assurance Project Plans using Data Quality Objectives and other planning tools 1 Introductions 2 Agenda I. Developing a QAPP II. Systematic Planning using Data Quality Objectives III.
United States Environmental Protection Agency Office of Environmental Information Washington, DC 20460 EPA/240/B-06/001 February 2006 Guidance on Systematic Planning Using the Data Quality Objectives Process
1. Purpose The Environmental Protection Act 1994 (EP Act) enables listing of land on the environmental management register (EMR) if either a notifiable activity has been or is being conducted, or the land
APPENDIX E THE ASSESSMENT PHASE OF THE DATA LIFE CYCLE The assessment phase of the Data Life Cycle includes verification and validation of the survey data and assessment of quality of the data. Data verification
INTERNATIONAL STANDARDS FOR THE PROFESSIONAL PRACTICE OF INTERNAL AUDITING (STANDARDS) Introduction to the Standards Internal auditing is conducted in diverse legal and cultural environments; for organizations
A GUIDELINE ON THE USE OF SOIL COMPOSITING IN CONTAMINATED SITE INVESTIGATIONS Helen Davies, Pattle Delamore Partners Limited PO Box 389, Christchurch. firstname.lastname@example.org 1.0 Introduction The Ministry
GLOSSARY Assessment - the evaluation process used to measure the performance or effectiveness of a system and its elements. As used here, assessment is an all-inclusive term used to denote any of the following:
ENGINEERING SERVICE CENTER Port Hueneme, California 93043-4370 USER S GUIDE UG-2061-ENV GUIDANCE FOR HABITAT RESTORATION MONITORING: FRAMEWORK FOR MONITORING PLAN DEVELOPMENT AND IMPLEMENTATION Prepared
Ecology Quality Assurance Glossary Edited by William Kammin, Ecology Quality Assurance Officer Accreditation - A certification process for laboratories, designed to evaluate and document a lab s ability
GROUNDWATER REMEDY COMPLETION STRATEGY: Moving Forward with Completion in Mind Table of Contents 1. Introduction...2 1.1 Purpose and Scope...2 1.2 Background...3 2. Elements of a Groundwater Remedy Completion
Environmental Handbook Hazardous Materials This handbook outlines the process steps necessary to address hazardous materials during the development of a project. TxDOT Environmental Affairs Division Release
Chapter 22: Overview of Ecological Risk Assessment Ecological risk assessment is the process of gaining an understanding of the likelihood of adverse effects on ecological receptors occurring as a result
Quantitative Pathogen Risk Analyses 1 Conducting Quantitative Pathogen Risk Analyses via Monte Carlo Simulation Ben G. Fitzpatrick, PhD President, Tempest Technologies White Paper April 2014 www.tempest-tech.com
July 2013 Risk-Based Decision Making for Site Cleanup The Oklahoma Department of Environmental Quality (DEQ) has adopted a risk based decision making process to provide a framework for determining cleanup
ITRC PROJECT PROPOSAL Risk Assessment Guidance and Training: State-of-the-Art Principles and Practice Please use brief statements or bullet items to input the requested information PROPOSAL DATE: March
NGSS Science and Engineering Practices 1. Asking questions (for science) and defining problems (for engineering) A practice of science is to ask and refine questions that lead to descriptions and explanations
Content Sheet 7-1: Overview of Quality Control for Quantitative Tests Role in quality management system Quality Control (QC) is a component of process control, and is a major element of the quality management
SoBran, Inc. Corporate Quality Management Plan for Projects and Activities This proposal or quotation includes data that shall not be disclosed outside the government and shall not be duplicated, used,
PROJECT MANAGEMENT PLAN Outline VERSION 0.0 STATUS: OUTLINE DATE: Project Name Project Management Plan Document Information Document Title Version Author Owner Project Management Plan Amendment History
Revision History COPY The top row of this table shows the most recent changes to this controlled document. For previous revision history information, archived versions of this document are maintained by
5.0 ENVIRONMENTAL IMPACT ASSESSMENT METHODS The methods that are used to conduct the environmental impact assessment (EIA) of the Project are described in this section. The EIA uses a methodological framework
Actuarial Standard of Practice No. 23 Data Quality Revised Edition Developed by the General Committee of the Actuarial Standards Board and Applies to All Practice Areas Adopted by the Actuarial Standards
INTERNATIONAL STANDARDS FOR THE PROFESSIONAL PRACTICE OF INTERNAL AUDITING (STANDARDS) Introduction to the International Standards Internal auditing is conducted in diverse legal and cultural environments;
Overview of EPA Quality System Requirements Course Goals At the completion of this course, you will: Understand EPA's quality system requirements Understand the roles and responsibilities in implementing
Validation and Calibration Definitions and Terminology ACCEPTANCE CRITERIA: The specifications and acceptance/rejection criteria, such as acceptable quality level and unacceptable quality level, with an
United States Office of Environmental EPA/240/B-01/007 Environmental Protection Information Agency Washington, DC 20460 Data Quality Objectives Decision Error Feasibility Trials Software (DEFT) - USER'S
Observations on OMB s Proposed Risk Assessment Bulletin Reflections on Terrorism Risk and Homeland Security Dr L James Valverde, Jr Director, Economics and Risk Management Insurance Information Institute
PROJECT MANGEMENT PLAN EXAMPLES Prepare Project Support Plans and Documentation - Project Risk Assessment Examples Example 54 10.0 PROJECT RISK This section outlines a methodology which will be used to
9. Quality Assurance Setting The overall goal of a well-designed and well-implemented sampling and analysis program is to measure accurately what is really there. Environmental decisions are made on the
Guidelines for Preparation of Review Protocols Type of document: Policy _x_ Guideline Procedure Version: 1.0, January 1 2001 Decision: Steering Group, Date? A Campbell Systematic Review is meant to review
PROJECT MANAGEMENT PLAN CHECKLIST The project management plan is a comprehensive document that defines each area of your project. The final document will contain all the required plans you need to manage,
Developing a Business Case Figures don t lie but liars figure. Attributed to Mark Twain and others The business case document is a formal, written argument intended to convince a decision maker to approve
Evaluating Laboratory Data Tom Frick Environmental Assessment Section Bureau of Laboratories Overview of Presentation Lab operations review QA requirements MDLs/ PQLs Why do we sample? Protect physical,
Model Answer/suggested solution Research Methodology M. Com (Third Semester) Examination, 2013 Paper Title: Research Methodology Paper Code: AS-2375 * (Prepared by Dr. Anuj Agrawal, Assistant Professor,
DIVISION OF ENVIRONMENT QUALITY MANAGEMENT PLAN PART III: DRY CLEANING PROGRAM QUALITY ASSURANCE MANAGEMENT PLAN Revision 3 January 8, 2013 Kansas Department of Health and Environment Division of Environment
IEA Guidance Note G2 IEA Guidance Note G4 Guidance on Evaluation Inception Reports January 2015 1 Table of contents List of Abbreviations... ii Introduction... 1 General guidelines... 1 Content... 2 Opening
This is Dr. Chumney. The focus of this lecture is hypothesis testing both what it is, how hypothesis tests are used, and how to conduct hypothesis tests. 1 In this lecture, we will talk about both theoretical
2009, NUMBER 12 2 ND EDITION PERFORMANCE MONITORING & EVALUATION TIPS DATA QUALITY STANDARDS ABOUT TIPS These TIPS provide practical advice and suggestions to USAID managers on issues related to performance
DIVISION OF ENVIRONMENT QUALITY MANAGEMENT PLAN PART III: VOLUNTARY CLEANUP AND PROPERTY REDEVELOPMENT PROGRAM QUALITY ASSURANCE MANAGEMENT PLAN January 29, 2015 Kansas Department of Health and Environment
Master of Science in Management BUSINESS STRATEGY SYLLABUS Academic Year 2011-2012 Professor: Yuliya Snihur Email: email@example.com Office hours: by appointment COURSE OUTLINE Strategy involves
II. Figure 5: 6 The project cycle can be explained in terms of five phases: identification, preparation and formulation, review and approval, implementation, and evaluation. Distinctions among these phases,
Gathering and Managing Environmental Quality Data for Petroleum Projects Dr. David W. Rich, President, Geotech Computer Systems, Inc., Englewood, CO ABSTRACT Petroleum environmental projects face special
Actuarial Standard of Practice No. 23 Data Quality Revised Edition Developed by the General Committee of the Actuarial Standards Board and Applies to All Practice Areas Adopted by the Actuarial Standards
Presentation Outline 1. Problem Management Process 2. Defining the Problem 3. Formulating the Hypothesis 4. Identifying the Facts 5. Analysing the Facts 6. Developing the Solution Objectives Provide an
Cleanup Levels for CERCLA Remedial Actions Presented October 10, 2014 David M. Buxbaum, Senior Attorney U.S. EPA Region IV Office of Regional Counsel The information provided in the presentation is based
CDM Executive Board Part of version 04, page 1 Please note that this is only an extract of the full guidelines for completing the CDM-NMB and CDM-NMM, as it is the annex 4 of the twentieth meeting of the
Cumulative Effects Assessment in Environmental Impact Assessment Reports Required under the Alberta Environmental Protection and Enhancement Act Introduction This document provides general guidance from
Filed: -0- EB--0 Tab Page of ASSET MANAGEMENT PLANNING PROCESS.0 INTRODUCTION Hydro One s Asset Management Plan is a systematic approach to determine and optimize on-going operating and maintenance expenditures
3-D Data Visualization Taking the Next Step in Data Presentation Stephen Dyment USEPA Office of Superfund Remediation and Technology Innovation firstname.lastname@example.org 15 th Annual OSC Readiness Training
Quantifying uncertainty in human health risk assessment using probabilistic techniques S. Washburn, D. Arsnow, and R. Harris ENVIRON Corporation, 214 Carnegie Center, Princeton, New Jersey, 08540 USA EMail:
Sampling 5.3 SAMPLING 5.3.1 Introduction Data for the LULUCF sector are often obtained from sample surveys and typically are used for estimating changes in land use or in carbon stocks. National forest
A private sector assessment combines quantitative and qualitative methods to increase knowledge about the private health sector. In the analytic phase, the team organizes and examines information amassed
18.6.1 Terms concerned with internal quality control procedures Quality assurance in analytical laboratories Quality assurance is the essential organisational infrastructure that underlies all reliable
PROJECT RISK MANAGEMENT http://www.tutorialspoint.com/pmp-exams/project_risk_management.htm Copyright tutorialspoint.com Here is a list of sample questions which would help you to understand the pattern
APPENDIX Journal Article Reporting Standards (JARS), Meta-Analysis Reporting Standards (MARS), and Flow of Participants Through Each Stage of an Experiment or Quasi-Experiment 245 Journal Article Reporting
United States Environmental Protection Agency Office of Solid Waste RCRA Corrective Action Workshop On Results-Based Project Management: Fact Sheet Series EPA March 2000 www. FACT SHEET #1 HISTORY OF RCRA
Risk Management Plan Author: Adrian Hill Designation: Compliance Manager Phone: 03 8601 2000 Fax: 03 9670 1057 Release Date: 27 February 2006 Version: 2 1 Document History Document Control Number 2 Revision
Executive Summary of the FCSAP Long-Term Monitoring Planning Guidance 0 DISCLAIMER Her Majesty is not responsible for the accuracy or completeness of the information contained in the reproduced material.
Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire P3M3 Portfolio Management Self-Assessment P3M3 is a registered trade mark of AXELOS Limited Contents Introduction
QUANTITATIVE RISK ASSESSMENT FOR ACCIDENTS AT WORK IN THE CHEMICAL INDUSTRY AND THE SEVESO II DIRECTIVE I. A. PAPAZOGLOU System Reliability and Industrial Safety Laboratory National Center for Scientific
INTERNATIONAL FOR ASSURANCE ENGAGEMENTS (Effective for assurance reports issued on or after January 1, 2005) CONTENTS Paragraph Introduction... 1 6 Definition and Objective of an Assurance Engagement...
Complementary Additional Programme 2014-2015 / Concept note Assessing the Environmental and Health Impacts of mining in Sub- Saharan African Countries Geographical scope/benefitting country(ies) Duration
Age to Age Factor Selection under Changing Development Chris G. Gross, ACAS, MAAA Introduction A common question faced by many actuaries when selecting loss development factors is whether to base the selected
Response Levels and Wildland Fire Decision Support System Content Outline In wildland fire management, practitioners are accustomed to levels of incident management, initial attack response, dispatch levels,
MEMORANDUM OF UNDERSTANDING BETWEEN THE ENVIRONMENTAL PROTECTION AGENCY AND THE NUCLEAR REGULATORY COMMISSION CONSULTATION AND FINALITY ON DECOMMISSIONING AND DECONTAMINATION OF CONTAMINATED SITES I. Introduction
INDUSTRIAL HYGIENE EXPOSURE ASSESSMENTS: WORST-CASE VERSUS RANDOM SAMPLING By Jerome E. Spear, CSP, CIH The two types of sampling strategies to consider when planning an exposure assessment study are worst-case
Check list of important issues in contaminated site management Activity 2.1.2 Check list of important issues in contaminated site management Introduction The aim of this activity is to improve the supervision
SR Letter 11-7 Attachment Board of Governors of the Federal Reserve System Office of the Comptroller of the Currency April 4, 2011 SUPERVISORY GUIDANCE ON MODEL RISK MANAGEMENT CONTENTS I. Introduction,