GUIDELINES FOR REVIEWING QUANTITATIVE DESCRIPTIVE STUDIES



Similar documents
Assessing study quality: critical appraisal. MRC/CSO Social and Public Health Sciences Unit

Single and Multiple-Case Study Designs IS493

Systematic reviews and meta-analysis

Missing data in randomized controlled trials (RCTs) can

School Psychology Doctoral Program Dissertation Outline 1 Final Version 6/2/2006

IPDET Module 6: Descriptive, Normative, and Impact Evaluation Designs

WHAT IS A JOURNAL CLUB?

Comparison of Research Designs Template

NON-PROBABILITY SAMPLING TECHNIQUES

Correlation between competency profile and course learning objectives for Full-time MBA

INTERNATIONAL FRAMEWORK FOR ASSURANCE ENGAGEMENTS CONTENTS

Planning a Critical Review ELS. Effective Learning Service

Guided Reading 9 th Edition. informed consent, protection from harm, deception, confidentiality, and anonymity.

Inclusion and Exclusion Criteria

Running head: HOW TO WRITE A RESEARCH PROPOSAL 1. How to Write a Research Proposal: A Formal Template for Preparing a Proposal for Research Methods

Why Sample? Why not study everyone? Debate about Census vs. sampling

Guidance for the format and content of the protocol of non-interventional post-authorisation safety studies

Economic impact of privacy on online behavioral advertising

Guidelines for AJO-DO submissions: Randomized Clinical Trials June 2015

Sampling. COUN 695 Experimental Design

Written Example for Research Question: How is caffeine consumption associated with memory?

Data Extraction and Quality Assessment

PCORI Methodology Standards: Academic Curriculum Patient-Centered Outcomes Research Institute. All Rights Reserved.

Common Core State Standards for Literacy in History/Social Studies, Science, and Technical Subjects, Grades 9-10

Review Protocol Agile Software Development

Disseminating Research and Writing Research Proposals

Management Programmes. Dissertation Student Guidelines

Summary of assessment and checklist for the new GCE Health and Social Care Unit HSC08 (optional)

Biostat Methods STAT 5820/6910 Handout #6: Intro. to Clinical Trials (Matthews text)

Chapter 6 Experiment Process

Thesis Proposal Template/Outline 1. Thesis Proposal Template/Outline. Abstract

Analyzing Research Articles: A Guide for Readers and Writers 1. Sam Mathews, Ph.D. Department of Psychology The University of West Florida

The Billion Dollar Lost Laptop Problem Benchmark study of U.S. organizations

APA Div. 16 Working Group Globalization of School Psychology

PsyD Psychology ( )

Analyzing Marketing Cases

Conducting an empirical analysis of economic data can be rewarding and

Descriptive Methods Ch. 6 and 7

UNITED STATES DEPARTMENT OF EDUCATION OFFICE OF INSPECTOR GENERAL. March 01, 2016

Developing an implementation research proposal. Session 2: Research design

Case Studies. Dewayne E Perry ENS 623 perry@mail.utexas.edu

MRC Autism Research Forum Interventions in Autism

CONSTRUCTING AN EFFECTIVE ACTION PLAN

Forging New Frontiers: Translating theory to practical health solutions in rural communities

Appendix A: Annotated Table of Activities & Tools

Sampling strategies *

Mary B Codd. MD, MPH, PhD, FFPHMI UCD School of Public Health, Physiotherapy & Pop. Sciences

Thesis and Dissertation Proposals. University Learning Centre Writing Help Ron Cooley, Professor of English

Evaluation Plan: Process Evaluation for Hypothetical AmeriCorps Program

PCORI Methodology Standards: Academic Curriculum Patient-Centered Outcomes Research Institute. All Rights Reserved.

The fact is that 90% of business strategies are not implemented through operations as intended. Overview

Instructional Designer Standards: Competencies & Performance Statements

doing a literature review

Article Four Different Types of Evidence / Literature Reviews

Measuring BDC s impact on its clients

Instructional Technology Capstone Project Standards and Guidelines

DESCRIPTIVE RESEARCH DESIGNS

GUIDELINES FOR PROPOSALS: QUALITATIVE RESEARCH Human Development and Family Studies

Skills across the curriculum. Developing communication

Sampling: What is it? Quantitative Research Methods ENGL 5377 Spring 2007

Statistics 2014 Scoring Guidelines

TOOL D14 Monitoring and evaluation: a framework

Master of Science in Nursing Program Thesis and Research Project Handbook

Evidence-Based Nursing Practice Toolkit

Experimental methods. Elisabeth Ahlsén Linguistic Methods Course

Clinical Study Design and Methods Terminology

Software Engineering. Introduction. Software Costs. Software is Expensive [Boehm] ... Columbus set sail for India. He ended up in the Bahamas...

Below is a suggested framework and outline for the detailed evaluation plans with column names defined as follows:

Survey Research: Choice of Instrument, Sample. Lynda Burton, ScD Johns Hopkins University

Goals AND Objectives should be student-centered rather than course-centered Goals AND Objectives should reflect successful student performance

Fairfield Public Schools

Appendices WERKLUND SCHOOL OF EDUCATION

California State University, Los Angeles Department of Sociology. Guide to Preparing a Masters Thesis Proposal

Development and Validation of the National Home Inspector Examination

Module 223 Major A: Concepts, methods and design in Epidemiology

Models of Dissertation in Design Introduction Taking a practical but formal position built on a general theoretical research framework (Love, 2000) th

Pilot Testing and Sampling. An important component in the data collection process is that of the pilot study, which

Introduction to Managerial Economics

On the attributes of a critical literature review. Saunders, Mark N. K. 1 & Rojon, Céline 2. United Kingdom.

ChangingPractice. Appraising Systematic Reviews. Evidence Based Practice Information Sheets for Health Professionals. What Are Systematic Reviews?

Achieve. Performance objectives

Project Management in Marketing Senior Examiner Assessment Report March 2013

PREPARATION OF TECHNICAL APPRAISAL REVIEW REPORTS

Doctor of Education Program Handbook

College of Arts and Sciences: Social Science and Humanities Outcomes

GUIDANCE ON CAPTURING CUSTOMER SATISFACTION RESULTS

2. Issues using administrative data for statistical purposes

Research & Development Guidance for Students

Methods for Assessing Student Learning Outcomes

Systematic Reviews in JNEB

Guidance for Peer Reviewers. The Journal of the American Osteopathic Association (JAOA)

Choosing Methods and Tools for Data Collection

Public Health Insurance Expansions for Parents and Enhancement Effects for Child Coverage

Transcription:

GUIDELINES FOR REVIEWING QUANTITATIVE DESCRIPTIVE STUDIES These guidelines are intended to promote quality and consistency in CLEAR reviews of selected studies that use statistical techniques and other quantitative approaches but do not attempt to assess the causal impact of a program or policy. 1 The guidelines describe the characteristics that reviewers assess for each selected study and are a framework to support consistency across reviews. Given the range in the intent, approach, and findings of descriptive studies, the guidelines are not expected to pertain in full to each study. For studies that combine quantitative descriptive analysis and other types of analyses (such as implementation analysis), reviewers will use these and other relevant review guidelines. Profiles of selected studies on the Clearinghouse for Labor Evaluation and Research (CLEAR) website convey the results of these reviews. 2 Profiles provide clear and concise information on the study design, methods, and findings, with enough information on the quality of the study and its limitations to place the findings in the appropriate context. The guidelines are presented in the form of a checklist that reviewers examine and complete in the course of their reviews (Table 1). They were developed using existing guidelines and resources for similar research (listed in the final section of this document). CLEAR does not use a rating system for descriptive studies. For each criterion, reviewers indicate their assessment of whether the issues were appropriately addressed in the study (yes, no, or mixed) and briefly note the information supporting their assessment. At the end, reviewers summarize the study s key strengths and limitations and their implications for the findings. This information is used to develop the Considerations for Interpreting the Findings section of the study profile. For some studies, these considerations might be well aligned with the limitations reported by the study authors; for others, the considerations noted by CLEAR might differ or be more comprehensive than those of the authors. 1 Examples of studies that would be eligible for review under these guidelines include, but are not limited to, analyses of means and distributions of outcome variables, including service receipt, wages, and employment; analyses of trends in outcomes; comparisons of outcome means and trends by subgroups defined by cohorts, individual characteristics, geographic area, or service receipt; comparisons of outcome means and trends between program exiters and populations targeted by the program; correlational analyses examining relationships between individual and geographic characteristics and outcomes; cost-benefit analyses; and meta-analyses. Examples of studies that would not be reviewed under these guidelines include implementations studies, which are covered by other CLEAR guidelines; qualitative case studies; literature reviews; and analyses of the history of programs. 2 CLEAR produces Highlights on the website for all studies. The highlights include basic information on a given report s objective, setting, methods, and findings. Selected studies receive the more comprehensive review against the descriptive guidelines discussed in this document. 1

Table 1. Checklist for Assessing Technical Quality of Quantitative Descriptive Studies 1. Study Design Criterion 1.1: Is the study design clear and appropriate for addressing the research questions? - Demonstrates how the overall research strategy was designed to meet the study s aims (for example, what the study will do) - Discusses the rationale for the study design (for example, why the study does it this way) - Presents a convincing argument for different features of the design (for example, reasons for different components or stages of research; selection of any groups for examination and description of any comparisons [for example, across groups or over time]; and the rationale for particular methods or data sources, multiple methods, or time frames) Criterion 1.2: Are the program(s) or conditions applying to the group(s) of interest clearly described in sufficient detail to understand and replicate? Criterion 1.3: Are key features of the design including time, place, and context (such as labor market conditions) clearly described? This includes the sampling design, if applicable. Criterion 1.4: Does the study explain limitations of the design and draw appropriate implications for interpreting findings? 2. Data Quality Criterion 2.1: Are data sources clearly identified and appropriate for addressing the research questions? - Documents data sources and variables used to address specific research questions - Discusses any strengths and weaknesses of the data sources Criterion 2.2: Do key variables have face validity and does the study discuss their reliability and validity? Criterion 2.3: Are issues of data completeness, consistency, and accuracy, as well as steps researchers took to resolve these issues, addressed clearly, in sufficient detail, and appropriately? - These issues could include, as relevant, response rates, potential reasons for nonresponse, attrition, movement in and out of the sample, and missing or inconsistent data. Criterion 2.4: Is the description of constructed variables clear and do constructed variables make sense given the outcome of interest for the research question? 2

3. Data Collection Criterion 3.1: Are the data collection methods, sources, and instruments clearly described and appropriate for the research questions? - If the study uses administrative data or surveys conducted by federal agencies, some or all of the data collection criteria might not apply (for example, the American Community Survey). Studies using these types of data should discuss or refer to the publicly available materials about the data s reliability and unbiasedness as well as the basic data collection methods. - Discusses creation of the analytic sample, including details on sampling methods if appropriate Criterion 3.2: Does data collection reflect sound and systematic methods to produce reliable data? - Discusses who collected the data and procedures used - Describes quality assurance procedures in data collection and verification - Discusses how data collection settings or methods might have influenced the data collected - Discusses instrumentation for surveys, if appropriate Criterion 3.2: Does data collection reflect methods that produce unbiased results? 4. Study Sample - Presents evidence of independence and objectivity of the research team - Documents consent procedures and information and incentives provided to respondents, if applicable Criterion 4.1: Does the study examine a population relevant to the research questions? Criterion 4.2: Is the sampling design clearly defined and defensible? - Indicates whether sample is purposive or representative - Discusses sample identification and recruitment procedures, if relevant - If a sample of respondents cannot be drawn to represent a relevant universe, it is acknowledged and explained - Approach to selection reflects the purpose of the study and use/interpretation of the findings - Discusses what can be generalized to a wider population from which the sample is drawn or the site selection is made and limitations on drawing wider inferences - Discusses methods for drawing samples from extant data sources or identifying and sampling respondents for data collection Criterion 4.3: Are inclusion and/or exclusion restrictions clear and defensible? Criterion 4.4: Is the analytic sample appropriate and described clearly and in adequate detail? - Gives the rationale for the sufficiency of the sample size for answering the research question(s) of interest Criterion 4.5: Does the study discuss limitations of the sample and/or sampling procedure? 3

5. Analysis Methods Criterion 5.1: Are the analysis methods clearly described, appropriate for the research questions, sufficiently rigorous, and correctly executed? - Describes and gives rationale for methods of analysis, including use of specific analysis methods, models, and procedures for hypothesis testing - The description of the analysis methods should be sufficiently detailed to understand how the analysis was conducted and how the empirical findings are to be interpreted - The reviewer should have some confidence that the findings could be replicated based on the description of the methods Criterion 5.2: Does the report clearly explain and justify key analysis decisions? Criterion 5.3: Are appropriate statistical procedures used? - These procedures could include methods to account for stratification, methods to account for clustering, and sample weights. Criterion 5.4: Are limitations of the analytic methods discussed, especially those that could lead to bias? - These limitations could include treatment of missing data, confounding factors, omitted variables, endogeneity, and statistical power. - Discusses how limitations of the analytic methods could affect interpretation of the findings - Discusses sensitivity tests conducted and their results 6. Findings and Conclusions Criterion 6.1: Are findings fully supported by the data and analysis? - Are findings presented accurately and objectively without introducing a point of view? - Findings make sense as a whole and are coherent; seemingly odd or inconsistent findings are acknowledged and addressed appropriately. - Findings are placed in an appropriate context given limitations in design, data sources, and analytic methods of the study. Criterion 6.2: Are conclusions supported by the findings? - Conclusions are based on a reasonable interpretation of the findings. - Conclusions do not appear to reflect biases on the part of the researchers or authors. - Conclusions are placed in appropriate context with respect to the theory proposed and/or conclusions based on previous literature. 4

References These guidelines were developed and synthesized from the following sources: Higgins, J.P.T., and S. Green (editors). Cochrane Handbook for Systematic Reviews of Interventions, version 5.1.0 (updated March 2011), chapters 7, 13, and 16. The Cochrane Collaboration, 2011. Available at www.cochrane-handbook.org. Accessed December 6, 2013. Mathematica Policy Research. NCEE Guidance for REL Study Proposals, Reports, and Other Products. Mathematica project materials, not publicly available. April 2013. Office of Adolescent Health. Evaluation Technical Assistance Update. Frequently Asked Questions: Reporting Implementation Findings. Produced for OAH and Administration for Children, Youth, and Families Teen Pregnancy Prevention grantees. Washington, DC: OAH, December 2011). Available at http://www.hhs.gov/ash/oah/oahinitiatives/assets/ta_update_3.pdf. Accessed December 6, 2013. U.S. Department of Education. What Works Clearinghouse Study Review Guide Template. Washington, DC: U.S. Department of Education, 2014. Available at http://ies.ed.gov/ncee/wwc/studyreviewguide.aspx. Accessed December 6, 2013. 5