Practical Use of Program Evaluation among Sexually Transmitted Disease (STD) Programs
|
|
|
- Beverley Richard
- 10 years ago
- Views:
Transcription
1 Practical Use of Program Evaluation among Sexually Transmitted Disease (STD) Programs January 2007 Department of Health and Human Services Centers for disease control and prevention
2 Acknowledgements A manual of this nature is labor intensive and draws from many sources. We thank individuals from the Academy of Educational Development (AED) who assisted us in the development and testing phases of this manual: Stacey Little, PhD, Susan Rogers, PhD and Richard Sawyer, PhD We also thank the following people from the Division of STD Prevention for their assistance in reviewing initial drafts of this document: Lydia Blasini-Alcivar, PhD, David Byrum, MPH, Dayne Collins, BS, Janelle Dixon, Norm Fikes, Deymon Fleming, MPH, Kim Seechuk, MPH and Steve Shapiro, BS. The Division of STD Prevention extends special thanks to those individuals in the field of STD prevention who contributed with their time to providing either input on the document layout/content or reviewing the relevance of the different chapters in this document to STD programs. These individuals are: Gail Bolan, MD (California) Susan Bulecsa, MSN (Florida) F. Bruce Coles, DO (New York) Derek Coppedge, MBA (Kansas) Nyla DeArmitt, BS (Missouri) Annabeth Elliott, RN (Idaho) Alice Gendelman, MPH (California) Heather Hauck, MSW (New Hampshire) Heidi Jenkins, BS (Connecticut) Robert Johnson, GCPH (Virginia) Kristine Judd, BSPH (Michigan) Laurie Kops, BS (Montana) Leandro Mena, MD (Mississippi) Pete Moore, MPH (North Carolina) David Morgan, BS (South Dakota) Glen Olthoff (Baltimore City) Rupa Sharma, MSc (Arkansas) Mark Stenger, MA (Washington) Melanie Taylor, MD (Arizona) Drew Thomits, BS (New Hampshire) Craig Thompson, BS (Mississippi) Wendy Wolf, MPA (San Francisco) To better meet the needs of our funded-partners, we pilot-tested all the sections of this document for usability and practicality in real life STD programs. Four funded-std programs (California, Idaho, Michigan and North Carolina) went through an intensive step-by-step evaluation skill- building process and were successful in applying the content of this document to a programmatic area of their choice. We thank the following individuals for their valuable input and hard work: Michael McElroy, MPH (California) Susan Watson, MPH (California) Annabeth Elliott, RN (Idaho) Kristine Judd, BSPH (Michigan) Bruce Nowak, BS (Michigan) Monica Brown, MPH (North Carolina) Lumbe Davis, MPH (North Carolina) Kawanna Glenn, BS (North Carolina) Monica Melvin, BS (North Carolina) Chantha Prak, BS (North Carolina)
3 Practical Use of Program Evaluation among Sexually Transmitted Disease (STD) Programs January 2007 Yamir Salabarría-Peña, Dr.P.H., M.P.H.E. Betty S. Apt, BA Cathleen M. Walsh, Dr.P.H. Department of Health and Human Services Centers for Disease Control and Prevention National Center for HIV, STD, and TB Prevention Division of STD Prevention
4 INTROD UCTION & OV ERV IEW REFERENCES Patton, M.Q. (1997) Utilization-Focused Evaluation: The New Century Text. 3rd ed. Sage Publications, Thousand Oaks, CA. Smith, M.F. (1989). Evaluability Assessment A Practical Approach. Kluwer, Norwell, MA. U.S. Department of Health and Human Services. Centers for Disease Control and Prevention (2001). Introduction to Program Evaluation for Comprehensive Tobacco Control Programs. Office on Smoking and Health. evaluation_manual/evaluation.pdf U.S. Department of Health and Human Services. Centers for Disease Control and Prevention. Office of the Director, Office of Strategy and Innovation. Introduction to program evaluation for public health programs: A self-study guide. Atlanta, GA: Centers for Disease Control and Prevention, U.S. Department of Health and Human Services. Centers for Disease Control and Prevention (2001). Introduction to Program Evaluation for Comprehensive Tobacco Control Programs. Office on Smoking and Health. evaluation_manual/evaluation.pdf U.S. Department of Health and Human Services. Centers for Disease Control and Prevention. Framework for program evaluation in public health. MMWR 1999;48(No. RR-11) ftp://ftp.cdc.gov/pub/publications/mmwr/rr/rr4811.pdf SUGGESTED CITATION: Salabarría-Peña, Y, Apt, B.S., Walsh, C.M. Practical Use of Program Evaluation among Sexually Transmitted Disease (STD) Programs, Atlanta (GA): Centers for Disease Control and Prevention; U.S. Government Accountability Office (2005). Performance measurement and evaluation, GAO SP, Should you need help or more information regarding this manual or any specific tool, please contact CDC/DSTDP program evaluation staff at (404) or [email protected]. 11
5 INT RODUCT ION & O VERVIEW Evaluation Standards The evaluation standards exist to guide and ensure that the evaluation is well-designed and meets the needs of programs. These are integrated in this manual as well. Utility refers to designing an evaluation that meets the needs of the stakeholders. Feasibility ensures that the evaluation is practical and realistic. Propriety is concerned about the ethics of the evaluation such as human rights protection. Accuracy ensures that the evaluation is producing valid and reliable findings. Table of Contents INTRODUCTION TO THE MANUAL OVERVIEW OF PROGRAM EVALUATION 1 STEP 1: ENGAGE STAKEHOLDERS 1.1 Determine how and to what extent to involve stakeholders in program evaluation STEP 2: DESCRIBE THE PROGRAM 2.1 Understand your program focus and priority areas 2.2 Develop your program goals and measurable (SMART) objectives 2.3 Identify the elements of your program and get familiar with logic models 2.4 Develop logic models to link program activities with outcomes STEP 3: FOCUS THE EVALUATION 3.1 Tailor the evaluation to your program and stakeholders needs 3.2 Determine resources and personnel available for your evaluation 3.3. Develop and prioritize evaluation questions STEP 4: GATHER CREDIBLE EVIDENCE 4.1 Choose appropriate and reliable indicators to answer your evaluation questions 4.2 Determine the data sources and methods to measure indicators 4.3 Establish a clear procedure to collect evaluation information 4.4 Complete an evaluation plan based on program description and evaluation design
6 INTROD UCTION & OV ERV IEW 5 6 Table of Contents (continued) STEP 5: JUSTIFY CONCLUSIONS 5.1 Analyze the evaluation data 5.2 Determine what the evaluation findings say about your program STEP 6: ENSURE USE OF EVALUATION FINDINGS AND SHARE LESSONS LEARNED 6.1 Share with stakeholders the results and lessons learned from the evaluation 6.2 Use evaluation findings to modify, strengthen, and improve your program CDC Framework for Program Evaluation Source: Centers for Disease Control and Prevention. Framework for Program Evaluation in Public Health. MMWR1999; 48 (No. RR-11). 6 Ensure use and share lessons learned 5 Justify conclusions Steps 1 Engage Stakeholders STANDARDS Utility Feasibility Propriety Accuracy 4 Gather credible evidence 2 Describe the program 3 Focus the evaluation design GLOSSARY AND APPENDICES 289 Glossary of Key Terms Appendix A: Evaluation Designs Appendix B: Syphilis Case Illustrating the Application of the Manual Appendix C: Sample Logic Models of STD Programs Appendix D: Sample Evaluation Plans of STD Programs Program Evaluation Steps Step 1: Engage Stakeholders deals with engaging individuals and organizations (with an interest in the program) in the evaluation process. Step 2: Describe the Program involves describing the program or activity to evaluate by defining the problem, formulating program goals and objectives, and developing a logic model showing how the program is supposed to work. Step 3: Focus the Evaluation Design determines the type of evaluation to implement, identifies the resources needed to implement the evaluation and develops evaluation questions. Step 4: Gather Credible Evidence identifies how to answer the evaluation questions and develop an evaluation plan that will include, among others, indicators, data sources and methods, and the timeline. Step 5: Justify Conclusions is about collecting, analyzing and interpreting the evaluation data. Step 6: Ensure Use and Share Lessons Learned identifies effective methods for sharing and using evaluation results. 9
7 INT RODUCT ION & O VERVIEW how are we doing? Because of its ongoing nature, performance measurement can serve as an early warning to program managers of changes in program performance. Program evaluation can be used to answer why is the program doing poorly or well? and identify adjustments that may improve performance. Planning Program evaluation is part of the program planning continuum. Planning asks What are we doing and what should we do to achieve our goals and objectives? Program evaluation provides information on progress toward goals and objectives, identifies what is working well and/or poorly, and recommends what can be changed to help the program better meet its intended goals and objectives. Introduction to the Manual Surveillance Surveillance is the continuous monitoring or routine data collection on various factors (e.g., behaviors, attitudes, deaths) over a regular time interval and responds to what s happening? Program evaluation will provide more in-depth examination as to why something is happening in the implementation and contextual aspects of the program. Data gathered by surveillance systems are invaluable for performance measurement and program evaluation, especially for monitoring long-term and population-based outcomes. However, some surveillance systems have limited flexibility when it comes to adding questions that a particular evaluation needs to answer. WHAT IS CDC S FRAMEWORK FOR PROGRAM EVALUATION? CDC s Framework for Program Evaluation in Public Health In 1999, CDC published a Framework for Program Evaluation in Public Health (CDC, 1999). The Framework, as depicted below, has two components: (1) six steps and (2) four sets of standards for conducting quality evaluations of public health programs. As previously mentioned, this framework is the foundation of this manual. The following is a graphic representation of the framework. The underlying logic of the Evaluation Framework is that a welldone evaluation does not merely gather accurate evidence and draw valid conclusions, but produces results that are used to make a difference. The following is a brief overview of the framework. W elcome to this manual, which illustrates the how to of planning and implementing evaluation activities in STDrelated programs in a user-friendly manner. WHY THE NEED FOR THIS MANUAL? In 2002, the National Coalition of STD Directors (NCSD) conducted a needs assessment of STD program infrastructure among their membership (representatives from STD programs in all 50 states, plus 7 cities, and 8 US territories). More than three quarters of the respondents reported a need for guidance from CDC to conduct STD program evaluation and more evaluation resources. In response to the expressed needs, CDC s Division of STD Prevention (DSTDP) supported the development of this manual to illustrate the theory and practice of program evaluation tailored to STD programs, and workshops on the content of the manual. This process started in 2004 and was completed in WHAT IS THE PURPOSE OF THIS MANUAL? This manual provides guidance on how to design and implement a program evaluation via a step-by-step approach. Its goals are to (1) build the evaluation capacity of STD programs so that they can internally monitor their program activities, understand what is working or not working and improve their efforts; (2) establish a common evaluation language across project areas; (3) show that program evaluation is an activity we can all do and that yields important benefits; and (4) integrate evaluation into routine program practice. 8 1
8 INT RODUCT ION & O VERVIEW INTROD UCTION & OV ERV IEW The manual is meant to be used by those in STD-related programs who are responsible for conducting evaluation activities and by those with lots, little or no experience, even in the face of limited resources for program evaluation. This manual presents program evaluation as an integral part of the program planning process. It provides a way of thinking about evaluation from the get-go as opposed to it being an afterthought or something that could wait until the end of a program. In addition, the manual focuses on an evaluation process that is participatory and will be responsive to the program s needs, stakeholders, and resources. Since evaluating an entire STD program may not be feasible due to resource constraints, we emphasize the evaluation of program activities or components and focus on evaluations that can measure program contribution rather than attribution (causality). This manual touches on basic principles of program evaluations, but the information included is not exhaustive. For more detailed information on various evaluation topics, please make use of the references provided in the manual and in Tool 3.2, which includes links to other evaluation resources. HOW IS THIS MANUAL ORGANIZED? This manual is based on CDC s Framework for Program Evaluation in Public Health and evaluation material developed by other Divisions at CDC (Division of Adolescent and School Health, Division of Tuberculosis Prevention, Office of Strategy and Innovation, and the Office of Smoking and Health), all of which have been based in CDC s framework. Our framework divides evaluation into six progressive steps. Step 1: Engage stakeholders. Step 2: Describe the program. Step 3: Focus the evaluation design. Step 4: Gather credible evidence. Step 5: Justify conclusions. Step 6: Ensure use and share lessons learned. This manual includes an overall introduction to program evaluation, and then it is organized by main step. Each step is divided into substeps (e.g., 1.1, 2.1, etc.), and for each sub-step there is a targeted STD prevention efforts. Program evaluation offers the opportunity to review, analyze, and modify STD prevention efforts as necessary. It also helps improve program performance and measure progress toward achievement of goals and objectives. WHAT IS THE DIFFERENCE BETWEEN PROGRAM EVALUATION AND OTHER PROGRAM ELEMENTS? Program evaluation is sometimes confused with other program essential components because it is one of several ways to answer questions about a program. These questions can be answered using other approaches that might be characterized as academic research, performance measurement, planning, and surveillance. Academic Research While there is some overlap between program evaluation and academic research, there are differences between the two that are important to distinguish in the context of evaluating STD-related activities or components. The main differences have to do with the purpose and how findings are used. The main purposes of program evaluation are to identify program s achievements in meeting its goals and objectives, identify program operations/components/activities that need to be improved, and solving practical problems. The main purposes of academic research are to create new knowledge in a field that may be generalized to programs or populations throughout a field, and to test hypotheses. An evaluation might determine whether a specific outreach activity is reaching the right target population, whereas an academic research study might aim to find out how people s attitudes influence their predisposition or likelihood to seek STD testing. An evaluation is conducive to making programmatic decisions (e.g., modifying the activity, allocating more resources, implementing the activity in different counties); research, in contrast, is conducive to examining the relationship between attitudes and testing, and trying to translate the findings into practical implications. Performance Measurement Performance measurement is an ongoing monitoring of a set of indicators (i.e., performance measures) of program progress for the purpose of accountability to interested parties such as funders, legislators, and the general public. Performance measures respond to 2 7
9 INT RODUCT ION & O VERVIEW INTROD UCTION & OV ERV IEW findings. Evaluation findings should be used to make decisions about program implementation and to improve program effectiveness. Program evaluation measures the how and the why, and it can be conducted to measure the evaluation issues outlined below. It is intended to document program progress, demonstrate accountability to funders and policymakers, or identify ways to make the program better. Implementation: whether an intervention is implemented as planned or if a target population is reached; if not, why not? Effectiveness: if a program is achieving its goals and objectives or its intended outcomes; if not, why not? Efficiency: whether program activities are being conducted using resources (e.g., budget, time) appropriately. If not, why not? Causal Attribution: whether progress on goals and objectives are due to the STD program activities, as opposed to other things that are going on at the same time; if not, why not? Cost-Benefit analysis: identifies all relevant costs and benefits of a program (intervention, activity), usually expressed in terms of dollar. Cost-effectiveness analysis: determines the cost of meeting a single goal or objective and can be used to identify the least costly alternative for meeting this. You were just presented with issues that you can address by using program evaluation. However, evaluation has limits. For instance, program evaluation cannot determine success of a program that has no goals and measurable objectives or a program theory to evaluate against. Program evaluation needs to be based on questions about a program that stakeholders are interested in answering. In the absence of such questions, merely collecting data does not equal evaluation. Last but not least, if results of an evaluation are not used for programmatic decision making, it is not program evaluation. WHY EVALUATE STD PROGRAMS? Program evaluation is vital in allowing STD prevention programs to determine their accomplishments, how resources have been invested, what should be improved, and take action accordingly. The demands of policymakers and other stakeholders for accountability and resultoriented programs have increased. This means that strong program evaluation is essential now more than ever. Ongoing evaluation is critical to developing and sustaining high-quality and appropriately corresponding tool showing the how to of each evaluation activity. Each evaluation tool includes concepts and systematic step-by step guidance on how to go about conducting different evaluation activities as it applies to STD programs. The following illustrates how the steps/tools and concepts are organized. STEP AND TOOL 1. Engage Stakeholders 1.1 Determine how and to what extent to involve stakeholders in program evaluation 2. Describe the Program 2.1 Understand your program focus and priority areas 2.2 Develop your program goals and measurable (SMART) objectives 2.3 Identify the elements of your program and become familiar with logic models 2.4 Develop logic models to link program activities with outcomes 3. Focus the Evaluation Design 3.1 Tailor the evaluation to your program and stakeholders needs 3.2 Determine resources and personnel available for your evaluation 3.3. Develop and prioritize evaluation questions 4. Gather Credible Evidence 4.1 Choose appropriate and reliable indicators to answer your evaluation questions 4.2 Determine the data sources and methods to measure indicators 4.3 Establish a clear procedure to collect evaluation information 4.4 Complete an evaluation plan based on program description and evaluation design 5. Justify Conclusions 5.1 Analyze the evaluation data 5.2 Determine what the evaluation findings say about your program 6.Ensure Use and Share Lessons Learned 6.1 Share the results and lessons learned from the evaluation with stakeholders 6.2 Use evaluation findings to modify, strengthen, and improve your program CONCEPTS Who are stakeholders? Why is stakeholder involvement important? How to identify/involve/retain stakeholders in evaluation? Why is it important to determine needs? What are the benefits of and when to conduct a needs assessment? How to carry out a needs assessment. What are goals and objectives? How to write an objective the SMART way. What are process and outcome objectives? What are the elements of a program? What is a logic model and its benefits? How to use goals and objectives to develop a logic model. What are the challenges and rewards of logic models? What types of logic models can be constructed? How to construct a logic model. What are process and outcome evaluations? How to choose the focus of an evaluation. When are the results of an evaluation needed? Who will conduct the evaluation? (skills of a qualified evaluator) What financial resources are available? (technical assistance services at DSTDP, how to recruit an evaluator, how to develop a budget) What is the purpose of and how to develop/prioritize evaluation questions? What is an indicator? What is the link between indicators, performance measures and program evaluation? How to develop appropriate indicators. What data sources and data collection methods can be used? What is the relationship between indicators, data sources and data collection methods? What factors to consider when developing data collection procedures (developing instruments, data collectors skills, training data collectors). What is the purpose of an evaluation plan and its components? How to construct an evaluation plan. How do you analyze evaluation data? (quantitative/qualitative) What is the purpose of data interpretation and should be considered when doing so? What are some factors to consider when developing recommendations? How to share evaluation results. How to promote the use of evaluation results. How to use evaluation findings for decision making. 6 3
10 INT RODUCT ION & O VERVIEW Each tool has the following components: Introduction: briefly describes what the tool is about and its relationship to the previous sub-step/tool to provide continuity and progression. Diagram: connects the concepts to be discussed in the tool with previous concepts addressed. Learning Objectives: specifies what can be accomplished from going through the tool. Content: divides information into subheadings (i.e., guiding questions) and is loaded with examples specific to STD to simplify concepts and connect with the reader. Focus on the how to of the topic. Checklist: lists key aspects of the tool. Conclusion: summarizes the tool content. Next-steps: connects the tool with the next one. Key terms: defines concepts addressed in the tool. Acronyms: spells out acronyms used in the tool. References: provides resources that were used to craft the tool and can help readers to locate more detailed information. Exercise: provides hands-on experience on how to apply the tool content to STD programs (e.g., worksheets, exercises/questions with answer key, flow charts, matrices, etc.). Most of the tools include exercises or case studies. Case study: describes how the concepts addressed in the tool have been/can be applied to a particular scenario. Appendix: Adds to the information already discussed in the tool. Answer key: provides possible responses to the exercise included. In addition to the introduction and the steps/tools, we have included a glossary and appendices at the end of the manual. In the appendices you will find a case on a syphilis project illustrating how one step/tool builds on another, information on evaluation designs, sample logic models and evaluation plans developed by the STD programs that pilot-tested this manual. WHAT S NEXT? The following section introduces program evaluation, its uses, explains the difference between program evaluation and other program elements, and describes CDC s framework for Program Evaluation. Overview of Program Evaluation Y ou will see program evaluation used in multiple contexts. In this manual we are using the following definition which conveys the essence of program evaluation. the systematic collection of information about the activities, characteristics, and outcomes of programs to make judgments about the program effectiveness, and/or inform decisions about future development. Michael Quinn Patton, Utilization Focused Evaluation, 1997 We conduct program evaluation according to a set of protocols that are systematic to accurately determine the value or merit of a program and make decisions accordingly, including those related to improving a program. Understanding by program as a set of planned activities directed toward bringing about specified change(s) and identifiable audience, we conclude that STD-related activities (e.g., counseling session with an individual with positive results for syphilis), interventions (e.g., gonorrhea media campaign in a defined population), and components (e.g., partner services) are deserving of program evaluation. Program evaluation is influenced by program constraints (e.g., staffing, budget, skills). Therefore, evaluation should be practical and feasible and must be conducted within the confines of resources, time, and political context. Moreover, it should serve a useful purpose, be conducted in an ethical manner, and produce accurate 4 5
Program Evaluation and the Work Plan for Applicants to STD AAPPS
Program Evaluation and the Work Plan for Applicants to STD AAPPS Marion Carter, Dayne Collins, Bruce Heath, and Kimberly Thomas Program Evaluation Team (HSREB), Program Support Team (PDQIB) Division of
Using Evaluation to Improve Programs. Strategic Planning. www.cdc.gov/healthyyouth/evaluation
Using Evaluation to Improve Programs Strategic Planning www.cdc.gov/healthyyouth/evaluation PROGRAM STRATEGIC PLANNING KIT Table of Contents Introduction Part 1: What is strategic planning? Part 2: What
STD AAPPS: Getting Started
STD AAPPS: Getting Started Bruce Heath Team Lead AAPPS Webinar Series 01/28/2014 Overview Organization Program Support Team Cooperative agreement monitoring Technical Assistance AAPPS Moving forward/key
Developing an Effective Evaluation Plan. Setting the course for effective program evaluation
Developing an Effective Evaluation Plan Setting the course for effective program evaluation Acknowledgments This workbook was developed by the Centers for Disease Control and Prevention s (CDC s) Office
Building Our Understanding: Key Concepts of Evaluation What is it and how do you do it?
Building Our Understanding: Key Concepts of Evaluation What is it and how do you do it? Imagine that you or your research team has just completed a communication intervention designed to reduce smoking
Finding the Right People for Your Program Evaluation Team: Evaluator and Planning Team Job Descriptions
: Evaluator and Planning Team Job Descriptions I. Overview II. Sample Evaluator Job Description III. Evaluator Competencies IV. Recruiting members of your strategic evaluation planning team V. Recruiting
Summary of the State Elder Abuse. Questionnaire for West Virginia
Summary of the State Elder Abuse Questionnaire for West Virginia A Final Report to: Department of Health and Human Services February 2002 Prepared by Researchers at The University of Iowa Department of
Summary of the State Elder Abuse. Questionnaire for Ohio
Summary of the State Elder Abuse Questionnaire for Ohio A Final Report to: Department of Human Services February 2002 Prepared by Researchers at The University of Iowa Department of Family Medicine 2 Grant,
Summary of the State Elder Abuse. Questionnaire for Vermont
Summary of the State Elder Abuse Questionnaire for Vermont A Final Report to: Vermont Adult Protective Services February 2002 Prepared by Researchers at The University of Iowa Department of Family Medicine
Evaluation, Performance Management, and Quality Improvement: Understanding the Role They Play to Improve Public Health
Evaluation, Performance Management, and Quality Improvement: Understanding the Role They Play to Improve Public Health Craig Thomas, PhD Liza Corso, MPA Harald Pietz Division of Public Health Performance
Summary of the State Elder Abuse. Questionnaire for Alabama
Summary of the State Elder Abuse Questionnaire for Alabama A Final Report to: Department of Human Resources Office of Adult Services February 2002 Prepared by Researchers at The University of Iowa Department
Introduction to Program Evaluation for Public Health Programs:
Introduction to Program Evaluation for Public Health Programs: A Self-Study Guide Suggested Citation: U.S. Department of Health and Human Services Centers for Disease Control and Prevention. Office of
What to Know About State CPA Reciprocity Rules. John Gillett, PhD, CPA Chair, Department of Accounting Bradley University, Peoria, IL
What to Know About State CPA Reciprocity Rules Paul Swanson, MBA, CPA Instructor of Accounting John Gillett, PhD, CPA Chair, Department of Accounting Kevin Berry, PhD, Assistant Professor of Accounting
Step 6: Report Your PHDS-PLUS Findings to Stimulate System Change
Step 6: Report Your PHDS-PLUS Findings to Stimulate System Change Sharing information from the PHDS-PLUS can help you launch or strengthen partnerships and efforts to improve services, policies, and programs
Social Media: Understanding User Patterns and Compliance Issues. June 2011. Electronic copy available at: http://ssrn.com/abstract=1859443
Social Media: Understanding User Patterns and Compliance Issues June 2011 Electronic copy available at: http://ssrn.com/abstract=1859443 TABLE OF CONTENTS I. Executive Summary....3 II. Key Findings of
Research Brief. Do Parents Feel More Aggravated These Days? MARCH 2014 Publication #2014-14 PARENTAL AGGRAVATION IN THE U.S., 1997 TO 2012 OVERVIEW
MARCH 2014 Publication #2014-14 Do Parents Feel More Aggravated These Days? PARENTAL AGGRAVATION IN THE U.S., 1997 TO 2012 David Murphey, Ph.D., Tawana Bandy, B.S., Kristin Anderson Moore, Ph.D., & P.
Evaluation. Kerry Thomson, MPH, CHES Program Evaluator Colorado Department of Public Health and Environment
Evaluation Kerry Thomson, MPH, CHES Program Evaluator Colorado Department of Public Health and Environment Presentation Goal, Objective, & Activity Goal: Increase knowledge of public health program evaluation
Monitoring and Evaluation
Monitoring and Evaluation Background Notes * Information is essential to the decisions we make and the actions we take. Timely and accurate information enables us to learn from others experiences; identify
Glossary Monitoring and Evaluation Terms
Glossary Monitoring and Evaluation Terms This glossary includes terms typically used in the area of monitoring and evaluation (M&E) and provides the basis for facilitating a common understanding of M&E.
Model Regulation Service January 2006 DISCLOSURE FOR SMALL FACE AMOUNT LIFE INSURANCE POLICIES MODEL ACT
Table of Contents Section 1. Section 2. Section 3. Section 4. Section 5. Section 6. Section 1. Model Regulation Service January 2006 Purpose Definition Exemptions Disclosure Requirements Insurer Duties
The Questionable Employment Tax Practices Initiative Progress Report
The Questionable Employment Tax Practices Initiative Progress Report April, 2011 QETP Oversight Team Members Include: Internal Revenue Service U.S. Department of Labor National Association of State Workforce
CONDUCTING IN-DEPTH INTERVIEWS: A Guide for Designing and Conducting In-Depth Interviews for Evaluation Input
P ATHFINDER I NTERNATIONAL T OOL S ERIES Monitoring and Evaluation 2 CONDUCTING IN-DEPTH INTERVIEWS: A Guide for Designing and Conducting In-Depth Interviews for Evaluation Input By Carolyn Boyce, MA,
High Risk Health Pools and Plans by State
High Risk Health Pools and Plans by State State Program Contact Alabama Alabama Health 1-866-833-3375 Insurance Plan 1-334-263-8311 http://www.alseib.org/healthinsurance/ahip/ Alaska Alaska Comprehensive
Three-Year Moving Averages by States % Home Internet Access
Three-Year Moving Averages by States % Home Internet Access Alabama Alaska Arizona Arkansas California Colorado Connecticut Delaware Florida Georgia Hawaii Idaho Illinois Indiana Iowa Kansas Kentucky Louisiana
Developing an Effective Evaluation Report. Setting the course for effective program evaluation
Developing an Effective Evaluation Report Setting the course for effective program evaluation Acknowledgments This workbook was developed by the Centers for Disease Control and Prevention s (CDC s) Office
Strategic Communications Audits
Strategic Communications Audits Prepared for the Communications Consortium Media Center Julia Coffman October 2004 Nonprofit organizations are now continuously being challenged to be more strategic in
The Grant Writing Game
101 Ways to Win The Grant Writing Game by Steve Price, Ed.D. and Stephen Price, M.A. Introduction In this brief ebook, Dr. Steve Price and Stephen Price share 101 strategies for successful grant writing
Core Competencies for Public Health Professionals
Core Competencies for Public Health Professionals Introduction This document contains three different versions of the recently adopted Core Competencies for Public Health Professionals. Click on the name
2009-10 STATE AND LOCAL GOVERNMENT TAX AND REVENUE RANKINGS. By Jacek Cianciara
2009-10 STATE AND LOCAL GOVERNMENT TAX AND REVENUE RANKINGS By Jacek Cianciara Wisconsin Department of Revenue Division of Research and Policy December 12, 2012 TABLE OF CONTENTS Key Findings 3 Introduction
Hourly and Per Visit Wage Report
Hourly and Per Visit Wage Report For additional information, please contact Jeanette Janota, Surveys & Analysis American Speech-Language-Hearing Association Rockville, MD 20850 800-498-2071, ext. 8738
Public School Teacher Experience Distribution. Public School Teacher Experience Distribution
Public School Teacher Experience Distribution Lower Quartile Median Upper Quartile Mode Alabama Percent of Teachers FY Public School Teacher Experience Distribution Lower Quartile Median Upper Quartile
Monitoring and Evaluation Plan Primer for DRL Grantees
Monitoring and Evaluation Plan Primer for DRL Grantees I. What is a monitoring and evaluation plan? A monitoring and evaluation plan (M&E plan), sometimes also referred to as a performance monitoring or
BUSINESS DEVELOPMENT OUTCOMES
BUSINESS DEVELOPMENT OUTCOMES Small Business Ownership Description Total number of employer firms and self-employment in the state per 100 people in the labor force, 2003. Explanation Business ownership
Real Progress in Food Code Adoption
Real Progress in Food Code Adoption August 27, 2013 The Association of Food and Drug Officials (AFDO), under contract to the Food and Drug Administration, is gathering data on the progress of FDA Food
Englishinusa.com Positions in MSN under different search terms.
Englishinusa.com Positions in MSN under different search terms. Search Term Position 1 Accent Reduction Programs in USA 1 2 American English for Business Students 1 3 American English for Graduate Students
Real Progress in Food Code Adoption
Real Progress in Food Code Adoption The Association of Food and Drug Officials (AFDO), under contract to the Food and Drug Administration, is gathering data on the progress of FDA Food Code adoptions by
University System of Georgia Enrollment Trends and Projections to 2018
University System of Georgia Enrollment Trends and Projections to 2018 Introduction: Projections of USG Headcount Enrollment Enrollment projections use past trends and information on other variables to
State Tax Information
State Tax Information The information contained in this document is not intended or written as specific legal or tax advice and may not be relied on for purposes of avoiding any state tax penalties. Neither
How To Be A Health Care Worker
Working with Epidemiologists for Heart Disease & Stroke Prevention Program Development Betty C. Jung, RN MPH CHES Connecticut Department of Public Health [email protected] Albert Tsai, PhD, MPH
Annual Salary Report
Annual Salary Report For additional information, please contact Jeanette Janota, Surveys & Analysis American Speech-Language-Hearing Association Rockville, MD 20850 800-498-2071, ext. 8738 [email protected]
STATISTICAL BRIEF #273
STATISTICAL BRIEF #273 December 29 Employer-Sponsored Health Insurance for Employees of State and Local Governments, by Census Division, 28 Beth Levin Crimmel, M.S. Introduction Employees of state and
State by State Summary of Nurses Allowed to Perform Conservative Sharp Debridement
State by State Summary of Nurses Allowed to Perform Conservative Sharp Debridement THE FOLLOWING ARE ONLY GENERAL SUMMARIES OF THE PRACTICE ACTS EACH STATE HAS REGARDING CONSERVATIVE SHARP DEBRIDEMENT
Annual Salaries. For additional information, please contact:
Annual Salaries For additional information, please contact: Jeanette Janota, Surveys & Analysis American Speech-Language-Hearing Association 2200 Research Boulevard Rockville, MD 20850-3289 800-498-2071,
CLINICAL PRIVILEGE WHITE PAPER Psychology
Psychology CLINICAL PRIVILEGE WHITE PAPER Psychology Background Psychology is a broad field that includes the scientific study of mental processes and behavior. It is similar to the medical field of psychiatry.
National Association of Black Accountants, Inc. National Policies and Procedures Manual
Introduction The purpose of the (NPPM) is to provide additional guidance on implementing the Bylaws of the Association. This manual provides a comprehensive set of policies, procedures and guidelines that
By Tim Bates and Joanne Spetz, University of California, San Francisco
Education Data Sources: A User s Guide By Tim Bates and Joanne Spetz, University of California, San Francisco Introduction The Institute of Medicine (IOM) Committee on the Future of recommended that stakeholders
Economic Impact and Variation in Costs to Provide Community Pharmacy Services
Economic Impact and Variation in Costs to Provide Community Pharmacy Services Todd Brown MHP, R.Ph. Associate Clinical Specialist and Vice Chair Department of Pharmacy Practice School of Pharmacy Northeastern
NOTICE OF PROTECTION PROVIDED BY [STATE] LIFE AND HEALTH INSURANCE GUARANTY ASSOCIATION
NOTICE OF PROTECTION PROVIDED BY This notice provides a brief summary of the [STATE] Life and Health Insurance Guaranty Association (the Association) and the protection it provides for policyholders. This
PRO-NET. A Publication of Building Professional Development Partnerships for Adult Educators Project. April 2001
Management Competencies and Sample Indicators for the Improvement of Adult Education Programs A Publication of Building Professional Development Partnerships for Adult Educators Project PRO-NET April 2001
STATISTICAL BRIEF #435
STATISTICAL BRIEF #435 April 2014 Premiums and Employee Contributions for Employer-Sponsored Health Insurance: Private versus Public Sector, 2012 Karen E. Davis, MA Introduction Employer-sponsored health
MAINE (Augusta) Maryland (Annapolis) MICHIGAN (Lansing) MINNESOTA (St. Paul) MISSISSIPPI (Jackson) MISSOURI (Jefferson City) MONTANA (Helena)
HAWAII () IDAHO () Illinois () MAINE () Maryland () MASSACHUSETTS () NEBRASKA () NEVADA (Carson ) NEW HAMPSHIRE () OHIO () OKLAHOMA ( ) OREGON () TEXAS () UTAH ( ) VERMONT () ALABAMA () COLORADO () INDIANA
Family Involvement in Adolescent Substance Abuse Treatment February, 2008
Family Involvement in Adolescent Substance Abuse Treatment February, 2008 Sharon L. Smith, Steve Hornberger, MSW, Sherese Brewington-Carr, M.H.S. Cathy Finck, Cassandra O Neill, MA, Doreen Cavanaugh, Ph.D.,
The Economic Impact of Physicians
The Economic Impact of Physicians A Fact Sheet Examining the Economic Contribution Physicians Make to Their Communities and to Their Affiliated Hospitals Prepared by: Merritt Hawkins, the nation s leading
A conversation with CDC s Alcohol Program, September 5, 2014
A conversation with CDC s Alcohol Program, September 5, 2014 Participants Robert Brewer, MD, MSPH Epidemiologist; Lead, Excessive Alcohol Use Prevention Team (Alcohol Program), Division of Population Health
Major Process Future State Process Gap Technology Gap
Outreach and Community- Based Services: Conduct education, training, legislative activity, screening and communication within the community and build appropriate partnerships and coalitions to promote
Core Competencies for Public Health Professionals
Core Competencies for Public Health Professionals Revisions Adopted: May 2010 Available from: http://www.phf.org/programs/corecompetencies A collaborative activity of the Centers for Disease Control and
Addiction Counseling Competencies. Rating Forms
Addiction Counseling Competencies Forms Addiction Counseling Competencies Supervisors and counselor educators have expressed a desire for a tool to assess counselor competence in the Addiction Counseling
2014 INCOME EARNED BY STATE INFORMATION
BY STATE INFORMATION This information is being provided to assist in your 2014 tax preparations. The information is also mailed to applicable Columbia fund non-corporate shareholders with their year-end
NON-RESIDENT INDEPENDENT, PUBLIC, AND COMPANY ADJUSTER LICENSING CHECKLIST
NON-RESIDENT INDEPENDENT, PUBLIC, AND COMPANY ADJUSTER LICENSING CHECKLIST ** Utilize this list to determine whether or not a non-resident applicant may waive the Oklahoma examination or become licensed
STATE GRANT AND SCHOLARSHIP OPPORTUNITIES
STATE GRANT AND SCHOLARSHIP OPPORTUNITIES Below you ll find a list of key grant and scholarship programs by state along with links to resources to learn more about these and other financial aid opportunities.
Definition of Foundational Public Health Services
Definition of Foundational Public Health Services FOUNDATIONAL CAPABILITIES A. Assessment (Surveillance and Epidemiology). The foundational definition of this capability includes: 1. Ability to collect
A-79. Appendix A Overview and Detailed Tables
Table A-8a. Overview: Laws Expressly Granting Minors the Right to Consent Disclosure of Related Information to Parents* Sexually Transmitted Disease and HIV/AIDS** Treatment Given or Needed Alabama 14
Consent to Appointment as Registered Agent
Consent to Appointment as Registered Agent This form is used by the person or business entity that agrees to act as the registered agent for a limited liability company. Every person or business entity
NEW JERSEY 2/17/15. Welcome! Gaining Ground Performance Management Pre- Workshop Webinar February 18, 2015 2:00 pm 3:30 pm
Welcome! Gaining Ground Performance Management Pre- Workshop Webinar February 18, 2015 2:00 pm 3:30 pm Please Dial in from your land- line: 210-339- 7212 ParLcipant code: 4146073 NEW JERSEY Performance
Current State Regulations
Current State Regulations Alabama: Enacted in 1996, the state of Alabama requires all licensed massage therapists to * A minimum of 650 classroom hours at an accredited school approved by the state of
Table of Contents. Steps in Developing a Success Story... 1. Success Story Example... 6. Style Reminders... 8. Glossary... 9
Table of Contents Steps in Developing a Success Story... 1 Success Story Example... 6 Style Reminders... 8 Glossary... 9 Additional Resources... 10 ii Division of Adolescent and School Health Steps in
US Department of Health and Human Services Exclusion Program. Thomas Sowinski Special Agent in Charge/ Reviewing Official
US Department of Health and Human Services Exclusion Program Thomas Sowinski Special Agent in Charge/ Reviewing Official Overview Authority to exclude individuals and entities from Federal Health Care
U.S. Department of Labor Office of Workforce Security Division of Fiscal and Actuarial Services
U.S. Department of Labor Office of Workforce Security Division of Fiscal and Actuarial Services Evaluating State UI Tax Systems using The Significant Tax Measures Report State Summary Tables o State Benefit
1 Engage Stakeholders
1 Engage Stakeholders The first step in program evaluation is to engage the stakeholders. Stakeholders are people or organizations who are invested in the program, are interested in the results of the
CPA FIRM NAMES. April 2009 Working Draft
WHITE PAPER CPA FIRM NAMES April 2009 Working Draft Copyright (c) 2009 by the American Institute of Certified Public Accountants, Inc. License are hereby granted for reuse or reprint of this matter for
Workers Compensation Cost Data
Workers Compensation Cost Data Edward M. Welch Workers Compensation Center School of Labor and Industrial Relations Michigan State University E-mail: [email protected] Web Page: http://www.lir.msu.edu/wcc/
Hail-related claims under comprehensive coverage
Bulletin Vol. 29, No. 3 : April 2012 Hail-related claims under comprehensive coverage Claims for hail damage more than doubled in 2011 compared with the previous three years. Hail claims are primarily
Appendix E. A Guide to Writing an Effective. Executive Summary. Navy and Marine Corps Public Health Center Environmental Programs
Appendix E A Guide to Writing an Effective Executive Summary Navy and Marine Corps Public Health Center Environmental Programs MISSION Ensure Navy and Marine Corps readiness through leadership in prevention
Adoption of Electronic Health Record Systems among U.S. Non- Federal Acute Care Hospitals: 2008-2014
ONC Data Brief No. 23 April 2015 Adoption of Electronic Health Record Systems among U.S. Non- Federal Acute Care Hospitals: 2008-2014 Dustin Charles, MPH; Meghan Gabriel, PhD; Talisha Searcy, MPA, MA The
Understanding Socioeconomic and Health Care System Drivers to Increase Vaccination Coverage
Understanding Socioeconomic and Health Care System Drivers to Increase Vaccination Coverage Jason Baumgartner Life Sciences Consulting Director, Quintiles April 2011 Discussion Topics Title: Understanding
Impacts of Sequestration on the States
Impacts of Sequestration on the States Alabama Alabama will lose about $230,000 in Justice Assistance Grants that support law STOP Violence Against Women Program: Alabama could lose up to $102,000 in funds
State Tax Information
State Tax Information The information contained in this document is not intended or written as specific legal or tax advice and may not be relied on for purposes of avoiding any state tax penalties. Neither
D6 INFORMATION SYSTEMS DEVELOPMENT. SOLUTIONS & MARKING SCHEME. June 2013
D6 INFORMATION SYSTEMS DEVELOPMENT. SOLUTIONS & MARKING SCHEME. June 2013 The purpose of these questions is to establish that the students understand the basic ideas that underpin the course. The answers
STEP. Workplans: A Program Management Tool. René Lavinghouze MA Kathleen Heiden RDH, MSPH. Step 2: Describing the Program
STEP C Workplans: A Program Management Tool René Lavinghouze MA Kathleen Heiden RDH, MSPH 2C_1 STEP C Workplans: A Program Management Tool Contents How to Use This Packet 4 Tip Sheets at a Glance 5 Case
Saving Lives, Saving Money. A state-by-state report on the health and economic impact of comprehensive smoke-free laws
Saving Lives, Saving Money A state-by-state report on the health and economic impact of comprehensive smoke-free laws 2011 Table of Contents Executive Summary...............................................................................2
Is the Uniform Certified Public Accounting Exam Uniform?
Is the Uniform Certified Public Accounting Exam Uniform? Richard B. Griffin, Ph.D., CMA Professor of Accounting Department of Accounting, Economics, Finance, and International Business The University of
