Standards for Defining and Measuring Human Services & Community Development Program Outcomes



Similar documents
Seward County Community College/Area Tech School Program - Social Sciences

Case Statements. The questions below will help you create these sections and a case statement for your program.

Integrated Data Collection and Performance Management REFERENCE

FLORIDA DEPARTMENT OF HEALTH

DATA, DATA, DATA Steps and Tools for Planning and Using Your Data

How To Design A Program To Help The Poor Recipe Cards

Distance Learning Policy Guidance

Action Project 11. Project Detail

Using Case Management Data to Improve Programs and Achieve Results

NEW/REVISED PROGRAM DESCRIPTION Innovation. Program Number/Name: INN Access to Mobile/Cellular/Internet Devices in Improving Quality of Life

HOST PLANNING PROCESS

Evaluation Plan: Impact Evaluation for Hypothetical AmeriCorps Program

Human Services I HUS credits Course Description Overview: Course Learning Outcomes

Reading Results with

SMALL BUSINESS WELLNESS INITIATIVE RESEARCH REPORT

STUDENT LEARNING OUTCOMES PLAN. Undergraduate and Graduate Programs DEPARTMENT OF SOCIAL WORK

Senate Bill (SB) 855: Housing Support Program Orange County Application

Pennsylvania Commission on Crime and Delinquency (PCCD) Research Based Programs Initiative Grantee Outcomes Report Template

The IN-CAA Family Development Certification Program

FTF Priority Roles in the Early Childhood System

Released: March 1, 2011

How To Be A Strong Leader In Centralohio

Evaluation: Designs and Approaches

WRITING S.M.A.R.T. GOALS AND EVALUATING YOUR PLAN

Data and Performance Management in Domestic Violence Programs: House of Ruth Maryland

Developing Measurable Program Goals and Objectives

Program Guidelines Transition to College and Careers Pilot Project May, 2008

SYVPI Risk Assessment Frequently Asked Questions May 26, 2015

A Guide to Developing an Outcome Logic Model and Measurement Plan

A + dvancer College Readiness Online Remedial Math Efficacy: A Body of Evidence

Major Criteria For Adult Education Projects in Arkansas

OHIO UNIVERSITY BACCALAUREATE SOCIAL WORK PROGRAM ASSESSMENT OF STUDENT LEARNING OUTCOMES LAST COMPLETED MAY 2015

Trainer: Brett Zyromski, Ph.D.

HIGHER EDUCATION PRACTICE RESEARCH ABSTRACTS

Section 1: Asset Mapping

Impact of Financial Aid on Student College Access & Success:

VII. GUIDE TO AGENCY PROGRAMS - CONTINUED

A Study of the Efficacy of Apex Learning Digital Curriculum Sarasota County Schools

Early Childhood Block Grant Preschool for All

A National Call to Action: School Counselors Ensuring College and Career Readiness. College Board National Office for School Counselor Advocacy

Maryland Affiliate of. Susan G. Komen for the Cure. Grant Writing Workshop. Who is Susan G. Komen for the Cure? Workshop Agenda.

PARTNERSHIPS FOR OPENING DOORS

How to Construct Performance Measures 2.0

Hands on Banking Adults and Young Adults Test Packet

Report of Student Learning and Achievement and Operational Effectiveness For Academic Year:

Evaluation Plan: Process Evaluation for Hypothetical AmeriCorps Program

Guide to the New York State Afterschool Program Accreditation System Advancing Quality. Promoting Professionalism.

SPF SIG Logic Model/Work Plan Instructions. South Dakota Department of Human Services Division of Alcohol and Drug Abuse

Professional Development Landscape of Early Childhood Education in Detroit

City of Lawrence 2009 Alcohol Tax Funds Request for Proposals Calendar Year 2009 (January December) Cover Page. Lawrence Community Shelter, Inc.

Non-Researcher s Guide to Evidence-Based Program Evaluation

Driving a Nonprofit Organization with Data: Tools and Processes for Cost-Effective Metrics by Wendy Church, PhD

Chapter 13: Transition and Interagency Agreements

Marshall Community & Technical College

Dr. Ryan McLawhon Texas A&M University

Building Non-Profit Capacity: Findings from the Compassion Capital Fund Evaluation

Certified Addiction Professional Training

DEVELOPING AN OUTCOME MEASUREMENT SYSTEM FOR YOUR NONPROFIT

Request for Proposal February 18, Summer Employment Opportunities for Workforce Investment Act (WIA) Eligible Youth

NEW Leadership Ohio Application June 18-22, 2012

EQUIP Early Childhood Quality Improvement Grants Program 2013 GUIDELINES

Addressing the Pregnancy, STI and HIV Prevention Needs of At-risk Populations in San Diego County

A BRIEF GUIDE TO SELECTING AND USING PRE-POST ASSESSMENTS

RUNNING HEAD: TUTORING TO INCREASE STUDENT ACHIEVEMENT USING TUTORING TO INCREASE STUDENT ACHIEVEMENT ON END OF COURSE ASSESSMENTS. By KATHYRENE HAYES

System Performance Measures An introductory guide to understanding system-level performance measurement

Service Coordination Core Training Module Component 1

Use of Data MANAGEMENT SYSTEM

Pittsburgh Works Quick Train for Jobs: New Destinations RFP

Louisiana Special Education Guidance

Guide to Planning and Assessing School-based Special Education Services

Sustainable Jersey for Schools Small Grants Program

AmeriCorps National Performance Measures Pilot Year 1: Output and Outcome Instrument Packet

National Standards. Council for Standards in Human Service Education (2010, 1980, 2005, 2009)

Breaking Down Barriers: Effective Coordination Between Counselors and Legal Services Organizations. Odette Williamson November 6, 2015

Evaluating Breastfeeding Programs and Initiatives

Metropolitan State University of Denver Master of Social Work Program Field Evaluation

UNF-MPA student learning outcomes and program assessment

Award Title. An Innovative Approach to Address Financial Literacy on Campus: The SHSU Student Money Management Center.

ASPIRA Management Information System OJJDP General Intake Information

Non-instructional Support Services Review Template

Below is a suggested framework and outline for the detailed evaluation plans with column names defined as follows:

3.5 Description of Federal and State Reporting Requirements for Adult Education Programs

2015 HUD CoC Competition Evaluation Instrument

Courses to Employment: Sectoral Approaches to Community College-Nonprofit Partnerships

Mississippi Adult Education Teacher Orientation Manual January 2014 MON 1

CONSOLIDATED STATE PERFORMANCE REPORT: Parts I and II

Behavioral Health Services for Adults Program Capacity Eligibility Description of Services Funding Dosage Phase I 33 hours

Missouri School Counselor Performance Assessment (MoSCPA) Task Requirements

Migrant Education Program Evaluation Toolkit A Tool for State Migrant Directors. Summer 2012

becoming professional women of color

Bell Tech Career Institute Vocational Nursing

Program Guidelines Fiscal Year Introduction

Delray Beach CSAP - Kindergarten Readiness

Vocational Courses in Management, Leadership, HR & Business Communications

Developing GCC Grant Applications for Victims Services Programs: Examples of Project Goals, Objectives, Measures, and Evaluation Methods

Title 10 DEPARTMENT OF HEALTH AND MENTAL HYGIENE

How To Teach Online Courses In Virginia

DEPARTMENT OF HEALTH IN SARASOTA COUNTY HIV/AIDS EDUCATION AND TRAINING CLASSES FOR 2015 INFORMATION ON CLASSES & REGISTRATION FORMS

Communities In Schools National Evaluation. Volume 5: Randomized Controlled Trial Study Austin, Texas

Executive Summary. Ohio Virtual Academy. Dr. Kristin Stewart, Superintendent 1655 Holland Rd Maumee, OH 43537

Transcription:

Standards for Defining and Measuring Human Services & Community Development Program Outcomes February 2002

Outcomes Standards Page 1 Introduction This document provides an overview of minimum standards or guidelines for defining and measuring program outcomes for human services and community development programs developed by Community Research Partners for United Way of Central Ohio. These standards may change over time as best practices in defining and measuring outcomes evolve. CRP will continually revise its standards to reflect the state of the art in the field. The standards for defining and measuring outcomes include the following components: A. Program Description B. Outcome Statement C. Measurement Procedures D. Participant Information E. Performance Reporting A. Program Description: How do we define our program? 1. Need for the Program. Program providers should be able to present a compelling case documenting the need for the program. Need may be expressed in terms of a deficit, aspiration and/or strength that exists in a particular target group. The case should include a description of the characteristics that put the target group for the program at risk, the community/neighborhood condition(s) to be addressed and/or the extent to which proposed services/programs are not otherwise available in the community. 2. Program Logic Model. Program providers should develop a program logic model describing the issue(s) to be addressed, the intervention strategy and the desired program outcome. In general, outcomes should be expressed in terms of short, intermediate and long-term results. The logic model should show how the intervention and outcomes "logically flow" from the identified need for the program. The logic model is the foundation for defining and measuring program outcomes. This is an essential step in the outcome definition and measurement process. 3. Evidence Based Practice. Program providers should develop a case in support of the linkages in their program logic model. That is, program providers should be able to support their claim that intervening in a particular manner is likely to produce a specific result or series of results for the typical target of the program. Any claims related to linkages should be supported with appropriate facts and/or references. Support for such claims can be demonstrated through: Experience of program providers Theory about how/why the program works Empirical evidence or evaluation results The Logic Model Conditions Activities Program Outcome Long-term Impact

Outcomes Standards Page 2 4. Overview of Program Activities. Program providers should provide a comprehensive description of the program, including a specific statement of the duration and amount of the program necessary to be effective. This criterion requires that providers define a minimal level of intervention (time, number of contacts) necessary to produce a desired effect. Specific program inputs should also be described, such as staff resources necessary to deliver the program. B. Outcome Statement: How do we define success? Outcome statements should include the following seven elements: 1. Intent. Intent expresses what is to be accomplished, or some change that will occur as a result of a specific intervention, activity or program. The prevention of some event or circumstance that would have occurred in the absence of an intervention also qualifies as an expression of intent. Intent should be consistent with a short term, intermediate or long term outcome from the program logic model. 2. Program Target. The target identifies "who" or "what" changes as a result of the program intervention. A target might be a specific group of people or the general population of the entire city or county. For programs that target individuals, characteristics that make her/him an appropriate program participant (age, gender, race, needs, risk factors, etc.) should also be expressed. A program target might also be a community or neighborhood condition, policy or organizational practice. 3. Geographic Focus. A program outcome should define the geographic area where the desired change will take place or from which the program participants will be drawn. This is particularly relevant for neighborhood based programs or programs where the long-term result is to change conditions in a specific locale. Franklin County, Columbus, a local neighborhood, a locale (school, housing complex) and areas defined by specific physical boundaries are examples of geographic focus. 4. Program Participation. The program outcome should define the point at which a person enters the program and becomes a program participant for the purposes of tracking outcomes. This may be at intake, enrollment in a class or completion of certain entrance requirements. In general, there are four levels of interaction between program staff and individuals: Defining a Program 1. Need: Documentation of community and target population needs. 2. Logic Model: How the program intervention will address the need. 3. Evidence Based Practice: Factual evidence that the intervention can produce stated results. 4. Program Overview: Description of specific program activities and resource requirements. The Seven Elements of an Outcome Statement 1. Intent: What is to be accomplished or change that will occur. 2. Program Target: Who or what will change as a result of the intervention. 3. Geographic Focus: Location where the change will occur or from which participants will be drawn. 4. Program Participation: The point when a person enters the program and outcome tracking begins. 5. Success Measure: What constitutes success for a program participant. 6. Time Frame for Success Measure: The length of time for a typical participant to achieve success. 7. Program Performance Benchmark: The level of success for the program during a reporting period.

Outcomes Standards Page 3 Program contacts (outreach/marketing)-individual interacts with program staff (any contact) in order to learn about services, but is not considered a program participant. Program entrance-individual officially enters the program (completes intake, registration, etc.) Program participation-individual completes enough of the program to achieve the program outcome and is still officially a program participant. Program exit-individual officially completes the program or exits without completion. 5. Success Measure. Providers should define what constitutes success for a program participant. The success measure will be used by program staff, consumers, etc. to determine if the program is successful or if the desired change expressed by the intent of the outcome has occurred. Program success might be defined in terms of change from a pretest to a posttest measure or the mastery of some knowledge as indicated by a specific score on a knowledge test (posttest only). Sometimes a success measure involves major life change such as securing employment, housing or a high school diploma. The success measure might also reflect change in a community or neighborhood condition, policy or organizational practice. 6. Time Frame for Success Measure. A program outcome should express the time frame necessary to achieve the program success measure. For example, one program might require 18 months for a homeless person to achieve success (permanent housing), while another program might expect a participant to achieve success (change in attitude) after a sixweek intervention. 7. Program Performance Benchmark. A program outcome should express the level of success that the program will achieve in a specific time frame. For example, if individuals are program targets, the program performance measure is the number of unduplicated program participants who will achieve the success measure during the funding period as a percentage of the total unduplicated number of program participants served by the program during the funding period. Example Outcome Statement for ABC Teen Pregnancy Prevention Program Intent: Increase knowledge of behaviors that prevent pregnancy Program Target: Lowincome girls and boys ages 11-17. Geographic Focus: The Hilltop area. Program Participation: A person becomes a participant after attending two, twohour classes in a row. Success Measure: Participants demonstrate on pre-and post-tests that they have gained knowledge of positive steps to avoid pregnancy. Time Frame for Success Measure: Ten, two-hour classes over three months. Program Performance Benchmark: During the oneyear grant period, of the 100 program participants served, 40 will achieve the success measure.

Outcomes Standards Page 4 C. Measurement Procedures: How do we verify what happened? 1. Success Measurement Tool(s). Program providers should acquire an existing tool or develop a tool to measure and/or verify program success. Categories of measurement tools include: Information from individual participants (interviews, tests, standardized tools) Observation/client records (program staff and or outside evaluators) Information from groups of clients (focus groups) Information from other providers, employers, landlords, family Public records or other statistical data Community data indicators Public opinion surveys 2. Measurement Methods. Program managers should document the administrative procedures that they will employ to measure program results. Administrative procedures include: Participant tracking process Sample size 1 Sample selection methods How the measurement tool identified above is administered Who administers the tool and under what circumstances How data are maintained and analyzed Example Measurement Plan for ABC Teen Pregnancy Prevention Program Tool: Standardized pre/post tests of knowledge of training curriculum. Measurement Methods: Written test Administered by program staff in one-on-one counseling sessions Administered to all participants upon entry into the program and all successful completions. Data maintained in Access database. Client Tracking: Case file established with demographic information. Entrance interview/pre-test. Class attendance records. Records of participation in class exercises, homework and class tests. Exit interview/post-test. One-year follow-up Summary reports on status of all persons served. 1 Samples for measuring program success should be selected randomly from all program participants who have officially entered the program and are being tracked for outcome measurement purposes. When this number is between 0 and 30, the acceptable sample size is 100%. When this number is between 31 and 300, the acceptable sample size is 30. When this number is 300 or more, acceptable sample size is 10%.

Outcomes Standards Page 5 D. Participant Information: What do we know about program participants? Specific information about program participants should be collected and maintained. Information about program participants should provide an overview of program activity, accomplishments and challenges for the funding period. Except for outreach and marketing, all data should be expressed as unduplicated numbers for a 12-month funding period. 1. Program Contacts (outreach/marketing activities). Number/type of contacts (phone, face to face, mail, etc.) with potential program participants. 2. Program Entrance. New participants who entered the program and their demographic characteristics. 3. Program Participation. Program participants progress through the program and point at which participant(s) completed enough of the program to achieve the desired program outcome. 4. Program Exits. Data on persons who left the program, including: Participants who completed the program according to program standards. Participants who exited the program prior to completion and reasons for exiting. Barriers to program completion. 5. Program Success Rate Calculation. Participants who achieved the success measure expressed as both a number and as a percent of all program participants. The formula for computing outcome achievement is the number of participants who achieved the success measure divided by the number of unduplicated program participants. This should be compared to the Program Performance Benchmark identified in the Outcome Statement. Calculating Program Success Rate Success Rate= Number of Participants Who Achieved Success Measure Number of Unduplicated Program Participants

Outcomes Standards Page 6 E. Performance Reporting: How do we communicate results to our stakeholders? 1. Reporting Timeline. Reporting of information on outcomes should occur on a timeline corresponding to the funder's investment cycle. Some information is reported as part of an application for funding, along with other typical application requirements. Other reporting takes place throughout the life of the program to report on the progress of the program. Suggested reports and timelines are indicated below: Report Content Focus Program funding plan/ Application Outcome report Program description (see Section A) Outcome statement (see Section B) Participant information (see Section C) Measurement procedures (see Section D) Participant information (see Section C) Measurement procedures (see Section D) Program improvement plans (see #2 and 3 below) Projection of what will be accomplished for a requested level of investment during a 12 month period or other specified investment period Procedures and results of measurement process Accomplishments and activity during prior 12 month period or other specified investment period Program improvement plan based on evaluation 2. Regular Review of Evaluation Data. Program providers should review program evaluation data (the reports defined in the previous section) on a regular basis. This information should also be presented to the agency board and other interested stakeholders and should provide a basis for considering various strategies to improve program operation. 3. Annual Program Improvement Plan. Program providers and organizational leadership should produce and implement an annual program improvement plan based on review of outcome measurement data and other regular reports. This plan should define how program evaluation data were used to improve the program. Specifically the program improvement plan should identify what worked and didn't work, analysis of dropouts and non-completion and strategies for how to change the program to produce greater success levels. Community Research Partners is a non-profit organization established in 2000 by United Way of Central Ohio, the City of Columbus and the John Glenn Institute for Public Service and Public Policy to advance human services and community development policy and programs through measurement, evaluation and research. Community Research Partners 341 S. Third St., Suite 10 Columbus, OH 43215 Phone: 614/224-5917

Outcomes Standards Page 7 Fax: 614/224-8132 www.communityresearchpartners.org