Pennsylvania Commission on Crime and Delinquency (PCCD) Research Based Programs Initiative Grantee Outcomes Report Template



Similar documents
Stephanie Bradley New Grantee Orientation August 14, 2012 Ramada Inn and Conference Center - State College, PA

School-based Substance Abuse Prevention

Introduction. Communities That Care

Excellence in Prevention descriptions of the prevention programs and strategies with the greatest evidence of success

COLORADO DEPARTMENT OF HUMAN SERVICES DIVISION OF CHILD WELFARE

1 P a g e Strategic Plan

Choosing The Right Evaluation Questions

Family Ties: How Parents Influence Adolescent Substance Use

Quality Standards. All children will learn, grow and develop to realize their full potential.

How To Know If Mentoring Helps Children Affected By Incarceration

Report on Act 75 of 2013

A More Perfect Union: Joining Tobacco Use and Substance Abuse Prevention Programs

Intensive Home Based Supervision IHBS

Twelve Ounces of Prevention: A Look at Alcohol Use by Erie County Youth

Running Head: COUNCIL ON ACCREDITATION, INNOVATIVE PRACTICES AWARD 1. Council on Accreditation Innovative Practices Award

JUVENILE JUNCTION ALCOHOL AND DRUG PREVENTION AND TREATMENT PROGRAMS IN SAN LUIS OBISPO COUNTY SUMMARY

Drug Abuse Prevention Training FTS 2011

SACRAMENTO COUNTY PROBATION DEPARTMENT OPENS MULTI-SERVICE CENTER TO BETTER SERVE PROBATION YOUTH

The NCAOC Access and Visitation Program

Template for Local Grantee Evaluations ISBE 21st Century Community Learning Center Program. II. Overview and History of Program

Best Practices in Juvenile Justice Reform

Safe Schools/Healthy Students (SS/HS) Washington Complex Award 2.21 million per year for four years

SERVICE STANDARD INDIANA DEPARTMENT OF CHILD SERVICES COMMUNITY PARTNERS FOR CHILD SAFETY November 2010

Program theory and logic models Evaluation resources from Wilder Research

Safe and Healthy Minnesota Students Planning and Evaluation Toolkit

Guide to Using Results

Governor s Council on Alcoholism and Drug Abuse Fiscal Grant Cycle July 2014-June 2019 MUNICIPAL ALLIANCE CAPACITY

MST and Drug Court. Family Services Research Center Medical University of South Carolina Funded by NIDA and NIAAA

Certified Prevention Specialist Manual

Updated February 2011

State and County Data Resources

FINAL REPORT RESEARCH GRADE 7 TO 12 PROGRAMS. Frontier College would like to thank the Ontario Ministry of Education for their support.

Tulsa Public Schools District Secondary School Counseling Program

Outcomes of a treatment foster care pilot for youth with complex multi-system needs

UNH Graduate Education Department. Quarterly Assessment Report

Boys & Girls Clubs of Tampa Bay. Formative Evaluation Report

Partnership4Success Agency Continuous Improvement Action Plan Summary. Agency Big Brothers Big Sisters of Central Ohio

How to Develop An Operations Manual For Your Mentoring Program

Wyman s National Network: Bringing Evidence-Based Practice to Scale. COA Innovative Practices Award Case Study Submission. Emily Black.

I n t r o d u c t i o n

Effectiveness of Mentor Programs

Juvenile and Domestic Relations District Court

Facts for Teens: Youth Violence

HEARTWOOD. Program Description

Redirection as Effective as Residential Delinquency Programs, Achieved Substantial Cost Avoidance

Byram Hills School District. School Counseling (Guidance) Plan for Grades K 12. NYS Commissioner s Regulations Part 100.2(j)

Performance Measures in Out-of-School Time Evaluation

Preconception Health Strategic Plan

Working Paper # March 4, Prisoner Reentry and Rochester s Neighborhoods John Klofas

Report to the President and Congress Medicaid Home and Community-Based Alternatives to Psychiatric Residential Treatment Facilities Demonstration

STATE PREVENTION SYSTEM

Reynolds School District K 12 GUIDANCE AND COUNSELING PROGRAM OVERVIEW

Inquiry into teenage pregnancy. Lanarkshire Sexual Health Strategy Group

Ohio Standards for School Counselors

The Alameda County Model of Probation: Juvenile Supervision

EFFECTIVENESS OF TREATMENT FOR VIOLENT JUVENILE DELINQUENTS

Wellness Initiative for Senior Education (WISE)

CSAT TA Package. Building a Business Plan for Provider Sustainability

GUIDELINES FOR THE IEP TEAM DATA COLLECTION &

Introduction to Community Surveys

COMMENTARY. Scott W. Henggeler, PhD

Improving the Effectiveness of Correctional Programs Through Research

Department of Corrections Should Maximize Use of Best Practices in Inmate Rehabilitation Efforts

The National Center on Addiction and Substance Abuse at Columbia University 2009

National Standards. Council for Standards in Human Service Education (2010, 1980, 2005, 2009)

24647 NORTH MILWAUKEE AVENUE VERNON HILLS, ILLINOIS 60061

Evidence-Based Enrichment Programs. Summary of Child Trends Research Briefs by Laura Deaton Managing Director December 15, 2010

ACT Results Framework: ACHIEVING Youth Outcomes [these are the goals for a single year of programming for each participant]

The Begun Center is currently serving as the evaluator for five drug courts in Ohio receiving SAMHSA grant funding.

Texas HOPE Literacy, Inc.

PRIORITY ISSUE - ALCOHOL and DRUG ABUSE

RESULTS FROM MENTORING EFFECTIVENESS RESEARCH

CSI: Minneapolis Presented by Al Flowers

YOUTH DRUG SURVEY CHARLOTTE-MECKLENBURG PUBLIC SCHOOLS

Tulsa Public Schools District School Counseling Program Elementary

The Los Angeles County Juvenile Probation Outcomes Study

Dr. LaVonne Chenault-Goslin

Five-Year Prevention Statewide Strategic Plan

Multidimensional Treatment Foster Care Program (formerly "Treatment Foster Care Program")

COMMONWEALTH OF MASSACHUSETTS DEPARTMENT OF CORRECTION 103 DOC 445 SUBSTANCE ABUSE PROGRAMS TABLE OF CONTENTS Definitions...

SBBC: JJ-002 FL: 28 THE SCHOOL BOARD OF BROWARD COUNTY, FLORIDA JOB DESCRIPTION. Approved School-based Administrators Salary Schedule

IOWA DEPARTMENT OF EDUCATION

Juvenile Justice Centres Continuous Improvement Quality Assurance Framework Policy

July 1 Dec. 31 for HIB Trainings and Programs Sept. 1 Dec. 31 for HIB Investigations and Incidents

Using Evaluation to Improve Programs. Strategic Planning.

Sample Cover Letter & Sample Proposal for Funding Support

Evidence Summary for Treatment Foster Care Oregon (formerly Multidimensional Treatment Foster Care, or MTFC)

PERFORMANCE MANAGEMENT AND EVALUATION: WHAT S THE DIFFERENCE? Karen E. Walker, Ph.D., and Kristin Anderson Moore, Ph.D.

Functional Family Therapy

Organizational Report for Post-Baccalaureate Non-Degree Educator Preparation Programs. (Institution, Organization, or LEA name)

History of Secondary Student Assistance Programs in Pennsylvania

Hood River County School District K-12 Guidance and Counseling Program Overview

April 14, 2010 Testimony of Julie Crowe, LMSW DePelchin Children s Center

How To Treat A Substance Abuse Problem

The National Center on Addiction and Substance Abuse at Columbia University 2010

Denver Police Department Law Enforcement Advocate Program. Scanning: When a developmentally delayed youth was involved in a police shooting in 2003,

Community Health Education Internship Handbook. Fall 2014

Campbell County Middle School. Welcome! CCMS School Counseling Advisory Council Meeting #1

Early Childhood Education Draft Board Resolution

Wraparound Practitioner Care Coordination Certificate

Transcription:

Pennsylvania Commission on Crime and Delinquency (PCCD) Research Based Programs Initiative Grantee Outcomes Report Template Guidelines for Reporting on Big Brothers Big Sisters As a requirement of funding under PCCD s Research based Programs Initiative, all grantees are required to submit a cumulative outcomes report. Grantees awarded funding before 2010 should report in the in the 3 rd quarter of the 3rd year of grant funding. Grantees awarded funding in 2010 should report in the 3 rd quarter of the 2 nd year of funding. The purpose of the Outcomes Report is to relate the experience of the grantee in implementing the program, and to summarize the data collected to measure program reach, implementation quality, and impact. Beyond meeting the reporting requirements of PCCD, the Outcomes Report should also serve as a valuable tool for communicating the program s impact to local stakeholders. It is recommended that prior to completing the report, grantees print copies of their grant application and quarterly E grants reports. These should be used to respond in narrative format to all of the outcome report questions. Please answer using complete sentences. Please also include the numerator and denominator and calculation explanations for any data or percentages provided. For example, rather than a) reporting that 60% of participants demonstrated change or b) 60 participants demonstrated change in a specified outcome, report that 60 of 100 surveyed participants (60%) demonstrated change. Please only report on data that is reflective of participants and services funded by PCCD. If multiple grants have been funded, separate outcome reports are required for each grant. The EPISCenter has provided guidance and clarification for responding to the required questions in the gray, shaded boxes below the questions. Also, please pay special attention to Notes included in the template. If you need assistance completing this report, please contact the EPISCenter by email at episcenter@psu.edu or call (814) 863 2568. You are strongly encouraged to submit your draft report to your assigned EPISCenter Prevention Coordinator for feedback prior to submitting the report to PCCD. The final report should be attached in E grants with your quarterly report. Grant ID #: Grantee name: Person completing this report (name, phone, & email): Evidence based program being implemented: Initial amount awarded: Geographic location/county served:

Is this project a new implementation or expansion of an existing program? Period covered by this report Grant start date: Current funding year & quarter: Report submission date: Describe any major changes to the project plan from what was originally proposed, and why those changes were necessary. Include a description of any Project Modification Requests (PMRs) you submitted as well: At the time of writing a grant application, it is impossible to foresee all the influences that may lead to implementation barriers and challenges. These challenges can lead to changes to the envisioned project plan. Discuss challenges you encountered and any resulting changes to your proposed implementation. SECTION 1 DESCRIPTION OF POPULATION SERVED In this section, you will describe the population that has actually been reached by the program since the beginning of the grant period. 1. Total number of youth served directly by the program to date: Indicate the total number of unduplicated eligible youth that entered the inquiry process. Indicate the total number of matches funded by PCCD. Indicate the total number of youth matched. Count each individual only once. Of the total number of youth matched, indicate the number of youth matched with more than one mentor. Indicate the number of youth who have applied for the program, have been screened, and are eligible to be matched, but that currently remain on a waitlist. 2. Total number of adult served directly by the program to date: Count all volunteer that have been screened and trained, but are waiting for a match. Count all volunteers matched. Count each individual once. If available, indicate the number of parents/caregivers served by the program.

3. Of the total number of youth program participants that were matched, how many successfully completed the program? Successful completion is defined as youth participating in a match relationship for 1 year or more. Count each individual only once. Note: Responses to questions 3 and 4 combined should equal the total number of youth participants that were matched. 4. Of the total number of youth program participants that were matched, how many did not complete the program successfully? Unsuccessful completion is defined as youth participating in a match relationship for less than 1 year. Count each individual only once. 5. Describe the overall demographics of the population served. Age range served: Gender ratio (percent male/percent female): Races/ethnicities served: General risk characteristics of your participants: Any other characteristics you can identify based on the data available to you: Share any demographic information collected or available through observation. Also discuss the make up or family structure of the youth served, indicating single parents, divorced parents, grandparents raising grandchildren, and foster parents. 6. Provide a listing of all the zip codes and schools served by the program. (This information will be used by PCCD to measure the saturation of each program type across the Commonwealth.) Zip Codes Served: Use the addresses of the youth to list all zip codes served. Name of Schools District(s) Served: Indicate all the school districts located in the targeted recruitment areas. Name(s) of Individual Schools Served: Indicate the schools that the youth participants attend.

7. Was the program universally implemented or was it targeted at a specific at risk or referred population? If targeted, what criteria were used to select participants and what was the target population? (Programs targeting an entire grade of students are considered universal). Universal or targeted: The funded BBBS model is typically offered as a targeted community based prevention program. Criteria for referral or participation: If a specific eligibility criterion was used, such as income requirements, age, or geographic residency, please explain it. Referral process: Referral sources: Did you encounter any barriers to recruitment or referral? 8. Describe the dosage or amount of service provided. Depending on the program you are implementing, this would be the number of lessons delivered, number of mentoring hours, or number of sessions participated in. Please include the level of participation by all program participants, as well as the percentage of participants who received the full amount of service (e.g., percentage of youth served who received all of the lessons of a curriculum.). The dosage of BBBS is defined by: the number of hours of interaction between a Big and Little per week/month the number of contacts or activities engaged in by a Big and Little per week/month the length of a match relationship PCCD s definition of full dosage is based on the research by Public/Private Ventures that has led to the recognition of BBBS as an evidence based program. The research indicates match contact of 3 4 hours 3 times per month for approximately 1 year. Indicate the percentage of youth with 9 or more hours of mentor contact each month of their match relationship. Note: Add together the totals of the 3 performance measures/indicators collecting data on hours of contact. Then, divide by the total number of youth that have been matched (and reported on in the performance measures). Multiple the quotient by 100. Please show the calculation. Indicate the percentage of youth engaged in 3 or more one on one activities with their mentor each month. Note: Add together the totals of the 3 performance measures/indicators collecting data on one on one mentor activities. Then, divide by the total number of youth that have been matched (and reported on in the performance measures). Multiple the quotient by 100. Please show the calculation. Indicate the percentage of youth matched for 12 months or more with the same mentor.

SECTION 2 DESCRIPTION OF PROGRAM GOALS In this section you will describe the short and long term goals of your implementation (i.e., changes in knowledge, attitudes, skills, intentions, and behaviors targeted by the program). Do not enter your impact/outcome data in section 2. 1. Describe the theory of change (logic model) of the program. This is a description of how the program creates change in targeted behaviors by addressing certain risk and protective factors and teaching specific skills or knowledge. The logic model defines what you measure and how you determine if the program is having the intended impact. Explain the theory on which BBBS is based and how caring relationships between youth and adults supported by a dedicated organization impact change. BBBS emphasizes development of youth assets to counter the negative effects of risk factors, especially those risks that cannot be altered, such as living in poverty or in a single parent home. BBBS has been proven to impact substance use initiation, violence, school attendance and performance, attitudes towards school work, and peer and family relationships. To answer this question, refer to the logic model and implementation manual created by the EPISCenter. The BBBSA Web site, the BBBSA practice standards, the Center for the Study and Prevention of Violence Blueprint summary, the Public/Private Venture s Making a Difference: An Impact Study of Big Brothers Big Sisters, and other research articles on program effectiveness are a few of the additional possible sources of information that can be used in your response.

2. Given the program s logic model described above, list the short term or intermediate outcomes (sometimes called proximal indicators) that you have measured during the grant period. Include any baseline data used as comparison and the targets you are striving to meet. Short term outcomes are often seen as new skills that are fostered during the bonding that occurs with in the mentee relationship. BBBS has been proven to have longer term impacts on substance use, violence, academics, and peer and caregiver relationships. Short term changes in confidence, caring, and competence precede these outcomes. For example, long term changes in academic performance are driven by short term changes in scholastic competence, educational expectations, and confidence in academic ability. The following community level risk and protective factors, as assessed through the Pennsylvania Youth Survey (PAYS), are targeted by BBBS as some of the short term outcomes: PAYS Protective Factors Targeted for an Increase Community Opportunities for Prosocial Opportunities Community Rewards for Prosocial Opportunities Belief in the Moral Order Additional Protective Factors Targeted for an Increase Exposure to community/cultural norms that are not favorable to antisocial behaviors and substance use Promotion of healthy beliefs and clear standards Goal setting and positive future orientation Positive orientation to school and increased scholastic confidence and competency Positive parent child affect and parental trust Improved relations with pro social peers Communication and interpersonal skills Decision making and critical thinking skills Coping and self management skills PAYS Risk Factors Targeted for a Decrease Academic failure beginning in late elementary school/poor academic performance Lack of commitment to school Early initiation of drug use Early initiation of problem behavior/early and persistent anti social behavior Rebelliousness Friends who engage in problem behaviors/friends delinquent behavior & friends use of drugs negative peer influences Favorable attitudes towards antisocial behavior problem behaviors and substance use Favorable attitudes towards Alcohol, Tobacco, and Other Drug (ATOD) use Low perceived risks of drug use Family conflict Use your performance measures, the data analyzed from the POE/YOS evaluations, and other data sources, such as the PA Youth Survey (PAYS), to report.

3. List the long term behavioral outcomes (sometimes called distal outcomes) you are seeking to change in your target population (as established in your grant application). Include any baseline data used as comparison and the targets you are striving to meet. In your grant application, you were required to identify locally prioritized risk and protective factors and behavioral outcomes, specific to your community. Summarize the long term behavioral outcomes you intended to impact in your community and in the population you targeted with BBBS. BBBS has been proven to have impacts on substance use, violence, academics, and peer and caregiver relationships. The 18 month randomized control research trial found that Little Brothers and Little Sisters: Were at least 46% less likely than controls to initiate drug use (stronger effect for minority males and females) Were 27% less likely to initiate alcohol use (stronger effect for females) Were 1/3 less likely to hit someone Were 52% less likely to skip school Were 37% less likely to skip a class Showed small gains in grade point averages and were more confident of their performance in schoolwork Had improved relations and levels of trust with their parents and their families Had improved peer relationships Involvement in BBBS is also believed to buffer the negative effects of the single parent home. Children from single parent homes often live in poverty, enjoy less parental time and supervision, and have fewer opportunities for positive youth development. They are twice as likely to drop out of high school and more likely to be placed in foster care or juvenile justice facilities. Females from single parent homes have 3x the risk of bearing children as unwed teenagers. Males whose fathers are absent face a much higher probability of growing up unemployed, incarcerated, and uninvolved with their own children. School dropout, out ofhome placement, teen pregnancy, unemployment, and incarceration may be long term targets your community has also identified. Use your performance measures, the data analyzed from the POE/YOS evaluations, and other data sources, such as the PA Youth Survey (PAYS), to report. SECTION 3 INDICATORS OF PROGRAM IMPACT In this section you will describe how you measured the impact your program had on its target audience including what data you collected, how it was analyzed and what the data tells you. Notes: In this section, please indicate the total number of participants surveyed and included in the reported outcomes. In this section, please include the numerator and denominator for all calculations. 1. Describe the process you used to measure the impact of your program. Include a brief description of any survey or outcome measurement instruments used. Provide details about the person(s) responsible for data collection, data entry, and data analysis (such as whether they were paid or volunteer and whether you used

internal or external evaluators). Include information about plans for continued data collection and analysis after the end of the grant period. Describe the outcome measurement tool(s) used (i.e., YOS, POE, SOR). Indicate how evaluation tools were administered and at what time points (e.g., online, by phone, or in person at in take, at service points, at annual anniversaries, or upon exit). Briefly indicate the person(s) responsible for data collection, entry, and analysis, as well as their role with the program and qualifications. Indicate whether staff or an external evaluator was used and any fees paid for evaluation services. 2. Indicate the total number of participants surveyed and explain any challenges in collecting or analyzing the survey data? Indicate the total number of program participants completing the YOS, POE, or SoR. 3. Given the objectives and benchmarks identified in the grant application (as you described in SECTION 1 above), list any quantifiable changes in attitudes, skills, knowledge, intentions, or behaviors among the population served. The response should include ALL quantifiable measures of client change, and MUST include: changes in anti social behavior as defined in the grant s performance measures the impact on placement and recidivism, if applicable any other peripheral benefits realized by the community, families, etc. Indicate the total number of youth reaching an anniversary that were surveyed and for whom outcomes data was reported on in your performance measures/indicators. Note: This response can be derived from the performance measure/indicator titled Number of Annual/12 month Program Outcome Evaluations (POE) or Youth Outcomes Survey (YOS) Completed, Returned, and Analyzed. Indicate changes in antisocial behavior by matched youth. Change in anti social behavior is defined as a reduced likelihood of engaging in misconduct and risk taking delinquent behaviors. Note: This response should include data reported in the performance measure/indicator titled YOS Number of Youth With Decreased Favorable Attitudes Towards Antisocial Behavior or POE Number of Youth With Increased Ability To Avoid Delinquency. Additional significant quantitative data demonstrating changes in attitudes, beliefs, intentions, skills, knowledge, or behaviors can be reported from the YOS, POE, SOR, or other measurement tools. Additional qualitative information related to the development of youth participants, family benefits, sibling influences, or mentor mentee relationship dynamics collected through case manager interaction parent reports, or volunteer reports can be included.

4. If any outcome goals of the project were not met, please describe the reason(s) if known: 5. Please describe any ways in which you exceeded the expectations of the project as proposed? Share your program successes! Discuss the ways in which you met targets set for recruitment, matches, and outcomes. 6. Describe how you measured your clients satisfaction with the program and explain any available indicators of client satisfaction, quantitative or qualitative. Indicate the use of client satisfaction measurement tools, including the SoR, and the information collected from them. Also address youth, caregiver, or volunteer testimonials or reports. SECTION 4 INDICATORS OF IMPLEMENTATION QUALITY AND FIDELITY In this section, you will describe how you assessed the quality and fidelity of implementation, and the results you found. 1. Describe how you assessed implementation quality and fidelity to the developer s proven program model: What components of implementation quality did you assess (e.g., adherence, dosage, quality of delivery, participant responsiveness, and program reach/targeted population)? For each component assessed, what type of measurement occurred (e.g., observations, implementer self reports, etc )? For each component assessed, what instruments/measurement tools were used? Who provided the instruments/measurement tools used (the developer, the EPISCenter, etc )? For each component assessed, how often did you measure it? 2. Summarize the results of implementation quality and fidelity monitoring: Include areas that you were successful in meeting high quality. Include areas you felt were weaknesses in program fidelity. Indicate how you would change your implementation to improve fidelity if you were starting the project from scratch.

Discuss systems developed for quality monitoring that ensured quality service. Discuss in what ways you adhered to the national BBBSA practice standards. Discuss your submission of the Annual Agency Self assessments and the required PCCD year 2 quality assurance review process. Share feedback received from national. Explain whether or not the format and frequency of program delivery met the standards of the researched Blueprints model. Discuss how staff and volunteers were trained to promote quality and also how you promoted their understanding of research that has lead to BBBS s recognition as a Blueprints program. Indicate ways in which you tracked match intensity (e.g., contact hours, number of contacts/activities). Discuss how you promoted frequent contact. Discuss the SoR outcome findings and how they were used by case managers to promote the strength and length of matches. Discuss additional ways you promoted sustained matches. SECTION 5 LESSONS LEARNED 1. With the experience and knowledge you now have, what would you have done differently? 2. What lessons have you learned that would benefit other sites who are considering implementing this program? 3. Describe the level of investment/buy in among the practitioners implementing the project (e.g., the teachers implementing a PATHS curriculum or the therapists delivering MST services). Describe the investment and commitment to model adherence and national practice standards by the program director, program coordinator, case managers, and any other relevant program support staff. Describe the orientation and training of volunteers and the level of investment in their mentoring relationships. SECTION 6 PROGRAM SUSTAINABILITY 1. Who has the responsibility of sustainability planning for your program? 2. To whom and in what ways have you communicated the impact of the program (e.g., your Collaborative Board, School Board, CTC Board, County Commissioners or other officials such JPO or CYS, United Way, PTO, news outlets, other local prevention efforts, etc.)? 3. Since the beginning of the grant, what new relationships have you created that have or could enhance or support the program long term? 4. Since the beginning of the grant, what relationships have you strengthened that have or could enhance or support the program long term?

5. What specific steps have you taken to sustain the program beyond PCCD funding (e.g., detailing the budget, meeting with stakeholders, securing local investment, applying for additional grants)? If you have applied for additional funding from any source to support the program, please list the sources and status of the application. 6. What were the results of the steps taken as described in #5 (include any accomplishments related to sustaining the program beyond PCCD funding)? THANK YOU FOR YOUR TIME!