Stages of Implementation Analysis: Where Are We?



Similar documents
Conducting Implementation-Informed Evaluations: Practical Applications and Lessons from Implementation Science

Application of Implementation Science to Support All Levels of Need

CWLA 2013 National Conference April 17, 2013

Mississippi Statewide Implementation of a Family Centered Practice Model: Increasing Readiness and Managing Change

Wyman s National Network: Bringing Evidence-Based Practice to Scale. COA Innovative Practices Award Case Study Submission. Emily Black.

Effective intervention practices + Effective implementation practices = Increased likelihood for positive youth outcomes

Implementation Brief: The Potential of Co-Creation in Implementation Science

Professionals Responsible for Campus Turnaround Plan Development: Name:

ARIZONA State Personnel Development Grant Introduction and Need

Inclusive Practices In Early Childhood Special Education/Early Intervention: Selected Links to Online Resources Promoting Evidence-Based Practices

Job Description. Organisational Relationships: Description of Role: Context of the Post: Duties and Responsibilities Specific to the Post:

Quality Schools Working Group Purpose Statement and Workplan

A Principal s Guide to Intensive Reading Interventions for Struggling Readers in Reading First Schools

National Standards for Disability Services. DSS Version 0.1. December 2013

How To Improve Your Child'S Learning

Office of the Superintendent of Schools MONTGOMERY COUNTY PUBLIC SCHOOLS Rockville, Maryland. December 9, 2014

Support Services Evaluation Handbook

SECTION 1. Why Coaching?

Executive Summary. Ohio Virtual Academy. Dr. Kristin Stewart, Superintendent 1655 Holland Rd Maumee, OH 43537

INTENSIVE READING INTERVENTIONS FOR STRUGGLING READERS IN EARLY ELEMENTARY SCHOOL. A Principal s Guide

WHAT WORKS IN IMPLEMENTING NEW PROGRAMS

Executive Summary. Tom P. Haney Technical Center

Align4NCWorks Strategic Plan. Adopted by the NC Community Colleges Board

Fidelity Integrity Assessment (FIA)

Chris Bell. Customer Experience Coach.

Best Practices in School Budgeting

The Wraparound Process: An Overview of Implementation Essentials

What is Quality Schools Development? Why are we doing this?

Wraparound Practitioner Care Coordination Certificate

Inspection dates March Effectiveness of leadership and management

Technical Review Coversheet

GOAL I - Help students successfully obtain their diverse educational goals

BUILDING AND SUSTAINING TALENT. Creating Conditions in High-Poverty Schools That Support Effective Teaching and Learning

Scope of Work Consultancy Services of Communication Specialist for the Development of Project Communication Strategy and Plan 1. PURPOSE The purpose

World Health Organization

The NCCTS Learning Collaborative Model for the Adoption & Implementation of Evidence-Based Mental Health Treatment

PBIS Forum 15 Practice Brief: State and District System Alignment: Strategies

St. Louis Park City Council Meeting Item: c - Wireless Study Report Page 1. 8c. Wireless Internet Service Feasibility Study

Community/School Partnerships: A National Survey

Wilson Implementation Network

County of San Diego Health and Human Services Agency. Final Behavioral Health Services Three Year Strategic Plan

Arizona s Master Teacher Program. Program Overview

Social Performance Management

Self-Assessment Duval County School System. Level 3. Level 3. Level 3. Level 4

Campus Network Planning and Technical Assistance Overview

SBBC: JJ-002 FL: 28 THE SCHOOL BOARD OF BROWARD COUNTY, FLORIDA JOB DESCRIPTION. Approved School-based Administrators Salary Schedule

Strategic Initiative #1 - Prevention and Early Intervention GOALS OBJECTIVES METRICS

Executive Summary. Baker County High School

FAMILY INTERVENTION FOR UNACCOMPANIED HOMELESS YOUTH

Class Location: New College Institute, Martinsville, Virginia Session I April 7-10

Report of the Delaware School Library Survey 2004

Pittsburgh Public Schools. We Dream Big. We Work Hard. We Promise. Promise-Readiness Corps

Teaching All Students to Read: With Strong Intervention Outcomes

Cerebral Palsy Sport. Fundraising and Business Development Manager. Applicants Information Pack

RUBRICS FOR ASSESSING MASTER S LEVEL SCHOOL ADMINISTRATION PORTFOLIO EVIDENCE CLUSTERS APPALACHIAN STATE UNIVERSITY. Student Name.

Nutrition and Physical Activity in Child Care The NAP SACC Program

Case Study: Leadership Development Working Across Boundaries: An Innovative Approach to Building Leadership. June 2014

ty School District Digita al Classrooms Plan

Individualized Care Planning Manual

Assistant Director of Alcohol, Drug, and Mental Health Services Clinical Operations Job Bulletin #

Comprehensive Reading Plan K-12 A Supplement to the North Carolina Literacy Plan. North Carolina Department of Public Instruction

NORTH CAROLINA PROFESSIONAL SCHOOL SOCIAL WORK STANDARDS

Executive Summary. Design and Architecture Senior High School

The North Carolina Principal Fellows Program: A Comprehensive Evaluation

REPORT OF THE SERVICE DIRECTOR CUSTOM ERS AND HUMAN RESOURCES WORK BASED COACHING IN NOTTINGHAMSHIRE COUNTY COUNCIL

Community Action Head Start Self Assessment Final Report and Program Improvement Plan

School Mental Health Services

University of North Carolina TEACCH Autism Program Professional Certification

ILLINOIS PROFESSIONAL TEACHING STANDARDS (2013)

Cultural Competency Plan

date by June Key initiatives included: Paraprofessionals and Academic specialists

Guide for Documenting and Sharing Best Practices. in Health Programmes

Transcription:

Stages of Implementation Analysis: Where Are We? National Implementation Science Network (NIRN) Frank Porter Graham Child Development Institute UNIVERSITY OF NORTH CAROLINA CHAPEL HILL Page 1

Citation and Copyright This document is based on the work of the National Implementation Research Network (NIRN). 2013 Karen Blase, Melissa van Dyke and Dean Fixsen This content is licensed under Creative Commons license CC BY NC ND, Attribution NonCommercial NoDerivs. You are free to share, copy, distribute and transmit the work under the following conditions: Attribution You must attribute the work in the manner specified by the author or licensor (but not in any way that suggests that they endorse you or your use of the work); Noncommercial You may not use this work for commercial purposes; No Derivative Works You may not alter, transform, or build upon this work. Any of the above conditions can be waived if you get permission from the copyright holder email: nirn@unc.edu web: http://nirn.fpg.unc.edu The mission of the National Implementation Research Network (NIRN) is to contribute to the best practices and science of implementation, organization change, and system reinvention to improve outcomes across the spectrum of human services. Page 2

When creating Implementation Teams to provide supports that are effective, integrated, efficient, and sustainable, the first task is map the current implementation landscape. The goal is, build on current strengths and collect information to inform planning the best path toward developing implementation capacity in this provider organization. Background Human service provider organizations (e.g. child welfare units, child care settings, community centers, education settings, healthcare clinics, residential care facilities) are attempting to make use of interventions (e.g. evidence based programs and other innovations) to improve outcomes for children, families, individuals, and communities. For the past few decades policy makers, researchers, and technical assistance providers have focused on interventions. The same attention and support has not been given to implementation of interventions. Consequently, in most cases human service organizations have been left to their ingenuity to figure out how to make use of evidence based programs. In a few instances, evidence based program developers have created a purveyor group that can provide effective supports for implementation of that intervention. The lack of attention to implementation methods has led to what some have termed the quality chasm: we know what to do, but we are not making use of that knowledge to improve outcomes in human services. The National Implementation Research Network encourages policy makers, practitioners, and communities to make greater use of evidence based programs and other innovations (collectively called interventions in this document). The United States far outspends any other country on human services yet our outcomes rank near the bottom of the 30 or so most developed countries globally. Evidence based interventions hold the promise of better outcomes. Common sense tells us that children, families, individuals, and communities cannot benefit from interventions they do not experience. Thus, the promise of evidence based interventions will not be realized unless they are used fully and effectively in practice, every day for everyone who could benefit. The growing science of implementation and documentation of implementation best practices provide guidance for effectively and efficiently supporting evidence based programs in human service provider organizations. To realize benefits on a Page 3

socially important scale, policy makers and directors of provider organizations must invest in creating effective implementation supports for practitioners. Implementation supports for interventions Implementation capacity is embodied in Implementation Teams. An Implementation Team consists of three or more full time individuals who know interventions well, are skilled specialists regarding implementation science and best practices, and are well versed in the many uses of improvement cycles to continually advance practices, organizations, and systems. Implementation Team members do the work of implementation in organizations and systems. To create an Implementation Team, current positions are re assigned, functions are repurposed, team members develop new competencies, and reporting relationships are realigned so no new costs are added. Implementation Teams are built into organization and system structures to provide lasting and sustainable supports for using a variety of evidencebased interventions and other innovations fully and effectively. Implementation Team members conduct ImpleMap interviews. Readers are encouraged to visit the National Implementation Research Network website (http://nirn.fpg.unc.edu) and the State Implementation and Scaling up of Evidence based Programs website (www.scalingup.org) for further information about implementation science, Implementation Teams, and infrastructures to support implementation on a large scale. Page 4

Stages of Implementation Analysis EBP or Evidence Informed Innovation: This tool provides the team with the opportunity to plan for and/or assess the use of stagebased activities to improve the success of implementation efforts for EBPs or evidenceinformed innovations. The tool can be used to assess current stage activities (e.g. We are in the midst of Exploration ) or past efforts related to a stage (e.g. We just completed most of Installation? How did we do? What did we miss?). For activities scored as Not Yet Initiated the planning team may wish to: a) Examine the importance of the activity in relationship to achieving success b) Identify barriers to completion of the activity c) Ensure that an action plan is developed (sub activities, accountable person(s) identified, timeline, evidence of completion) and monitored A strength of stage score can be computed for each stage to help guide and measure effective use of stages. Page 5

Stage Related Activities for Exploration Current Past In Initiated or Partially In Not Yet Initiated 1. Form Implementation Team or Re Purpose/Expand a Current Group 2. Develop communication plan to describe the exploration process (e.g. activities, participants, timeline, benefits, risks) to key stakeholder groups 3. Analyze Data to determine need and prevalence of need 4. Select Targeted Areas to address Need (e.g. child, adult, family outcomes) 5. Review and identify programs, practices, interventions that match target area and address need 6. Review and discuss eligible programs and practices (use the Hexagon) in relation to: a) Need b) Fit c) Resources Sustainability d) Strength of Evidence e) Readiness for Replication f) Capacity to Implement 7. Select programs/practices for continued exploration based on assessment results from above 8. Develop methods to promote exploration and assess buy in for range of impacted stakeholders 9. Analyze information and results of exploration activities 10. Work group makes recommendation to appropriate level (e.g. state level team, local partners, alliance, funders) Average % in Each Category Strength of Exploration Score: What should we do to further strengthen our Exploration Process? Are there Exploration Activities we need to revisit? And what are the next right steps? Page 6

Stage Related Activities for Installation Current Past In Initiated or Partially In Not Yet Initiated 1. Identify structural and functional changes needed (e.g. policies, schedules, space, time, materials, re allocation of roles and responsibilities, new positions needed) a) at provider/agency level b) at local level (e.g. collaborative groups) c) at District or County level 2. Make structural and functional changes needed to initiate the new program, practice, framework a) at provider/agency level b) at local level (e.g. collaborative groups) c) at District or County level 3. Development of selection protocols for first implementers a) at provider/agency level b) at local level (e.g. collaborative groups) c) at District or County level 4. Selection of first implementers a. Agency administrators b. Practitioner/Front line c. Other: 5. Identification of Training Resources, logistics 6. Training of first cohort of implementers a) Practitioners b) Agency administrators c) Trainers: d) Coaches: e) Other: 7. Develop coaching and support plans for practitioners 8. Evaluate readiness and sustainability of data systems at consumer level (e.g. child, adult, family) 9. Evaluate readiness and sustainability of fidelity data system 10. Analyze and problem solve around the sustainability of training, coaching, data systems 11. Establish communication links to report barriers and facilitators during next stage (e.g. Initial Implementation) Average % in Each Category Strength of Installation Score: What might we do to further strengthen our Installation Process? Are there Installation Activities we need to revisit? And what are the next right steps to engage in or revisit Installation Activities? Page 7

Stage Related Activities for Initial Implementation In Initiated or Not Yet Current (monitored at least bi weekly for first 4 Partially In Initiated months) Past 1. Communication plan(s) developed to inform stakeholders of launch dates, activities, and convey support 2. Communication protocols developed for identifying barriers and adaptive challenges and problem solving at each level (e.g. weekly implementation team meetings to identify issues, create plans, review results of past problem solving efforts, forward issues to next level as appropriate) 3. Leadership develops support plan to promote persistence 4. Written coaching plan developed at relevant levels (e.g. school, teacher; agency, practitioner) 5. Coaching system in place (see Best Practices for Coaching Systems) 6. Data systems in place for measuring and reporting outcomes 7. Data systems in place for measuring and reporting fidelity 8. Document that reviews initial implementation challenges Revision recommended for Implementation Drivers based on review of challenges and with sustainability considerations a) Recruitment and Selection b) Training and Booster Training c) Coaching processes and data d) Outcome data measures and reporting process e) Fidelity measures and reporting processes f) Agency Administrative policies and practices g) Other Levels of Administrative policies and practices 9. If appropriate, plan for next cohort of implementers Average % in Each Category Strength of Initial Implementation What might we do to further strengthen our Installation Process? Are there Installation Activities we need to revisit? And what are the next right steps to engage in or revisit Installation Activities? Page 8

Stage Related Activities for Full Implementation In Initiated or Not Yet Current (every 6 months) Past (when there has Partially In Initiated been a shift back to Initial Implementation due to turnover) 1. Monitoring and support systems are in place for each Implementation Driver: a) Recruitment and Selection b) Training and Booster Training c) Coaching processes and data d) Outcome data measures and reporting process e) Fidelity measures and reporting processes 2. Feedback process from practitioners to Agency administrators is in place and functional (e.g. practitioner participation on Leadership and Implementation Teams, changes to facilitate best practices) 3. Feedback process from Agencies (e.g. schools, care settings, clinics)to next levels of administration in place and functional 4. Feedback process to State or to TA support is in place and functional. (e.g. system in place for Agencies to feed information and feedback to appropriate State and/or TA entities) 5. Agency Leadership and Implementation Teams use data to make decisions (e.g. clinical outcomes, behavior, academics, and fidelity) 6. Improvement processes are employed to address issues through the use of data, development of plans, monitoring of plan execution and assessment of results (PDSA cycles) Average % in Each Category Strength of Initial Implementation What might we do to further strengthen and maintain Full Implementation? Are there Activities we need to revisit? And what are the next right steps to engage in or revisit Full Implementation Activities? Page 9