Conducting Implementation-Informed Evaluations: Practical Applications and Lessons from Implementation Science August 19, 2015 A joint webinar with
Today s Hosts and Presenter Teresa Duncan Director, REL Mid-Atlantic Host Caryn Ward Senior Implementation Specialist/Scientist University of North Carolina, Chapel Hill Presenter Rolf Blank Senior Fellow, NORC Co-Host 2
Today s Agenda Welcome About the REL Program Collaboration between REL Mid-Atlantic and Division H About AERA s Division H Presentation Questions and Answers Wrap Up 3
Why this webinar? Meaningful organizational change is hard Focus on evidence-based practices is important But these practices are only effective when fully implemented Implementation science has lessons to share 4
The Regional Educational Laboratories 5
About the Regional Educational Laboratory (REL) Program Began in 1965 10 Regional Educational Laboratories across the United States The 10 RELs respond to education needs of the region by working in alliances with practitioners and producing rigorous research, providing technical assistance, and disseminating useful and accessible research-based products This webinar is an example of our work 6
Services Provided by the RELs 7
About Division H: Research, Evaluation, and Assessment in Schools One of 12 divisions of the American Educational Research Association Division H is AERA s home for people dedicated to doing applied research and evaluation along with assessment and accountability in the schools Nearly 2,500 members This webinar is part of the work of the Division H Professional Development Committee 8
Keep on the lookout for news about Maximum number of attendees: 50 Full-day pre-conference workshop on implementation science at AERA 2016 (April 8-12, Washington, DC) 9
Conducting Implementation Informed Evaluations: Practical Applications and Lessons from Implementation Science RELMA/AERA Division H Webinar Caryn S. Ward, PhD August 2015
SISEP Center State Implementation and Scaling up of Evidence-based Practices (SISEP) www.scalingup.org United States Department of Education - Office of Special Education 11
SISEP Team Karen Blase, Ph.D. Dean Fixsen, Ph.D. Barbara Sims Kathleen Ryan Jackson, D. Ed. Jonathan Green 12
http://implementation.fpg.unc.edu 13 13
Learning Objectives Participants in the webinar will Increase their knowledge and understanding of the Active Implementation Frameworks Explore examples of an implementationinformed Decision Support Data System including fidelity assessments Identify how Improvement Cycles can be used to carry out practices and effective implementation methods 14
What Is Challenging About Implementation and Evaluation of Education Initiatives? 15
Why Is Implementation of * * So Hard? It is a change that challenges old practices Some control and power is taken away from people Multi-year process (avg. 4-8 years to implement) Don t always have the expertise and resources needed CHANGE.CHANGE.CHANGE CHANGE!!! More accountability for the adults and the system Is not a linear step by step process One size does not fit all 16
Implementation Science Literature Implementation is defined as a specified set of activities designed to put into practice an activity or program of known dimensions (Fixsen, Naoom, Blase, Friedman & Wallace, 2005). RESEARCH IMPLEMENTATION GAP PRACTICE Why Focus on Implementation? Students cannot benefit from innovations they do not experience. (Fixsen) 17
Formula for Success WHAT: Effective Innovations WHO & HOW: Effective Implementation WHERE: Enabling Contexts WHY: Educationally Significant Outcomes 18
Formula for Success WHAT Effective Innovations Students cannot benefit from instruction they do not experience WHERE Enabling Contexts WHY Educationally Significant Outcomes 19
FFormula for Success WHAT: Effective Innovations WHO & HOW: Effective Implementation WHERE: Enabling Contexts WHY: Educationally Significant Outcomes 20
Innovations Investing in innovation Survey of 5,847 schools Avg. 9 innovations per school 7.8% had evidence to support effectiveness A drain on precious resources! US Department of Education, 2011 21
Formula for Success Formula for Success WHAT: Effective Innovations WHO & HOW: Effective Implementation WHERE: Enabling Contexts WHY: Educationally Significant Outcomes 22
Implementation mentation Longitudinal Studies of a Variety of Comprehensive School Reforms Evidencebased Actual Supports Years 1-3 Outcomes Years 4-5 Every Teacher Trained Fewer than 50% of the teachers received some training Fewer than 10% of the schools used the CSR as intended Every Teacher Continually Supported Fewer than 25% of those teachers received support Vast majority of students did not benefit Aladjem & Borman, 2006; Vernez, Karam, Mariano, & DeMartini, 2006 23
Effective Effective Implementation Best data show these methods, when used alone, Do not Result in Use of Innovations as Intended Diffusion/ Dissemination of information Training Passing laws/ mandates/ regulations Providing funding/ incentives Organization change/ reorganization 5 to 15% return on investment NECESSARY BUT NOT SUFFICIENT 24
Formula for Success WHAT: Effective Innovations WHO & HOW: Effective Implementation WHERE: Enabling Contexts WHY: Educationally Significant Outcomes 25
Our Current Context GOAL 26 Improved Outcomes for ALL Students
What DOES work? 27
Formula for Success WHAT: Effective Innovations WHO & HOW: Effective Implementation WHERE: Enabling Contexts WHY: Educationally Significant Outcomes 28
Making It Happen Active Implementation Frameworks EFFECTIVE & USABLE INTERVENTIONS What exactly are people saying and doing that makes things better for children and families? STAGES What steps lead to successful implementation? DRIVERS What critical supports are needed to make this change? What is the infrastructure? TEAMS Who takes responsibility for and helps guide the change process? IMPROVEMENT CYCLES How can we create more hospitable environments, efficiently solve problems and get better? Copyright Dean Fixsen and Karen Blase Usable & Usable Interventions Stages Drivers Teams Cycles 29
Implementation Drivers Common features of successful supports to help make full and effective use of a wide variety of innovations 30
Improved Student Outcomes Effective Instructional Practices & System Change Performance Assessment (Fidelity) Coaching Systems Intervention Implementation Drivers = Infrastructure Selection Training Integrated & Compensatory Facilitative Administration Decision Support Data System Technical Leadership Adaptive Fixsen & Blase, 2008 31
Leadership Different challenges call for different strategies (Heifetz & Laurie): Technical Strategies Adaptive Strategies Integrated & Compensatory Leadership Technical Adaptive Fixsen & Blase, 2008 32
Competency Drivers Performance Assessment (Fidelity) Coaching Training Selection Develop, improve and sustain competent and confident use of the innovation through Careful selection of new and lead staff Design and employ effective training plans Design and support coaching system Routine use of fidelity assessments to inform the process Fixsen & Blase, 2008 33
34 34 Competency Drivers Tools You Can Use Training Plan Template Coaching System Worksheet Coaching Service Delivery Plan Template
Improved Student Outcomes Effective Instructional Practices & System Change Performance Assessment (Fidelity) Coaching Systems Intervention Implementation Drivers = Infrastructure Selection Training Integrated & Compensatory Facilitative Administration Decision Support Data System Technical Leadership Adaptive Fixsen & Blase, 2008 35
Decision Support Data System (DSDS) 36
Best Practices of DSDS Data Elements 37
Best Practices of DSDS Data Processes: 38
Best Practices of DSDS Data Use: 39
Improved Student Outcomes Effective Instructional Practices & System Change Performance Assessment (Fidelity) Coaching Systems Intervention Implementation Drivers = Infrastructure Selection Training Integrated & Compensatory Facilitative Administration Decision Support Data System Technical Leadership Adaptive Fixsen & Blase, 2008 40
Example: MTSS Implementation Construct Measure Frequency/Focus Outcome (long & State Academic Assessments 3x a year to 1x short term) Benchmark Assessments year/student Universal Screening Progress Monitoring Discipline Data Fidelity of use of an School Wide Assessment of 1x year School innovation MTSS (SAM; Stockslager et al.) Observational Tool for Instructional Supports & Systems (OTISS; Fixsen et al.) 6x a year - Classroom Use of evidence- Building Drivers Best Practices 2x a year/ Building based Assessment (NIRN) Implementation team implementation District Capacity Assessment 2x a year District processes (NIRN) Implementation Team Scaling data: stage Stages of Implementation 1x a year 41 of implementation by Analysis: Where are We Now school (NIRN)
Implementation Informed Evaluations Summary: What type of data should be collected? Readiness and ability of participants to engage in the innovation The use of evidence-based implementation processes to support the use of a selected innovation Fidelity of use of an innovation for each practitioner Implementation climate How well defined and operationalized the innovation 42 is for use in typical service settings
Lessons Learned... Data Calendars are essential Who, what, when, where for each level of the unit Data flowing ASAP Exploration Stage: Use outcome data to determine need Use implementation data to determine fit, capacity, etc. Fidelity data flowing within 6 months of installation stage based work 43
Implementation?s What if we are lacking A well defined or operationalized innovation A fidelity assessment 44
Improvement Cycles Changing on purpose to support the new way of work PDSA Cycle Plan What did you intend to do? Do - Did you do it? Study What happened? Act What can be changed and improved? Cycle Continue until our goal is reached 45
Improvement Cycles The PDSA cycle logic applies to: Discovering/developing essential components of innovations Developing competency drivers Initiating organization change to accommodate/more precisely support the innovation and the implementation methods 46
Requires: Improvement Cycles Discipline to have a plan and stick to it Studying and acting (plan-. do, plan-. do, plan-. do = my colleague saying ) Repetition until the goal is reached or the problem is solved (my other colleague says - often thwarted, never stymied) 47
Improvement Cycles Rapid Cycle Problem Solving Rapid cycle, short-term, few people Initially identifying problems and solutions Usability Testing Medium cycle, medium-term, more people Testing the feasibility of solutions and developing administrative supports Practice Policy Communication Cycle Longer cycle, longer-term, layers of people Executive leadership and others with authority to change system units 48
Usability Testing A planned series of PDSA cycles to test an innovation, components of an innovation, or implementation processes. Makes use of a series of PDSA cycles to refine and improve the innovation elements or the implementation processes. Used proactively to test the feasibility and impact of a new way of work prior to rolling out the innovation or implementation processes more broadly Occurs during initial implementation of the process or procedure being testing 49
Usability Testing Process Use the PDSA cycle processes with small groups of 4 or 5 typical users. Plan to use OTISS Do Engage in training and conduct live co-observations Study Debrief as team and identify successes, changes and improvements needed on process of utilizing tool (training, introducing to principals and teachers, process of conducting observations, interrater reliability agreement, etc.) Act - Apply those changes to the next set of users. Cycle Go again with next set of users till goal is reached Repeat the process (PDSA) with the the next set of implementers 4 or 5 times. 50
Usability Testing Pre-Requisites Pre-requisites for usability testing: A defined it Prepared people Clear teaming structure Capacity to collect and report data 51
Why Usability Testing Is a very efficient way to develop, test, and refine: Innovation critical components Implementation supports Communication Training Data processes & usefulness of reports End user experience is improved much more by 4 tests with 5 users each than by a single test with 20 users IT IS NOT A PILOT! 52
Choosing Usability Targets Choose components, implementation supports, or data processes that are important to getting outcomes AND you are worried will not be executed well Logistically difficult to execute (setting, resources, competing activity, barriers) Outside the typical comfort zone or roles of the person doing or receiving And/Or? K. Blase 53
Usability Testing: Example Observational Tool for Instructional Supports & Practices (OTISS; Fixsen, et al. 2015) Goals & Selected Targets: Inform OTISS development Process Items Inform Implementation Supports Communication Training Data Process Scale to all of the state s instructional coaches across (50+) 54
Usability Testing: Testing Dimensions Limited number of cases or events within a given test Enough to sample with variability in order to detect systematic problems rather than individual challenges Staged to efficiently/rapidly get a sense of challenges Small enough number to give you quick early returns of data Metrics are both functional and practical to collect Quantitative and qualitative Information both useful 55
Usability Testing Plan Cohort 1: State team (N = 6) Cohort 2: Region Leads N = 8 Cohort 3: Selected Coaches (N = 8) & Region Leads (N = 2) Cohort 4: Coaches & Leads (N = 50) 56
Creating a Plan For Each Cycle Which Team or team members lead the effort? (e.g., state team of 5 leads) What s the scope of the usability test? (e.g., how many, how long, where, when, type of classrooms) What s the criteria for identifying who will be included in the test? (e.g., observers and sites) What s the criteria for needing another cycle or declaring it good enough? (e.g., 85% inter-observer agreement; 95% of steps on protocol followed) 57
Creating a Plan For Each Cycle What s the relevant measure/key outputs that can be quickly revealed? (e.g., opinions of teachers or principals, percentage of assessments done on schedule) Who will be responsible for reporting the data and on what schedule? (e.g., observers enter data at time of observation, debrief occurred within 24 hours, aggregated reporting with 5 days of observations) Who will be responsible for summarizing the data? (e.g., office staff aggregated reporting with 5 days of observations) Who will study the data and decide next steps? (e.g., state team) 58
Usability Testing Plan Cohort 1: Cohort 2: Cohort 3: State team (N = 6) Region Leads Selected Coaches N = 8 (N = 8) & Region Leads (N = 2) Cohort 1 Core Team Cohort 4: Coaches & Leads (N = 10) 59
Usability Testing Results Usability Testing resulted in: Changes to communication processes and tools with principals and instructional staff Changes to training materials Different videos, emphasis of training points, revision of slides, etc. Minor adjustments to observation protocol Revisions to operational definitions of items Minor adjustments to data collection processes Lessons learned re scheduling 60
Get Started, Get Better K. Blase 61 61
Usable Interventions Tools You Can Use Hexagon Tool Practice Profiles Usable Interventions 62
Improvement Cycles Tools You Can Use PDSA Worksheet Communication Protocol Worksheet 63
Want more information? To learn more about the teams and what we do: National Implementation Research Network: http://nirn.fpg.unc.edu SISEP center: http://sisep.fpg.unc.edu To learn more about implementation science via our online learning platform: Active Implementation Hub: http://implementation.fpg.unc.edu 64
http://implementation.fpg.unc.edu 65
Questions? 66
Get Connected! www.scalingup.org SISEP @SISEPcenter For more on Implementation Science http://nirn.fpg.unc.edu www.globalimplementation.org 67
For More Information Allison Metz Allison.metz@unc.edu Caryn Ward Caryn.ward@unc.edu Frank Porter Graham Child Development Institute University of North Carolina Chapel Hill, NC http://nirn.fpg.unc.edu/ www.scalingup.org www.globalimplementation.org 68
Implementation Science Implementation Research: A Synthesis of the Literature Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M. & Wallace, F. (2005). Implementation Research: A Synthesis of the Literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231). HTTP://NIRN.FPG.UNC.EDU 69
Copyright Dean Fixsen and Karen Blase This content is licensed under Creative Commons license CC BY-NC-ND, Attribution-NonCommercial-NoDerivs. You are free to share, copy, distribute and transmit the work under the following conditions: Attribution You must attribute the work in the manner specified by the author or licensor (but not in any way that suggests that they endorse you or your use of the work); Noncommercial You may not use this work for commercial purposes; No Derivative Works You may not alter or transform this work. Any of the above conditions can be waived if you get permission from the copyright holder. http://creativecommons.org/licenses/by-nc-nd/3.0 70
Your input is very important to us! https://www.surveymonkey.com/r/jwnsv36 71