1 Demystifying Data Collection ANA WEBINAR SERIES-MAY 21, 2015
2 acf.hhs.gov/ana HelpDesk (877)
3 Eastern Region Training & Technical Assistance Center 1 (888) A Resource of ANA
4 For Your Information Webinar is being recorded Materials are available You are Muted Ask Questions! Select the Hand Button on the control panel on the right side of your screen
5 Data Collection Introduction
6 Melina Salvador Tribal Home Visiting Evaluation Institute James Bell Associates
7 Accessing and Collecting Data for Effective Tribal Services: Tribal MIECHV Grantees Experiences with Performance Measurement Data Melina Salvador, MA Tribal Home Visiting Evaluation Institute James Bell Associates
8 Overview Tribal Maternal, Infant, and Early Childhood Home Visiting program Tribal Home Visiting Evaluation Institute What can we learn from Tribal MIECHV Grantees about accessing and collecting data? What has worked What has been challenging How are grantees using data How are grantees engaging their communities Implications for other programs that serve Tribal communities Port Gamble S Klallam Tribe
9 Tribal Maternal, Infant and Early Childhood Home Visiting Program Administered by ACF in cooperation with HRSA Funded through Affordable Care Act (ACA), MIECHV includes 3% set aside for tribal program 25 cooperative agreements awarded to Tribes, Tribal consortia, Tribal organizations and urban Indian organizations 5-year grants that begin with a needs assessment and a planning year 3 cohorts: 13 in FY 2010, 6 awarded in FY 2011, 6 awarded in FY 2012 Grantees must report to ACF on performance measures, conduct rigorous evaluation and engage in continuous quality improvement activities
10 Tribal MIECHV Program Goals 1. Supporting the development of healthy, happy, and successful AIAN children and families 2. Implementing high-quality, culturally-relevant, evidencebased home visiting programs in AIAN communities 3. Expanding the evidence base around home visiting interventions for Native populations 4. Supporting and strengthening cooperation and coordination and promoting linkages among various early childhood programs, resulting in coordinated, comprehensive early childhood systems 10
11 Tribal Home Visiting Evaluation Institute (TEI) Provides Technical Assistance on: Tracking and reporting on benchmarks (i.e., performance measures) Rigorous evaluation Data systems Continuous Quality Improvement Ethical dissemination and knowledge translation Taos Pueblo
12 TEI Partners: Technical Assistance Providers James Bell Associates, Inc. Johns Hopkins Bloomberg School of Public Health, Center for American Indian Health University of Colorado School of Public Health, Centers for American Indian and Alaska Native Mental Health MDRC Federal Partners Office of Planning, Research and Evaluation Administration for Children and Families, Office of the Assistant Secretary for Early Childhood Development Office of Child Care
13 Tribal MIECHV Data Collection 13
14 Data Requirements: Needs Assessment Needs Assessment Understand Community Health and Well-being Key part of planning year Grantees gather quantitative and qualitative data on key community health and well-being indicators Needs Assessment findings inform home visiting model selection Grantees use the process as a community engagement effort 14
15 Data Requirements: Performance Benchmarks Benchmarks - Demonstrate Performance Improvement Over Time Legislatively mandated Grantees develop their own performance measures and indicators No client level data reported 37 constructs TEI helps grantees develop a benchmark plan, prepare for and conduct data collection, and report data to ACF; data systems and data management are also TA topics 15
16 Data Requirements: Rigorous Evaluation Rigorous Evaluation- Answer a Focused Evaluation Question Using Rigorous Methods Grantees select question using community engaged approach Use rigorous design to answer question Focus on program impact, adaptations, or implementation strategy TEI helps grantees develop an evaluation question and design using the PICO approach, provide TA on developing IRB protocol, analysis and ethical dissemination of results 16
17 Data Requirements: Continuous Quality Improvement Continuous Quality Improvement Use Data to Identify and Test Changes to Improve Program Grantees select CQI topic like screening rates, family retention, breastfeeding initiation Use benchmark or other program data in a collaborative process to make data-driven improvements to program TEI assist grantees in preparing for and conducting P-D-S-A cycles 17
18 Key difference is how the data are used and the comparisons made Locally defined Benchmarks Grantees make decisions about the data to collect based on community priorities TEI provides TA on all 3 activities Continuous Quality Improvement Local Rigorous Evaluation The same data can be used for multiple purposes
19 How are Grantees Accessing Data? Families (self-report) Demographic information Service use (i.e. when was your last prenatal exam?) Screenings (depression, substance abuse, development, domestic violence) Partnerships (administrative records) Head Start Child Care Health Clinics Child Welfare Programs State Partners Home Visitors (documentation and observation) Visits completed/missed Observational assessment of parents/children Process information (referrals, information provided, etc.)
20 Developing and implementing a performance monitoring system (benchmarks) Grantees develop and operationalize performance measures that correspond to 37 Federally mandated benchmark constructs Grantees select appropriate measures that correspond to community priorities and provide useful data for continuous quality improvement Iterative process of collaborating on benchmark plan as a program team, engaging with community, and working with TEI Consult with community Benchmark Review Call Benchmark Plan Draft Receive written feedback Send to TEI Liaison TEI develops Feedback
21 Rigorous evaluation Grantees develop an evaluation question that is important to the community and will contribute to the knowledge base Grantees select an evaluation design that is both rigorous and acceptable to the community Grantees are encouraged to narrow the focus of the evaluations: Measure small set of outcomes Examine component of HV program Focus on implementation strategy (e.g., recruitment, retention) Evaluate enhancement or adaptation
22 Evaluations Tailored to Community and Cultural Context Federal guidance allows for individualized as opposed to prescriptive approach respects tribal sovereignty and diversity of communities Flexibility to define evaluation question has lead many grantees to examine cultural enhancements to home visiting models Flexibility to define performance measures results in benchmark plans that reflect community context
23 Evaluation Approach All knowledge will be generated through local evaluations No cross-site evaluation Evaluation questions are developed by grantees in consultation with their community to reflect local interests & priorities Evaluation questions are informed by findings of needs assessment and connected to implementation decisions Tribal ownership of evaluation process, data and dissemination is respected IRB and Tribal approval is required 23
24 Evaluation Approach cont. Evaluations can be limited in size and scope A focused question is answered with rigorous design and methods Flexibility to focus evaluation on a component of home visiting Evaluations will inform grantees, communities, and the field about what works in implementing home visiting in Tribal communities Intensive technical assistance is provided to increase Tribal capacity and empowerment to conduct different types of evaluation 24
25 Lessons from Tribal MIECHV Grantees
26 Data Planning Supports Community Programs Grantees have used requirements to advance communitywide discussions about overall data needs Conversations took place with Head Start and Early Head Start to agree upon a way in which the programs could collaborate so that families were not receiving multiple screenings at the same time points. Grantees examined and improved program processes through benchmark planning The benchmark process required us to get concrete about program processes and outcomes.
27 Locally Defined Data: Meaningful, Feasible and Capacity Building Grantees tailored measures to community Increased Accuracy The benchmark data we do collect will be more accurate because our manner of collection fits our process and our program. This is an important way in which the individuality of each tribe and program was honored.
28 Grantees had to be thoughtful about balancing service provision and data collection needs Increased Feasibility Selection of data collection tools to be utilized for each benchmark construct was determined and based on the feasibility of collecting meaningful data without undue burden to the client. For example, we stayed away from lengthy or unwieldy tools even if they were validated and considered reliable. White Earth Nation
29 The planning process facilitated individual skill and organizational capacity building Capacity Building The process allowed team [members] to expand their personal skills as well as overall organizational capacity in research and development of the appropriate constructs for a benchmark plan... As a result this has added to their ability to serve the community and to also carry out future work.
30 Engaging Community Members is Data Collection Plan Drawing on information from the community needs assessment We repeatedly referred to needs assessment results Community desire to focus on family assets and strengths, as well as parent involvement/engagement was a key factor in the choice to use [certain] measures. Community Advisory Groups We went through each question with the council, discussing order of questions, purpose, wording, rephrasing, and what data the council wanted us to collect. 30
31 Grantees engaged community members in the review of potential measures The team refined the list of potential measures, and the evaluator put together a plan describing how the measures could be administered. This plan was presented to key representatives from the Task Force as part of a benchmark working group meeting. The list was further refined based on feedback, and then the plan was presented to the entire Task Force. After we received approval from the Task Force, it was submitted for approval to Federal partners. 31
32 Using The Data Grantees plan to use benchmark data for program monitoring and improvement We will meet monthly to discuss how the program is doing as reflected by the benchmark plan and create opportunities to reflect and make changes as needed... Grantees are beginning to translate data for use with Tribal government, partner organizations, and community groups This information will be valuable to the entire tribal community when looking at health and parenting issues. Eventually, the team plans to share the information with other tribal departments and the tribal leadership for future planning and program development.
33 High Quality Locally Relevant Data Collection: A Lot of Work but Worth It
34 Grantees have needed to assess measures for utility and cultural relevance Choosing an appropriate tool was the most challenging part of developing the plan Having a cohort one year ahead of us in the process helped us have examples of grantees using various surveys to talk to. We also read many articles about assessments and utilized the compendium of measures to find the appropriate measures.
35 Building a performance measurement system required time, resources, and considerable technical assistance At times, it appeared that a better process might have been to have an electronic template, with various [drop] down choices under each benchmark area that already met appropriate language requirements and standards from which the local team could make a selection. On the one hand this would have been quicker but individuals involved might not have learned as much about all areas.
36 Evaluation Resources The Program Manager s Guide to Evaluation. Administration for Children and Families, U.S., Department of Health and Human Services. (2 nd Ed. 2010) Indigenous Evaluation Framework: Telling Our Story in Our Place and Time (2009) W.K. Kellogg Foundation Evaluation Handbook (2004)
37 Takeaways for other tribal programs Strong partnerships facilitate the ability to have strong data Accessing and collecting helpful data relies on an understanding of: program processes community needs current infrastructure grant requirements 37
38 Takeaways for other tribal programs Accessing and collecting data can be challenging Context of unfamiliarity and/or mistrust of data collection Burden on staff and families Requires (sometimes costly) technology and expertise Having access to useful, accurate data can be beneficial Program improvement Improved collaboration Sustainability
40 Thank You!! Melina Salvador, MA Research Associate James Bell Associates, Inc. Phone:
41 THANK YOU FOR JOINING US Eastern Region Training & Technical Assistance Center (888)