WHAT IS QUALITY DATA FOR PROGRAMS SERVING INFANTS AND TODDLERS?



Similar documents
MEASURING WHAT MATTERS: USING DATA TO SUPPORT FAMILY PROGRESS OVERVIEW

MEASURING WHAT MATTERS: USING DATA TO SUPPORT FAMILY PROGRESS OVERVIEW

How To Support Active Supervision In Head Start

Leadership, Governance and Management Systems

Department of Family and Support Services City of Chicago EARLY CHILDHOOD DEVELOPMENTAL SCREENING PROCESS

University Child Care Centre EXECUTIVE SUMMARY

Demystifying Data Collection ANA WEBINAR SERIES-MAY 21, 2015

Step 6: Report Your PHDS-PLUS Findings to Stimulate System Change

Strategic Plan Roadmap

Child Care Subsidy Case Management Services

The Family Services Manager s Handbook

The South Central Action Program - Position Description and Training Requirements

SIDE-BY-SIDE COMPARISON OF FEDERAL AND STATE REQUIREMENTS FOR EARLY CHILDHOOD EDUCATION SERVICES

THE HEAD START PARENT, FAMILY, AND COMMUNITY ENGAGEMENT FRAMEWORK PROMOTING FAMILY ENGAGEMENT AND SCHOOL READINESS, FROM PRENATAL TO AGE 8

NEEDHAM PUBLIC SCHOOLS Needham, Massachusetts EVALUATION PROCESS: NURSES

Academic Program Review Handbook

PRO-NET. A Publication of Building Professional Development Partnerships for Adult Educators Project. April 2001

Early Childhood Indicators Report

Report of External Evaluation and Review

Finding the Right People for Your Program Evaluation Team: Evaluator and Planning Team Job Descriptions

Standards for Certification in Early Childhood Education [ ]

After School & Out of School Time Programs QRIS Application & Self Assessment

PSYCHOLOGY 592B. School Psychology Practicum, Three Credits. Rationale: The Psychology 592B Practicum, second in the School Psychology

CLINICAL RESEARCH GENERIC TASK DESCRIPTIONS

Putting the Head Start Parent, Family, and Community Engagement Framework to Work in Your Program: Integrating Strategies for Program Progress (ISPP)

KANSAS DEPARTMENT OF HEALTH AND ENVIRONMENT DIVISION OF PUBLIC HEALTH BUREAU OF FAMILY HEALTH

Evaluation: Designs and Approaches

EARLY CHILDHOOD EDUCATION

WV Birth to Three. Comprehensive System of Personnel Development CORE COMPETENCIES FOR EARLY INTERVENTION AND SERVICE COORDINATION SPECIALISTS

GUIDELINES FOR THE IEP TEAM DATA COLLECTION &

WIC NUTRITION ASSISTANT CERTIFICATION GUIDANCE FOR CANDIDATES & REVIEWERS

CenteringParenting, a unique group post-partum care and social-support model, is ready for

Introduction to Data Analysis Handbook

Early Intervention Specialists

Program Outcomes and Assessment

Competencies. The Children s Program Administrator Credential of NewYork State. Topic 1: Administering Children s Programs

Early Childhood Education Program. Field Placement 1. Learning Outcomes Feedback Form

Program Administrator Definition and Competencies

Alberta Health. Primary Health Care Evaluation Framework. Primary Health Care Branch. November 2013

Lac du Flambeau Tribal Early Childhood Education Program for Expectant Families, Infants, Toddlers and Preschoolers

How To Assess A Child

OKLAHOMA: EARLY HEAD START INITIATIVE

13. Competency/Outcome: Students will learn to serve as an advocate on behalf of young children and their families. (ESPB Standard


The Ecumenical Board of Theological Studies - A Review

Continuing The Journey. Best Practices. in Early Childhood Transition. A Guide for Families ELEMENTARY AND SECONDARY EDUCATION

Report of Results and Analysis of Parent Survey Data Collected in Southern West Virginia

Helping children develop to their full potential. Early Head Start Head Start Raising A Reader

Child Development Associate (CDA) Credential and Process

EARLY CHILDHOOD EDUCATION

erving Head Start s Diverse Children and Families What Is the Law? What Are the Regulations?

SOCIAL SERVICE SPECIALIST

LifeCubby Aligns to FLORIDA's Early Learning and Development Standards

School Psychology Program Goals, Objectives, & Competencies

Attn: Desk Officer for the Administration for Children and Families. From: Lynette Fraga, Executive Director, Child Care Aware of America

NUNAVUT. EDUCATION CAREERS Early Childhood Education PROGRAM REPORT. 171 Early Childhood Education DIPLOMA

HEAD START PERFORMANCE STANDARDS W/ MENTAL HEALTH FOCUS

TENNESSEE STATE BOARD OF EDUCATION

Early Childhood Authentic Assessment

South Carolina Early Care and Education Technical Assistance System

Chapter 13: Transition and Interagency Agreements

M D R w w w. s c h o o l d a t a. c o m

School Support System Report and Support Plan. Compass Charter School. October 17-18, 2012

Ontario Leadership Strategy. Leadership Succession Planning and Talent Development Ministry Expectations and Implementation Continuum

2013 Marzano School Leader Evaluation Model Rubric

Topic #4: Goals, Objectives, Outcomes, Progress, and Action Plans Program Examples

Program Quality Self-Assessment (QSA) Tool Planning for Ongoing Program Improvement

Organizational Capacity Assessment for Community-Based Organizations. New Partners Initiative Technical Assistance (NuPITA) Project

GEORGIA STANDARDS FOR THE APPROVAL OF PROFESSIONAL EDUCATION UNITS AND EDUCATOR PREPARATION PROGRAMS

Care Coordination and Medicaid Managed Care:

Education Specialist. Early Childhood Certificate

Pennsylvania Big Ideas Framework and Individual Professional Development Plan for Early Childhood & School-Age Professionals

Northwestern Michigan College Supervisor: Employee: Department: Office of Research, Planning and Effectiveness

CHILDREN S PROGRAM ADMINISTRATOR CREDENTIAL

Logic Model for SECCS Grant Program: The Utah Early Childhood Comprehensive Systems (ECCS) Statewide Plan/Kids Link INTERVENTION

H5W1 04 (SCDCCLD 0407) Lead Curriculum Provision of Early Education for Children

Transcription:

WHAT IS QUALITY DATA FOR PROGRAMS SERVING INFANTS AND TODDLERS? Head Start programs serving infants and toddlers collect data lots of data Children develop rapidly during the first three years of life. Families needs change just as rapidly. To ensure that programs are responsive to children s and families evolving needs and that children and families are adequately supported in reaching their goals, staff collect and track a substantial amount of information (or data ). These data are used to inform program planning and decision making at the child and program levels. (See 1304.51(a)(1) and (2) and 1307.3(b)(2)(i) and (ii).) But is it quality data and why is that important? With the passage of the Improving Head Start for School Readiness Act of 2007 (or Head Start Act ), all Head Start programs, including those that serve infants and toddlers, have been asked to shift toward a more data-driven decision making culture. In other words, programs are expected to use data in even more meaningful ways to plan and make decisions. This involves using a combination of qualitative data (information from sources such as interviews, openended questionnaire items, and focus groups that is represented in verbal or narrative form or anecdotes stories that are compiled to represent particular points) and quantitative data (data that are expressed in numerical terms). It also involves integrating the use of data and data analysis in planning systems to track child progress and improve overall services to infants, toddlers, and their families, including pregnant women/expectant families. To make the most effective and meaningful decisions and improvement plans, programs need quality data. Quality data provide a foundation for sound decision making and play a critical role in providing objective information for assessing child progress as well as identifying program successes and challenges. When used effectively, quality data can provide programs with compelling information for improving services to very young children and their families and to document and share their success stories. Examples of Data Collected by Programs Developmental screenings and ongoing assessments of child progress, including progress toward school readiness goals and early intervention outcomes for infants and toddlers with disabilities Home visit and group care quality Child/family demographics (including pregnant women/expectant families) Family Partnership Agreement goals and families progress toward achieving them Staff qualifications and performance appraisals Attendance (child, staff, family) and length of time in program Pregnant mother, child, and family health (including physical, nutrition, oral, and mental) Safety checks (e.g., indoor/outdoor environments, buses used to transport children, fire or other drills) Community resources (e.g., through community assessments and partnerships with community resources) Family referrals to and use of community resources Program self-assessment results and federal monitoring reports Finance/budgets Migrant and Seasonal Head Start Technical Assistance Center, Introduction to Data Analysis Handbook, 7. Ibid., 8.

WHAT MAKES DATA QUALITY? There are many definitions of quality data. In this information sheet, we identify six quality data characteristics: relevant, timely, accurate, complete, valid, and reliable. These characteristics complement each other and build a picture of data that is useful in planning. CHARACTERISTIC EXAMPLES RELEVANT Relevant data is information that is connected to the reason it is being collected. In other words, there must be an appropriate purpose for collecting the data; the data should be connected to questions about how well the program is supporting infants, toddlers, and families, including expectant families, and to a program s analysis and decision-making processes. Programs have an abundance of data available to them, so it is important to identify which data are most relevant and useful for which questions to determine effectiveness in enhancing quality practices. At the child level, teachers, home visitors, and family child care providers may want to know how each child s receptive and expressive language skills are developing (language and literacy domain); the information staff collect should be specific to these aspects of language development. At the program level, a grantee may want to know how group care or home visit quality might impact child outcome data on language development; the grantee would use group care and home visit quality tools and track ratings or scores that relate to how adults listen and talk to children. TIMELY Current data are important in order to lend credibility to the program s process of data analysis and decision making. Data should be captured as quickly as possible after the activity and made available for use in program improvement. Web-based management information systems (MIS enable programs to capture and share real-time information. Health care information, such as the number of children who are up to date on immunizations and well-baby check-ups, once entered into the MIS, can be quickly shared and reported to key staff in the organization. This improves the program s ability to ensure children receive timely health care. Program Information Report (PIR): PIR data on health care services from previous years are valuable when looking at trends and assessing improvement. However, if a program does not have ready access to its current-year data, it will not be able to fully assess its current reality relative to progress. Timely collection of child assessment data: Because infants and toddlers grow so rapidly, having current information about children s development is critical for providing an educational program that is truly tailored to their interests, needs, and abilities. Appropriately supporting a six-month-old means having assessment data showing where the child is developmentally at six months, not where the child was at four months.

CHARACTERISTIC (CON T) EXAMPLES (CON T) ACCURATE Data are correct (free from error) for the desired purpose, clear, and in adequate detail. Accurate data represent real situations. And, timely data are more likely to be accurate. Staff who work directly with infants, toddlers, and families and conduct observations write observation notes as one of their data-gathering methods. To be accurate, written observation notes should reflect only facts, capture events in the order they occur, and include details such as time of day, location, how long child engages in play or with a particular object or person, and routines and experiences during which the observation occurs. This information should be captured either during the observation or as close to the actual observation time as possible. COMPLETE A program s data collection system should be monitored regularly to ensure that all required pieces of information (or data elements) are there. Missing information and incomplete records can adversely impact a program s effectiveness in evaluating the strengths of the organization as well as discovering the most important issues to address in improving services. While some data might be missing because of timing (e.g., children and families enrolling at different times during a program year, new staff hires), programs should still aim to have all required data elements. If some staff are not collecting and recording information regarding children s progress toward school readiness goals in the physical development and health domain, the program will not have a complete picture of child progress across all the domains. Staff files: If files do not contain all the necessary documentation on staff degree attainment, the program will not be able to assess compliance with related staff qualification requirements. WHAT ABOUT VALID AND RELIABLE DATA? Valid and reliable data can come from valid and reliable tools. In the early childhood field, the terms valid and reliable (or validity and reliability) are typically associated with tools for screening and ongoing child assessment; assessing parenting, the home environment, and parent well-being; and measuring program implementation and quality. Validity and reliability are important to data quality because they ensure that the tool: measures what it was intended to measure (validity); and provides dependable and consistent information (reliability). For programs serving infants and toddlers, using tools that are valid and reliable, along with using them in the prescribed manner and for the purpose they were developed, ensure the information the tools provide is meaningful and trustworthy. There are different types of validity and reliability. See Section 3 of Resources for Measuring Outcomes in Head Start Programs Serving Infants and Toddlers from the Office of Planning, Research and Evaluation to learn more about them. Office of Planning, Research and Evaluation, Resources for Measuring Outcomes in Head Start Programs Serving Infants and Toddlers Section 3: Information Included for Each Instrument.

QUESTIONS TO CONSIDER IN MANAGING QUALITY DATA As mentioned above, programs serving infants and toddlers collect and use multiple sources of data to direct continuous improvement activities at the child level; and create effective and meaningful program improvement and training and technical assistance plans that strengthen the program s foundation, support program excellence, and lead to improved outcomes for very young children and their families. As part of this important work, programs need to think about how to manage the data they have collected and plan systems to track child progress and improve overall program services. The charts below provide some questions to help programs get started in thinking about managing data and planning systems. MANAGING DATA THOUGHT PROMPTERS Some questions to get started WHAT ARE WE DOING WELL? WHAT DO WE NEED TO STRENGTHEN? Data Collection (general) What data do we need to collect to assess child development and learning? What other data would be helpful to collect to determine program strengths and areas needing attention? Is the staff aware of the purpose for collecting data and how data will be used? Who collects the data? Who inputs the data? Do staff who enter data receive adequate training and oversight? Do we regularly check for data accuracy and completeness and correct problems in a timely manner? Child Assessment Do we use a variety of methods to help us gather data on child progress, including information from families? Do we understand the difference between screening and assessment? Are we confident that our assessment tools can effectively measure the growth and development of our infants and toddlers?

MANAGING DATA THOUGHT PROMPTERS Some questions to get started WHAT ARE WE DOING WELL? WHAT DO WE NEED TO STRENGTHEN? Child Assessment Can the data from our assessment system be easily aggregated into groups of children, e.g., by age, language, program option, etc.? Are staff adequately trained in how to assess children? Management Information System (MIS) How up to date is our computer hardware? Can it support our data collection and tracking efforts? Do we have a plan for providing computer access to essential staff and regularly updating computer hardware and other equipment? How effective is our software or web-based system? Can our system generate reports that can be easily customized to provide information in a way that is meaningful to our stakeholders? Can reports be generated in real time? Do we understand the reports? What is our expertise in managing the hardware, software, and/or web-based system? Have we designated someone with the necessary expertise to oversee this system? What support (financial, human, logistical, and technological) do we need to strengthen this system?

MANAGING SYSTEMS THOUGHT PROMPTERS Some questions to get started WHAT ARE WE DOING WELL? WHAT DO WE NEED TO STRENGTHEN? Planning Are multiple sources of data gathered and analyzed, including data from the community assessment and annual self-assessment findings, to develop program goals and objectives? Do the Policy Council and governing board have regular opportunities to review program data, including children s progress toward school readiness goals? Management Information System (MIS) How do we inform our staff, families, policy committee/council, governing board, and community about our data? Does our communication system include regular opportunities to obtain feedback from staff, families, policy committee/council, governing board, and community? Are data collected throughout the program s operating period? Are data recorded and reported in a timely manner? Do we assess child progress on an ongoing basis? Do we aggregate and analyze child assessment data multiple times during the program s operating period? Do we aggregate and analyze other program data multiple times during the program s operating period? Does our ongoing monitoring system include regular opportunities to analyze data and use that information to make course corrections and revise plans? Self-Assessment Do we analyze findings (data) from our annual self-assessment? Do we use data from our self-assessment to determine program effectiveness and progress toward meeting program goals? Human Resources Do we have a plan for training staff on using our data systems, including training for new staff and ongoing refresher training? What data do we need/use to develop professional development plans for staff? Fiscal Management Are data used to review financial priorities? Eligibility, Recruitment, Selection, Enrollment, and Attendance Do we use data from our community assessment to develop recruitment strategies and eligibility/selection criteria? Do we track children s attendance in centers and family child care homes, and child/family attendance in home visits and socializations? Do we analyze causes and patterns of chronic absenteeism and provide appropriate child/family supports as needed?

References: Migrant and Seasonal Head Start Technical Assistance Center. Introduction to Data Analysis Handbook. Washington, DC: Academy of Educational Development, 2006. Accessed March 5, 2013. http://ece.aed.org/publications/mshs/dataanalysis/webdataanalysis.pdf. Office of Planning, Research and Evaluation. Resources for Measuring Outcomes in Head Start Programs Serving Infants and Toddlers. Washington, DC: U.S. Department of Health and Human Services/Administration for Children and Families/Office of Planning, Research and Evaluation, 2011. http://archive.acf.hhs.gov/programs/opre/ehs/perf_measures/reports/resources_measuring/res_meas_title.html. Resources: Horsch, Karen. Indicators: Definition and Use in a Results-Based Accountability System. Boston, MA: Harvard Family Research Project, 1997. http://www.hfrp.org/publications-resources/browse-our-publications/indicators-definition-and-use-in-a-resultsbased-accountability-system. Office of Head Start. School Readiness Action Steps for Infants and Toddlers. Washington, DC: U.S. Department of Health and Human Services/Administration for Children and Families/Office of Head Start, 2012. http://eclkc.ohs.acf.hhs.gov/hslc/tta-system/ehsnrc/early%20head%20start/early-learning/curriculum/schoolreadiness.htm. Office of Head Start. School Readiness Goals for Infants and Toddlers in Head Start and Early Head Start Programs: Examples from the Early Head Start National Resource Center. Washington, DC: U.S. Department of Health and Human Services/Administration for Children and Families/Office of Head Start, 2012. http://eclkc.ohs.acf.hhs.gov/hslc/ttasystem/ehsnrc/early%20head%20start/early-learning/curriculum/school-readiness-goals-infants-toddlers.pdf. UMass Donahue Institute and Administration for Children and Families. Setting the Stage for Data Analysis: Assessing Program Strengths and Risks. Boston, MA: University of Massachusetts Donahue Institute, January 2007. http://www.donahue.umassp.edu/publications/assess-data-analysis.