THE YOUTH COMMUNITY DATA CENTER Spring 2014
|
|
|
- Nathan Preston Powers
- 10 years ago
- Views:
Transcription
1 THE YOUTH COMMUNITY DATA CENTER Trusted Guidance for Doing Good.
2 Acknowledgements This report was prepared by the Dorothy A. Johnson Center s Community Research Institute. We would like to thank the key members of the team responsible for this report. Joshua Church, M.I.S. Web Developer Community Research Institute Julie Strominger, M.S. Research Coordinator Community Research Institute Amber Erickson, M.S.W. Research Manager Community Research Institute T.J. Maciak, M.S. Senior Programmer Community Research Institute Jodi Petersen, Ph.D. Senior Researcher Community Research Institute Lisa Venema Project Manager Community Research Institute Eleibny Feliz Santana, M.Sc. Database and System Administrator Community Research Institute Dorothy A. Johnson Center for Philanthropy Grand Valley State University
3 Table of Contents The Youth Community Data Center: Description, Features, and Function...4 Process Summary: Building YCDC & Collecting Data....6 YCDC Data Analysis Capabilities...11 Analysis with Pilot Data Plan for YCDC Scale Up Appendix A: Indicator Definitions...16 Appendix B: Flow Chart Diagram...17 Appendix C: YCDC Scale Up Cost Estimate...18 Dorothy A. Johnson Center for Philanthropy Grand Valley State University
4 The Youth Community Data Center: Description, Features, & Function The Youth Community Data Center (YCDC) is an integrated data system designed for use by Expanded Learning Opportunities Network (ELO) members. The need for YCDC became clear after several years of research on the data needs and goals of ELO members. Programs reported they wanted to be able to use their own data systems for multiple needs and requests. They wanted a system that would allow them to continue their own data collection while contributing to the ELO Network s quality improvement and outcome measurements. The goals of an integrated data system are 1.) align shared indicators across diverse out of school time (OST) programs to see the combined impact on youth outcomes; and 2.) share the overall impact of out of school time programs on youth outcomes with the community. This is a pilot project meant to test the feasibility of a full-scale data system utilized by all ELO members. Two school sites agreed to participate in the pilot: Harrison Park Elementary and Martin Luther King Jr. Leadership Academy. The OST programs included between these two sites are Kent School Services Network (KSSN); The Gerontology Network; Creative Youth Center; Boy Scouts of America; and the athletic programs at the two schools. YCDC has a public interface accessible to anyone in the community. The features and functions of the public YCDC are described below: YCDC home page: displays ELO media coverage and provides a program spotlight to highlight an individual program s services. Program Directory: contains the following information on all programs registered with the data system: Name, Number of sites, Site location, Website, Phone, , Day/times of programming, Schools served, Grades served, and Topics served. Dorothy A. Johnson Center for Philanthropy Grand Valley State University
5 This is a searchable directory, meaning that interested users can search for programs based on any of the above criteria. Individual programs complete this information when they register with YCDC and can update it at any time. The objective is to serve the needs of community members, program providers, and funders through a current, comprehensive, and interactive directory that can be used as a reference for out of school time programs in the region. Locator Map: contains geographic points for all registered ELO programs. The map will also offer a search function, so that users can enter an address and locate programs within a particular distance. The locator map is intended to provide increased access to and information on out of school time programs for the community. ELO Network Progress: a dashboard showing aggregate data for all youth in all ELO programs. Additionally, there will be comparison data using youth in the school system who are not attending any ELO programs. Currently, the data elements included are: number of unique youth served; percent of youth reaching a program s attendance benchmark; percent of youth with chronic absence and satisfactory absence; percent of youth meeting grade level for math and reading Measures of Academic Progress (MAP) scores; percent of youth arrested within the time frame of OST programming; average program quality score; and average social emotional development score. This dashboard meets YCDC s goal of sharing with the community the overall impact of out of school time programs on youth outcomes. (See Appendix A for a definition of each indicator). Links: lists hyperlinks to relevant organizations (e.g. United Way 211, all ELO program websites). In addition to the above features, ELO members who register with YCDC will have access to other functions within the system. These functions will vary by the role of the user, as described below: Individual program user: will upload individual program data into the YCDC portal and will be able to access their program s individual and aggregate level data through program reports. Data should be updated quarterly. Multi-site program user: will upload individual program data for each program site into the YCDC portal and will be able to access their individual and aggregate level data for their overall program and each individual site through program reports. Data should be updated quarterly. ELO administrator user: will have access to aggregate program reports for programs that opt to share their information. Dorothy A. Johnson Center for Philanthropy Grand Valley State University
6 Programs will upload their data on a quarterly basis using the Data Uploader. The Data Uploader allows individual programs and multi-site programs to upload their data in a comma separated values (csv) format via the YCDC interface. Program data will be merged with data from Grand Rapids Public Schools (GRPS) and the Grand Rapids Police Department (GRPD). The data will be used to create the ELO Network Progress dashboard and program reports. Specifics on these reports are described below: ELO Network Progress dashboard: links individual youth data across all OST programs and shows aggregate data for the following indicators: number of unique youth served; percent of youth with chronic absence and satisfactory absence; percent of youth reaching a program s attendance benchmark; percent of youth meeting grade level for math and reading MAP scores; percent of youth arrested within the time frame of OST programming; average program quality score; and average social emotional development score. Individual program report: program-specific, aggregate level data will include: number of participants; average days attended last quarter; percent of youth reaching the program s attendance benchmark; percent of youth with chronic absence and satisfactory absence; percent of youth meeting grade level for math and reading MAP scores; percent of youth arrested within the time frame of the program; average program quality score; and average social emotional development score. The ELO Network Progress dashboard will also be presented for comparison. Individual-level data for program attendance and social emotional development will be available for each program participant. Multi-site program report: aggregate data will include program-specific data, site-specific data, and overall ELO data. For the overall program, and for each individual site, data will include: number of participants; average days attended last quarter; percent of youth reaching the program s attendance benchmark; percent of youth with chronic absence and satisfactory absence; percent of youth meeting grade level for math and reading MAP scores; percent of youth arrested within the time frame of the program; average program quality score; and average social emotional development score. Individual-level data for program attendance and social emotional development will be available for each program participant. Process Summary: Building YCDC & Collecting Data Building an integrated data center and collecting useful data from community programs necessitated a months long process of facilitation and negotiation. Many key lessons emerged from this process, which can be used to further improve the system as it scales up to include all ELO members, as well as to inform other initiatives and/or communities that are interested in integrated data and collective impact. Dorothy A. Johnson Center for Philanthropy Grand Valley State University
7 The idea for an integrated data system stemmed from focus groups held with ELO members. Program providers voiced concerns regarding the original idea of using a new data system requiring additional data entry. The main themes that emerged from the focus groups were 1.) providers did not want to enter data into another data system; they already entered data into multiple systems due to multiple sources of funding; 2.) providers would only be interested in a new data system if they were to get more out of it than what they were putting into it; and 3.) providers did not want to have to collect data in a uniform way across the network or have to make many changes in the way they already collect data. Therefore, the concept behind YCDC was to build a system that allows members to upload their data as is from their current data system, where it would be linked to outside datasets that would provide them with additional information that they do not have, such as school attendance and juvenile offenses. Additionally, the web-based portal is easily accessible and has a user-friendly interface. Building an online portal and integrated data system began with determining the basic functionality of the system. It was decided that the structure of the portal had to allow for different capabilities based on the roles of those who would be using the portal. Therefore, users must register with the system and provide a unique log in name/password and identify their role within the site. This also ensures the security and confidentiality of the program data entered into the system. Programmers designed the portal to auto-populate various features based on information programs enter into their profile. This information provides the data needed to fill in the Program Directory and Locator Map. Additionally the search function accesses the profiles information to query results. The Data Uploader function allows programs to enter their data via a csv file. This was determined to be the most flexible option, as most data systems that programs use allow for data to be exported into this format. Additionally, it is most helpful for those who are managing the data, as this format can easily be manipulated for use in integrating datasets and building the aggregate reports. A major requirement of an integrated data system is the capability to match youths individuallevel data across several datasets. Each program dataset must contain the same identifiers (first name, last name, and date of birth) in order for a master dataset to be created. The work of cleaning, managing, matching and creating one integrated dataset requires a skilled database administrator (see Appendix B for graphic representation of process). Dorothy A. Johnson Center for Philanthropy Grand Valley State University
8 In order for the design of the portal to be fully functional, programmers and database administrators will need data to be submitted in an appropriate format and manner. This includes variables across all datasets that are clearly labeled; select variables that measure similar indicators so as to compare data across programs; individual-level data with the same identifiers (name and date of birth); and timely submission of a csv file. Programmers and database administrators must work closely with researchers and select program staff to communicate their needs so as to ensure the most efficient process. The process of actually receiving program data to be imported into the portal for the pilot project required many phases of work. First, the research team and the ELO evaluation committee worked with GRPS to identify two schools that could pilot the data system process. Harrison Park Elementary and Martin Luther King Jr. Leadership Academy were chosen based mostly on two factors: 1.) there was an established OST program infrastructure, and 2.) school principals had been in place for a while and were motivated to assist with the pilot. The research team and the ELO evaluation committee then selected six OST programs between the two schools based on the following criteria: Program is a member of ELO. Program is providing services within GRPS. Number of youth and age range of youth served should be representative of ELO. At least one program within each pilot school should be participating in the quality improvement system (implementing the Youth Program Quality Assessment (YPQA)). Program needs to be willing to participate and help shape the process. Programs should have varying levels of current data capacity. Programs should vary in topic to represent the diversity of ELO Network members. Once the pilot schools and programs had been selected, the next step was to work with programs to decide what data elements to collect. Indicators from outside datasets (GRPS and GRPD) included school attendance, academic achievement, and juvenile offenses; these were chosen because they are easily accessible, generalizable across ELO programs, and provided information that programs could not typically get on their own. Other data collected across all programs included program quality scores through the YPQA and program attendance; therefore, these indicators were also chosen to be a part of the data system. In order to be flexible and responsive to programs requests, it was decided that attendance data could be collected in any format if they also provided an attendance benchmark by which to measure their success in that area; this way, the system could report out on attendance in a uniform way across all programs (reporting on the percent of youth who reached the attendance benchmark). Dorothy A. Johnson Center for Philanthropy Grand Valley State University
9 Additionally, when programs were asked to provide their definition of success for youth in their program, the most common concept that emerged was some type of increased social emotional development. Researchers decided to create a brief survey to measure social emotional development that could be administered across all ELO programs. This provided another uniform indicator across all programs, which helps when analyzing what specific combination of ELO programs or dosage of time contributes to desired outcomes (in this case, increased social emotional development). Once the data elements were determined, the process of actually obtaining data began. To access GRPS data, researchers had to work closely with the school system to develop a research request that specifically outlined the requested data elements and how the data would be gathered and kept confidential and secure. Researchers had already acquired GRPD data through another research project. An opt out consent process was used though the GRPS school system where parents could indicate that they did not want their child s data from either dataset (GRPS and GRPD) to be used for this project. This opt out consent process required working closely with the school system to determine appropriate wording and a protocol for the administration and collection of the consent. In order to obtain program data, researchers needed to take several steps: Inform and ask programs for their participation. Develop an MOU for each program to stipulate the requirements of the project. Work with programs to determine the type of the data they collect and in what format. This one step can be time consuming and complicated, as programs often do not have a good sense of how their data is collected and managed. Researchers met with providers to interpret their data and figure out how it could be used to measure ELO s outcomes of interest. Additionally, researchers worked with providers to set up a Secure File Transfer Protocol (a software program that allows for secure transfer of electronic files) so that their data could be sent to researchers. This was a necessary step as the Data Uploader function of the portal had not yet been built. Develop a Data Sharing Agreement with each program that specifies the exact data to be shared and in what format. The next step was to build an integrated dataset using the data from all the pilot programs. Individual-level data was linked across datasets using full name and date of birth. Because the data is formatted in various ways across the programs, a database administrator must spend time cleaning and restructuring the data so researchers can measure outcomes in a standardized way. This is a time and energy intensive process, but it allows for programs to largely continue collecting their data as is and eliminates the burden of additional and new data collection. Dorothy A. Johnson Center for Philanthropy Grand Valley State University
10 The greatest lesson learned from this entire process is that the most difficult work is not in building the technology; it is in fully engaging program providers, listening and responding to their needs, and fostering their cooperation and participation with the data collection process. Because there is no mandate to participate in ELO or an integrated data system, there has to be a real incentive for programs to spend time contributing to a network-wide initiative; in addition, ELO executive committees and researchers need to facilitate opportunities for programs to voice their opinions and perspectives so that any project developments truly reflect their needs. This is the only way to ensure their involvement. More specifically, the process of cataloging programs data, agreeing to a minimum number of data attributes programs should collect, and carrying out a scheduled data import with the agreed upon data elements are huge challenges that require consistent communication and a democratic style of decision making. The underlying issue that needs to continually be addressed, and is of utmost importance to the school system and police department, is guaranteeing the security and confidentiality of the data. Accomplishing the above tasks is extremely time-consuming, so project planning should account for longer timelines and be flexible to delays. Program providers who participated in the pilot also gave their perspective on lessons learned. A summary of key points include: Each program should receive a clear set of instructions on what is expected, deadlines, and points of contact for various needs. Because providers are very busy, the information they need to participate in YCDC should be inclusive and in one place. The social emotional survey is not appropriate for younger children. A different, more ageappropriate measure should be developed. MAP score data should include the growth score and not just percent at or above grade level. Providers serve youth who often have below average scores. They believe that programming contributes to improved scores, but the improvement may not reach the grade level benchmark; however, this growth is still an important factor when considering a program s impact. There should be a plan for how the portal is disseminated to the community, and especially to parents of children in the school system. Possible ways to encourage program provider engagement with YCDC include: Explain how providers can use the directory and locator map to access other programs in their school or neighborhood that they may collaborate with, or refer students to if needed Dorothy A. Johnson Center for Philanthropy Grand Valley State University
11 Emphasize the desire to improve ELO as an entity that works toward improving outcomes for all children in the community; the data system can be used to examine how various OST programs interact with each other to affect youth outcomes. This information can be used to make decisions about programming in regard to collaborating with other providers, dosage, areas in need of improvement, and many other ways. Demonstrate how the reports can be used for evaluation purposes or grant proposals so that providers are able to use this information in ways that are useful to them. It is clear that there are multiple layers of benefits to an integrated data system. Programs have accessible and regularly updated information to help improve their services; ELO has data to advocate for their structure and continued sustainability; and funders will have better information on program outcomes to inform their decision making. An important additional step that has not yet been implemented is some sort of training on how programs can interpret their reports and use the information for their maximum benefit. This will be necessary in promoting their participation so that they continue to see the benefits to their involvement. YCDC Data Analysis Capabilities YCDC has the capability to provide descriptive data on the population served by ELO members as well as provide a deeper analysis of the cumulative impact of out of school time programming. The system will include data from several datasets, including: registered ELO programs; Grand Rapids Public School System; Grand Rapids Police Department; Youth Program Quality Assessment data; and the ELO Network Social Emotional Development Survey. The descriptive information will provide an up to date profile containing key indicators related to the ELO population. These include: Number of unique students served, Percent of youth with chronic absence and satisfactory absence, Percent of youth reaching a program s attendance benchmark, Percent of youth meeting grade level for math and reading MAP scores, Percent of youth arrested within the time frame of out of school programming, Average program quality score, and Average social emotional development score. Data can also be used to begin to answer ELO outcomes of interest. Propensity score analysis will be used to match individual-level student data across the several datasets submitted to YCDC, allowing for an analysis of how out of school time programs may or may not work together to produce better outcomes. This analysis will be strengthened by comparing students in OST programming versus students not in OST programming (using student data from the school system and police department). Dorothy A. Johnson Center for Philanthropy Grand Valley State University
12 ELO is interested in producing the following outcomes: Increased school attendance, Decreased police contact, Increased social emotional score, Increased school achievement, and Increased program quality. There are several analyses that can be used to explore if and how these outcomes are achieved. Data from YCDC will be able to answer the following questions: Is there a key dosage of programming that will produce the desired outcomes? What is it? Is there a combination of certain types of program that will produce the desired outcomes? What is it? Are higher social emotional scores associated with an increase in program and/or school attendance? Is out of school time program attendance associated with school attendance? Do programs that use the Youth Program Quality Assessment tool produce better outcomes than program that do not use the tool? Do programs with higher program quality scores produce better outcomes than those with lower program quality scores? The integration of several datasets provides the unique opportunity to explore how ELO programs may interact with one another to impact youth in a variety of ways. Additionally, the comparison between students in programming versus students not in programming potentially adds value to any outcome findings. As a result, there is the ability for a sophisticated analysis of youth programming that could affect system or policy level change. This level of analysis and the potential findings would not be possible without the cooperation of involved programs and the integration of data. Individual program data can only answer questions related to that program, which is of less value to an entity such as ELO that is interested in the relationship between all member programs and factors like attendance and criminal involvement. YCDC will be able to provide ELO with current and consistent data that will help inform decision making around program funding, policy, and best practices, which will in turn work to serve youth in the most effective and impactful ways possible. Dorothy A. Johnson Center for Philanthropy Grand Valley State University
13 Analysis with Pilot Data In order to demonstrate the analysis capabilities of the data system and how select findings will be reported, researchers used the pilot data to provide select descriptive statistics. These statistics are meant only to show how the data can be used; they should not be interpreted as conclusive findings in any way. Student information was obtained from the Grand Rapids Public School system; this data contained attendance information by quarter and MAP score results for students at Martin Luther King, Jr. Leadership Academy and Harrison Park Elementary School, reported out in the form of national percentiles. Program information was collected from programs participating in the YCDC pilot phase and consisted of: Creative Youth Center, Gerontology Network, Boy Scouts, athletics, and Kent Social Services Network. The program information was merged with the GRPS information based on first name, last name, and date of birth. Unique Participants The number of unique students served by the selected ELO programs over the school year was 401 (20 students participated in two or more programs). School Attendance Following discussions around how to effectively convey attendance information, percent of students categorized as chronically absent and percent of students categorized with satisfactory attendance were calculated: ELO Program Participants Non-ELO Program Participants Chronic Absence 27% 24% Satisfactory Absence 47% 49% Program Attendance A benchmark for attendance, by program, was determined and the percent of students that reached the previously set benchmark was calculated. Thirty-eight percent of students in ELO pilot programs reached the attendance benchmark set by the program. Dorothy A. Johnson Center for Philanthropy Grand Valley State University
14 Academic Achievement Regarding MAP scores, percent of students performing at grade level or above (defined as performing at 50 percent or above) was calculated for students in ELO pilot programs and students not in ELO pilot programs. This was broken down by grade level (K-2nd, 2nd-5th, and 6th+) and by test (Reading and Math): ELO Program Participants Math - Grades K-2 17% 32% Math - Grades % 20% Math - Grades 6+ 18% 19% Reading - Grads K-2 24% 28% Reading - Grades % 31% Reading - Grades 6+ 27% 32% Non-ELO Program Participants Program Quality Youth Program Quality Assessments (YPQAs) were performed at select ELO program sites. Self and external scores (on a 0-5 scale) were calculated for four areas of the programs engagement, supportive environment, interaction, and safe environment: Self-Assessment Engagement Supportive Environment Interaction Safe Environment External Assessment Social Emotional Development Pilot programs administered a social emotional development survey to their youth toward the end of their program. The average social emotional score, on a scale, was The standard deviation was Police Contact Information from the Grand Rapids Police Department was used to determine the percent of students in each program who were arrested during the program dates. Less than 5 percent of the ELO program participants were arrested during the program dates. For comparison purposes, less than 5 percent of students not in ELO programs were arrested during the school year. Dorothy A. Johnson Center for Philanthropy Grand Valley State University
15 Plan for YCDC Scale Up The pilot project proved that a scale up to include all ELO member programs is feasible. There will be additional adjustments as new programs are brought on to the project, but the technology can support the load level. While engagement, relationship-building, and data sharing with programs, the school system, and the police department is time-consuming and complicated, these parties do have data that is beneficial for an integrated data system and are willing to share this data. Additionally, the system infrastructure is supported by a unique team that includes a database administrator, web designer, programmer, project manager, and researchers; this team has the skill sets and available resources to manage the various elements of such a multidimensional project. A defined step by step process for scale up should be developed as soon as funding is available. Providers will need clear instructions on how they can participate, as well as continued engagement that shows how participation will be beneficial. There should be a brief planning period before scale up is implemented in order to carry out any changes the pilot showed to be necessary and to create a protocol for onboarding programs in the most efficient way possible. An estimate of the scale up cost is detailed in Appendix C. There is a yearly cost for hosting the portal and providing quarterly reports, as well as a cost associated with incorporating new programs, expanding to additional school systems, and gathering updated police department data. Dorothy A. Johnson Center for Philanthropy Grand Valley State University
16 Appendix A: Indicator Definitions Number of unique youth served: a count of the unique number of youth served by all ELO program or an individual program. Percent of youth reaching a program s attendance benchmark: each program has an individual attendance benchmark (e.g. youth should attend at least 75 percent of program sessions). This indicator measures the percent of youth who met or exceeded that benchmark. Average days attended last quarter: only for program reports the average number of days program participants attended the program. Percent of youth with chronic absence and satisfactory absence: percent of youth categorized with Chronic Absence or Satisfactory Absence according to Grand Rapids Public School definitions. Percent of youth meeting grade level for math and reading Measures of Academic Progress (MAP) scores: percent of youth with scores at grade level for the following subjects and grades: Math - Grades K-2: 20 Math - Grades 2-5: 30 Math - Grades 6 and above Reading - Grades K-2 Reading - Grades 2-5 Reading - Grades 6 and above Percent of youth arrested with the timeframe of our of school time programming: percent of youth arrested during ELO program participation. Program quality score: average score for program quality using the Youth Program Quality Assessment (possible scores range between 1 and 5). The self-assessment is completed by the program and the external assessment is completed by an outside evaluator. Social emotional development score: average score of social emotional development survey (possible scores range between 1 and 100). Dorothy A. Johnson Center for Philanthropy Grand Valley State University
17 Appendix B: Flow Chart Diagram Youth Community Data Center Flow Chart Diagram Contributed Data Data Transformation Publication Boy Scouts GRPS Demographics and Test data Creative Youth Center Load Data Clean Data, Standardize fields, Assign Unique ID, Populate Program participation Merge Programs data with GRPS Demographics and Test data -Exact matching *- Social Emotional Survey Data Demographics-MAP Gerontology Network Filter data by consent. (GRPS, ELO) All Programs Data LOOP KSSN De-identify data Analysis Combined (GRPS + Program Data) GRPD Prepare and Publish Research Datasets Phase * Exact Matching is based on 100 % match of first name, last name and Date of birth Community Research Institute of Grand Valley State University 2014 Dorothy A. Johnson Center for Philanthropy Grand Valley State University
18 Appendix C: YCDC Scale Up Cost Estimate Yearly Cost: $21,100 -Hosting the Portal and providing quarterly reports Technology supply/hosting cost $400 Technology updates to maintain website 5 hours DBA, $625 5 hours Web Designer, $625 Quarterly reports to ELO ED 10 hours Senior Researcher/Research Manager, $3,400 Social Emotional survey administration (programs print surveys and coordinate within sites) 10 hours Senior Researcher/Research Manager, $1, hours Project Manager, $10, hours student data management/entry, $1,000 ELO evaluation team meetings 18 hours Senior Researcher/Research Manager, $3,060 Onboarding of new programs: Single Site Program: $2,100 Technology team client meetings 4 hours Web Designer or DBA, $500 Research team client meetings, DSA 6 hours Research Coordinator and 4 hours student, $610 Portal administration 2 hours Web Designer or DBA, $250 Technology report generation 6 hours Web Designer, $750 Multi site Program: $3,510 Technology team client meetings 4 hours Web Designer or DBA, $500 Research team client meetings, DSA 6 hours Research Coordinator and 10 hours student, $760 Portal administration 6 hours Web Designer or DBA, $750 Technology report generation 12 hours Web Designer, $1,500 Dorothy A. Johnson Center for Philanthropy Grand Valley State University
19 Expansion to each additional School District: $2550 DSA development and consent processing 10 hours Senior Researcher/Research Manager, $1,700 GRJOI updated quarterly: $8, hours Project Manager, $850 Data pulls from GRPD and data integration analysis 20 hours DBA, $2, hours Research Coordinator, $1,700 Annual longitudinal report 20 hours Research Coordinator and 10 Senior Researcher, $3,400 Expansion of GRJOI (one time cost): $10, hours GIS Specialist, $5,000 New DSAs with Kentwood and Wyoming Police, Kent County Court 30 hours Senior Researcher/Research Manager, $5, hours Project Manager, $2,550 Data merging 20 hours DBA, $2,500 Dorothy A. Johnson Center for Philanthropy Grand Valley State University
20 Dorothy A. Johnson Center for Philanthropy Grand Valley State University 201 Front Ave. SW, BIK 200 Grand Rapids, MI
SLDS Spotlight. SLDS Spotlight: Data Use Through Visualizations and Narratives, March 2013 1. Data Use Through Visualizations and Narratives
SLDS Spotlight Data Use Through Visualizations and Narratives Data, on their own, may not tell a story to the untrained eye. It takes analytical skills and an understanding of the data to draw out meaningful
Proposal Guidelines. Projects with Scholarship Component
Proposal Guidelines Projects with Scholarship Component These proposal guidelines are intended to help you prepare your proposal and gather the required documentation. The guidelines include a checklist
TEACHER PERFORMANCE APPRAISAL
201 TEACHER PERFORMANCE APPRAISAL SYSTEM Observation and Evaluation Forms and Procedures for Instructional Practice Effective July 1, 201 A Comprehensive System for Professional Development and Annual
NORTH PENN SCHOOL DISTRICT
NORTH PENN SCHOOL DISTRICT REQUEST FOR PROPOSAL FOR elearning Software Systems September 10, 2013 1 NORTH PENN SCHOOL DISTRICT Lansdale, PA 19446 Article I. Introduction & Background Information North
THE SCHOOL BOARD OF ST. LUCIE COUNTY, FLORIDA TEACHER PERFORMANCE APPRAISAL SYSTEM, 2014-2015 TABLE OF CONTENTS
, 2014-2015 TABLE OF CONTENTS Purpose and Key Components... 1 1. Core of Effective Practices... 1 2. Student Growth... 2 3. Evaluation Rating Criteria... 13 4. Teacher and Principal Involvement... 14 5.
Process for reporting and learning from serious incidents requiring investigation
Process for reporting and learning from serious incidents requiring investigation Date: 9 March 2012 NHS South of England Process for reporting and learning from serious incidents requiring investigation
NC TEACHER EVALUATION PROCESS SAMPLE EVIDENCES AND ARTIFACTS
STANDARD I: ELEMENT A: Teachers demonstrate leadership Teachers lead in their classroom Developing Has assessment data available and refers to it to understand the skills and abilities of students Accesses
Andhra Pradesh School Choice Project Proposal
Andhra Pradesh School Choice Project Proposal 1. Background: In recent years, access to primary education has expanded tremendously in India and gender gaps have narrowed. Approximately 95% of both boys
Program Report for the Preparation of Elementary School Teachers Association for Childhood Education International (ACEI) 2007 Standards - Option A
Program Report for the Preparation of Elementary School Teachers Association for Childhood Education International (ACEI) 2007 Standards - Option A NATIONAL COUNCIL FOR ACCREDITATION OF TEACHER EDUCATION
2013-14 Program Year. Audit Report. Prepared for the California Community Colleges Chancellor s Office
2013-14 Program Year Audit Report Prepared for the California Community Colleges Chancellor s Office Purpose The California Community Colleges Chancellor s Office (CCCCO) requested an audit for the 2013-14
School Social Worker INTRODUCTION
INTRODUCTION The following document is designed to help School Social Workers to utilize Infinite Campus in their job functions. We will be looking at four areas of the program. The first area will be
IDS CASE STUDY: Allegheny County
IDS CASE STUDY: Allegheny County Allegheny County s Data Warehouse: Leveraging Data to Enhance Human Service Programs and Policies by Erika M. Kitzmiller MAY 2014 University of Pennsylvania 3701 Locust
WEST OTTAWA PUBLIC SCHOOLS
WEST OTTAWA PUBLIC SCHOOLS EAT LAKES ELEMENTARY SCHOOL 3200 N. 152 nd Ave. Holland, MI 49424 (616) 738-6300 Fax (616) 738-6391 Safety Hotline (616) 738-5750 www.westottawa.net August 15, Dear Parents and
Assessment of Primary Care Resources and Supports for Chronic Disease Self Management (PCRS)
Assessment of Primary Care Resources and Supports for Chronic Disease Self Management (PCRS) Individuals interested in using the PCRS in quality improvement work or research are free to do so. We request
Child Care Data Systems in the State of Maryland
A Look at Maryland s Early Childhood Data System 2 State policymakers and administrators use Maryland s early childhood data system, the Maryland Model for School Readiness (MMSR), to evaluate and promote
Pearson Inform v4.0 Educators Guide
Pearson Inform v4.0 Educators Guide Part Number 606 000 508 A Educators Guide v4.0 Pearson Inform First Edition (August 2005) Second Edition (September 2006) This edition applies to Release 4.0 of Inform
Roadmap for Teacher Access to Student-Level Longitudinal Data
Roadmap for Teacher Access to Student-Level Longitudinal Data Key s to Ensure Quality Implementation Where are we going? Teachers have access to information about the students in their classrooms each
Reporting Student Progress and Achievement
Reporting Student Progress and Achievement takes pride in the quality of its product content. However, technical inaccuracies, typographical errors, and editorial omissions do occur from time to time.
EVALUATION METHODS TIP SHEET
EVALUATION METHODS TIP SHEET QUANTITATIVE METHODS: Quantitative data collection methods consist of counts or frequencies, rates or percentages, or other statistics that document the actual existence or
Andrew F. Cleek, Psy.D Executive Office McSilver-UIBH New York University NYAPRS Executive Seminar April 25, 2013
Andrew F. Cleek, Psy.D Executive Office McSilver-UIBH New York University NYAPRS Executive Seminar April 25, 2013 Brief Description of CTAC Lessons learned from BEEP Using Data to Make Decisions The Evolution
PRO-NET. A Publication of Building Professional Development Partnerships for Adult Educators Project. April 2001
Management Competencies and Sample Indicators for the Improvement of Adult Education Programs A Publication of Building Professional Development Partnerships for Adult Educators Project PRO-NET April 2001
LSF HEALTH SYSTEMS Information Technology Plan
LSF HEALTH SYSTEMS Information Technology Plan I. INTRODUCTION The LSF Health Systems software is a web-enabled, secure website providing access to LSF, the Provider Network and DCF. At this time, the
Request for Proposal RFP M.S.D. OF WARREN TOWNSHIP
Request for Proposal RFP M.S.D. OF WARREN TOWNSHIP Version: 1.0 Date: 10/5/2015 RFP number: 2015-0001 EDUCATION & COMMUNITY CENTER Page 1 Contents Confidentiality... 3 Introduction and purpose of the RFP...
The Alameda County Model of Probation: Juvenile Supervision
The Alameda County Model of Probation: Juvenile Supervision August 2011 Model of Probation Juvenile Supervision 1 The Alameda County Model of Probation: Juvenile Supervision August 2011 With the appointment
Course Catalog. www.airweb.org/academy
www.airweb.org/academy Course Catalog 2015 Hosted by the Association for Institutional Research, Data and Decisions Academy courses provide self-paced, online professional development for institutional
WORKBOOK B: CONDUCTING SECONDARY RESEARCH
WORKBOOK B: CONDUCTING SECONDARY RESEARCH TABLE OF CONTENTS OVERVIEW OF SECONDARY RESEARCH... 3 Steps Involved in Secondary Research... 3 Advantages and Disadvantages of Secondary Research... 4 Deciding
Geocoding in Law Enforcement Final Report
Geocoding in Law Enforcement Final Report Geocoding in Law Enforcement Final Report Prepared by: The Crime Mapping Laboratory Police Foundation August 2000 Report to the Office of Community Oriented Policing
A Review of High School Mathematics Programs
A Review of High School Mathematics Programs Introduction In the winter of 2005, a grant was written entitled Making the Transition from High School to College (MaTHSC). The grant began as a collaboration
The Recipe for Sarbanes-Oxley Compliance using Microsoft s SharePoint 2010 platform
The Recipe for Sarbanes-Oxley Compliance using Microsoft s SharePoint 2010 platform Technical Discussion David Churchill CEO DraftPoint Inc. The information contained in this document represents the current
POLICY STATEMENT Commonwealth of Pennsylvania Department of Corrections
POLICY STATEMENT Commonwealth of Pennsylvania Department of Corrections Policy Subject: Policy Number: Project Management 1.1.5 Date of Issue: Authority: Effective Date: May 29, 2009 Signature on File
Annual Report on Curriculum, Instruction, and Student Achievement Independent School District 700, Hermantown Community Schools
Annual Report on Curriculum, Instruction, and Student Achievement Independent School District 700, Hermantown Community Schools Information for the constituents of the Hermantown School District of curriculum,
Computer-Based Assessment and Instruction: Technology for Teaching and Learning
Computer-Based Assessment and Instruction: Technology for Teaching and Learning Re-published with permission from Family Center on Technology and Disability Technology Voices June 2009 Computer-Based Assessment
Denver Police Department Law Enforcement Advocate Program. Scanning: When a developmentally delayed youth was involved in a police shooting in 2003,
Denver Police Department Law Enforcement Advocate Program Summary Scanning: When a developmentally delayed youth was involved in a police shooting in 2003, the incident increased neighborhood distrust
Efficiency and Not-for-Profit Can Go Hand-in-Hand
Efficiency and Not-for-Profit Can Go Hand-in-Hand Efficiency and Not for Profit Can Go Hand in Hand In today s nonprofit business environment, operating efficiently is not just a good idea, but a necessity
Step 6: Report Your PHDS-PLUS Findings to Stimulate System Change
Step 6: Report Your PHDS-PLUS Findings to Stimulate System Change Sharing information from the PHDS-PLUS can help you launch or strengthen partnerships and efforts to improve services, policies, and programs
School & District Data: Analysis Spreadsheets
School & District Data: School & District Data: Hello, Idaho! I m Beth Klineman from Pearson Education. Welcome to Schoolnet s, accessed through the ISEE Portal. School & District Data reports come in
Indiana s Department of Education STEM Education Implementation Rubric
Indiana s Department of Education STEM Education Rubric The rubric that follows provides an outline for the implementation of STEM attributes in schools. The rubric is designed to show varying levels of
Leadership Competency Self Assessment
USDA Virtual University School of Talent Management Leadership Essentials Certificate Program Leadership Competency Self Assessment Building Blocks for Workforce Development Based on OPM Competencies Updated:
Sustainable Jersey for Schools Small Grants Program
Sustainable Jersey for Schools Small Grants Program Funded by the New Jersey Department of Health, Maternal and Child Health Services Title V Block Grant 2015 Application Information Package Announcement
A Look at Maryland s Early Childhood Data System
A Look at Maryland s Early Childhood Data System 2 State policymakers and administrators use the Maryland Model for School Readiness (MMSR) the main component of the state s early childhood data system
James Rumsey Technical Institute Employee Performance and Effectiveness Evaluation Procedure
James Rumsey Technical Institute Employee Performance and Effectiveness Evaluation Procedure James Rumsey Technical Institute, a West Virginia state institution, is bound by Policy 5310, Performance Evaluation
U.S. Department of the Treasury. Treasury IT Performance Measures Guide
U.S. Department of the Treasury Treasury IT Performance Measures Guide Office of the Chief Information Officer (OCIO) Enterprise Architecture Program June 2007 Revision History June 13, 2007 (Version 1.1)
Using Query Browser in Dashboards 4.0: What You Need to Know
Using Query Browser in Dashboards 4.0: What You Need to Know The BusinessObjects 4.0 release gave birth to a few new interesting features to the BI toolkit. One key enhancement was the addition of the
How To Improve Your School
DETROIT PUBLIC SCHOOLS August 19, 2013 Carver Elementary-Middle School 18701 Paul Street Detroit, Michigan 48228-3868 Annual Education Report (AER) Cover Letter (2012-2013) Dear Parents and Community Members:
Model for Practitioner Evaluation Manual SCHOOL COUNSELOR. Approved by Board of Education August 28, 2002
Model for Practitioner Evaluation Manual SCHOOL COUNSELOR Approved by Board of Education August 28, 2002 Revised August 2008 Model for Practitioner Evaluation Guidelines and Process for Traditional Evaluation
Guide to the New York State Afterschool Program Accreditation System Advancing Quality. Promoting Professionalism.
Guide to the New York State Afterschool Program Accreditation System Advancing Quality. Promoting Professionalism. 1 P a g e Table of Contents Acknowledgements and Introduction Pages 3&4 Frequently Asked
Agile Projects 7. Agile Project Management 21
Contents Contents 1 2 3 Agile Projects 7 Introduction 8 About the Book 9 The Problems 10 The Agile Manifesto 12 Agile Approach 14 The Benefits 16 Project Components 18 Summary 20 Agile Project Management
School Committee Members. From: Jeffrey M. Young Superintendent of Schools. Additional FY2016 Budget Information. Date: April 10, 2015
To: School Committee Members From: Jeffrey M. Young Superintendent of Schools Re: Additional FY2016 Budget Information Date: April 10, 2015 This memo is written in response to questions and comments raised
upport uy in ccountable ndependent epresentative impact ower and influence Measuring the impact and success of your youth voice vehicle
Measuring the impact and success of your youth voice vehicle epresentative ccountable ndependent upport uy in impact ower and influence A guide for staff, councillors and young people involved in youth
JOB DESCRIPTION PATERSON BOARD OF EDUCATION. CHILD STUDY TEAM/COUNSELOR /MEDICAL PERSONNEL 3206 Elementary Guidance Counselor Page 1 of 8
Page 1 of 8 JOB TITLE: ELEMENTARY GUIDANCE COUNSELOR REPORTS TO: The Principal and Supervisor of Counseling Services SUPERVISES: Students NATURE AND SCOPE OF JOB: Assumes professional responsibility for
Every Student Succeeds Act
Every Student Succeeds Act A New Day in Public Education Frequently Asked Questions STANDARDS, ASSESSMENTS AND ACCOUNTABILITY Q: What does ESSA mean for a classroom teacher? A: ESSA will end the obsession
EDI: Current Use in Canada and Australia
EDI: Current Use in Canada and Australia Kerry McCuaig Early Childhood Instrument (EDI): From Data to Action!Montreal, February 1, 2013 EDI: Current Use in Canada and Australia 1 EDI Background Full name:
School Counseling Programs and Services
REGULATION Related Entries: IJA Responsible Office: Special Education and Student Services MONTGOMERY COUNTY PUBLIC SCHOOLS IJA-RA School Counseling Programs and Services I. PURPOSE A. To ensure that a
Local and Community Development Programme
UPDATED TO REFLECT NEW FRAMEWORK MAY2011 Local and Community Development Programme A step by step guide to Strategic Planning for LCDP Step One - Reflection on the principles of the programme and the horizontal
REQUEST FOR PROPOSAL FOR WEBSITE DESIGN\REDESIGN DEVELOPMENT AND IMPLEMENTATION RFP Date March 3, 2015
1 REQUEST FOR PROPOSAL FOR WEBSITE DESIGN\REDESIGN DEVELOPMENT AND IMPLEMENTATION RFP Date March 3, 2015 Proposal Submission Friday March 20 th, 2015.at 5:00pm local time 2 TABLE OF CONTENTS 1. Introduction
ILLINOIS STATE BOARD OF EDUCATION College and Career Readiness Division 100 North First Street, C-215 Springfield, Illinois 62777-0001 217/524-4832
ILLINOIS STATE BOARD OF EDUCATION College and Career Readiness Division 100 North First Street, C-215 Springfield, Illinois 62777-0001 217/524-4832 FY 2013 ILLINOIS 21 st CENTURY COMMUNITY LEARNING CENTERS
Shared Governance Council January 15, 2014 Minutes
Shared Governance Council January 15, 2014 Minutes Present Susanna Gunther, Richard Crapuchettes, Kevin Anderson, Karen McCord, Debbie Luttrell- Williams, Gabriel Johnson, Kyle Todd, Maire Morinec, Diane
RENEWING CAREER AND TECHNICAL EDUCATION IN COLORADO
RENEWING CAREER AND TECHNICAL EDUCATION IN COLORADO A summary of the Colorado State Plan for Implementation of the Carl D. Perkins Career and Technical Education Act of 2006 Prepared by the Colorado Community
Enhancements to State Reports
click the icon to go to the contents The Challenge of Data How and Why of Data Visualization Interactive Dashboards Electronic Data Walls Enhancements to State Reports Next Steps THE CHALLENGE OF DATA
The Medical Home Index - Short Version: Measuring the Organization and Delivery of Primary Care for Children with Special Health Care Needs
The Index - Short Version: Measuring the Organization and Delivery of Primary Care for Children with Special Health Care Needs The Index - Short Version (MHI-SV) represents ten indicators which have been
The Criminal Justice Dashboard (The Dashboard) Category: Information Communications Technology (ICT) Innovations. State of Maryland.
The Criminal Justice Dashboard (The Dashboard) Category: Information Communications Technology (ICT) Innovations State of Maryland June 1, 2011 1 Section B. The Criminal Justice Dashboard (Dashboard) is
Best Practices in Implementation of Public Health Information Systems Initiatives to Improve Public Health Performance: The New York City Experience
Case Study Report May 2012 Best Practices in Implementation of Public Health Information Systems Initiatives to Improve Public Health Performance: The New York City Experience In collaboration with the
PA County Integration Project County Commissioners Association of Pennsylvania. The County Justice Hub Solution
PA County Integration Project County Commissioners Association of Pennsylvania Technical Report The County Justice Hub Solution Leveraging technology to improve inter-agency communication, coordination
EMR Technology Checklist
Patient Accessibility/Scheduling/Account Maintenance: Able to interact with schedule through an online portal pre register VIP status to move patient to the front of the line Access and pre registration
September 2011 Report No. 12-002
John Keel, CPA State Auditor An Audit Report on The Criminal Justice Information System at the Department of Public Safety and the Texas Department of Criminal Justice Report No. 12-002 An Audit Report
RFP #: RQ14036 WEB-BASED CUSTOMER TRACKING, SCHEDULING, ONLINE SYSTEM FOR EMPLOYMENT CONNECTION QUESTIONS AND ANSWERS
RFP Title: RFP #: RQ14036 WEB-BASED CUSTOMER TRACKING, SCHEDULING, ONLINE SYSTEM FOR EMPLOYMENT CONNECTION QUESTIONS AND ANSWERS As of 5/4/09 We request a demonstration of the current Area 3 Customer Tracking
Looking Beyond Data Synchronization for Mission Critical GIS Data
Looking Beyond Data Synchronization for Mission Critical GIS Data Table of Contents Preface... 2 Background Information... 3 Introduction... 4 Area of Focus #1: Locally Authoritative GIS Data Development
Job Description. BRANCH Integrated Services GRADE JM2
DIRECTORATE People and Communities JOB TITLE Consultant Social Work Practitioner Job Description BRANCH Integrated Services GRADE JM2 SECTION Community Family Service Main Purpose of the Job To operate
