click the icon to go to the contents The Challenge of Data How and Why of Data Visualization Interactive Dashboards Electronic Data Walls Enhancements to State Reports Next Steps
THE CHALLENGE OF DATA Educators are well aware of the questions they need to ask to inform instruction and improve student achievement: Is my achievement gap closing? Are discipline consequences being equitably handled for all subgroups? What is the profile of our dropouts? Who is likely to be in the lowest achievement category on the annual standardized tests? etc Such information is available from the state through the data warehouse, but only a year (or more) later long after any interventions based on the data should have taken place. School districts actually collect an enormous amount of data on teaching and learning that is not fully utilized. For example: State and Federal mandated reporting is generally considered a burden that must be borne without any utility to the district. The data is collected, often by a clerical employee, and reported. As long as it passes validations, it is not examined or used further. Student attendance, discipline and grade data is collected in the Student Information System and utilized internally by administrators as isolated information. Staff data is collected in other isolated systems. Local assessments are collected by teachers, and this data also exists in its own world, often on pieces of paper but occasionally in electronic form such as scan sheets or spreadsheets. Finally, MCAS and other State level assessments live in state data warehouses that are accessible only by a few gurus who know the secret handshake. As such, this data cannot be correlated or examined in the context of local data such as local benchmark assessments. As can be seen, schools are a very data rich environment. However, data that would inform decisions is scattered throughout the schools and central office in different forms, in different places, and is either unavailable or unknown to decision makers. Experienced educators call this siloed data. Margaret Rouse, writing in the TechTarget blog states: A data silo is a repository of fixed data that an organization does not regularly use in its day-to-day operation. So-called siloed data cannot exchange content with other systems in the organization. The expressions "data silo" and "siloed data" arise from the inherent isolation of the information. The data in a silo remains sealed off from the rest of the organization, like grain in a farm silo is closed off from the outside elements. This is an all too familiar scenario in school systems. The problem rose slowly and innocently enough as districts embraced technology to streamline all kinds of operations, from the lunch program to the library to the Student Information System to the employee database. For example, the librarian is well
aware of his data, the Special Education director is well aware of hers, but the two databases live in complete isolation from each other. So, although educators generally have the questions, they do not have access to the answers because of the enormous difficulty in pulling the data together or the lack of the technology to view the data effectively. The Old Way HOW AND WHY OF DATA VISUALIZATION Sure, it is possible to create a report of discipline incidents, an attendance report, a report of special education interventions, a list of MCAS results, grades, and a folder full of local assessments a stack of printouts several pages for each student. But imagine having the copy machine crank 15 copies for all the members of the team and then attempting to have a discussion about a cohort of students. By the time everyone is on the same page (literally) the common planning time is over. The New Way All (or most) of the data is electronic to begin with, but it needs to be put in the proper form and desiloed. Only then can business information tools and technologies such as Tableau Software be applied to the data to create visuals in an electronic format that everyone on the team can not only view but also interact with at the same time. Only then can everyone be on the same page right from the start of the meeting.
Why Visualization? Most users of data have become accustomed to tabular data rows and columns statically placed on a page. Sometimes we get creative and use bold or colored cells to differentiate the data. But how can one find trends in data in this form? Consider the visualization provided in the Periodic Table of the Elements versus a generic crosstab listing: Symbol Element Atomic number Ac Actinium 89 Al Aluminum 13 Am Americium 95 Sb Antimony 51 Ar Argon 18 As Arsenic 33 As Astatine 85 The two representations may contain the identical information, but everyone agrees that the visual profoundly deepens our understanding of the properties of the elements! Here are some examples of dashboards from the SIMS report, from local assessments, and from MCAS reports. (Note: The underlying data is invented, student and teacher names are invented; none of the data is from any actual school district.)
INTERACTIVE DASHBOARDS This example shows how student demographics and attendance data can be displayed in a format that is easy to read and understand. By clicking filters one can drill down from the district level to the school to the classroom: In this example we are looking at the demographics of Ms. Abraham s fourth grade class at the West Street School. By selecting All for teachers, we would be viewing the entire fourth grade at the West School. By selecting All for schools, we would be viewing the district wide fourth grade demographics and attendance.
In the same workbook an administrator could keep a pulse on staff attendance, automatically updated weekly from the data source: Here we see displayed the staff attendance data for the Central Middle School, with a year-to-date running history at the top, last week s absence list, an individual staff history, and a running total of expense on substitute teachers. By selecting All, the school central office can see this information for the entire district.
Administrators can be kept up to date on common assessments with a quick view of how are our subgroups doing in each school and class. The detailed view of individual students and classes can be drilled down as will be seen below in Data Walls.
ELECTRONIC DATA WALLS Wouldn t it be great to have an electronic data wall on a smart board that pulled your student achievement, assessment, demographic and other data together into interactive dashboards? During common planning or professional development time, teachers could interact with the data all in one place and at the same time. When everyone is looking at the same clearly presented information and interacting with it, rich conversations can take place about student learning. Let s continue with an overview of student growth for a grade level in a series common local assessments or unit assessments: In this view we see the overall growth represented by the size of the balloons for each class. By selecting the filters on the right, we can for example view the girls who are not students with disabilities in the East Street School. By changing filters, we can view the overall growth of any selected subgroup. Selecting All clears all the filters and shows growth for all students the district.
This view can be drilled down to specific students by clicking on a balloon: This displays the growth on each unit for the girls in Ms. McGrath s class. The students can be sorted by any of the columns. For example, by clicking on the little sorting icon by Overall Growth, the names will sort from lowest to highest (as shown in view above) or highest to lowest. The same can be done for any of the unit assessments. The color enhancement of the data leads immediately to observations such as: We did well in Unit 2. Using the filters we can go back to the first page with the balloons, and click on different teachers to see their performance for Unit 2, for example Ms. Jones: Using data fosters rich conversations about teaching and learning, which inform instruction.
ENHANCEMENTS TO STATE REPORTS In Massachusetts, the Department of Elementary and Secondary Education provides, through Edwin Analytics, beautifully detailed reports on MCAS, as they will no doubt for PARCC. But the reports are presented as static crosstabs that do not easily bring out information needed from them. Fortunately the state does produce (and they are working with us to improve) flat files that can be connected to tools such as Tableau, and visualized using all the bells and whistles provided by such tools. For example Edwin has a 606 report displaying the results of a test broken down by standard, providing information for each student achievement relative to the school, district and state. Here is a sample Tableau visualization of the 606 report: The bars on the left of the visual show the percent correct in each standard and question type. The color of the bars indicate the relationship between that percent correct and the state average percent correct, varying continually from dark red through pink, grey to dark green. We can see instantly that, even though these students in grade 4 at the East Street School scored highest in the Knowledge of Language standard, with 76.4% correct, the dark red indicates that it was among the lowest with respect to the state average. This view can be drilled down to the classroom, subgroup (not shown here) and individual student level or up to the district by selecting the filters on the right of the visual.
This information can further be combined in a dashboard with local information such as attendance, grades, local assessments to give a deeper view into teaching and learning: This visual pulls the results of MCAS by standard shown at the top, ethnicity from SIMS, Overall Growth from local assessment data, last year s SGP from MCAS, and attendance rate from the student information system.
CONCLUSION B.I. industry expert and author Frank Buytendijk states Measurement impacts our personal lives every single day. If we want to lose some weight, we start by standing on the scale. Based on the outcome, we decide how much weight we need to lose, and every other day we check our progress. If there is enough progress, we become encouraged to lose more, and if we are disappointed, we re driven to add even more effort in order to achieve our goal. In short, measurement drives our behavior. School leaders often state this as if you don t measure it, it won t improve. However, the measurement and follow on analysis should be done collaboratively with tools that are easy to use and understand and can be put in the hands of all stakeholders at a low cost. It is not necessary (or desired) for a school district to employ a Ph.D. in statistics to put the tools in place to establish the data driven culture that teachers need to succeed. BKLSchoolVision will guide your school or district through the process of gathering the appropriate data, cleansing it, and your building custom visualizations. We provide training and professional development for teachers and IT staff to update and refresh the data. We would be privileged to help you begin or continue your journey. ABOUT US Paul A. Livingston, Ph.D. has over 30 years in public school education. He has taught at the elementary, middle school, and high school levels. Paul was a school administrator for 25 years having served as a school business manager for 5 years and a school superintendent for 20 years. Recently he has worked with schools in several states on turnaround efforts. Elaine Braun-Keller, Ph.D. has a background in mathematics and chemical physics. She worked as a college professor, researcher, and computer scientist for over 15 years. Recently, she spent the last 10 years working in K-12 education, as a data and technology specialist helping districts use data to inform instruction. Her experience in K-12 education also includes two terms on her local school committee.