2013 APQC Education North Star Community Process and Performance Management Conference
Arlington Public Schools About Arlington Public Schools Pre-K-12 Enrollment - 22,645 Adult Education Enrollment - 14,000 Schools: Elementary - 22 Middle - 5 High Schools - 3 Secondary (grades 6-12) - 1 Other Programs - 6 Our Students APS students hail from 126 nations Speak 98 languages. White - 46.8 % Hispanic - 27.8 % Black/African American - 10.7 % Asian - 9.4 % Amer. Indian/Alaskan Nat. - 0.4 % Nat. Hawaiian/Pac. Islander - 0.1 % Multiple - 4.8 % Students w/subsidized Meals - 30.8 % About Arlington, Virginia Highest Level of Education 70.2% Bachelor Degrees 36.7% Graduate Degrees
Planning Strategic Plan Department Plan School Management Plan Teacher SMART Goals All Planning is Aligned to the Five Strategic Plan Goals Department Plans are Aligned to Specific Strategic Plan Goals, Strategies, and Outcomes
STRATEGIES DESIRED OUTCOMES DATA SOURCES B. Provide an infrastructure for learning. APS makes an infrastructure available to students for learning regardless of their location or the time of day. It supports access to information, as well as access to participation in online learning communities. It enables seamless integration of inand out-of-school learning. APS utilizes state-of-the-art technology that creates engaging, relevant, and personalized learning experiences for all learners regardless of background, language, or disabilities. Students and parents are satisfied with the APS learning infrastructure. APS employs technology to assess student achievement in authentic and meaningful ways that generates data to diagnose and modify instructional practices. APS technology compared to industry standards (e.g. student to computer ratios, uptime for core services) Student and parent Site Based and Community Satisfaction Surveys Feedback from teacher and employee advisory groups (e.g., CPST- Collaborative Professional Strategies Team; TCI- Teachers Council on Instruction)
You cannot change outcomes without changing the processes that lead to those outcomes. Dr. Jack Grayson, Founder and Executive Chairman, American Productivity and Quality Center
Teaching middle school is like skinning mules, sometimes you have to hit them on the head with a 2x4 to get their attention. - Mom
Description What does the process accomplish? Process for the technical components of online SOL testing Owner Who is accountable for the process? Assistant Director, Assessment Begin Process What starts the process? Pearson updates online testing web site and/or software Suppliers People and organizations which provide Inputs Principals School Testing Coordinators (STCs) Instructional Technology Coordinators (ITCs) Service Support Center End Process When is the process complete? Testing is complete Customers People and organizations which use the outputs Schools Supplier Requirements Key needs of suppliers Customer Requirements Key customer expectations Reliable process Testing runs smoothly No technology issues during testing Testing is responsive to school needs Inputs Materials or information used in the process Testing Plans Updated testing web site and/or software Outputs The key outcomes or services of the process Approved Testing Plan Updated browser as needed Configured testing spaces Enablers Tools which help the process function Wireless network Wired network Guides Policies and procedures which govern the process State of Virginia Test Implementation Manuals APS Online Testing Best Practices
Browser Update Request testing plan Complete Testing Plan Review and Approve Testing Plan Update Wireless Infrastructure Browser specs updated Issues Sub-Process Yes No Testing Issues? Set Up Testing Infrastructure according to plan Verify Equipment Setup according to plan Compliant? Yes Administer Tests (details not in scope of this document) No No Testing Complete? Yes Testing Compete
Support Sub-Process Issue Resolved? Yes No Issue Resolved Yes On-Site Support Yes First Two Days? Testing Issue No Report and Resolve Issue Recurring Issue? No
Task Details Task details can be used to clarify details about the events which occur in the process. Task details are particularly useful if the tasks are complex. Duplicate the task template as appropriate. Task: Browser update Responsibility: SSC Date(s): Notes: Update plugins, browser version, etc. to meet requirements for testing Task: Request testing plan Responsibility: Assessment Office Date(s): Notes: Send paperwork to schools asking for their testing plan Task: Complete testing plan Responsibility: School Testing Coordinator Date(s): Notes: Work with ITC, Principal and other school staff members as appropriate; signature from Principal that the scheduling of the test will be in compliance with the technical plan. Task: Review and Approve Testing Plan Responsibility: Assessment Office Date(s): Notes: Send draft plan to SSC for review; Document which schools have approved plans; Maintain a copy of proposed and approved plans Task: Update Wireless Infrastructure Responsibility: SSC Date(s): Notes: Update wireless infrastructure to meet needs of the approved testing plan.
RACI Matrix A RACI Matrix is used to clarify roles and responsibilities for a process. RACI Matrixes are particularly useful when a process crosses multiple groups or if responsibilities are changing as part of process improvement. RACI Matrix for SOL Testing process R - Responsible A - Accountable C - Consulted I - Informed Assessment Office Browser update A C R Request testing plan A I I Complete testing plan C R R C A STC ITC SSC Principal Wireless Administrator Review and Approve Testing Plan A C C R R Update Wireless Infrastructure I C A I R Set up testing infrastructure R R A Verify equipment setup I A C I Report issues I A C I I Compile and Analyze Issues A C C R Resolve Technical Issues I R R A R First Two Days on-site support A R R I R On-Site support for ongoing issues A R R R TTS
Continuous Improvement Vision What is the vision for the process? Online testing will be conducted routinely with no technical issues and minimal impact on Instruction Goals What is the goal state for the process? All schools will comply with approved testing plans by the Spring of 2014. Current Condition What is the current condition of the process? Testing plans are created Schools deviate from testing plans Schools do not certify compliance with testing plans Testing plans are not reviewed and approved by the wireless administrator Testing plans are not kept centrally Schools annually request network equipment to conduct online testing Next Target Condition What is the next planned condition which will move the process closer to the Goals Schools will keep networking equipment and discontinue the practice of requesting equipment. Proposed and approved plans will be kept centrally Plans will be approved according to the RACI Matrix in this document. Schools will certify that testing schedule is consistent with approved testing plan
Improvement Cycles March 2013 Discontinued the process of providing testing network equipment after the 2012-13 school year Informed schools to keep testing network equipment Created a central repository for approved testing plans Expanded review of testing plan to include Wireless Administrator Limited On-Site support to the first two days of testing
Accountability Division Scorecard and Dashboard Department Scorecard School Dashboard Teacher Evaluation Department Scorecards have Two Measures that are the Same Across Departments Plus Department Specific Determined Measures
7 6 5 4 3 2 1 0 Department Core Processes Maturity Level Jul-13 Sep-13 Time 3 Time 4 Time 5 Time 6 Time 7 Time 8 Time 9 Process Maturit y Level Continuous Improvement Results 100 80 60 40 20 0 Processes Documented AIM Cycles Completed Costs Reduced (Thsd $) Cycle Time Reduced (hrs.) Increased Reliability Jul-13 Sep-13 Time 3 Time 4 Time 5 Time 6? 6 5 1st Qtr 2nd Qtr 3rd Qtr 4th Qtr 4 3 2 1 0 Series 1 Series 2 Series 3 40 1/9/ 20 2013 0 1/8/ 2013 1/5/ 2013 1/6/ 2013 1/7/ 2013 Serie s 1 Serie s 2
Department Plan Consolidated 2012-13 Plan for the Department of Information Services Date: November 1, 2012 Vision The Department of Information Services provides the right information to the right people because information is anywhere, anytime, on any device Mission The Arlington Public Schools instills a love of learning in its students and prepares them to be responsible and productive global citizens. Goals Goal 4: Provide Optimal Learning Environments Strategies/Outcomes B. Provide an infrastructure for learning 1. APS utilizes state-of-the-art technology that creates engaging, relevant, and personalized learning experiences for all learners regardless of background, language, or disability Action Plans B.1: Implement an enterprise-level data warehouse 1. Identify data sources related to students, instruction, assessments, and other functional areas by 1/13 2. Prepare data to be system-ready including phases of data cleanup and preparation by 2/13 3. Document all relevant data-elements in one central repository of meta data, allowing for consistency in data integrity, presentation and use by 3/13 4. Migrate all the existing systems at functionality/operations around student data management into the new Student Integration System by 4/13 5. Develop and deliver a detailed and incremental training plan for all groups of data users (staff, students, administrators, parents) by 6/13
Process Maturity Operational Definition Data Gathering and Analysis: Department Asst. Superintendent Data Gathering Instrument: Process Maturity Criteria Data Gathering Process: Once a year, the department assistant superintendent will sit down with each core process owner and ask the Six Process/Performance Management (PPM) questions. 1. How do you clarify what the users and/or customers requirements or expectations are for the outcomes of the process you own? 2. Where on the Process Maturity Criteria would you rank your process at this time? 3. How many AIM cycles have you completed to improve this process since our last review and what did you learn from these improvement cycles? 4. Were any cost reductions and/or cycle reductions achieved during this past reporting time? 5. When was the last time you talked to the users of this process to better understand how well they think the process is working and find out if they have any suggestions for improvement? 6. Who do you benchmark this process against? Are you satisfied that this is the best benchmark for this process? What have you learned from your benchmark(s) that has helped you improve this process?
Maturity Level No systematic process and no results APS Process Maturity Criteria (how we know how well a process is being managed) Criteria 1) The process has clearly identified starting and ending points. The beginning of a systematic process with no results A systematic process is in place with no results The process has no significant deployment gaps A systematic process responsive to basic requirements An effective systematic process with trend data that has been sustained over time An effective systematic process with results that meet or exceed benchmark standards 2) Process customers (users) are identified and customer process requirements/expectations are clearly defined. A process measurement system is designed to collect data related to customer requirements and expectations. Customers validate the key metrics in the process measurement system. 3) The process is completely flowcharted using APS process documentation, including necessary detail so that individual accountability for process steps can be identified. 4) Process steps are standardized for all users. 5) Key process metrics show progress in meeting customer requirements/expectations targets. 6) Key process metrics show customer requirements/expectations consistently met or exceeded. 7) Key process metrics show best-in-class performance (results benchmarked against best-in-class results).
100 Continuous Improvement Results 80 60 40 Jul-13 Sep-13 20 0 Processes Documented AIM Cycles Completed Costs Reduced (Thsd $) Cycle Time Reduced (hrs.)
7 Department Core Processes Maturity Level 6 5 Criteria Level 4 3 2 Process Maturity Level 1 0 Jul-13 Sep-13
Improvement Continuous Improvement Plan Performance and Process Management Performance and Process Management PLC s and Staff Development APS Improvement Model (AIM) Designed and Deployed by Department Staff Performance and Process Management Consists of Calendared Events and Activities at Division and Department Levels
AIM to Process, Tools and Skills Map AIM Improvement Cycle Set the Vision and Goal Remember the Goal Determine the Current Condition Set the Next Target Condition PDSA to the Target Condition Process Tools Skills Strategic/School/Department Planning Staff Evaluation Classroom Instruction PLCs Scorecards Classroom Assessments Grade book Benchmarking Change Management Communications Plans Cost/Benefit Analysis Data Based Decisions Process Mapping Root Cause Analysis Voice of Customer
APS Improvement Model (AIM) Continuous Improvement is like any other skillbuilding activity. It is not meant to be casual, or occasional, or reserved for only when it s convenient. It must be habitual.
Questions?