Teaching approach DRAFT May 29, 2012 DRAFT DRAFT Proposal for Evaluation of Current and Proposed Driver Education Formats in North Carolina In the spring of 2012 the School of Government was approached by the North Carolina Department of Public Instruction (NC DPI) about conducting an evaluation of aspects of Driver Education in North Carolina. To determine what would be included in the evaluation, discussions with DPI and General Assembly Program Evaluation Division staff resulted in the following grid outlining the types of Driver Education teaching currently being offered or considered and the three aspects of greatest interest to stakeholders overall. Evaluation Criteria COST (per student) QUALITY (outputs and results) ACCESS (convenience, digital divide) TRADITIONAL METHOD (classroom) BLENDED (hybrid) 1 5 9 2 6 10 ON-LINE (interactive) ON-LINE (non-interactive) 3 7 11 4 8 12 NOTE: teaching approaches are for content only and do NOT include Behind the Wheel (BTW) instruction. The evaluation plan we propose follows this grid for each square we propose below a method and potential data sources to provide information on particular combination of criteria and teaching approach. 1
COST (Cells 1-4) The evaluation team will identify the cost of Driver Education per student in each approved approach. 1. Traditional Classroom Method: We propose to identify cost per student in schools using a traditional classroom method with a state-wide approach. This will be faster, easier, less expensive, and less of a burden on schools, but it does assume consistency from school to school, which is unlikely. So there is some limitation to this method. Schools included: all schools Methods: gather school year 2011-2012 data via survey on total program cost and number of students participating in those classes. Data Sources: Survey of all LEAs through their Finance Officers. Data Analysis: Descriptive statistics including cost per student, range, outliers, and other notable patterns. Limitations: each LEA has the ability to design its own approach to classroom instruction within the established curriculum. There may be dozens of different actual activities with different cost levels. A state-wide average does not reflect the expected wide range of actual activities and related costs. Result would be a broad estimate. If data are self-reported, there is no guarantee of consistency or comparability. Costs: depends on survey type (mail, electric, or phone), qualtrics survey software, drafting and editing the survey, and data analysis. 2. Blended Program: Blended programs are also varied in the actual activities that take place and less is known about what blended means. The overall approach would be the same as with traditional instruction Schools included: same as above. Methods: same as above. Data Sources: same as above, personnel involved in program at selected schools. Data Analysis: Descriptive statistics including cost per student, range, outliers, and other notable patterns. Detailed description of what blended means in each school. Detailed connection of aspects of instruction (personnel, materials, overhead) to total costs. Limitations: same as above Costs: depends on quality and detail of budget documents, availability of personnel to participate, and location of site visit schools. 3. Online (Interactive): A true interactive program as envisioned by many in the Driver Education community does not currently exist in North Carolina. The information for this option would be gathered from other states which offer this approach. States included: Texas and Florida (still need to confirm what other states offer it) 2
Methods: Case studies. We would gather program descriptions, participation data and cost data from the vendor and appropriate oversight authority in each case study state. Data Analysis: Case studies that present comparable information as much as possible in the appropriate state context. Limitations: Information gathered would not necessarily be generalizable to North Carolina. Costs: depends on number of states included. 4. Online (Pilots) Schools included: same as for 1 and 2 above. Methods: same as for 1 and 2 above, but may include site visits to selected participating schools. Data Sources: same as above, personnel involved in program at selected schools. Data Analysis: Descriptive statistics including cost per student, range, outliers, and other notable patterns. Detailed connection of aspects of instruction (personnel, materials, overhead) to total costs. Limitations: same as above Costs: depends on quality and detail of budget documents, availability of personnel to participate, and location of site visit schools. QUALITY (Cells 5-8) The evaluation team will assess the quality of the NC Department of the Public Instruction s Driver Education program in the four areas according to the following outputs and outcome measures. In all cases, attempts will be made to compare results for student who took the Driver Education curriculum in each of the four approaches with students who did not take the curriculum at all. The ability of us to complete this portion of the evaluation depends on data access from DMV. 1. Curriculum Outputs Number of Students: The evaluation team will assess the percent of qualified students receiving a certificate, as well as other benchmark points up to receiving a driver license. 2. Curriculum Results Short term: The evaluation team will assess the percent of qualified students receiving a driver license on first attempt, as well as other benchmark points. Long Term: The evaluation team will assess the percent of qualified students involved in traffic violations, accidents, accidents with injuries, and accidents with fatalities. 3
ACCESS (Cells 9-12) The evaluation team will assess the access of the NC Department of the Public Instruction s Driver Education program according to the four different options along the following lines: 1. Ease of Access: Demographics: The evaluation team will identify which students are taking the program in each of the four curricula. Potential Impact: The evaluation team will explore data relevant to the number of potential additional students served through the new online programs. 2. Digital Divide: Infrastructure: The evaluation team will explore the potential limitation of the program given the lack of access to broadband internet service in North Carolina. Potential expansion: The evaluation team will work with internet providers to identify the degree to which future increases in internet access will facilitate additional enrollment in the new NC Driver Education programs. PROJECT BUDGET This budget contains all costs associated with executing the work-plan outlined above, but it is based on estimates of the time required to accomplish certain tasks. It may not match the final cost of the project. The client will be contacted before any additional costs are incurred, and no additional work will occur without the client s consent to modify the scope of work, work-plan, or budget. Initial Research and Project Design Collecting and Compiling Data Analyzing Data Writing Report Editing & Presenting Total Expected Expense 40 hours 60 hours 60 hours 60 hours 20 hours 240 hrs $9,600.00 ($40/hr) TIMELINE Phase Hrs June July August September Project Design 40 Collect & Compile Data 60 Analyzing Data 60 Writing Report 60 Editing & Present 20 TOTAL 240 4
DELIVERABLE Starting in August, work will begin on the draft report. The draft report will include background on the project, an explanation of the data gathering process, and findings. A draft will be ready for NC DPI by August 15. A final report will be submitted in electronic form to NC DPI on September 15. Printed versions will not be provided unless requested, with an additional charge to cover printing costs. SCHOOL OF GOVERNMENT GUIDELINES 1. The School of Government maintains a strong emphasis on the non-partisan, evidence based nature of our work, in line with our mission to improve the lives of North Carolinians through good government. 2. Knowledge gained from this evaluation may be used in classroom, writing, presentations or in other learning settings in line with our educational mission as part of the University of North Carolina at Chapel Hill. 3. Whenever possible and practical, we seek advice, review, and collaboration with all those involved in the evaluation. We welcome input at all stages of the evaluation process. However, to maintain the integrity of our work, we retain the right of final approval for all information included in final evaluation products. 4. Our evaluations follow research ethics guidelines established by the Institutional Review Board of The University of North Carolina at Chapel Hill, even in instances where not officially required. 5. All work conducted by the School of Government is considered public record. Any information related to this evaluation will be released upon request. EVALUATION PERSONNEL Andrew George, Project Director, received his PhD in 2010 in environmental policy from UNC Chapel Hill. He was a Royster Society Fellow, an Interdisciplinary Research Fellow, and he received the prestigious Tanner Teaching Award. Dr. George has worked with the School of Government on several evaluations since 2008, including the assessment of the Clean Water Management Trust Fund of North Carolina. Before graduate school, Dr. George worked for over 15 years with non-profit organizations and local governments. He also teaches two graduate level courses, Analysis and Evaluation I and II (PUBA 719 and PUBA 720) at UNC Chapel Hill, and Conflict Resolution (ENVIRON 296) at Duke University. Maureen Berner, Faculty Advisor, first joined the School of Government in 1998, teaching program evaluation, statistics, and budgeting. Between 2003 and 2005, she directed efforts to provide new outreach activities for local governments based on the UNC model at the University of Northern Iowa. In 2005, she returned to teaching and writing on research methods for MPA students and public officials at the School of Government. Berner has been active in research and teaching in both academia and in government, and her publications include a variety of books, textbooks, and journal articles. She worked 5
for four years with the Budget Issues Group at the U.S. General Accounting Office, including a rotation to the US House of Representatives Budget Committee while serving as a Presidential Management Intern. Berner received an MPP from Georgetown University and PhD in public policy from the LBJ School of Public Affairs, University of Texas at Austin. CONTACTS Evaluation Director: Faculty Advisor: Andrew George, PhD School of Government University of North Carolina at Chapel Hill andrewg@unc.edu 828.280.6956 (cell) skype: andrew.george08 Maureen Berner, Ph.D. Professor School of Government University of North Carolina at Chapel Hill mberner@sog.unc.ed 6