State Career and Technical Education Self-Assessment Tom Kelsh, Senior Research Associate, MAGI Educational Services, Inc. Karen Batchelor, State Director for Career and Technology Education, Texas Education Agency Dan Covington, Director for Physical Information Management, Tennessee Department of Education Bernie McInerney, Tech Prep Coordinator, New York State Education Department Kathy Shibley, State Director of Career-Technical and Adult Education, Ohio Department of Education
STATE CAREER and TECHNICAL EDUCATION (CTE) SELF-ASSESSMENT Assessing the Progress and Future Planning of CTE: A Self-Assessment Tool for State Agencies Created by:
Purpose The State Career and Technical Education (CTE) Self-Assessment is a comprehensive, voluntary instrument designed to help guide states program improvement efforts.
Purpose (continued) The instrument identifies many activities, tasks, processes, and collaborations that, if they occur consistently, ensure that CTE programs are being implemented with a high degree of quality.
Purpose (continued) By using this tool in a dynamic, ongoing way, states can identify many existing CTE practices and policies that comprise quality and use them as building blocks for system-wide continuous improvement from properly administering their basic grants and tech-prep programs to using their accountability data to fund local programs.
Purpose (Continued) The process of the self-assessment also provides intangible value beyond any written reports or assessments because it Ø builds commitment and ownership on the part of the statelevel staff who participate in the process; Ø promotes team building and consensus among state CTE leaders Ø increases the capacity for strategic thinking in the field of CTE; and Ø builds an understanding of what the federal government requires of states with respect to quality performance.
Purpose (Continued) And finally, state CTE teams who engage in CTE self-assessment prior to the OVAE monitoring visit will be better prepared to take full advantage of the exchange of ideas and technical assistance provided. They will have considered the views of key stakeholders, assembled and digested information on the different components of CTE, and come to a consensus on the current status of their statewide efforts.
Directions for Use The CTE Self-Assessment asks state agencies to rate their CTE programs according to 30 quality indicators. The ratings should take into account the various pieces of evidence that define each indicator. A five-point rating scale representing a continuum of implementation progress has been developed and is described below. Rating 1 No Implementation 2 Minimal Implementation 3 Moderate Implementation 4 Complete Implementation 5 Exemplary Implementation Rubric Our state is not implementing any (or hardly any) of the evidence for this indicator. Our state is implementing some of the evidence for this indicator, but most of our efforts are in the planning stage. Substantial work is needed to improve our approach. Our state is implementing most of the evidence for this indicator, but some gaps in implementation exist and improvements could be made. Our state is implementing most of the evidence for this indicator. Our approach is systematic and organized with no major gaps. Our state is implementing all of the evidence for this indicator. We have a sound, systematic approach that could serve as a model for other states.
Recommended Steps The following steps are recommended to conduct the state CTE self-assessment. 1. Identify and recruit the key CTE stakeholders to complete the self-assessment. A variety of approaches to conducting this step can prove effective. Regardless of the approach used, however, it is important to enlist input from key stakeholder groups.
Recommended Steps (continued) 2. Gather supporting evidence and data. The instrument should be completed by knowledgeable stakeholders who use as much supporting evidence as possible. Sources of information can include the state plan, reports, minutes of meetings, mission/ vision statements, policies, written documentation and data gathered through interviews with stakeholders, student records, program site visits, third-party evaluation evidence, financial records, proposals, local applications, monitoring tools, the state s professional development plan, progress reports, and so forth.
Recommended Steps (continued) 2. Gather supporting evidence and data (continued). Examples of information sources for each indicator can be found in the document, State CTE Self-Assessment Sample Data Sources, beginning on page 40.
Recommended Steps (continued) 3. Complete the self-assessment. Carefully read the evidence for each indicator. If the evidence is in place (i.e., implemented), place a checkmark ( ) in the box provided. If you feel that your level of implementation is systematic, without significant weaknesses or gaps, place an asterisk (*) next to the checkbox. Then review these individual assessments and decide on a final rating for the indicator; fill in the appropriate circle:. Use the Notes section to record any explanatory or expanded information about the state s performance for that indicator. Once you have rated all of the indicators in each major CTE area, transfer your ratings to the Summary Form, beginning on page 36.
Recommended Steps (continued) 4. Provide feedback to CTE stakeholders involved in Step 1. Throughout the process of completing the CTE self-assessment, information should be fed back to the key CTE stakeholders as part of this dynamic process of inquiry and reflection.
Optional Uses The primary use of the CTE self-assessment is to help guide states program improvement efforts through careful study of statewide policies, procedures, and activities. However, a number of states have found it helpful to use the tool in other, creative ways: Ø as a monitoring tool for reviewing local grantee programs, functions, as well as expenditures and Ø an instructional device for orienting new staff (or reacquainting veterans) about what comprises quality in the delivery of CTE programs. Still others have used it: Ø as a way of communicating the importance of Perkins/CTE to other non-cte state-level stakeholders.
1. State Administration Quality Indicator 1.1: Mission The state has a clearly articulated mission for CTE that is consistent with the State Plan; a consolidated set of policies and procedures for translating the mission/vision into action. Evidence 1. The mission statement accurately reflects the purpose of the CTE initiative, who is served, the services offered, and the outcomes expected. Check ( ) if Implemented 2. The mission communicates the belief that all students including special populations can meet high standards of academic and technical excellence as well as engage in active, productive learning. 3. The state provides leadership for achieving the mission through a coherent set of policies and procedures that govern all areas of program administration, planning, development, and implementation in accordance with the State Plan and Perkins legislation. 4. The state has oriented relevant local- and state-level stakeholders to the CTE mission and policies/procedures (e.g., through training workshops, dissemination of print resources, electronic resources, personal contact, etc.). 5. The state has a process involving local input for periodically revising the mission and policies/procedures to ensure their continued relevance; modifications are made to reflect the evolving knowledge base in CTE. Final Rating: No Minimal Moderate Complete Exemplary Implementation Implementation Implementation Implementation Implementation NOTES (evidence of accomplishments, related data/criteria, key stakeholders involved, critical issues, Web site, etc.):
Final Rating Summary Form Program Area/Quality Indicator 1. State Administration Quality Indicator 1.1: Mission The state has a clearly articulated mission for CTE that is consistent with the State Plan; a consolidated set of policies and procedures exist for translating the mission/vision into action. Quality Indicator 1.2: Secondary-Postsecondary Collaboration The state has established effective working relationships between and among secondary and postsecondary institutions. Quality Indicator 1.3: Collaboration-Other State Agencies The state has established collaborative linkages with other state-level agencies and programs involved in workforce preparation. Quality Indicator 1.4: Use of Reserve The state uses its reserve funds to foster program improvement. Quality Indicator 1.5: Local Monitoring The state monitors local grantees for compliance with Perkins requirements and performance goals. Final Rating: Level of Implementation None Minimal Moderate Complete Exemplary None Minimal Moderate Complete Exemplary None Minimal Moderate Complete Exemplary None Minimal Moderate Complete Exemplary None Minimal Moderate Complete Exemplary
STATE CTE SELF-ASSESSMENT SAMPLE DATA SOURCES Program Area/Quality Indicator Final Rating: Level of Implementation 1.1 Mission: The state has a clearly articulated mission for CTE that is consistent with the State Plan; a consolidated set of policies and procedures exists for translating the mission/vision into action. Sources a. State plan for CTE (Note: this data source is relevant for all indicators) b. Written mission statement and/or set of guiding principles c. Policies/procedures manual or guidelines (Note: this data source is relevant for all components, e.g., Tech Prep, Accountability, Special Populations) d. Event calendar/contact log/website identifying workshops, resources, and strategies to inform stakeholders about the CTE mission and policies/procedures e. Sample of orientation products developed and disseminated 1.2 Secondary-Postsecondary Collaboration: The state has established effective working relationships between and among secondary and postsecondary institutions. Sources a. Strategic plan/action plan for joint work b. Signed interagency agreements or Memorandum of Understanding (MoU) c. Event calendar/contact log documenting planning sessions, meetings, jointly sponsored activities; minutes of meetings d. Inventory of products/processes produced collaboratively; sample of products jointly developed e. Copy of local application re: secondary-postsecondary linkage requirements f. Event calendar/contact log/website identifying workshops, resources, and strategies to help grantees create effective linkages g. Sample of products developed and/or disseminated to assist grantees h. Sample of completed local application review instruments and/or approved applications re: grantee capacity for establishing effective secondary-postsecondary linkages
NCCTE Webcast Panel State CTE Self Assessment New York State s Perspective Bernie McInerney, Tech Prep Coordinator New York State Education Department Ohio State University Columbus, OH March 27, 2006
NYS CTE Perspective " US Department of Education Office of Vocational and Adult " Compliance Education (OVAE) Document Decision: Perkins Monitoring Checksheets " Quality CTE Tech Prep Self Assessment Tool
NYS Perkins Team Stages 1. Participation in Self Assessment Tool Conference Calls 2. Review of Perkins Monitoring Checklists 3. Decision Monitoring Checksheets or Self Assessment Tool 4. Build NYS Perkins Review website on Monitoring Checksheets 5. Self Assessment Tool as a compliment for reference
NYS Perkins Team Stages (continued Post OVAE Monitoring Visit) 6. A user-friendly Pilot Survey developed using the Self Assessment Tool - beyond the boxes 7. Target Audiences for the Pilot Survey: key State Perkins Team staff, Tech Prep Consortia, and a smattering of CTE Directors 8. Survey Results for outcome possibilities
Perkins Compliance Checksheets
Compliance Outcome USDE Monitoring Report " Perkins Monitoring Review July 2005 went very well with only a few compliance issues that were quickly resolved
Quality Initiative POST USDE Monitoring Report " NYS CTE Tech Prep Pilot Online Survey with Self Assessment Tool
Survey Form scroll to bottom!
Designed so individual areas can be chosen
Notes Sections are provided for feedback
NYS CTE Self Assessment Process and Outcomes Strategies being surveyed in NYS: " use in lieu of Perkins Monitoring Checklist " tool for reviewing local grantee programs " incorporate into our State/Local Plans or final reports, e.g.: " narrative for the annual Perkins Consolidated Assessment Report (CAR) to OVAE " orientation instrument for new staff " in-service instrument for experienced staff
CTE Self Assessment Tool Process and Outcomes (continued) " use to compliment improvement planning and implementation with regional accreditation organizations in postsecondary institutions " modified tool for local grantee programs use, and " learning other strategies from NCCTE Webcast Panelists from Ohio, Tennessee and Texas!!!!!
Thank you New York State Education Department Bernie McInerney, Tech Prep Coordinator bmcinern@mail.nysed.gov 518-474-4157 Pilot Online Survey Form can be found at: http://www.emsc.nysed.gov/workforce/ techprep/tech.html
Ohio s Experience with State CTE Self-Assessment Kathy Shibley, Ph.D. Director Office of Career-Technical and Adult Education Ohio Department of Education March 27, 2006
Timing! Sequencing with monitoring visit! Volume of time required! Alignment with monitoring checklist! Benefits of definition!! Relationship to Monitoring!
State Plan! Monitoring implementation! Mid-monitoring check!! Future Uses!!!
State CTE Self Assessments Our Children Are Our Future: No Child Left Behind " " Karen Batchelor Texas Education Agency
Performance-based Monitoring (PBM) System Performance Based Data Driven 2004-2005 CTE Pilot Year PBM district reports for CTE concentrators during 03-04 Intervention stages - based on the number of indicators below state standards (1-2-3-4)
Texas Assessment of Knowledge and Skills (TAKS) Student academic performance in Math Reading/ELA Science Social Studies
PBM Indicators for CTE Concentrators 1. CTE overall performance on TAKS 2. CTE SPED TAKS 3. CTE LEP TAKS 4. CTE ED TAKS 5. CTE Tech-Prep TAKS 6. CTE Annual Dropout Rate * Total of 21 CTE indicators
CTE Report Only Measures 7. RHSP/DAP Graduation Rate CTE students earning a recommended or distinguished achievement diploma 8. CTE Non-traditional course completion (males) 9. CTE Non-traditional course completion (females)
PBM Standards Performance level 1 à 1-5% below standard Performance level 2 à 5.1-10% below standard Performance level 3 à 10.1%+ below standard
CTE PBM Summary 2004-05 Stage 1 à 67 districts Stage 2 à 23 districts Stage 3 à 13 districts Stage 3+ à 3 districts Stage 4 à 21districts * 24 on-site visits 2005-06 Stage 1 à 158 districts Stage 2 à 30 districts Stage 3 à 20 districts Stage 4 à 26 districts * 26 on-site visits
District PBM Review Team 1. CTE district/campus administrator 2. Parent of CTE student 3. CTE teacher 4. CTE student 5. Guidance counselor 6. Business/industry partner * Other team members as desired
Stage 1 Intervention Stages Program Review and Improvement Plan Stage 2 Focused Data Analysis Program Effectiveness Review/self study Continuous Improvement Plan Stage 3 Full Compliance Review Stage 4 Full Compliance Review CTE/ Civil Rights On-site Review
Program Effectiveness Review (based on Perkins State Self Study) 1. Administrative Leadership 2. Local Perkins Application/Plan 3. Tech-Prep/Advanced Technical Credit 4. Special Populations 5. Civil Rights (CR) 6. Fiscal Management 7. Accountability
Modifying the State Self Study to Develop the Program Effectiveness Review 1. Customized for LEA 2. Used Indicators only (no evidence) 3. Added Civil Rights indicators 4. Added yes/no for each indicator 5. Column for identifying Strengths 6. Column for Areas of Improvement
CTE Web Resources Performance Based Monitoring www.tea.state.tx.us/pbm Program Monitoring and Intervention www.tea.state.tx/us/pmi
Tennessee Self Assessment Process Dan Covington Director, Fiscal and Information Management Tennessee Department of Education dan.covington@state.tn.us
OVAE Targeted Monitoring September Notification of Targeted Monitoring Visit Tennessee Targeted Monitoring Visit -December 1 & 2, 2005 Preparing for the On-sight Review OVAE recommended that we: Collect Documentation for the Six Topical Areas Complete the Perkins Self Assessment Tool
Perkins Self Assessment Tool Why we did it? Tennessee had experienced staff changes within the Division The current staff members were not at the SDE when Tennessee was monitored in 2002 The tool presented a unique data collection process to ascertain depth and quality of programs
Preparing for the Perkins Self Assessment 16 Stakeholders were Identified: Assistant Commissioner(1) Department Directors(3) Program Area Consultants(2) CTSO Consultants(1) Field Service Consultants(2) Vocational Directors(LEA)(3) TCOVE Executive Director(1) Postsecondary, TTC & CC(3)
Focusing on Continuous Program Improvement Survey Methodology Representative team of stakeholders including secondary and postsecondary identified Assessment was completed by each individual stakeholder Results were compiled and percentages and comments were rated
Why the Self-Assessment Survey? Assess the level of compliance with Perkins III legislation for the Quality Indicators Assess the depth and quality of our career and technical programs as we continue our 20/20 visioning process for program improvement
Why the Self Assessment Survey? Assist with the visioning process Determine where we were on each Quality Indicator Begin a validation process for OVAE Monitors Assess where we needed to be
Survey Results Analysis Highest Rating Order Average of Responses for Complete or Exemplary Local Application Fiscal Responsibility State Administration
Survey Results Analysis Lowest Rating Order Average of Responses for No, Minimal, or Moderate Implementation Tech Prep Special Populations Accountability
High Ratings Analysis Division Initiatives Submission of automated local applications Submission of on-line accountability data Initiated risk based monitoring processes Building staff capacity Strengthening fiscal management processes (FACTS,CATS, Staff reassignment)
Lower Ratings Analysis Areas-of-Need Focus Secondary and Postsecondary Connections Ensuring Best Results for Special Populations Newness of the Automated Accountability Systems
Areas that are Targeted Improve Needs Accountability data from Tech Prep Student Follow-up data reliability Clarify mission Improve collaboration with agencies Improve services and outcomes for Special Populations
Significant Concerns Targets Systematic collaboration with Tech Pep and for equal access for special populations Secondary/postsecondary collaboration and state wide articulation agreements Automated plan applications Preparing special populations for further learning and high skills, high wage occupations
Significant Concerns Targets Assess academic attainment in the accountability system Use accountability data to shape continuous improvement Determine a reliable assessment of technical skills in the accountability system
Use of Survey Results A structure for on-going program improvement planning A move to beyond mere compliance A data base for self-improvement A baseline data base for where we are on program improvement A needs assessment document for program emphasis
Tennessee Action Plan 2020 Task Force Visioning based on four pillars: Academic Vision Articulation-transitions Communication Professional Development
Tennessee s Action Plan Division s on-going Action Plan PMOC s (Project Management Oversight Committee) etiger data reporting CATI academic integration Web Design restructure Serving Special Populations Curriculum alignment with Post-Secondary
Tennessee Action Plan Divisions on-going Action Plan Perkins online Report Card 20/20 Vision Task Force Name change legislation Transitions from high schools to colleges and careers (SREB) Postsecondary Challenge Grants for community colleges Statewide articulation agreements
What we have learned from the Self Assessment An excellent Planning Tool Division 2005 Retreat will focus on the assessment results We have archived our files to document where we are and will use them to continue to document our strengths and weaknesses Future Monitoring Visit Format
Tennessee Secondary Program Data 2004-05 Total Course Enrollment 296,224 Agricultural Education------------------30,610 Business Technology--------------------81,819 Contextual Academics--------------------8,437 Family and Consumer Science---------51,896 Health Science Education--------------18,378 Marketing Education--------------------15,007 Technology Engineering-----------------8,780 Trade and Industrial---------------------80,576
Individual Student Demographic Data Total 9-12 CTE students-170,134 Total 9-12 High School Students-284,615 CTE Percentage of State Average-59.78% Students with disabilities-28,135 (16%) Economically Disadvantaged-90,318 (53%) Limited English Proficiency-2,520 (1%)
Our Mission Tennessee Department of Education Helping Teachers Teach and Students Learn