Program Evaluation Plan: A Prescription for Healthcare Training in Tennessee (RX TN) Date of Interim Report: 11/30/14 Date of Final Report: 9/30/16



Similar documents
A Partnership of Tennessee s Community Colleges and Tennessee Technology Centers

Idaho Center of Excellence Healthcare Partnership Program Evaluation Plan

This Academic Program Review provides a holistic view of the status of degree programs throughout the state, and includes the following components:

Project: TBR Central Instance PeopleAdmin Tutorial

Below is a suggested framework and outline for the detailed evaluation plans with column names defined as follows:

TENNESSEE 2011 For a strong economy, the skills gap must be closed.

Tennessee Higher Education Commission Performance Funding Standard 1B: Major Field Assessment

COMPLETE COLLEGE TENNESSEE

BACKGROUND INFORMATION:

THE LIST OF state operated institution of higher learning. Tennessee Tech University. University of Memphis. Chattanooga State Community College

Northeast Ohio Health, Science, and Innovation Coalition (NOHSIC)

Hospital Employee Education and Training Report to the Legislature December 2010

Affordability Profile

135CSR28 TITLE 135 JOINT LEGISLATIVE RULE WEST VIRGINIA COUNCIL FOR COMMUNITY AND TECHNICAL COLLEGE EDUCATION AND BOARD OF EDUCATION

Draft Policy on Graduate Education

Carl Perkins Recipient On site Monitoring Preparation Documentation. FY16 Fiscal Year

Effective Programming for Adult Learners: Pre-College Programs at LaGuardia Community College

STATE OF TENNESSEE SLC Presentation

The Earning Power of Graduates From. Tennessee s Colleges and Universities:

Midland College Educational Program Assessment Associate Degree Nursing

Grand Rapids Community College Teaching and Learning Quality Report Year 2 Occupational Standards for WorkKeys August 8, 2013

FY 2015 CONGRESSIONAL BUDGET JUSTIFICATION EMPLOYMENT AND TRAINING ADMINISTRATION. TAA Community College and Career Training Grant Fund

BOARD POLICY 6178 Page 1 of 5 CAREER TECHNICAL EDUCATION INSTRUCTION EFFECTIVE: OCTOBER 13, 2015 REVISED:

Southeast Tech South Dakota Allied Health Training Consortium (SDAHTC) EXECUTIVE SUMMARY

NAWB Annual Meeting March H2P Consortium: Community Colleges Innovate to Provide Pathways to Employment

Strategic Focus Improve Preparation for Success at College

Graduation. Nursing. Pre-Nursing

A Homegrown Solution for Montana s Healthcare Workforce

Program Review

GUIDE TO EVALUATING INSTITUTIONS

Trade Adjustment Assistance Community College and Career Training Grants

Response to the Department of Education Request for Information: Promising and Practical Strategies to Increase Postsecondary Success

Addressing Education Deficits: LaGuardia Community College s Bridge to College and Careers Program

Online Skills Academy Virtual Listening Session Questions

Graduation Rate - Bachelors. Graduation Rate - Masters

Workforce Education and Economic Development Model Framework for Community Colleges. Submitted June 1, 2013 DRAFT

BEAUFORT COUNTY COMMUNITY COLLEGE ~ALLIED HEALTH DIVISION~ A45100 Associate Degree Nursing Program Review: July, 2012

NORTHERN ILLINOIS UNIVERSITY. College: College of Business. Department: Inter-Departmental. Program: Master of Business Administration

Indiana University School of Nursing Program Assessment and Review Report AY

Student Success: Investing in the Future through Health Care Career Pathway Programs April 8,2015

Terry College of Business Strategic Plan

Caralee Adams Contributing writer, Education Week. Follow Caralee on

STANDARD #2. Strategic Planning

Theory of Action Statements - Examples

Creating Pathways to Degree Completion Presented by:

Program Overview. Updated 06/13

Doctor of Nursing Practice (DNP)

(1) Reducing the credit hours required to complete an associate or baccalaureate degree offered by the institution

Grantee Guidebook 9134 Report-Form and Narrative October 2010

NATIONAL RECOGNITION REPORT Preparation of School Psychologists

Southern New Hampshire University School of Education Educational Leadership (Ed.D.) Program Overview

A Master Plan for Nursing Education In Washington State

HIM Baccalaureate Degree. Standards and Interpretations for Accreditation of Baccalaureate Degree Programs in Health Information Management

Pathways in Technology Early College High School (P-TECH) STEM Pathways to College & Careers

Get Into Energy Outreach and Career Coaching Purpose: Targeted Career Awareness Campaigns: Career and Education Advising and Guidance:

Integrating HR & Talent Management Processes

TAMECA A. ULMER, Ed.D Alta Dr. #2033, Las Vegas, NV Contact Phone (702) tamecau@gmail.com

Introduction to Research Methods and Applied Data Analysis

BACKGROUND INFORMATION:

Assessment Handbook. Assessment Practice. Student Affairs Assessment Council (SAAC)

Academic Program Review Handbook

Charter School Performance Framework

Framework for Leadership

Tennessee Public Postsecondary Graduates. and the Labor Market:

EDUCATION SPECIALIST IN EDUCATIONAL LEADERSHIP

Learning Outcomes Data for the Senate Committee on Instructional Program Priorities

Academic Audit Site Visit Report. Pellissippi State Community College University Parallel Program April 17, 2015

Protocol for the Review of Distance and Correspondence Education Programs Effective July 5, 2006

MCC Academic Program Review Report

UTAH STATE UNIVERSITY. Professional School Guidance Counselor Education Program Mapping

The Board shall review and approve all district plans and applications for the use of state and/or federal funds supporting CTE.

Program Description. Doctorate of Health Sciences to Be Offered by Indiana State University, Terre Haute, IN

Health Care ACADEMY. Strategic Action Plan. Presentation to WISF Health Care Subcommittee January 27, Building Tomorrow s Workforce Today

Carl Perkins Program Self-Evaluation for FY 2013 Kansas Board of Regents - Career and Technical Education

Student Learning Outcomes Bachelor in Health Information Management Department of Health Sciences College of Applied Science and Technology

Current Workforce Skill Development Project OHSU-AFSCME Jobs and Ladders Program Final Report

GEORGIA STANDARDS FOR THE APPROVAL OF PROFESSIONAL EDUCATION UNITS AND EDUCATOR PREPARATION PROGRAMS

Michigan Department of Education Educational Technology Plan Suggestions for Enhancing Your Technology Plan

Colorado Career Pathways Sector Strategies Steering Committee Pathways Measurement Taskgroup Draft of Evaluation Plan for Key Metrics October 6, 2013

Basic Health Care Certificate

IHE Evidence XXX XXX XXX

Developing Market-Relevant Curricula and Credentials: Employer Engagement for Community Colleges in Partnerships

QUALITY INDICATORS FOR ASSISTIVE TECHNOLOGY SERVICES RESEARCH-BASED REVISIONS, 2004

1. PROFESSIONAL SCHOOL COUNSELOR IDENTITY:

PENSACOLA STATE COLLEGE

Evaluation of the Information Technology Professionals in Health Care ( Workforce ) Program - Summative Report

North Carolina New Schools Design Principle 1: Ready for College. Beginning Early Steps Growing Innovations New Paradigms

The University of Arizona

New York State Application for Workforce Investment Act Incentive Grant Funding

Student Achievement at Johnson University

Grant 90FX0001: Bergen Community College Northern NJ Health Professions Pathway Consortium

Introduction to Research Methods and Applied Data Analysis Online module

UNIVERSITY OF BRADFORD School of Management Programme/programme title: Master of Science in Strategic Marketing

NEW YORK STATE TEACHER CERTIFICATION EXAMINATIONS

Career Expressways Update. AARO Social Equity Committee Workforce Potential Project Group May 20, 2014

Evidence-Based Enrichment Programs. Summary of Child Trends Research Briefs by Laura Deaton Managing Director December 15, 2010

Council for Accelerated Programs

NATIONAL RECOGNITION REPORT Preparation of Educational Leaders School Building Level

Design Specifications. Asset Inventory

COLLEGE OF PUBLIC HEALTH AND HUMAN SCIENCES. Health Management & Policy MPH Handbook

Transcription:

Evaluation Plan: A Prescription for Healthcare Training in Tennessee (RX TN) Evaluation Methodology: Comparison Group (Non-Experimental) Date of Interim Report: 11/30/14 Date of Final Report: 9/30/16 PURPOSE. The purpose of this evaluation is to collect, analyze, and interpret data pertaining to RX TN that will (a) lead to continuous program improvement and (b) determine the extent to which the various program components are associated with positive outcomes and impacts in the lives of program participants. Due to the multi-year nature of this project, the evaluation will begin with an implementation evaluation to ensure that all program activities are being conducted as planned. While the implementation evaluation will be ongoing, the evaluation focus will also include participant outcomes and impacts later in the project cycle. To promote engagement in evaluation activities throughout the grant period, the primary stakeholders of this project have been identified prior to program implementation. The U.S. Department of Labor (USDOL), the grantor of this project, is the primary stakeholder who seeks to fund programs that lead to education and employment outcome improvements. Students receiving student services provided by this grant, and/or those who enroll in one of the grantfunded career programs funded with this project, also have a large stake in program implementation and outcomes. Thus, these students are a crucial source of data during both the implementation stage and the outcomes stage of evaluation. Each of the career training fields will have a designated curriculum specialist who also has a vested interest in evaluation findings. The program director, assistant director, and all other staff funded with this grant, as well as community partners including employers, are other identified stakeholders of this project. Members of stakeholder groups have already contributed to the evaluation planning process, and they are expected to show continued involvement throughout the evaluation in an interactive 1

process with the third-party evaluator. This will ensure that the most pertinent evaluation questions are being addressed; appropriate and feasible methods for data collection are developed and used; and evaluation results are disseminated and utilized as appropriate. EVALUATION QUESTIONS. The evaluation questions to be addressed by this evaluation were created by working collaboratively with grant team leaders to ensure a shared understanding of the theory of change underlying the program and to align evaluation processes with the informational needs of these stakeholders. Engaging this subset of stakeholders (i.e., the intended users of the evaluation findings) early on in the process helps to promote ownership of the evaluation process, as well as eventual use of evaluation results. Information regarding program implementation, as well as program outcomes, will enable program stakeholders to improve program processes as well as evaluate whether program activities are leading to intended outcomes. The purpose of the implementation evaluation is to determine the extent to which the various project inputs/activities/outputs are occurring as intended by the program creators. This will allow project staff to (a) make adjustments when program activities are not occurring as planned, and (b) determine whether program outcomes and impacts may be attributed to these planned activities. Thus, a 3 rd party evaluator will be involved early in the planning stage to provide real-time feedback to program staff, and a significant amount of evaluation resources will be expended to answer the planning and implementation evaluation questions presented in the top portion of Table 1. By thoroughly addressing implementation, specific recommendations can be made to the USDOL and other institutions as future programs are developed and funded. The central purpose of the evaluation of program outcomes and impacts is to determine the extent to which the program accomplished its goals. By utilizing matched comparison groups 2

and statistically controlling for differences between the participant and comparison groups (e.g., by using Analysis of Covariance), results from the evaluation can be used to estimate the value added by the program. In other words, the design used for this evaluation will enable measurement of the impact of various program components on its participants by contrasting participant outcomes and impacts with those of non-participants (see the bottom portion of Table 1 for outcomes/impacts evaluation questions). METHODS. To address the evaluation questions presented in Table 1, a mixed-methods approach to data collection will be used to collect information from multiple sources. Using both qualitative and quantitative data sources concurrently will allow the third-party evaluator, with assistance from the data manager, to gain a multi-dimensional perspective that allows for a more thorough analysis and promotes triangulation. Table 1 shows the methods and data sources for each evaluation question. Importantly, this table shows that data on both the comparison groups and the program participant groups (described below) will come from the same data sources. All participant and comparison group members will be affiliated with a consortium institution; therefore, it will be possible to collect the information needed by USDOL (e.g., name, date of birth, and Social Security Number) to track comparison group members during the student enrollment process. 1 Because evaluation is ongoing, and especially because this is a multi-year program, it is likely that some of these components will be modified along with the changing priorities and data needs of the primary stakeholders. As such, an examination of the utility of this plan will occur on a routine basis. DESIGN & DATA ANALYSIS. The various programs of study, along with the diverse nature of the evaluation questions and corresponding methods of data collection, will necessitate a 1 This consortium has agreed to provide these data to the Department of Labor on an annual basis during the grant period of performance. 3

comprehensive yet flexible design and data analysis plan. Random assignment into participant and control groups is not practical for this project because educational training will be provided to all students who apply to, are accepted into, and enroll in a grant-funded program of study. In other words, program directors will select participants from a pool of self-selected candidates based on their assessment of students aptitude in the programs. When appropriate, the planning and implementation evaluation will address whether the program is organized, managed, and delivered in a consistent manner across RX TN institutions. Comparison groups (consisting of an equal number of students as the participant groups) will be used to measure grant impact on participants in most of the training programs. Overall, program participants will differ from the cohort group because only the participant group will have received the various support prescriptions included in the RX TN grant. For instance, participants will be provided with skills assessment and readiness services, career exploration and academic planning services, academic preparation and supplementary instruction services, and retention and completion coaching services. With the exception of the Surgical Technology, Patient Care Technician and Emergency Medical Dispatcher programs, for which no appropriate comparison data are available, posttest comparisons of participant and comparison groups will be used to measure grant impacts. See Table 2 for a description of all comparison groups. When it is appropriate, comparison and participant groups will be matched on the basis of age, gender, and institution to attenuate the effect of nonequivalence on validity. When group differences exist, statistical analyses will control for these variables. The last column in Table 1 indicates the data analysis procedures that will be used to address each evaluation question. INTERPRETATION & DISSEMINATION. So that evaluation results are meaningful and conducive to program improvement, data will be interpreted and conclusions will be made by the 4

3rd-party evaluator in conjunction with the RX TN director and other stakeholders as appropriate. Evaluation results will be shared regularly with RX TN management to ensure that adjustments are made in a timely manner. The 3 rd party evaluator will work with grant management to develop a plan to disseminate findings 2, some of which will be included in the interim report. The evaluator will also assist grant management with decision-making that leads to the modification/improvement of processes that will be evaluated and reflected in the final report. This dissemination plan is tied to the utility standard of evaluation 3, and will ensure that results are used in purposeful ways that support continuous program improvement. Finally, consistent with the grantor s goal to fund programs that supply TAA and other adults with the skills and credentials needed to gain high-wage, high-skill employment, the evaluator will make evidence-based suggestions for future replication of the program in the final report. THIRD-PARTY EVALUATOR SELECTION. The 3 rd party evaluator of RX TN must practice in accordance with the national program evaluation standards and the Guiding Principles of the American Evaluation Association (AEA, 2008 4 ). Using AEA s Guiding Principles ensures that program evaluators engage in rigorous evaluations that are based on systematic data-based inquiry. Using the Guiding Principles in the selection process also helps ensure that only competent, ethical, and respectful evaluators are chosen to facilitate the evaluation of this program. The process of selecting an evaluator requires adherence to state and federal purchasing guidelines to appropriately bid for these services. Roane State expects to attain a 3 rd party evaluator by April 1, 2013. 2 A dissemination plan will include regularly scheduled meetings with stakeholders, and/or the development of multiple, tailored evaluation reports aimed at specific stakeholder groups. 3 Yarbrough, D. B., Shulha, L. M., Hopson, R. K., and Caruthers, F. A. (2011). The program evaluation standards: A guide for evaluators and evaluation users (3rd ed.). Thousand Oaks, CA: Sage 4 American Evaluation Association: Guiding principles for evaluators (2008). American Journal of Evaluation, 29, 397-398. 5

Table 1 Evaluation Question Method Data Sources Analysis Planning/Implementation Evaluation 1. What process was used to plan the various program components, including student services? Key Variables (a) Curriculum selection, creation, and application, (b) design or expansion using grant funds, (c) Delivery methods of program curricula, (d) administration structure, (e) Support services to participants, (f) Types of in-depth assessment of participants abilities, skills, and interests for selection into program, and determining appropriate program and/or course sequence; Who conducted the assessment? (g) How were assessment results used? Were assessment results useful in determining the appropriate program and course sequence for participants? (h) Methods used to provide career guidance 2. What can be done to improve the program during both planning and implementation? Key Variables (a) Satisfaction of key stakeholders (program participants, team leaders, employers), (b) Quality of student support services (e.g., academic planning, Healthcare Career Workshop, academic boot camps, digital literacy training, supplementary tutoring in gateway courses, completion coaches), (c) Questionnaires, Review of Technical Proposal including Work Plan Interviews and/or Focus Groups Surveys Director, Assistant Director, Data Manager, Curriculum Specialists, Coordinators Grant Director, Assistant Director, Curriculum Specialists, Coordinators, Participants Employers, Participants Content analysis of qualitative data Content analysis of qualitative data -Content analysis of qualitative data -Descriptive data analysis of quantitative data 6

Evaluation Question Method Data Sources Analysis Quality of technology components (e.g., shared online learning tools), (d) Contribution of partners to program design, curriculum development, recruitment, training, placement, program management, leveraging of resources, and commitment to program sustainability, (e) Outreach to TAA workers, (f) Dissemination of curriculum to other schools 3. What factors are contributing to partners involvement (or lack thereof) in the program? 4. Which contributions from partners are most critical to the success of the grant? Which contributions are less critical? 5. Are program activities and outputs consistent with what was planned? Is there consistency across institutions? Interviews and/or Focus Groups Interviews and/or Focus Groups Survey Grant Director, Assistant Director Grant Director, Assistant Director, Coordinators Participants Content analysis of qualitative data Content analysis of qualitative data, disaggregated by institution - Content analysis of qualitative data, disaggregated by institution -Descriptive data analysis of quantitative data, disaggregated by institution Document Analysis Progress Reports Comparison of progress reports to project plan 7

Outcomes/Impacts Evaluation 6. To what extent was the self-paced competency curricula (a component of student support services) associated with higher COMPASS test scores? 7. To what extent were student support services associated with graduation and retention rates at participating institutions? 8. How many program participants completed a TAACCCT-funded program of study? 9. To what extent did program participants achieve mastery of key program outcomes? Document Analysis Compass Test Scores -Descriptive data analysis of quantitative data -ANOVA (group 5 x campus) comparing COMPASS test scores of students in healthcare programs at TTCs, TAA workers, and others considering training who received self-paced competency curriculum, with those who did not. Document Analysis Student Data -Descriptive data analysis of quantitative data -ANCOVA (group x campus x gender x age) comparing Allied Health/Nursing hold students who received student support services with those hold students who did not (regardless of whether they enrolled in Allied Health/Nursing) -ANCOVA (group x campus x age x gender) comparing Allied Health/Nursing students receiving services with those who did not receive services Survey Participants -Content analysis of qualitative data -Descriptive data analysis of quantitative data Document Analysis Student Data -Descriptive data analysis -ANCOVAs (group x campus x age x gender) for each program of study Surveys Employers, Graduates -Content analysis of qualitative responses -Descriptive Data Analysis (disaggregated by program) 5 See Table 2 for a description of comparison groups for each program of study. 8

10. How many participants earned degrees and certificates in the various grant-funded programs of study? 11. How many participants who completed a grant-funded program of study entered employment in the quarter after the quarter of program exit? 12. How many participants who completed a grant-funded program of study who entered employment (in the quarter following the quarter of program exit) retained employment (into the second and third quarters after program exit)? 13. What are the average earnings for participants attaining employment? Certification/Licensure Exam Scores or Pass Rates National Healthcare Assoc. Cert.Exam, EMD Cert. by the State of TN Dept. of EMS, NCLEX- RN, National Board for Cert. in OT -Descriptive data analysis (e.g., means, frequencies) of quantitative data -ANOVAs (group x campus) for each program of study Document Analysis Student Data -Descriptive data analysis of quantitative data -Degrees/Certificates and Employment: -ANCOVA (group x campus x age x gender) comparing participants versus non-participants -Salary: -ANCOVA (group x campus x gender x age) comparing salary of participants with the salary of non-participants 9

Table 2 Component(s) Surgical Tech. (AAS) Allied Health Sciences (AAS) Medical Informatics (AAS) LPN to RN Mobility Occupational Therapy Asst Emergency Medical Dispatch- -Public Safety E.C.G. Tech Patient Care Tech Phlebotomy Tech Description of Participant vs. Comparison Group Differences N/A: This is a new program at all consortium schools. No comparison group is available. Enhancements to curriculum to make it available online to RX TN participants. Enhancements to curriculum to make it available online to RX TN participants. Participants will experience enhanced pathways from technology center LPN training to community college RN training. Participants exposed to more online and simulation components. Enhancements to curriculum to make it available online to RX TN participants. N/A: This is a new program at all consortium schools. No comparison group is available. Enhancements to curriculum to make it available online to RX TN participants. N/A: This is a new program at all consortium schools. No comparison group is available. Enhancements to curriculum to make it available online to RX TN participants. The source of the Participant Group are those students enrolled in programs at the following institutions during the duration of the grant period: Roane, Cleveland, Walters Roane, Volunteer, Cleveland, Chattanooga, Columbia, Southwest, Walters Roane, Volunteer, Cleveland, Columbia, Dyersburg, Nashville Roane, Cleveland, Columbia, Dyersburg, Motlow, Northeast, Pellissippi, Southwest, Walters Roane, Cleveland, Chattanooga Roane, Volunteer, Dyersburg, Northeast, Walters; TTCs: Memphis, Nashville Roane, Volunteer, Columbia, Dyersburg, Jackson, Northeast, Walters; TTCs: Memphis, Nashville, McMinnville, Murfreesboro Roane, Volunteer, Columbia, Dyersburg, Jackson, Walters; TTC-Nashville Roane, Volunteer, Columbia, Dyersburg, Jackson, Northeast; TTCs: Memphis, McMinnville, Murfreesboro The Comparison Group consists of the Spring, 2013 cohort of students at the following institutions: Roane,Volunteer Roane, Walters, Volunteer Roane, Cleveland, Walters, Motlow, Dyersburg Roane, Cleveland, Nashville Roane Roane, TTC- Nashville 10