Below is a suggested framework and outline for the detailed evaluation plans with column names defined as follows:
|
|
|
- Willis Lewis
- 10 years ago
- Views:
Transcription
1 Trade Adjustment Assistance Community College and Career Training Grant Program Round 4 Grant Third-Party Evaluations Technical Assistance for Developing Your Detailed Evaluation Plan The Trade Adjustment Assistance Community College and Career Training (TAACCCT) Grant Program requires Round 4 grantees to retain a third-party evaluator who will design and execute a rigorous evaluation of each funded project. The third-party evaluation contractor must oversee the design of the evaluation, the impact/outcomes analysis, the implementation analysis, data collection and analysis, and development of the interim and final reports. The TAACCCT Solicitation for Grant Applications (SGA/DFA PY-13-10) required applicants to submit a summary evaluation plan with their grant applications, including their plans for: 1) a participant impact or outcomes assessment; and 2) a program implementation assessment. Applicants were encouraged to consider randomized controlled trials, although non-experimental designs are allowed as long as they meet evidence standards and provide a convincing argument for how the alternative design (e.g., quasi-experimental designs such as regression discontinuity) would allow for drawing causal inferences about the effects of the program. SGA PY13-10 also required those awarded a grant to secure a third-party evaluator and then submit a detailed evaluation plan. The detailed evaluation plan must rely on the expertise of the third-party evaluator and reflect an expansion of the summary evaluation plan, incorporate feedback received from DOL on the summary plan, and discuss the required components of the evaluation detailed in SGA/DFA PY The plans should be double-spaced, use 12-point font, and be no more than 30 pages in length. Below is a suggested framework and outline for the detailed evaluation plans with column names defined as follows: Component Labels the elements of the suggested detailed evaluation plan outline. (Note: numbering is not associated with the numbering in the SGA). Description Offers a description of what to include in this section of the plan. Reference Indicates which components are required and cites the specific language in the SGA. Where no reference is provided, the component is recommended, but not required. Recommendation/Tip Provides information on expectations for a high-quality evaluation plan and offers some tips on the development of the plan.
2 I. Table of Contents List of sections and any tables and figures in the detailed evaluation plan with page number references Tip: The Table of Contents does not count against the suggested 30-page limit II. Introduction What the evaluation will try to achieve What the goals of the evaluation are Include an introduction that provides a high-level summary of the evaluation design, outlining the research questions, data, methods, and reporting that will be provided III. Intervention What the intervention is and how it is supposed to effect change for the target population How the funded programs build institutional capacity What part of the intervention will be evaluated Discuss whether funded program is using a particular evidence-based model and describe model Describe each component of the intervention including ancillary components such as coaching, job placement assistance, and tutoring Describe plans for recruiting and enrolling participants, including target populations Describe each intervention that will be evaluated (if the grant is funding more than one) Tip: The SGA provides potentially useful references for understanding the evidence base for selected models 2
3 IV. Implementation Design How the third-party evaluator will analyze the steps taken by the institution to create and run the training program How the third-party evaluator will assess the operational strengths and weaknesses of the project after implementation How the third-party evaluator will suggest how implementation might be strengthened within appropriate timing so as not to interfere with the impact/outcomes analysis SGA, p. 79, section V.D.1.b (for overall implementa tion analysis) SGA, pp. 78- (for fidelity to the model) SGA, p. 79, section V.D.1.b (for implementa tion changes) Include a conceptual framework for implementation analysis (e.g., theory of change, logic model) and describe how the conceptual framework will be used to guide the implementation analysis Examine fidelity to the model or approach implemented, including whether program, processes, and systems are operating as intended, and, if not, how and why Tip: Suggestions for changes to the program design that may be needed to strengthen implementation should be discussed in the interim evaluation report(s), including the timing of and reasons for changes. Please note that changes to implementation should be made within the appropriate parameters so the impact/outcomes analysis is not affected. Tip: For new programs, it may make sense to delay the impact analysis until after adequate timing has been allowed for full implementation and any adjustments made to the model/program design IV.A. Implementation Research Questions The research questions that will guide the evaluation Required research questions as articulated in the SGA: 1. How was the particular curriculum selected, used, or created? 2. How were program/program designs improved or expanded using grant funds? What delivery methods were offered? What was the program administrative structure? What support or other services were offered? 3. Are in-depth assessment of participants abilities, skills, and interests conducted to SGA, p. 79, section V.D.1.b.i-iv Take into account all of the required research questions from the SGA, but include additional questions as appropriate based on the intervention being tested Include questions regarding efforts to expand institutional capacity 3
4 select or enroll participants into the program being evaluated? What assessment tools and processes were used? Who conducted the assessments? How were the assessment results used? Were the assessment results useful in determining the appropriate program and course sequence for IV.A. participants? Was career guidance provided? Implementation If so, through what methods? 4. What contributions did each of the partners Research and other key stakeholders (employers, Questions workforce system, other training providers (continued) and educators, philanthropic organizations, and others as applicable) make in terms of: 1) program design, 2) curriculum development, 3) recruitment, 4) training, 5) placement, 6) program management, 7) leveraging of resources, and 8) commitment to program sustainability? What factors affected partners involvement or lack of involvement? Which contributions from partners were most critical to the success of the grant program? Which contributions from partners had less of an impact? IV.B. Implementation Data Strategies Identification of data sources that will be utilized to address the research questions How data will be collected and analyzed, including which methods will be utilized SGA, p. 79, section V.D.1.b Include interviews with staff and stakeholders, which are critical for obtaining information for the implementation analysis Use other methods/sources, where feasible, including surveys, observations, document review, focus groups, etc. When measuring capacity building, include a description of the indicators that will be used Tip: It is important to provide a plan for ensuring 4
5 human subjects rights are protected (addressing consent procedures, participant confidentiality, protection of personally identifiable information (PII), data security procedures, Institutional Review Board (IRB) review if needed) Tip: discuss the reliability of the data being collected V. Outcomes/ Impact Design The plan for rigorously evaluating the participant outcomes or impacts, including a complete description of the study methodology How the methodology proposed is the most rigorous for the participant outcomes or impacts, given the number of participants (including TAA-eligible workers) the project intends to serve Justify the selected strategy, whether experimental or non-experimental impact analysis or outcomesonly analysis will be conducted Describe the estimated sample size and provide power calculations. If small sample sizes prevent from using a treatment/control group or treatment/comparison group design, identify other strategies for benchmarking the program s outcomes or consider pre/post tests to measure changes over time Include an analysis of outcomes, whether it is an outcomes-only or impact study Tip: Only impact analyses with carefully designed comparison groups can be used to assess the effectiveness of TAACCCT-funded programs Tip: For programs with small sample sizes, benchmarks could include outcomes of other similar programs for the target population V.A. Outcomes/ Impact Research Questions The research questions the evaluation will use to guide the data collection and analysis for this component While the SGA does not specify questions to be answered, it does indicate that the purpose of the outcomes/impact analysis is to rigorously evaluate participant outcomes and impacts. Thus, research questions should be developed to guide this analysis. 5
6 V.B. Outcomes (if most rigorous method selected) If no control or comparison group will be used, an outcomes only analysis must be fully justified as the most rigorous analysis feasible. Outcomes to be analyzed, expanding on or refining what was discussed in your summary evaluation plan How the nine outcomes required in the SGA will be used in the evaluation Offer hypotheses for how the intervention will affect the outcomes of interest Specify how/when outcomes will be measured including a discussion of how the measures are valid, appropriate and reliable Tip: The SGA provides a list of nine required outcomes, but grantees may add others as deemed appropriate (See SGA Appendix L) Tip: Per the ninth outcome, study the size of the earnings change, in addition to if there was an earnings increase V.C. Experimental Design (if selected method for impact analysis) How the recruitment plan will yield a sufficient number of qualified applicants (both program and controls) to produce valid estimates of these key outcomes: program completion, credential attainment, placement into employment, and employment retention (Outcomes # 2, 5, 7, and 8 in Appendix L), as well as average earnings for those who retain employment How random assignment will be performed What procedures will be in place to ensure compliance with random assignment procedures (i.e., that all eligible individuals that apply and who are randomly assigned to treatment can receive it and those who were assigned to the control do not receive the treatment) What procedures will be in place to ensure the fidelity of implementation (i.e., that the features of the intervention occurred in the treatment condition as intended and did not occur in the Provide information on why grantee believes it will be able to recruit enough people for the study (given the need to randomly assign individuals to a control group) based on their past experience or their plans for expanded outreach Provide information on the treatment group: 1) How is entry into the treatment group determined? 2) Will there be placement tests? 3) Are there other explicit or implicit mechanisms underlying allocation to the treatment group? 4) What is the point of random assignment? Describe how TAA-eligible workers and veterans will be treated in the evaluation Use a power analysis such as minimum detectable effect analysis to guide determination of appropriate sample sizes Collect as much pre-program data and characteristics at entry as possible, especially data on pre-program earnings and employment Indicate if different programs or colleges will be merged in the analysis, and describe if and how merging them will allow the detection of impacts 6
7 control condition) Tip: Different evaluation approaches may be needed for TAA-eligible workers and veterans, who cannot be randomly assigned Tip: It is important to provide a plan for ensuring human subjects rights are protected (addressing consent procedures, participant confidentiality, protection of personally identifiable information (PII), data security procedures, IRB review if needed) V.D. Non- Experimental Design (if selected method for impact analysis) Argument for how the design (e.g., quasiexperimental designs such as regression discontinuity) will allow for drawing causal inferences about the effect of the program Comparison Group Design 1. The source of the comparison group(s) and how individuals are selected for the comparison group 2. If matching across groups is used (e.g., demographics, pretest scores, level of education), the statistical techniques for matching should be described, including an explanation of how these techniques are appropriate for the sample size 3. The procedures that will be in place to ensure the fidelity of implementation (i.e., that the features of the intervention occurred in the treatment condition as intended and did not occur in the comparison condition) Provide information on why the comparison group was selected and why the third-party evaluator/grantee believes it is similar to the treatment group Provide information on the treatment group: 1) How is entry into the treatment group determined? 2) Will there be placement tests? 3) Are there other explicit or implicit mechanisms underlying allocation to the treatment group? 4) Do college staff play a role in determining entry into the treatment group? Describe statistical techniques that will be used to correct for possible selection bias Use a power analysis such as minimum detectable effect analysis to guide determination of appropriate sample sizes Collect as much pre-program data and characteristics at entry as possible, especially data on pre-program earnings and employment Indicate if different programs or colleges will be merged in the analysis, and describe if and how merging them will allow the detection of impacts Tip: Since treatment and comparison group members need to be as similar as possible, a broad 7
8 set of variables is preferred to match on observable characteristics. If feasible, include variables related to attitudes/motivations, especially if there is strong treatment group self-selection. Note that a broader set of variables requires larger sample sizes. V.E. Outcomes/ Impact Data Collection and The data collection methods and data source(s) that will be used for the outcomes/impact analysis How the anticipated follow-up data will be successfully collected from participants and the control/comparison group (if using experimental or non-experimental design) The plan for data analysis, including the statistical methods that will be used to measure impacts of the training on participants based on participant-level data Indicate how personally identifiable data will be transmitted securely to the third-party evaluator Specify source of data on employment outcomes and plans for collecting unemployment insurance wage records or other state data Specify the frequency and schedule for data collection Include a discussion regarding handling PII and data security Discuss strategies for addressing missing data Indicate whether outcomes will be analyzed using descriptive statistics and/or causal analysis Describe variables to be used for estimation models Describe any subgroup analysis to be conducted, such as by program, college, year of funding, or any demographic subgroup Discuss planned sensitivity analyses to determine the robustness of the findings Tip: could be conducted regarding how sensitive results are to the selection of covariates, comparison groups, and timing of the outcomes VI. Limitations The challenges and limitations likely to be encountered throughout the execution of the evaluation and their implications for findings Discuss limitations related to such issues as internal and external validity (e.g., attrition or non-response bias, selection bias, cross-contamination), the ability to collect certain data, small sample sizes, inability to assure that the comparison group is sufficiently similar to the treatment group, or other factors that might affect the analysis 8
9 VII. Reports Plans to submit a final evaluation report, due to the Department at the end of the grant period of performance Plans to submit at least one interim report to include an evaluation design and on evaluation findings to date at a time determined by the grantee A timeline for transmitting these reports SGA, p , section V.D. Include a discussion of evaluation activities, milestones, and reports in the timeline Discuss how the outcomes/impact analysis results and implementation analysis findings will be integrated Describe how the third-party evaluator will provide information on the evaluation to the grantee for the purpose of quarterly report submission Discuss the independence of the third-party evaluator, including with respect to publications and release terms VIII. Reference List List of literature that was cited in the detailed evaluation design plan Tip: This does not count toward the suggested 30- page limit 9
NEW FEATURES OF SGA 2012
NEW FEATURES OF SGA 2012 Q: What are the new features of SGA 2012? A: The Department of Labor (the Department) intends to award grants of $2.5 to $3.0 million each to one individual applicant from each
Trade Adjustment Assistance Community College and Career Training Grants
Trade Adjustment Assistance Community College and Career Training Grants Benjamin Collins Analyst in Labor Policy July 10, 2014 The House Ways and Means Committee is making available this version of this
Idaho Center of Excellence Healthcare Partnership Program Evaluation Plan
Idaho Center of Excellence Healthcare Partnership Program Evaluation Plan The following timeline will be used for submission of quarterly evaluations unless otherwise directed by the U.S. Department of
FY 2015 CONGRESSIONAL BUDGET JUSTIFICATION EMPLOYMENT AND TRAINING ADMINISTRATION. TAA Community College and Career Training Grant Fund
FY 2015 CONGRESSIONAL BUDGET JUSTIFICATION EMPLOYMENT AND TRAINING ADMINISTRATION TAA Community College and Career Training Grant Fund TABLE OF CONTENTS Amounts Available for Obligation... 1 Summary of
1. FORMULATING A RESEARCH QUESTION
W R I T I N G A R E S E A R C H G R A N T P R O P O S A L 1. FORMULATING A RESEARCH QUESTION 1.1. Identify a broad area of interest through literature searches, discussions with colleagues, policy makers
Flipped Learning Evaluation Institute for Effective Education (University of York) Peter Rudd, PhD
Evaluation Summary Age range Year 5 (9-10) Number of pupils Average 60 pupils per school Number of schools 12 treatment and 12 control schools Design Randomised Controlled Trial with Delayed Treatment
Non-Researcher s Guide to Evidence-Based Program Evaluation
Non-Researcher s Guide to Evidence-Based Program Evaluation July 2012 Table of Contents Table of Contents... 2 Course Overview... 4 About This Course... 4 Intended Audience... 4 Course Topics... 4 Learning
WORKFORCE ACCELERATOR FUND. Request for Applications. April 23, 2014
WORKFORCE ACCELERATOR FUND Request for Applications April 23, 2014 The State Board is an equal opportunity employer/program. Auxiliary aids and services are available upon request to individuals with disabilities.
Multi-Tiered Systems of Support for Behavior (MTSS-B) Request for Proposals for an MTSS-B Program and the Provision of Training
Multi-Tiered Systems of Support for Behavior (MTSS-B) Request for Proposals for an MTSS-B Program and the Provision of Training RFP release date: May 30, 2014 Questions: June 10, 2014 Intent to bid: June
EXECUTIVE SUMMARY. List all of the program s learning outcomes: (regardless of whether or not they are being assessed this year)
STUDENT LEARNING ASSESSMENT REPORT SUBMITTED BY: C.KOPAC AND M. VENZKE DATE: JUNE 26, 2014 REVISED JANUARY 2015 TO MEET UAC RECOMMENDATIONS SEE BELOW- HEADING HIGHLIGHTED IN GREY BRIEFLY DESCRIBE WHERE
The Massachusetts Tiered System of Support
The Massachusetts Tiered System of Support Chapter 1: Massachusetts Tiered System of Support (MTSS) Overview Massachusetts has developed a blueprint outlining a single system of supports that is responsive
Technical Review Coversheet
Status: Submitted Last Updated: 8/6/1 4:17 PM Technical Review Coversheet Applicant: Seattle Public Schools -- Strategic Planning and Alliances, (S385A1135) Reader #1: ********** Questions Evaluation Criteria
New Jersey's Talent Connection for the 21 st Century
New Jersey's Talent Connection for the 21 st Century Unified Workforce Investment Plan June 2012 New Jersey by the Numbers 2 417,200 Unemployed Residents 50% Unemployed more than 26 weeks 587,700 Residents
HHS MENTORING PROGRAM. Partnering for Excellence MENTORING PROGRAM GUIDE
HHS MENTORING PROGRAM Partnering for Excellence MENTORING PROGRAM GUIDE November 17, 2008 TABLE OF CONTENTS Page I. VISION STATEMENT.... 2 II. MISSION STATEMENT. 2 III. INTRODUCTION...2 IV. PROGRAM OBJECTIVES.
DR. ERIC GRAVENBERG, INTERIM DEPUTY CHANCELLOR DR. MICHAEL ORKIN, VICE CHANCELLOR FOR EDUCATIONAL SERVICES
DR. ERIC GRAVENBERG, INTERIM DEPUTY CHANCELLOR DR. MICHAEL ORKIN, VICE CHANCELLOR FOR EDUCATIONAL SERVICES INSTITUTE VISION The Enrollment Management Institute (EMI) is a vehicle to enable the Peralta
BASIC EMPLOYABILITY SKILLS TRAINING (B.E.S.T.) PROGRAM
CITY OF ROCHESTER BUREAU OF HUMAN SERVICES BASIC EMPLOYABILITY SKILLS TRAINING (B.E.S.T.) PROGRAM REQUEST FOR PROPOSALS MAY 2003 The City of Rochester s Bureau of Human Services in collaboration with the
SOCIETY FOR THE STUDY OF SCHOOL PSYCHOLOGY DISSERTATION GRANT AWARDS. Request For Applications, Due October 30, 2015 5PM Eastern Time
SOCIETY FOR THE STUDY OF SCHOOL PSYCHOLOGY DISSERTATION GRANT AWARDS Request For Applications, Due October 30, 2015 5PM Eastern Time (A Spring Request will be distributed in early 2016, with a March deadline)
U.S. DEPARTMENT OF LABOR Employment and Training Administration Notice of Availability of Funds and Solicitation for Grant Applications for YouthBuild
U.S. DEPARTMENT OF LABOR Employment and Training Administration Notice of Availability of Funds and Solicitation for Grant Applications for YouthBuild Announcement Type: Initial Funding Opportunity Number:
PERFORMANCE MANAGEMENT AND EVALUATION: WHAT S THE DIFFERENCE? Karen E. Walker, Ph.D., and Kristin Anderson Moore, Ph.D.
Publication #2011-02 information for program providers and funders who seek to understand what performance management is, how it differs from evaluation, and why both are critical. January 2011 PERFORMANCE
PRO-NET. A Publication of Building Professional Development Partnerships for Adult Educators Project. April 2001
Management Competencies and Sample Indicators for the Improvement of Adult Education Programs A Publication of Building Professional Development Partnerships for Adult Educators Project PRO-NET April 2001
DOCTOR OF PHILOSOPHY DEGREE. Educational Leadership Doctor of Philosophy Degree Major Course Requirements. EDU721 (3.
DOCTOR OF PHILOSOPHY DEGREE Educational Leadership Doctor of Philosophy Degree Major Course Requirements EDU710 (3.0 credit hours) Ethical and Legal Issues in Education/Leadership This course is an intensive
Charter School Performance Framework
Charter School Performance Framework The Regents of the University of the State of New York Office of School Innovation 89 Washington Avenue Albany, New York 12234 www.p12.nysed.gov/psc/ Charter School
IPDET Module 6: Descriptive, Normative, and Impact Evaluation Designs
IPDET Module 6: Descriptive, Normative, and Impact Evaluation Designs Intervention or Policy Evaluation Questions Design Questions Elements Types Key Points Introduction What Is Evaluation Design? Connecting
CROHN S & COLITIS FOUNDATION OF AMERICA. Senior Research Award INSTRUCTIONS. Effective September 2013
CROHN S & COLITIS FOUNDATION OF AMERICA Senior Research Award INSTRUCTIONS Effective September 2013 Crohn s & Colitis Foundation of America National Office Research & Scientific Programs Department 733
FY 2015 Senior Professional (SP) Performance Appraisal System Opening Guidance
Office of Executive Resources Office of the Chief Human Capital Officer U.S. Department of Energy FY 2015 Senior Professional (SP) Performance Appraisal System Opening Guidance Table of Contents Contents
Approval Review Process: Baccalaureate Nursing Programs in New Brunswick
Baccalaureate Nursing Programs in New Brunswick Mission The Nurses Association of New Brunswick is a professional regulatory organization that exists to protect the public and to support nurses by promoting
Higher Performing High Schools
COLLEGE READINESS A First Look at Higher Performing High Schools School Qualities that Educators Believe Contribute Most to College and Career Readiness 2012 by ACT, Inc. All rights reserved. A First Look
Evaluation of the Virginia Employment Through Entrepreneurship Consortium (VETEC) Program. Implementation Study Report
Evaluation of the Virginia Employment Through Entrepreneurship Consortium (VETEC) Program Implementation Study Report May 2014 Authors: Russell Saltz Sonam Gupta Brian Peters David Thomas Alex Tenaglio
Revisioning Graduate Teacher Education in North Carolina Master of Arts in Elementary Education Appalachian State University
Revisioning Graduate Teacher Education in North Carolina Master of Arts in Elementary Education Appalachian State University A. A description of how the proposed program has been revisioned to reflect
Developing an R Series Plan that Incorporates Mixed Methods Research
Developing an R Series Plan that Incorporates Mixed Methods Research / 16 Developing an R Series Plan that Incorporates Mixed Methods Research primary mechanism for obtaining NIH grants to conduct mixed
Principal Practice Observation Tool
Principal Performance Review Office of School Quality Division of Teaching and Learning Principal Practice Observation Tool 2014-15 The was created as an evidence gathering tool to be used by evaluators
Guide to Building A Broad-Based Coalition
Guide to Building A Broad-Based Coalition Supporting the Development and Sustainability of a System of Pathways DISTRICT FRAMEWORK TOOL 1.1.3 As communities across California commit to developing systems
Annex 1. Call for proposals EACEA No 34/2015 Erasmus+ Key Action 3: Support for policy reform - Initiatives for policy innovation
Annex 1 Call for proposals EACEA No 34/2015 Erasmus+ Key Action 3: Support for policy reform - Initiatives for policy innovation European policy experimentations in the fields of Education, Training and
CHAPTER 6 STANDARDS FOR NURSING EDUCATION PROGRAMS
CHAPTER 6 STANDARDS FOR NURSING EDUCATION PROGRAMS Section 1: Statement of Purpose. (a) To foster the safe and effective practice of nursing by graduates of nursing education programs by setting standards
Evaluation Plan: Process Evaluation for Hypothetical AmeriCorps Program
Evaluation Plan: Process Evaluation for Hypothetical AmeriCorps Program Introduction: This evaluation plan describes a process evaluation for Financial Empowerment Corps (FEC), an AmeriCorps State and
Evaluation Plan: Impact Evaluation for Hypothetical AmeriCorps Program
Evaluation Plan: Impact Evaluation for Hypothetical AmeriCorps Program Introduction: This evaluation plan describes an impact evaluation for Financial Empowerment Corps (FEC), an AmeriCorps State and National
Key Principles for ELL Instruction (v6)
Key Principles for ELL Instruction (v6) The Common Core State Standards (CCSS) in English Language Arts and Mathematics as well as the soon-to-be released Next Generation Science Standards (NGSS) require
The University of North Texas at Dallas Policy Manual
The University of North Texas at Dallas Policy Manual Chapter 6.000 6.020 Academic Program Review Faculty Affairs Policy Statement. UNT Dallas offers high-quality academic programs that are achieved through
POLICY ISSUES IN BRIEF
ISSUES AND SOLUTIONS for Career and Technical Education in Virginia 2015 Educators and business representatives from across Virginia, along with 10 organizations representing Career and Technical Education
Monitoring and Evaluation Plan Primer for DRL Grantees
Monitoring and Evaluation Plan Primer for DRL Grantees I. What is a monitoring and evaluation plan? A monitoring and evaluation plan (M&E plan), sometimes also referred to as a performance monitoring or
The Alignment of Common Core and ACT s College and Career Readiness System. June 2010
The Alignment of Common Core and ACT s College and Career Readiness System June 2010 ACT is an independent, not-for-profit organization that provides assessment, research, information, and program management
REVISING AND RECRUITING: REACHING OUT TO THE EARLY EDUCATION WORKFORCE. Anne McLaughlin Danielle Holland Community College of Baltimore County
REVISING AND RECRUITING: REACHING OUT TO THE EARLY EDUCATION WORKFORCE Anne McLaughlin Danielle Holland Community College of Baltimore County Our Goals Offer the best program possible for current and future
Master of Science in Nursing Program Thesis and Research Project Handbook 2014-2015
Master of Science in Nursing Program Thesis and Research Project Handbook 2014-2015 Guideline for the Preparation of Master Thesis or Research Project Table of Contents Introduction...1 Thesis Option...1
The Outcomes For CTE Students in Wisconsin
Promoting Rigorous Career and Technical Education Programs of Study Quantitative Outcomes Study: Baseline Data Collection Report Prepared under contract to Division of Academic and Technical Education
COMMONWEALTH OF VIRGINIA VIRGINIA COMMUNITY COLLEGE SYSTEM WORKFORCE INVESTMENT ACT VIRGINIA WORKFORCE LETTER (VWL) #12-03
COMMONWEALTH OF VIRGINIA VIRGINIA COMMUNITY COLLEGE SYSTEM WORKFORCE INVESTMENT ACT VIRGINIA WORKFORCE LETTER (VWL) #12-03 TO: FROM: SUBJECT: LOCAL WORKFORCE INVESTMENT BOARDS OFFICE OF WORKFORCE DEVELOPMENT
Missing data in randomized controlled trials (RCTs) can
EVALUATION TECHNICAL ASSISTANCE BRIEF for OAH & ACYF Teenage Pregnancy Prevention Grantees May 2013 Brief 3 Coping with Missing Data in Randomized Controlled Trials Missing data in randomized controlled
PsyD Psychology (2014 2015)
PsyD Psychology (2014 2015) Program Information Point of Contact Marianna Linz ([email protected]) Support for University and College Missions Marshall University is a multi campus public university providing
JAN 2 2 2016. system; department of business, economic development, and. tourism; and department of labor and industrial relations
THE SENATE TWENTY-EIGHTH LEGISLATURE, 0 STATE OF HAWAII JAN 0 A BILL FOR AN ACT S.B. NO.Szg RELATING TO WORKFORCE DEVELOPMENT. BE IT ENACTED BY THE LEGISLATURE OF THE STATE OF HAWAII: SECTION. The legislature
M&E/Learning Guidelines for IPs. (To be used for preparation of Concept Notes and Proposals to LIFT)
Published: 17 March 2015 Background M&E/Learning Guidelines for IPs (To be used for preparation of Concept Notes and Proposals to LIFT) LIFT's new strategy (2015-2018) envisions LIFT as a knowledge platform.
K-12 EDUCATION Introduction and Capabilities K-12 Education
K-12 EDUCATION Introduction and Capabilities Hanover provides high-quality, timely, and well-articulated services working closely with our staff. Whether working with staff who have significant grant and
Adult Diploma Program Recommendations. Submitted to Governor Kasich, Senate President Faber and Ohio House of Representatives Speaker Batchelder
Adult Diploma Program Recommendations Submitted to Governor Kasich, Senate President Faber and Ohio House of Representatives Speaker Batchelder December 2014 Adult Diploma Program December 2014 Page 1
GUIDELINES FOR REVIEWING QUANTITATIVE DESCRIPTIVE STUDIES
GUIDELINES FOR REVIEWING QUANTITATIVE DESCRIPTIVE STUDIES These guidelines are intended to promote quality and consistency in CLEAR reviews of selected studies that use statistical techniques and other
Illinois Center for School Improvement Framework: Core Functions, Indicators, and Key Questions
Framework: Core Functions, Indicators, and Key Questions The Core Functions and Indicators, which form the structure for the delivery and execution of (Illinois CSI) services, describe the big ideas or
THE SELF STUDY DOCUMENT For Undergraduate Only Departmental Reviews
I. The Department/Unit (or Program) II. Resources University at Buffalo Comprehensive Program Reviews The Graduate School THE SELF STUDY DOCUMENT For Undergraduate Only Departmental Reviews A. Mission
STATE OF NEW JERSEY DEPARTMENT OF LABOR AND WORKFORCE DEVELOPMENT. NOTICE OF GRANT OPPORTUNITY Fiscal Year 2016
STATE OF NEW JERSEY DEPARTMENT OF LABOR AND WORKFORCE DEVELOPMENT NOTICE OF GRANT OPPORTUNITY Fiscal Year 2016 NEW JERSEY BUILDERS UTILIZATION INITIATIVE FOR LABOR DIVERSITY NJBUILD NEWARK PROGRAM Announcement
Massachusetts Department of Elementary and Secondary Education. Professional Development Self- Assessment Guidebook
Massachusetts Department of Elementary and Secondary Education Professional Development Self- Assessment Guidebook For Teacher Professional Development Offerings Modified for use by the District and School
OUTLINE OF PRINCIPLES OF IMPACT EVALUATION
OUTLINE OF PRINCIPLES OF IMPACT EVALUATION PART I KEY CONCEPTS Definition Impact evaluation is an assessment of how the intervention being evaluated affects outcomes, whether these effects are intended
U.S. Department of Education - EDCAPS G5-Technical Review Form (New)
U.S. Department of Education - EDCAPS G5-Technical Review Form (New) Status: Submitted Last Updated: 8/25/214 5:25 PM Technical Review Coversheet Applicant: Reader #1: ********** Success Academy Charter
Career Pathways as a Framework for Program Design and Evaluation. David Fein Abt Associates Inc [email protected]
Career Pathways as a Framework for Program Design and Evaluation David Fein Abt Associates Inc [email protected] A Webinar September 20, 2012 Acknowledgments Presentation based on paper developed
Guided Pathways to Success in STEM Careers. Request for Proposals
Guided Pathways to Success in STEM Careers Request for Proposals June 2013 Table of Contents Table of Contents... 2 Introduction... 3 Principles and Practices of Guided Pathways to Success... 4 Complete
Workforce Innovation and Opportunity Act Frequently Asked Questions July 22, 2014
Workforce Innovation and Opportunity Act Frequently Asked Questions July 22, 2014 The following Frequently Asked Questions are drafted in the context of the Workforce Innovation and Opportunity Act (WIOA)
Saint Paul Early Childhood Scholarship Program Evaluation
December 31, 2011 Saint Paul Early Childhood Scholarship Program Evaluation Final Evaluation Report 2008-2011 SRI Project 18280 Submitted to Duane Benson, Executive Director Minnesota Early Learning Foundation
CHAPTER 6 STANDARDS FOR NURSING EDUCATION PROGRAMS
Section 1. Statement of Purpose. CHAPTER 6 STANDARDS FOR NURSING EDUCATION PROGRAMS (a) To foster the safe and effective practice of nursing by graduates of nursing education programs by setting standards
Ontario Leadership Strategy. Leadership Succession Planning and Talent Development Ministry Expectations and Implementation Continuum
Ontario Leadership Strategy Leadership Succession Planning and Talent Development Ministry Expectations and Implementation Continuum Contents 1. Purpose 2. Why Succession Planning and Talent Development?
State Transition to High-Quality, College/Career-Ready Assessments: A Workbook for State Action on Key Policy, Legal, and Technical Issues
State Transition to High-Quality, College/Career-Ready Assessments: A Workbook for State Action on Key Policy, Legal, and Technical Issues Updated as of November 14, 2013 Page 1 Table of Contents Part
