INL PROGRAM PERFORMANCE METRIC EXAMPLES
|
|
|
- Nora Harper
- 10 years ago
- Views:
Transcription
1 1 INL PROGRAM PERFORMANCE METRIC EXAMPLES The purpose of this document is to assist Program Officers and Implementers with developing performance measurements that: 1. Are quantifiable 2. Are consistent with the project scope, schedule and funding 3. Meet SMART criteria 4. Can be combined with other metrics to determine program impact 5. Link to the INL Functional Bureau Strategy (Goals and Objectives) INL Performance Metrics can often be placed within a variety of categories, including: 1. Training Courses provided 2. Workshops conducted 3. Train the Trainer programs 4. Mentoring/Advising 5. Measuring Complex Changes over Time 6. Development and Delivery to Design, Build, and/or Equip Within each category, performance levels can be measured depending on the intervention: 1. Reaction: measures how participants reacted to the intervention 2. Learning: measures how much knowledge increased as a result of the intervention (requires baseline data, i.e., pre- and post-test scores) 3. Behavior: measures how far participants have retained their newly learned skills and have changed their behavior or applied what they have learned based on the intervention (requires baseline data) PERFORMANCE METRICS ASSOCIATED WITH INL-SPONSORED TRAINING COURSES Training programs are formalized learning experiences that use a prescribed curriculum to transfer practical knowledge, skills, and competencies that relate to specific useful skill to an audience of learners. Performance measures must include both Level 1 and Level 2 as described below. Level 3 is optional, but recommended, if possible. Reaction (Level 1): Performance metrics can measure the satisfaction rates over the period of performance for participants undertaking like courses. The most common method to accomplish this is through opinion surveys or questionnaires, in which respondents are offered a choice of five to seven pre-coded responses ranging from strongly agree to strongly disagree with the neutral point being neither agree nor disagree. At least one question on the Level 1 surveys or questionnaires must address whether respondents felt the course was useful in their current job. Opinion surveys will be distributed to participants of all INL-sponsored training courses conducted and results will indicate at least a 70%* overall satisfaction rate per course.
2 2 Learning (Level 2): Performance metrics could focus on an expected level of increase in knowledge of the participants using pre-test scores to determine the baseline, followed by post-test scores to measure what was learned. The quantitative metric is based on the average of the two scores. A qualitative analysis should then discuss if the increase in knowledge met the anticipated results. If not, then the analysis should also include discussion on changes that can be made to the training course to secure the desired results. Example for Learning (Level 2) Performance Metric Participants trained in the INL-sponsored (title/type of course) will have an average increase in knowledge of at least 10%* between pre- and post-test scores. Behavior (Level 3): Performance metrics could also measure the degree to which participants are applying what they have learned in the training to their workplace activities and/or behavior. A common means for applying a measurement is through observation whereby participants are observed by the course designer (or supervisor) in a practical real-world situation and their newly-acquired are evaluated and scored based on a standard checklist. This can be done from a random sampling of training participants. Example for Behavior (Level 3 ) Performance Metric One Month following INL-sponsored training, a sample of _20_%* of participants who underwent the _(title/type of course) will undergo a retention scorecard (i.e., learning objectives checklist) and earn a score of at least 50%.* PERFORMANCE METRICS ASSOCIATED WITH INL-SPONSORED TRAIN THE TRAINER PROGRAMS A train-the-trainer model enables experienced personnel to show a less-experienced instructor how to deliver courses, workshops and seminars. A train-the-trainer model can build a pool of competent instructors who can then teach the material to other people. Performance measures can be measured at both Level 1 and Level 2 as described below. Level 3 is optional, but recommended, if possible. Reaction (Level 1): This follows the same methodology for Level 1 of the Training Courses and Workshops. In this case, the respondents rate how well they feel the INL-sponsored training has prepared them to become trainers and teach the material to others. Opinion surveys will be distributed to INL-sponsored Train the Trainer participants and results will indicate at least a 70%* overall satisfaction rate per course.
3 3 Learning (Level 2): This follows the same methodology as explained in Level 2 of the Performance Metrics Associated with Training Programs. Performance metrics could focus on an expected level of increase in knowledge of the participants using pre-test scores to determine the baseline, followed by post-test scores to measure what was learned. The quantitative metric is based on the average of the two scores. Example for Learning (Level 2) Performance Metric Participants trained in the INL-sponsored (title/type of course) will have an average increase in knowledge of at least 10% between pre- and post-test scores. Behavior (Level 3): This follows the same methodology as explained in Level 3 of the Performance Metrics Associated with Training Programs. Performance can be evaluated by observation of new instructors to present information effectively, respond to participant questions and lead activities that reinforce learning. This can be measured by the course designer or subject-matter expert standardized checklist. Example for Behavior (Level 3) Performance Metric 20%* of train-the-trainer students who underwent the (title/type of course) and were later observed received a retention score of at least 50% based on a Train the Trainer retention scorecard (i.e., teaching objectives checklist). PERFORMANCE METRICS ASSOCIATED WITH INL-SPONSORED ADVISING/MENTORING Mentoring/advising is a dynamic, collaborative, reciprocal and sustained relationship between an experienced professional and a lesser one focused on the transfer of knowledge, skills and behaviors to a prescribed level. Performance measures typically include Levels 1 and 3. Reaction (Level 1): This follows the same methodology for Level 1 of the Train the Trainer programs whereby mentors can be measured in regards to whether mentees felt their experience with their mentor was useful in their current job. Behavior (Level 3: Performance metrics could measure the increase in knowledge, skills and behavior at various intervals during the mentoring/advising period using a prescribed survey or checklist. Opinion surveys will be distributed to participants of all INL-sponsored mentoring/advising programs conducted and results will indicate at least a 70%* overall satisfaction rate per course. Example for Behavior (Level 3) Performance Metric For all mentoring relationships, the mentee will establish specific objectives that are agreed upon with the mentor prior to the second meeting. The mentor will administer, at a minimum, bi-annual (twice a year) standardized checklists to gauge the number of objectives achieved with a minimum result of 50%* or higher.
4 4 PERFORMANCE METRICS ASSOCIATED WITH INL-SPONSORED WORKSHOPS Workshops are less formal educational modules than training courses. They can be in the form of a seminar, lecture, demonstration, simulation or the like, that emphasizes the exchange of ideas and the demonstration and application of techniques, skills, etc. It emphasizes problem-solving, hands-on training, and requires the involvement of the participants. Performance measures are typically measured at Level 1. Reaction (Level 1): As with training courses, performance metrics can measured for workshops through opinion surveys or questionnaires, in which respondents are offered a similar choice of pre-coded responses. At least one question on the surveys or questionnaires must address whether respondents felt the workshop was useful in their current job. Measuring satisfaction rates can give additional insight into how effective the workshop content was and if any modifications may be needed. * PA&E recommended minimum standard Opinion surveys will be distributed to participants of all INL-sponsored workshops conducted and results will indicate at least a 70%* overall satisfaction rate per workshop. PERFORMANCE METRICS ASSOCIATED WITH AND MEASURING COMPLEX CHANGES OVER TIME INL interventions may include measuring complex changes by the host country over time, such as systems reform, meeting specific international standards and measuring increased coordination and information sharing among host country agencies. Performance metrics could compare the acceleration of participants ability to produce specific outputs within a particular organization s structure. This evidence of change can be demonstrated, for example, through an increase in tangible outputs or decrease, such as cases prosecuted. This metric requires a quantitative baseline data on which any change is measured against. 1. Increase in tangible outputs include: Cases prosecuted, calls into tip lines, land patrols, asset forfeitures, convictions, number of investigations, public perception, implementation of new management tools 2. Decrease in tangible outputs include: Backlogged cases. eradication of illicit materials, processing time, response time INL Program Officers and Implementing Partners can monitor a host of these performance metrics, but would need to directly link these metrics to INL-funded training, workshops and mentoring/advising initiatives to dollars spent by Department of State (DOS) to determine return on investment. Another area to examine would be testing criminological theories for causative factors of change over time.
5 5 Performance measures typically include Level 3. Examples for Behavior (Level 3) Performance Metric _(Description of activity) will (increase/decrease) by _(xx) % from the baseline period of _ (specified baseline period) by _(defined end time). _ (Agency or office within an Agency/Organization) will demonstrate their new skills and knowledge through a _(xx) % increase in (type/ description of activity) as measured by (defined baseline data). Investigators from (specific office, agency or other defined entity) will engage and investigate _(xx) % more criminal cases in CY 2015 than from the previous benchmark of PERFORMANCE METRICS ASSOCIATED WITH DEVELOPMENT AND DELIVERY TO DESIGN, BUILD, AND/OR EQUIP Performance metrics that involve drafting legislation, implementing a system, procuring equipment or other start-up activity are typically measured using a Yes/No metric. These types of activities can be broken down into smaller, measurable milestones and assigned a numerical percentage value. Quarterly reporting would indicate the percentage complete and the final metric would be Yes/No. Examples for Performance Metric The _ (system or type of infrastructure will be installed/built and operational with an expected outcome of (description of benefit). It will be 50% complete by (anticipated midpoint date) and 100% complete by (anticipated completion date). _ (Governmental Agency or Entity) will draft a new (description of document) that achieves (description of benefit). It will be 50% complete by (anticipated midpoint date) and 100% complete by (anticipated completion date).
Corporate Learning Watch
Quantifying the Organizational Benefits of Training August 2010 The Majority of CEOs Want Return on Investment Data for Training Programs, But Few Receive It Watch List A recent survey of CEOs found that
Manual on Training Needs Assessment
Project on Improvement of Local Administration in Cambodia What is Training Needs Assessment? Five Steps of Training Needs Assessment Step 1: Identify Problem and Needs Step 2: Determine Design of Needs
Occupational Profile and Curriculum Summary
Occupational Profile and Curriculum Summary Presented for Comment OFO code 143905 Related Occupation Contact Centre Manager Table of Content Occupational Profile and Curriculum Summary... 1 Presented for
Hands on Banking Adults and Young Adults Test Packet
ATTENTION: SUBJECT: Hands on Banking Instructors Pre- and Post-tests for Adults and Young Adults If you use the Hands on Banking Adults or Young Adults courses with a group, we invite you to use the attached
Running Head: HEARTSHARE S MANAGEMENT TRAINING PROGRAM
HeartShare s Management Training Program 1 Running Head: HEARTSHARE S MANAGEMENT TRAINING PROGRAM HeartShare s Management Training Program COA Innovative Practices Award Case Study Submission HeartShare
Hillsborough Community College. QEP Baseline Analysis. ACG 2021 Financial Accounting
Hillsborough Community College QEP Baseline Analysis ACG 2021 Financial Accounting Fall 2007/ Spring 2008 Baseline Assessment: Fall 2007 The ACG 2021 pretest was administered to 475 students at the beginning
Math Science Partnership (MSP) Program: Title II, Part B
Math Science Partnership (MSP) Program: Title II, Part B FLOYD COUNTY: COLLABORATIVE SYMPOSIUM FOR MATH IMPROVEMENT IN FLOYD COUNTY SCHOOLS ANNUAL EVALUATION REPORT: YEAR TWO Report Prepared by: Tiffany
Background. Scope. Organizational Training Needs
1 Training Plan Template As described in Section 4.2.1 of the Training Effectiveness Toolkit, a Training Plan will help define the specific activities that will be conducted to carry out the training strategy.
DEPARTMENT OF LOCAL GOVERNMENT AND COMMUNITY SERVICES. Performance Indicators Information Paper
DEPARTMENT OF LOCAL GOVERNMENT AND COMMUNITY SERVICES Performance Information Paper 1. Introduction Performance management is the process of setting agreed objectives and monitoring progress against these
Performance Management. Date: November 2012
Performance Management Date: November 2012 SSBA Background Document Background 3 4 Governance in Saskatchewan Education System 5 Role of School Boards 6 Performance Management Performance Management Overview
ONTARIO LOTTERY AND GAMING CORPORATION. Employee Training OLG RG TRAINING PROGRAMS - CASE STUDY
ONTARIO LOTTERY AND GAMING CORPORATION Employee Training OLG RG TRAINING PROGRAMS - CASE STUDY Table of Contents EMPLOYEE TRAINING Gaming Employees 2 Corporate Employees 5 Gaps and Opportunities 7 Employee
U.S. Department of the Treasury. Treasury IT Performance Measures Guide
U.S. Department of the Treasury Treasury IT Performance Measures Guide Office of the Chief Information Officer (OCIO) Enterprise Architecture Program June 2007 Revision History June 13, 2007 (Version 1.1)
Online Teaching Evaluation for State Virtual Schools
Educational Technology Cooperative Online Teaching Evaluation for State Virtual Schools October 2006 Southern Regional Education Board 592 10th St. N.W. Atlanta, GA 30318 (404) 875-9211 www.sreb.org This
Monitoring and Evaluation Plan Primer for DRL Grantees
Monitoring and Evaluation Plan Primer for DRL Grantees I. What is a monitoring and evaluation plan? A monitoring and evaluation plan (M&E plan), sometimes also referred to as a performance monitoring or
1.3 Developing Performance Metrics University of California Approach
SECTION 1: DEVELOPMENT POCESSES 1.3 DEVELOPING PEFOMANCE METICS UNIVESITY OF CALIFONIA APPOACH 1.3 Developing Performance Metrics University of California Approach Introduction Performance metrics should
Developing Measurable Program Goals and Objectives
Developing Measurable Program Goals and Objectives Adapted from Presentation Developed by: Sharon T. Wilburn, Ph.D. and d Kenneth T. Wilburn, Ph.D. University of North Florida Florida Department of Education
Evaluating a Mentoring Program
Guide 303.735.6671 [email protected] Strategic Partners: investment Partners: Table of Contents Step 1: Identify the primary purpose for the evaluation... 3 Step 2: Revisit Program Goals and Metrics... 4
Student Success at the University of South Carolina: A comprehensive approach Category: Academic Support
Student Success at the University of South Carolina: A comprehensive approach Category: Academic Support Summary: The University of South Carolina Student Success Center offers a comprehensive array of
Eleven Reasons Why Training and Development Fails... and what you can do about it. Jack J. Phillips & Patti P. Phillips
Eleven Reasons Why Training and Development Fails... and what you can do about it By Jack J. Phillips & Patti P. Phillips During their more than 10 years as consultants to some of the world s largest organizations,
USAID/Macedonia Secondary Education Activity, Monitoring and Evaluation Plan
USAID/Macedonia Secondary Education Activity, Monitoring and Evaluation Plan January 2004 Contract Number: GDG-A-00-03-00006-00 Task Order Number: 165-00-03-00105-00 EQUIP1: Secondary Education Activity
Engagement with Public Security Forces - Iraq 1
Engagement with Public Security Forces - Iraq 1 I. INTRODUCTION The Rumaila oil field is Iraq s largest oilfield situated in Southern Iraq, southwest of Basra city. Since 2009, Iraq s state-owned South
Concept Paper. Improving Training Evaluation in Organizations
Concept Paper On Improving Training Evaluation in Organizations Prepared by Dr. Paul Squires President 20 Community Place Morristown, NJ 07960 973-631-1607 fax 973-631-8020 www.appliedskills.com According
INSTRUCTIONAL DESIGN CRITERIA CHECKLIST
INSTRUCTIONAL DESIGN CRITERIA CHECKLIST by Michael Fors, Ph.D. Microsoft Redmond, USA Instructional Design Criteria Checklist Below is a very comprehensive criteria checklist to guide the process of curriculum
Evaluating Training. Debra Wilcox Johnson Johnson & Johnson Consulting
Debra Wilcox & Consulting Learning new behavior new or enhanced skills is the most powerful outcome of training. Behavioral change is the most difficult outcome to achieve, but may be the most important.
A Change Management Plan. Proposal
A Change Management Plan Proposal Prepared for: Steve Schnitzler, Director of Operations Port City Java, Inc. Prepared By: Trish Torkildsen, Melissa Ennis & Suesan Sullivan Instructional Technology Master
Wellness Initiative for Senior Education (WISE)
Wellness Initiative for Senior Education (WISE) Program Description The Wellness Initiative for Senior Education (WISE) is a curriculum-based health promotion program that aims to help older adults increase
How To Teach Math To A Grade 8
Program Overview Introduction Program Structure This guide explains the program components and philosophy of the research-based program, Math Navigator, Common Core Edition. Math Navigator blends conceptual
The MetLife Survey of
The MetLife Survey of Challenges for School Leadership Challenges for School Leadership A Survey of Teachers and Principals Conducted for: MetLife, Inc. Survey Field Dates: Teachers: October 5 November
AN INTEGRATED APPROACH TO TEACHING SPREADSHEET SKILLS. Mindell Reiss Nitkin, Simmons College. Abstract
AN INTEGRATED APPROACH TO TEACHING SPREADSHEET SKILLS Mindell Reiss Nitkin, Simmons College Abstract As teachers in management courses we face the question of how to insure that our students acquire the
Guide. Professional Development. When change is constant, learning must be continuous. ISM-ADR School for Supply Management. Leadership Delivered
When change is constant, learning must be continuous. Bill Michels, C.P.M., CEO, ADR North America Professional Development Guide ISM-ADR School for Supply Management Leadership Delivered Paul Novak, CPSM,
Action Project 11. Project Detail
Action Project 11 Title: "Ready, Set, College!" - GED Transition Program Version: 1 Institution: Blackhawk Technical College Status: Active Submitted: 2014-08-06 Category: 2 - Meeting Student and Other
Introduction to Usability Testing
Introduction to Usability Testing Abbas Moallem COEN 296-Human Computer Interaction And Usability Eng Overview What is usability testing? Determining the appropriate testing Selecting usability test type
Sample Company IBP Insurance Services Employee Benefit Program Services Performance Guarantee
Page 1 of 5 Employee Benefit Program Performance s Financial Guarantee Sample Report Card Page 2 of 5 Exceeding Your Expectations is committed to partnering with and exceeding the expectations in the delivery
Organizational Change Management Standards. Table of Contents
Organizational Change C Management M Standards S Office O of Major Projects Department of Technology & Information October 2008 Pam Waters, Organizational Change Management Team Leader Office of Major
Module VII: INTRODUCTION STEP 1: NEEDS ASSESSMENT. Needs Assessment. Employee Training & Development. Design
Module VII: Employee Training & Development INTRODUCTION Training vs. Development Training programs are intended to ensure employees are proficient in meeting the requirements of their current role. However
STAFF DEVELOPMENT AND TRAINING
Manual of Policies and Procedures STAFF DEVELOPMENT AND TRAINING STATE OF CALIFORNIA HEALTH AND HUMAN SERVICES AGENCY DEPARTMENT OF SOCIAL SERVICES Distributed Under the Library Distribution Act This page
Teaching Risk Management: Addressing ACGME Core Competencies
Teaching Risk Management: Addressing ACGME Core Competencies Kiki Nissen, MD, FACP Steven V. Angus, MD, FACP Wendy Miller, MD Adam R. Silverman, MD, FACP Abstract Background Risk management is an important
College of the North Atlantic and its accreditation with the Accreditation Council for Business Schools and Programs (ACBSP)
College of the North Atlantic and its accreditation with the Accreditation Council for Business Schools and Programs (ACBSP) College of the North Atlantic (CNA) is made up of 17 campuses within the province
PROGRAM SUMMARY: http://www.uky.edu/socialwork/trc/indexqic.html 1/20/05 1
PROGRAM SUMMARY: Testing the Impact of Structured Clinical Supervision in Frontline Public Child Welfare on Organizational, Worker Practice and Case Outcomes University of Kentucky College of Social Work
Capacity Assessment Indicator. Means of Measurement. Instructions. Score As an As a training. As a research institution organisation (0-5) (0-5) (0-5)
Assessing an Organization s in Health Communication: A Six Cs Approach [11/12 Version] Name & location of organization: Date: Scoring: 0 = no capacity, 5 = full capacity Category Assessment Indicator As
NORTHERN ILLINOIS UNIVERSITY. College: College of Business. Department: Inter-Departmental. Program: Master of Business Administration
NORTHERN ILLINOIS UNIVERSITY College: College of Business Department: Inter-Departmental Program: Master of Business Administration CIP Code: 52.0201 Northern Illinois University s M.B.A. program follows
HR Metrics and Workforce Analytics. No Balance NO ROI The Rise of BIG Data
HR Metrics and Workforce Analytics No Balance NO ROI The Rise of BIG Data Program Description Regardless of the size of the organization, HR metrics and workforce analytics are becoming increasingly beneficial.
Effective objective setting provides structure and direction to the University/Faculties/Schools/Departments and teams as well as people development.
Effective objective setting provides structure and direction to the University/Faculties/Schools/Departments and teams as well as people development. The main purpose of setting objectives is to reflect
Measuring Evaluation Results with Microsoft Excel
LAURA COLOSI Measuring Evaluation Results with Microsoft Excel The purpose of this tutorial is to provide instruction on performing basic functions using Microsoft Excel. Although Excel has the ability
Southern University College of Business Strategic Plan
Southern University College of Business Strategic Plan 2012-2017 Baton Rouge, Louisiana February 24, 2012 This document is the draft Strategic Plan of the College of Business for the period 2012 2017.
2012 Writing Effective SMART Performance Goals
2012 Writing Effective SMART Performance Goals 2 Writing Effective Performance Goals INTRODUCTION Using a Goal Setting Worksheet Writing precise and meaningful performance goals is an important part of
PMP Exam Prep Training - 5 Days
PMP Exam Prep Training - 5 Days (Based on the PMBOK Guide, 5 th Edition) Course Length: 5 Days Course Abstract: This workshop has a primary and a secondary goal. The primary goal is to prepare participants
DOCTOR OF PHILOSOPHY DEGREE. Educational Leadership Doctor of Philosophy Degree Major Course Requirements. EDU721 (3.
DOCTOR OF PHILOSOPHY DEGREE Educational Leadership Doctor of Philosophy Degree Major Course Requirements EDU710 (3.0 credit hours) Ethical and Legal Issues in Education/Leadership This course is an intensive
Nottingham Trent University Course Specification
Nottingham Trent University Course Specification Basic Course Information 1. Awarding Institution: Nottingham Trent University 2. School/Campus: Law/City 3. Final Award, Course Title and LLM Legal Practice
Lean Six Sigma Black Belt Certification Recommendation. Name (as it will appear on the certificate) Address. City State, Zip
Lean Six Sigma Black Belt Certification Recommendation Name (as it will appear on the certificate) IQF Member Number Address City State, Zip Country We the undersigned, on behalf of the Sponsoring Organization,
Performance Measures and Program Management for JAG
Performance Measures and Program Management for JAG Mary Poulin Stan Orchowsky Justice Research and Statistics Association Presentation for the 2009 BJA JAG Performance Measurement Tool Training 1 Topics
New Jersey's Talent Connection for the 21 st Century
New Jersey's Talent Connection for the 21 st Century Unified Workforce Investment Plan June 2012 New Jersey by the Numbers 2 417,200 Unemployed Residents 50% Unemployed more than 26 weeks 587,700 Residents
Chapter Three: Challenges and Opportunities
Chapter Three: Challenges and Opportunities The preparation of Orange County Community College s Periodic Review Report occurs at a time when the College and New York State are experiencing an economic
TOOL D14 Monitoring and evaluation: a framework
TOOL D14 Monitoring and evaluation: a framework 159 TOOL D14 Monitoring and evaluation: a framework TOOL D14 For: About: Purpose: Use: Resource: Commissioners in primary care trusts (PCTs) and local authorities
Strategic Planning Guide
Planning Guide Social Enterprise Start-Up Tool Kit Emily Bolton, Enterprise Development Manager, 1 Plan Process Clarity Priorities Resource Implications Performance Metrics Objective To develop a concrete
MASTER OF PHILOSOPHY PUBLIC POLICY. Introduction
University of Cambridge: Programme Specifications Every effort has been made to ensure the accuracy of the information in this programme specification. Programme specifications are produced and then reviewed
STRATEGIC THINKING, PLANNING & GOAL SETTING
Training Title STRATEGIC THINKING, PLANNING & GOAL SETTING Training Duration 5 days Training Venue and Dates Strategic Thinking, Planning & Goal Setting 09-13 June $ 3,750 Riyadh, KSA In any of the 5 star
ADEPT Performance Standards. for. Classroom-Based Teachers
ADEPT Performance Standards for Classroom-Based Teachers Revised ADEPT Performance Standards for Classroom-Based Teachers Introduction Central to the ADEPT system is a set of expectations for what teaching
Profile. Leadership Development Programs. Leadership Development. Approach to Leadership Development
Profile Leadership Development Programs Leadership Development Strong leadership will support an organisation in implementing change and driving the organisation from where it is now to where it needs
Graduation. Nursing. Pre-Nursing
EVALUATION AND TECHNICAL SUPPORT CAPACITY XXXXXX has agreed to lead assessment efforts. The evaluation of the project will be greatly enhanced by Dr. XXXX expertise in evaluation, research design and knowledge
Instructor Guide. Train-the-Trainer
Train-the-Trainer This instructor guide was written and developed by the Community Services Agency, Inc. of the New Jersey State AFL-CIO. DISCLAIMER: This material was produced under a Susan Harwood Training
Quality Assurance for Continuing Education Activities for the Architecture Profession in Canada
Quality Assurance for Continuing Education Activities for the Architecture Profession in Canada National Standards for Continuing Education Activities and Providers July 17, 2007 Prepared for the Continuing
Migrant Education Program Evaluation Toolkit A Tool for State Migrant Directors. Summer 2012
Migrant Education Program Evaluation Toolkit A Tool for State Migrant Directors Summer 2012 Developed by the U.S. Department of Education Office of Migrant Education through a contract with The SERVE Center
DRAFT Matrix of Proposed Courses Watson School of Education Ed.D. in Educational Leadership and Administration Last Updated 11/01/2006
EDN 01 Introduction to Doctoral Studies ISLLC Standards: 1, 4, 5, Standards: 1, 2, 4, 5, 7, 8, 9, 10, 11, 12 1. Orient to doctoral studies and learning the 12-steps of the program 2. Use logic and informed
THE REPUBLIC OF UGANDA DPP
THE REPUBLIC OF UGANDA DPP Staff Training and Development Policy 2014 TABLE OF CONTENTS 1.0 INTRODUCTION 3 1.1 Policy and Legal Framework 4 1.2 Principles/Guidelines of the Training and Staff Development
A. PROJECT IDENTIFICATION 1. PROJECT TITLE MULTIMEDIA TRAINING OF LECTURERS AND CURRICULUM REVIEW IN ZIMBABWE 2. NUMBER IPDC/57 ZIM/02
ZIMBABWE A. PROJECT IDENTIFICATION 1. PROJECT TITLE MULTIMEDIA TRAINING OF LECTURERS AND CURRICULUM REVIEW IN ZIMBABWE 2. NUMBER IPDC/57 ZIM/02 3. MEDIA DEVELOPMENT INDICATORS CATEGORY Category 4: Professional
CHANGE MANAGEMENT PLAN
Appendix 10 Blaby District Council Housing Stock Transfer CHANGE MANAGEMENT PLAN 1 Change Management Plan Introduction As part of the decision making process to pursue transfer, the Blaby District Council
Teacher Evaluation. Missouri s Educator Evaluation System
Teacher Evaluation Missouri s Educator Evaluation System Teacher Evaluation Protocol Introduction Missouri s Educator Evaluation System was created and refined by hundreds of educators across the state.
Valid from: 2012 Faculty of Humanities and Social Sciences Oxford and Cherwell Valley College Thames Valley Police
APPENDIX H Programme Specification Programme Specification Foundation Degree Policing Valid from: 2012 Faculty of Humanities and Social Sciences Oxford and Cherwell Valley College Thames Valley Police
Stages of Instructional Design V. Professional Development
Stages of Instructional Design V. Professional Development Derived from Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of Instructional Design (4th ed.). Fort Worth, TX: Harcourt Brace
Assessing the Impact of a Tablet-PC-based Classroom Interaction System
STo appear in Proceedings of Workshop on the Impact of Pen-Based Technology on Education (WIPTE) 2008. Assessing the Impact of a Tablet-PC-based Classroom Interaction System Kimberle Koile David Singer
Department of Management Information Systems Terry College of Business The University of Georgia. Major Assessment Plan (As of June 1, 2003)
Department of Management Information Systems Terry College of Business The University of Georgia Major Assessment Plan (As of June 1, 2003) Note to Readers: The MIS Faculty takes seriously our challenge
METROPOLITAN COLLEGE. Goals and Student Assessment Outcomes Measures. Graduate Degree Programs
METROPOLITAN COLLEGE Goals and Student Assessment Outcomes Measures for Graduate Degree Programs TABLE OF CONTENTS Overview... 3 Degrees Master of Arts in Human Resource Management. 4-10 Human Resource
