INL PROGRAM PERFORMANCE METRIC EXAMPLES



Similar documents
Corporate Learning Watch

Outcome-Based Evaluation for Technology Training Projects. Developed for: The New York State Library. Division of Library Development

Manual on Training Needs Assessment

Occupational Profile and Curriculum Summary

Hands on Banking Adults and Young Adults Test Packet

Running Head: HEARTSHARE S MANAGEMENT TRAINING PROGRAM

Hillsborough Community College. QEP Baseline Analysis. ACG 2021 Financial Accounting

Math Science Partnership (MSP) Program: Title II, Part B

Background. Scope. Organizational Training Needs

DEPARTMENT OF LOCAL GOVERNMENT AND COMMUNITY SERVICES. Performance Indicators Information Paper

Performance Management. Date: November 2012

ONTARIO LOTTERY AND GAMING CORPORATION. Employee Training OLG RG TRAINING PROGRAMS - CASE STUDY

U.S. Department of the Treasury. Treasury IT Performance Measures Guide

Online Teaching Evaluation for State Virtual Schools

Monitoring and Evaluation Plan Primer for DRL Grantees

1.3 Developing Performance Metrics University of California Approach

Developing Measurable Program Goals and Objectives

Evaluating a Mentoring Program

Student Success at the University of South Carolina: A comprehensive approach Category: Academic Support

Eleven Reasons Why Training and Development Fails... and what you can do about it. Jack J. Phillips & Patti P. Phillips

USAID/Macedonia Secondary Education Activity, Monitoring and Evaluation Plan

Engagement with Public Security Forces - Iraq 1

Concept Paper. Improving Training Evaluation in Organizations

Tennessee Small Business Development Center Strategic Plan

INSTRUCTIONAL DESIGN CRITERIA CHECKLIST

Evaluating Training. Debra Wilcox Johnson Johnson & Johnson Consulting

The Value of Training

A Change Management Plan. Proposal

Making SAS Training Stick

Wellness Initiative for Senior Education (WISE)

Department of Technology & Information

How To Teach Math To A Grade 8

The MetLife Survey of

AN INTEGRATED APPROACH TO TEACHING SPREADSHEET SKILLS. Mindell Reiss Nitkin, Simmons College. Abstract

Guide. Professional Development. When change is constant, learning must be continuous. ISM-ADR School for Supply Management. Leadership Delivered

Action Project 11. Project Detail

Does self-reflection improve learning autonomy and student satisfaction with feedback?

Check list for CEU Application. CEU application filled out completely. $75.00 administrative fee (made payable to ETSU)

PROJECT MANAGEMENT PROFESSIONAL DEVELOPMENT PROGRAM

Introduction to Usability Testing

Sample Company IBP Insurance Services Employee Benefit Program Services Performance Guarantee

Organizational Change Management Standards. Table of Contents

Module VII: INTRODUCTION STEP 1: NEEDS ASSESSMENT. Needs Assessment. Employee Training & Development. Design

Dementia Health and Social Care Stakeholder Event Programme: Looking Back and Moving Forward Dementia it s everyone s business Table 3

STAFF DEVELOPMENT AND TRAINING

Testing the Revised Moving Target Tutor

Teaching Risk Management: Addressing ACGME Core Competencies

Changing Practice in an Ed.D. Program

College of the North Atlantic and its accreditation with the Accreditation Council for Business Schools and Programs (ACBSP)

PROGRAM SUMMARY: 1/20/05 1

Capacity Assessment Indicator. Means of Measurement. Instructions. Score As an As a training. As a research institution organisation (0-5) (0-5) (0-5)

Training Analytics Market Analysis:

NORTHERN ILLINOIS UNIVERSITY. College: College of Business. Department: Inter-Departmental. Program: Master of Business Administration

HR Metrics and Workforce Analytics. No Balance NO ROI The Rise of BIG Data

Effective objective setting provides structure and direction to the University/Faculties/Schools/Departments and teams as well as people development.

CONCORDAT IMPLEMENTATION

Measuring Evaluation Results with Microsoft Excel

Creating a Training Program and Learning Culture in Your Organization

Clinical Safety & Effectiveness Session # 11

Continuing Education and Knowledge Retention: A Comparison of Online and Face-to-Face Deliveries

Southern University College of Business Strategic Plan

2012 Writing Effective SMART Performance Goals

PMP Exam Prep Training - 5 Days

DOCTOR OF PHILOSOPHY DEGREE. Educational Leadership Doctor of Philosophy Degree Major Course Requirements. EDU721 (3.

Nottingham Trent University Course Specification

Lean Six Sigma Black Belt Certification Recommendation. Name (as it will appear on the certificate) Address. City State, Zip

A Guide. Assessment of Student Learning

CAMBRIAN COLLEGE BUSINESS PLAN

Performance Measures and Program Management for JAG

New Jersey's Talent Connection for the 21 st Century

Chapter Three: Challenges and Opportunities

TOOL D14 Monitoring and evaluation: a framework

How To Improve Productivity With Acselleration Series

Strategic Planning Guide

MASTER OF PHILOSOPHY PUBLIC POLICY. Introduction

STRATEGIC THINKING, PLANNING & GOAL SETTING

ADEPT Performance Standards. for. Classroom-Based Teachers

Online Journal of Workforce Education and Development Volume IV, Issue 1 Fall 2009

CENTRE (Common Enterprise Resource)

Texas A&M University-Corpus Christi Customer Service Report. May 2012

Profile. Leadership Development Programs. Leadership Development. Approach to Leadership Development

G. Michael Hendon, Director McDonald s Worldwide Training Learning and Development

1. Introduction. Chaiwat Waree and Chaiwat Jewpanich

Graduation. Nursing. Pre-Nursing

Instructor Guide. Train-the-Trainer

ANNUAL PROGRAM/DEPARTMENT ASSESSMENT FOR LEARNING PLAN

Quality Assurance for Continuing Education Activities for the Architecture Profession in Canada

Migrant Education Program Evaluation Toolkit A Tool for State Migrant Directors. Summer 2012

DRAFT Matrix of Proposed Courses Watson School of Education Ed.D. in Educational Leadership and Administration Last Updated 11/01/2006

THE REPUBLIC OF UGANDA DPP

A. PROJECT IDENTIFICATION 1. PROJECT TITLE MULTIMEDIA TRAINING OF LECTURERS AND CURRICULUM REVIEW IN ZIMBABWE 2. NUMBER IPDC/57 ZIM/02

CHANGE MANAGEMENT PLAN

Transforming Training Execution:

Teacher Evaluation. Missouri s Educator Evaluation System

CATALOG. LeadingWave Consulting

Valid from: 2012 Faculty of Humanities and Social Sciences Oxford and Cherwell Valley College Thames Valley Police

Stages of Instructional Design V. Professional Development

Assessing the Impact of a Tablet-PC-based Classroom Interaction System

Department of Management Information Systems Terry College of Business The University of Georgia. Major Assessment Plan (As of June 1, 2003)

METROPOLITAN COLLEGE. Goals and Student Assessment Outcomes Measures. Graduate Degree Programs

Transcription:

1 INL PROGRAM PERFORMANCE METRIC EXAMPLES The purpose of this document is to assist Program Officers and Implementers with developing performance measurements that: 1. Are quantifiable 2. Are consistent with the project scope, schedule and funding 3. Meet SMART criteria 4. Can be combined with other metrics to determine program impact 5. Link to the INL Functional Bureau Strategy (Goals and Objectives) INL Performance Metrics can often be placed within a variety of categories, including: 1. Training Courses provided 2. Workshops conducted 3. Train the Trainer programs 4. Mentoring/Advising 5. Measuring Complex Changes over Time 6. Development and Delivery to Design, Build, and/or Equip Within each category, performance levels can be measured depending on the intervention: 1. Reaction: measures how participants reacted to the intervention 2. Learning: measures how much knowledge increased as a result of the intervention (requires baseline data, i.e., pre- and post-test scores) 3. Behavior: measures how far participants have retained their newly learned skills and have changed their behavior or applied what they have learned based on the intervention (requires baseline data) PERFORMANCE METRICS ASSOCIATED WITH INL-SPONSORED TRAINING COURSES Training programs are formalized learning experiences that use a prescribed curriculum to transfer practical knowledge, skills, and competencies that relate to specific useful skill to an audience of learners. Performance measures must include both Level 1 and Level 2 as described below. Level 3 is optional, but recommended, if possible. Reaction (Level 1): Performance metrics can measure the satisfaction rates over the period of performance for participants undertaking like courses. The most common method to accomplish this is through opinion surveys or questionnaires, in which respondents are offered a choice of five to seven pre-coded responses ranging from strongly agree to strongly disagree with the neutral point being neither agree nor disagree. At least one question on the Level 1 surveys or questionnaires must address whether respondents felt the course was useful in their current job. Opinion surveys will be distributed to participants of all INL-sponsored training courses conducted and results will indicate at least a 70%* overall satisfaction rate per course.

2 Learning (Level 2): Performance metrics could focus on an expected level of increase in knowledge of the participants using pre-test scores to determine the baseline, followed by post-test scores to measure what was learned. The quantitative metric is based on the average of the two scores. A qualitative analysis should then discuss if the increase in knowledge met the anticipated results. If not, then the analysis should also include discussion on changes that can be made to the training course to secure the desired results. Example for Learning (Level 2) Performance Metric Participants trained in the INL-sponsored (title/type of course) will have an average increase in knowledge of at least 10%* between pre- and post-test scores. Behavior (Level 3): Performance metrics could also measure the degree to which participants are applying what they have learned in the training to their workplace activities and/or behavior. A common means for applying a measurement is through observation whereby participants are observed by the course designer (or supervisor) in a practical real-world situation and their newly-acquired are evaluated and scored based on a standard checklist. This can be done from a random sampling of training participants. Example for Behavior (Level 3 ) Performance Metric One Month following INL-sponsored training, a sample of _20_%* of participants who underwent the _(title/type of course) will undergo a retention scorecard (i.e., learning objectives checklist) and earn a score of at least 50%.* PERFORMANCE METRICS ASSOCIATED WITH INL-SPONSORED TRAIN THE TRAINER PROGRAMS A train-the-trainer model enables experienced personnel to show a less-experienced instructor how to deliver courses, workshops and seminars. A train-the-trainer model can build a pool of competent instructors who can then teach the material to other people. Performance measures can be measured at both Level 1 and Level 2 as described below. Level 3 is optional, but recommended, if possible. Reaction (Level 1): This follows the same methodology for Level 1 of the Training Courses and Workshops. In this case, the respondents rate how well they feel the INL-sponsored training has prepared them to become trainers and teach the material to others. Opinion surveys will be distributed to INL-sponsored Train the Trainer participants and results will indicate at least a 70%* overall satisfaction rate per course.

3 Learning (Level 2): This follows the same methodology as explained in Level 2 of the Performance Metrics Associated with Training Programs. Performance metrics could focus on an expected level of increase in knowledge of the participants using pre-test scores to determine the baseline, followed by post-test scores to measure what was learned. The quantitative metric is based on the average of the two scores. Example for Learning (Level 2) Performance Metric Participants trained in the INL-sponsored (title/type of course) will have an average increase in knowledge of at least 10% between pre- and post-test scores. Behavior (Level 3): This follows the same methodology as explained in Level 3 of the Performance Metrics Associated with Training Programs. Performance can be evaluated by observation of new instructors to present information effectively, respond to participant questions and lead activities that reinforce learning. This can be measured by the course designer or subject-matter expert standardized checklist. Example for Behavior (Level 3) Performance Metric 20%* of train-the-trainer students who underwent the (title/type of course) and were later observed received a retention score of at least 50% based on a Train the Trainer retention scorecard (i.e., teaching objectives checklist). PERFORMANCE METRICS ASSOCIATED WITH INL-SPONSORED ADVISING/MENTORING Mentoring/advising is a dynamic, collaborative, reciprocal and sustained relationship between an experienced professional and a lesser one focused on the transfer of knowledge, skills and behaviors to a prescribed level. Performance measures typically include Levels 1 and 3. Reaction (Level 1): This follows the same methodology for Level 1 of the Train the Trainer programs whereby mentors can be measured in regards to whether mentees felt their experience with their mentor was useful in their current job. Behavior (Level 3: Performance metrics could measure the increase in knowledge, skills and behavior at various intervals during the mentoring/advising period using a prescribed survey or checklist. Opinion surveys will be distributed to participants of all INL-sponsored mentoring/advising programs conducted and results will indicate at least a 70%* overall satisfaction rate per course. Example for Behavior (Level 3) Performance Metric For all mentoring relationships, the mentee will establish specific objectives that are agreed upon with the mentor prior to the second meeting. The mentor will administer, at a minimum, bi-annual (twice a year) standardized checklists to gauge the number of objectives achieved with a minimum result of 50%* or higher.

4 PERFORMANCE METRICS ASSOCIATED WITH INL-SPONSORED WORKSHOPS Workshops are less formal educational modules than training courses. They can be in the form of a seminar, lecture, demonstration, simulation or the like, that emphasizes the exchange of ideas and the demonstration and application of techniques, skills, etc. It emphasizes problem-solving, hands-on training, and requires the involvement of the participants. Performance measures are typically measured at Level 1. Reaction (Level 1): As with training courses, performance metrics can measured for workshops through opinion surveys or questionnaires, in which respondents are offered a similar choice of pre-coded responses. At least one question on the surveys or questionnaires must address whether respondents felt the workshop was useful in their current job. Measuring satisfaction rates can give additional insight into how effective the workshop content was and if any modifications may be needed. * PA&E recommended minimum standard Opinion surveys will be distributed to participants of all INL-sponsored workshops conducted and results will indicate at least a 70%* overall satisfaction rate per workshop. PERFORMANCE METRICS ASSOCIATED WITH AND MEASURING COMPLEX CHANGES OVER TIME INL interventions may include measuring complex changes by the host country over time, such as systems reform, meeting specific international standards and measuring increased coordination and information sharing among host country agencies. Performance metrics could compare the acceleration of participants ability to produce specific outputs within a particular organization s structure. This evidence of change can be demonstrated, for example, through an increase in tangible outputs or decrease, such as cases prosecuted. This metric requires a quantitative baseline data on which any change is measured against. 1. Increase in tangible outputs include: Cases prosecuted, calls into tip lines, land patrols, asset forfeitures, convictions, number of investigations, public perception, implementation of new management tools 2. Decrease in tangible outputs include: Backlogged cases. eradication of illicit materials, processing time, response time INL Program Officers and Implementing Partners can monitor a host of these performance metrics, but would need to directly link these metrics to INL-funded training, workshops and mentoring/advising initiatives to dollars spent by Department of State (DOS) to determine return on investment. Another area to examine would be testing criminological theories for causative factors of change over time.

5 Performance measures typically include Level 3. Examples for Behavior (Level 3) Performance Metric _(Description of activity) will (increase/decrease) by _(xx) % from the baseline period of _ (specified baseline period) by _(defined end time). _ (Agency or office within an Agency/Organization) will demonstrate their new skills and knowledge through a _(xx) % increase in (type/ description of activity) as measured by (defined baseline data). Investigators from (specific office, agency or other defined entity) will engage and investigate _(xx) % more criminal cases in CY 2015 than from the previous benchmark of 2013. PERFORMANCE METRICS ASSOCIATED WITH DEVELOPMENT AND DELIVERY TO DESIGN, BUILD, AND/OR EQUIP Performance metrics that involve drafting legislation, implementing a system, procuring equipment or other start-up activity are typically measured using a Yes/No metric. These types of activities can be broken down into smaller, measurable milestones and assigned a numerical percentage value. Quarterly reporting would indicate the percentage complete and the final metric would be Yes/No. Examples for Performance Metric The _ (system or type of infrastructure will be installed/built and operational with an expected outcome of (description of benefit). It will be 50% complete by (anticipated midpoint date) and 100% complete by (anticipated completion date). _ (Governmental Agency or Entity) will draft a new (description of document) that achieves (description of benefit). It will be 50% complete by (anticipated midpoint date) and 100% complete by (anticipated completion date).