Manual on Training Evaluation
|
|
|
- David Nelson
- 10 years ago
- Views:
Transcription
1 Project on Improvement of Local Administration in Cambodia Five Steps of Training Evaluation Step 1: Identify the Purposes of Evaluation Step 2: Select Evaluation Method Step 3: Design Evaluation Tools Step 4: Collect Data Step 5: Analyze and Report Results
2 Table of Contents Five Steps of Training Evaluation... 4 Step 1: Identify Purposes of Evaluation... 6 Step 2: Select Evaluation Method... 9 Step 3: Design Evaluation Tools Step 4: Collect Data Step 5: Analyze and Report Results Appendices Appendix 1: Sample Questionnaire Forms Appendix 2: Sample Pre/Post Test Forms Appendix 3: Impact Survey Questions Appendix 4: Sample Data Input Sheet Appendix 5: Evaluation Report Checklist 2
3 Glossary Evaluation: An evaluation is a systematic determination of merit, worth, and significance of something or someone using criteria against a set of benchmark standards. Evaluation Methods: Evaluation Tools: Impact Survey: Pre/Post Test: Evaluation is a methodologically diverse terms involving the use of both qualitative and quantitative methods, including case studies, survey research, statistical analysis, and model building among others. Evaluation tools are used to collect data. These come in varied forms and can be divided into categories such as questionnaires, surveys, tests, interviews, focus groups, observations, and performance records. An impact survey is an evaluation tool to measure the extent to which skills and knowledge learned in the program have translated into improved behavior and the final results that occurred because the participants attended the training program. The pre/post test is a common form of evaluating training programs in terms of knowledge improvement of the participants. Identical tests may be used for pre- and post-tests to compare scores before and after the training respectively. Questionnaire: The questionnaire is one of the most common tools used to evaluate training programs. Questionnaires can be used to obtain subjective information about participants feelings. Semi-structured Interview: A semi-structured interview is an interview with an individual or individuals that follows a pre-defined set of questions. It is flexible, allowing new questions to be brought up during the interview as a result of what the interviewee says. 3
4 1 Five Steps of Training Evaluation What is an Evaluation? Several definitions of evaluation have been offered, and the following are some of those most commonly used: An evaluation is the systematic and objective assessment of an ongoing or completed project, program or policy, its design, implementation and results. The aim is to determine the relevance and fulfillment of objectives, development efficiency, effectiveness, impact and sustainability. Source: Glossary of Key Terms in Evaluation and Results Based Management A program evaluation is the systematic collection of information about the activities, characteristics, and outcomes of programs to make judgments about the program, improve program effectiveness, and/or inform decisions about future programming. Source: Patton, M.Q. (1997). Utilization-focused Evaluation: The New Century Text (3 rd ed.). Thousand Oaks, CA: Sage. Training Management Cycle A training management cycle can be divided into three major steps: Step 1: Planning; Step 2: Implementation; and Step 3: Evaluation. The evaluation is the final step of the training management cycle. The results of the training evaluation are reflected in the next phase of training planning to improve future training programs as shown by the arrow in the figure. 4
5 Five Steps of Training Evaluation The processes of training evaluation can be divided into five steps: identify purposes of evaluation; select evaluation methods; design evaluation tools, collect data; and analyze and report results. Step 1: Identify the Purposes of Evaluation Before developing evaluation systems, the purposes of evaluation must be determined. Why do we want to evaluate training programs? Step 2: Select Evaluation Method Kirkpatrick s four levels of evaluating training programs Reaction, learning, behavior, and result Step 3: Design Evaluation Tools Questionnaire Pre/Post Test Impact Survey Step 4: Collect Data Who, when, how to collect data? Step 5: Analyze and Report Results Evaluation data analysis Reporting 5
6 2 Step 1: Identify Purposes of Evaluation Why do we want to evaluate training programs? Before developing evaluation systems, the purposes of evaluation must be determined. These will affect the types of data and the data collection methods. Learn from Experience for Future Improvement The most common reason for evaluating training programs may be to determine the effectiveness of training programs in order to improve future programs. Evaluation can help us learn from experience of past training programs. For example, we may want to know which parts of the training were successful and which not, or whether the approach to the training should be changed. We can use these lessons learned to improve plans for future training programs. Purposes identified by the GDLA Task Force The GDLA Task Force has identified the following as the purposes of evaluating training programs planned and implemented by the Task Force for public officials in charge of local administration: To determine whether the objectives of the training were achieved. To see how the knowledge and skills learned in the training are put into practice. To assess the results and impacts of the training programs. To assess the effectiveness of the training programs. To assess whether the training programs were properly implemented. 6
7 To identify the strengths and weaknesses of the training programs. To assess whether the training programs were suitable in terms of the training contents, timing, participants and other aspects. To find problems of the training programs and solutions for improvement. Accountability Issue In addition, accountability may be one of the reasons for evaluating training programs. Evaluation can increase the accountability of implementing agencies to concerned stakeholders. In the GDLA Task Force, it can improve accountability as a training agency by reporting the evaluation results of training programs to MOI and JICA. Other Reasons for Evaluation Depending on the objectives, contents, participants, and other factors, each training program might have different purposes of evaluation. Therefore, the evaluation design can be adjusted for each training program to meet the specific purposes of the evaluation. Some examples of the reasons for evaluating training programs are shown as follows: To determine success or otherwise in accomplishing program objectives. To identify the strengths and weaknesses in the human resource development process. To compare the costs and benefits of a human resource development program. To decide who should participate in future programs. To test the clarity and validity of tests, cases and exercises. To identify which participants were the most successful with the program. To reinforce the key points made to the participants. To gather data to assist in marketing future programs. To determine whether the program was an appropriate solution for the specific need. To establish a database that can assist management in making decisions. 7
8 Source: Handbook of Training Evaluation and Measurement Methods References Chapter 2 Reasons for Evaluating (pp ), Evaluating Training Programs: The Four Levels. Developing a Results-Based Approach (pp ), Handbook of Training Evaluation and Measurement Methods. 1. Overview of Evaluation (pp. 1-11), Building Evaluation Capacity. 8
9 3 Step 2: Select Evaluation Method Four Levels of Evaluation The four levels of evaluation is one of the most commonly used methods for evaluating training programs. The four sequential levels of evaluation were originally proposed by Donald L. Kirkpatrick, Professor Emeritus at the University of Wisconsin. This concept has been increasingly adopted in private companies to evaluate their training programs, and gradually applied for training programs under technical assistance projects of the Japan International Cooperation Agency (JICA). According to his concept, capacity development is realized by the four sequential steps: (i) Reaction; (ii) Learning; (iii) Behavior; and (iv) Results. Level 4: Results What occurred as the final results? Level 3: Behavior How has the behavior of participants changed after the training program? Level 2: Learning What have participants learned? Level 1: Reaction How do participants react to the training program? 9
10 Level 1: Reaction Evaluation on this level measures how participants react to the training program. It is important to get a positive reaction. Although a positive reaction may not ensure learning, if participants do not react favorably, they probably will not be motivated to learn. Evaluating reaction is the same thing as measuring customer satisfaction. It training is going to be effective, it is important that trainees react favorably to it. Otherwise, they will not be motivated to learn. Kirkpatrick (2006) Evaluating Training Programs Level 2: Learning Evaluation on this level measures the extent to which participants change attitudes, improve knowledge, and/or increase skills as a result of attending the training program. One or more of these changes must take place if a change in behavior is to happen. Level 3: Behavior Evaluation on this level measures the extent to which change in participants behavior has occurred because of attending the training program. In order for change to take place, four conditions are necessary: The person must have a desire to change. The person must know what to do and how to do it. The person must work in the right climate. The person must be rewarded for changing. Level 4: Results Evaluation on this level measures the final results that occurred because the participants attended the training program. Examples of the final results include increased production, improved quality and decreased costs. It is important to recognize that these results are the reason for having some training programs. 10
11 Some Other Evaluation Models The following are some of the evaluation models and approaches mentioned in the evaluation literature. ( 5. Evaluation Models, Approaches, and Designs Building Evaluation Capacity) Behavioral Objectives Approach Focuses on the degree to which the objectives of a program have been achieved. The major question is is the program achieving its objectives? Responsive Evaluation Evaluators are responsible for the information needs of various audiences or stakeholders. The major question is what does the program look like to different people? Goal-Free Evaluation Focuses on the actual rather than intended outcomes of a program. The major question is what are all the effects of the program, including any side effects? Expertise/Accreditation Approaches The purpose is to provide professional judgments of the quality of programs based on expert opinions. The major question is how would professionals rate this program? Participatory/Collaborative Evaluation Emphasize the engagement of stakeholders in the evaluation process so that they may better understand the evaluation, the program be evaluated and ultimately use the evaluation findings for decision-making purposes. The major question is what are the information needs of those closest to the program? 11
12 Organizational Learning Views evaluation as a social activity in which evaluation issues are constructed by and acted on by organization members. The major question is what are the information and learning needs of individuals, teams, and organization in general? References Chapter 3 The Four Levels: An Overview (pp ), Evaluating Training Programs: The Four Levels. Chapter 4 A Result-Based HRD Model (pp ), Handbook of Training Evaluation and Measurement Methods. 5. Evaluation Models, Approaches, and Designs (pp ), Building Evaluation Capacity. 12
13 4 Step 3: Design Evaluation Tools Evaluation Tools Various evaluation tools can be selected depending on the purposes and methods of evaluation. Questionnaires Surveys Tests Interviews Focus group discussions Observations Performance records For the training programs targeting local administration officials, a questionnaire may be used for Level 1: Reaction ; a pre/post test may be used for Level 2: Learning ; and an impact survey may be used for Level 3: Behavior and Level 4: Results. Level 1: Reaction Level 2: Learning Level 3: Behavior & Level 4: Results Questionnaire Pre/Post Test Impact Survey 13
14 Five Steps of Questionnaire Design The questionnaire is probably the most common form of evaluating training programs. Questionnaires to evaluate the reactions of training participants can be developed through the five steps shown above. Step 1: Determine what we want to find out The first step of questionnaire design is to determine the information we would like to know. This largely depends on the purposes of the evaluation. The following are some common types of information we may want to ask participants. Contents: Was the content appropriate? Materials: Were the materials useful? Teaching method: Was the teaching method appropriate? Trainer/Facilitator: Was the trainer/facilitator effective? Motivation to learn: Were you motivated to learn the contents? Program relevance: Was the program relevant to your needs? Level of understanding: Did you understand the contents? Time: Was the time and length of program appropriate? Length: Was the program length appropriate? Facilities: Were the training facilities appropriate? Overall evaluation: What is your overall rating of the program? Planned improvements: How will you apply what you have learned? Questions are developed later, but it might be useful to develop this information in outline form so that related questions can be grouped together. 14
15 Step 2: Select the type(s) of questions The second step in questionnaire design is to select the type(s) of questions. Questions that might be asked in a questionnaire can be classified into two major categories: open-ended and close-ended. Open-ended questions have an unlimited answer. The question is followed by a blank space for response. Open-ended questions give participants the opportunity to express their own thoughts. They produce varieties of answers and more difficult to analyze. The following are some examples of open-ended questions: Which part of the contents of the training program interests you more than others? How do you think we can improve the contents of the training program? On the other hand, close-ended questions ask respondents to select one or multiple responses from the list. Below are several types of close-ended questions. Two-option response: Respondents are asked to choose one out of two options, such as yes-no, true-false, disagree-agree. The following is an example. Did you like the training program? Yes No 15
16 Rating scale: Respondents are asked to choose the most appropriate answer to reflect their opinion from the complete range of possible answers. The range can be presented in numbers (e.g., 1 to 10), or in words (e.g., strongly agree to strongly disagree). The following is an example. How do you rate the contents of the training? (1 = poor, 10 = excellent) Checklist: It is a list of items. Respondents are asked to check those that apply to the situation. The following is an example. Please check one session from the list below that you liked most in the training program. 1. Local Administration System 2. D&D Policy 3. Organic Law Ranking scales: It requires the participant to rank a list of items. The following is an example. Please rank the following sessions in order of preference. 1. Local Administration System 2 2. D&D Policy 1 3. Organic Law 3 16
17 Step 3: Design the questionnaire The third step in questionnaire design is to develop the questions based on the types of questions planned and the types of information needed. Good Questions Good questions are simple, short, specific and straightforward in order to avoid confusing respondents, or leading respondents to a desired response. Well-designed Questionnaires Well-designed questionnaires are easy to understand and answer to encourage respondents to complete the questionnaires and also make few mistakes when doing so. Sample Questionnaires Questionnaire forms for the Top Management Seminar and the GDLA Officials Training are attached in the Appendix 1 as a reference. Some other sample questionnaires are available in the reference books. Step 4: Pretest the questionnaire The fourth step in questionnaire design is to test the questions. It is ideal if the prepared questions can be tested on a sample group of participants. If this is not feasible, they can be tested on a group of people at approximately the same job level as the participants. 17
18 The following are some of the points to be checked when pre-testing the questionnaire. Does he/she understand all the questions? Does he/she have any difficulty in answering the questions? Do all close-ended questions have an answer applicable to each respondent? Are the skip patterns followed correctly? Does any part of the questionnaire suggest bias on your part? Does the questionnaire create a positive impression to motivate people to respond? Step 5: Finalize the questionnaire Based on the result of pretest in Step 4, the questionnaire forms will be finalized. For sample questionnaire forms, please refer to Appendix 1. 18
19 Five Steps of Pre/Post Test Design The pre/post test is a common form of evaluating training programs in terms of improving the knowledge of the participants. Identical tests may be used for pre- and post-tests to compare scores before and after the training. Pre/post tests can be developed through the five steps shown above. Step 1: Determine what knowledge and understanding we want to measure The first step of pre/post test design is to determine what knowledge and understanding we want to measure. Normally this means the key concepts and ideas taught in the training. For example, in the GDLA Officials Training conducted in July 2007, the following are the issues used for the pre/post tests. Local Administration System Structure of local administration system Function of provincial and district governments Composition of commune councils D&D Theory and Practice Definition of decentralization and deconcentration Definition of democratic representation, participation of people, and public sector accountability as principles for D&D reform 19
20 Organic Law Official documents based on which the Organic Law is drafted Responsible agency for drafting the Organic Law Composition of the draft Organic Law Definition of urban/rural areas in the draft Organic Law Sub-national councils in the draft Organic Law One Window Service Model districts Functions of the One Window Service Office Services provided by the One Window Service Office Step 2: Select the type(s) of questions The second step in pre/post test design is to select the type(s) of questions. Questions that might be asked in a pre/post test can be classified into three major categories: true-false questions, multiple choice questions, and open-ended short-answer questions. True-false questions: Respondents are requested to choose appropriate answers from true and false. The following is an example. The local administration system in Cambodia is divided into three levels: province/city, district/khan, and commune/sangkat. True False 20
21 Multiple choice questions: Respondents are requested to choose appropriate answers from multiple choices. The following is an example. How is the local administration system in Cambodia structured? a) The local administration system in Cambodia is divided into three levels: province/city, district/khan, and commune/sangkat. b) The local administration system in Cambodia is divided into two levels: province/city, and district/khan. c) The local administration system in Cambodia is divided into four levels: province/city, district/khan, commune/sangkat, and village. d) The local administration system in Cambodia is not legally divided into different levels. Open-ended short-answer questions: Respondents are requested to explain their answers in short sentences. The following is an example. Please explain how the local administration system in Cambodia is structured. The local administration system in Cambodia is divided into three levels: province/city, district/khan, and commune/sangkat. 21
22 Step 3: Design the pre/post test Based on the result of the pretest in Step 4, the pre/post tests will be finalized. Good Questions Good questions are simple, short, specific and straightforward in order to avoid confusing respondents. Well-designed Pre/Post Test Well-designed pre/post tests are easy to understand and answer so as to encourage respondents to complete the tests and also make few mistakes in doing so. Sample Pre/Post Tests Pre/post test forms for the GDLA Officials Training are attached in the Appendix 2 as a reference. Step 4: Pretest the pre/post test The fourth step in questionnaire design is to test the questions. It is ideal if the prepared questions can be tested on a sample group of participants. If this is not feasible, they can be tested on a group of people at approximately the same job level as the participants. 22
23 The following are some of the points to be checked when pre-testing the questionnaire. Does he/she understand all the questions? For multiple choice questions, does he/she understand all the possible answers? Does he/she have any difficulty in answering the questions? Step 5: Finalize the pre/post test Based on the result of the pretest in Step 4, the pre/post test forms will be finalized. For sample pre/post test forms, please refer to the Appendix 2. 23
24 Five Steps of Impact Survey Design An impact survey can be carried out as a follow-up evaluation within several months of completing the training program. The main purpose of the impact survey is to assess the behavioral change of participants (Level 3) and improvements or outputs on their work that can be linked to the training program (Level 4). Step 1: Determine what impact we want to find out The first step of the impact survey design is to determine what impact of the training we would like to find out. The following are some common topics that we might want to ask training participants. Level 3: Behavior Evaluation on this level measures whether the knowledge and skills that the training participants have learned in the training are applied to their work. The amount of time required for the change to manifest itself will depend on the type of training, how soon the participant has an opportunity to practice the skill, how long it takes participants to develop a new behavioral pattern, and other aspects of the job. Level 4: Results Evaluation on this level measures the final results that occurred because the participants attended the training program. Examples of the final results include increased production, improved quality and decreased costs. It is important to recognize that these results are the reason for having some training programs. However, it is difficult to isolate the training effect from other factors affecting the results. 24
25 Step 2: Select the data collection method(s) The most common data-collection method for the impact survey might be the follow-up questionnaire. Interviews and/or focus group discussions can also be used especially when qualitative information is needed about the impact of the training program. Interviews have the following advantages and disadvantages that should be considered when selecting them as the data collection method. Advantages of interviews: Good for uncovering feelings and hidden causes. Non-verbal signals can indicate key issues. Spontaneity follow the unexpected issues. Disadvantages of interviews: Time-consuming. An unrepresentative sample can skew the results. Can be difficult to quantify. Very dependent on the skills of the interviewer. Interviews have three types from which a suitable one will be selected for each survey. 1. Structured interview: the questions are set in advance. 2. Semi-structured interview: the general content is pre-determined but additional exploration is allowed. This form of interview is particularly useful in situations where there are key issues to be investigated, but there is less certainty about the range of respondents' reactions to them. 3. Unstructured interview: free-flowing conversation rather than a specific set of questions 25
26 Step 3: Design the questionnaire / questions The third step of the impact survey design is to develop questions based on the kinds of impact to be found out and the data collection methods selected. The following are some examples of questions for an impact survey questionnaire. Relevance of the training program Please rate on a scale of 1-10, the relevance of each session to your job, with (1) indicating no relevance and (10) indicating very relevant D&D Policy Organic Law One Window Service Use of training materials Have you used the materials since you participated in the training program? Yes No If yes, please explain what materials you have used and for what purpose. 26
27 Knowledge application Is there any specific knowledge of D&D policy and organic law that you have used on your work after the training program? Yes No If yes, please explain what knowledge and how you have used it. The following are some of the interview questions used in the impact survey conducted in June For more details, please refer to the Appendix 3 for the questionnaire used in the survey. 1. Is there any specific knowledge of D&D policy and organic law that you have used in your work after the training course/seminar? 2. Have you used the materials since you participated in the training course/seminar? 3. Did you incorporate anything else you learned in the training course/seminar into your work? 4. Is there anything which has changed your perception, attitude or behavior as a result of the training course/seminar? Step 4: Present the questionnaire / questions The fourth step is to test the questions/questionnaire. It is ideal if the prepared questions/questionnaire can be tested on a sample group of participants. If this is not feasible, they can be tested on a group of people at approximately the same job level as the participants. 27
28 The following are some of the points to be checked when pre-testing the questionnaire/questions. Does he/she understand all the questions? Does he/she have any difficulty in answering the questions? Do all close-ended questions have an answer that applies to each respondent? Are the skip patterns followed correctly? Does any question suggest bias on your part? Do the questions create a positive impression to motivate people to respond? Step 5: Finalize the questionnaire / questions Based on the result of the pre-test in Step 4, the questionnaire/questions will be finalized. References Chapter 9 Collecting Data: Application and Business Impact Evaluation (pp ), Handbook of Training Evaluation and Measurement Methods. 7. Questionnaire Design (pp ), How to Conduct Your Own Survey. Part Two: Case Studies of Implementation (pp ), Evaluating Training Programs. 28
29 5 Step 4: Collect Data Questionnaire The following are some helpful guidelines to improve the effectiveness of questionnaire data collection. Keep responses anonymous If there is no specific reason why you would like to identify each participant s questionnaire, it is recommended to keep responses anonymous. It allows the participants to feel open and comfortable to give comments that can help improve future programs. Distribute questionnaire forms in advance For lengthy evaluations for training programs that span several days, or if you want the participants to evaluate each individual session, it is helpful to distribute questionnaire forms early in the program. This will allow the participants to familiarize themselves with the questions, and to answer specific questions as they are covered in the program. Please note, however, that the participants should wait until the end of the program to reach a final conclusion on general issues. For this reason, questionnaire forms for general questions could be distributed at the end of the program. Explain the purpose of the questionnaire and how the information will be used The purposes of the questionnaire and how the data will be used should be explained clearly to the participants. This will help improve the response rate and encourage them to make comments that can be useful to improve future programs. 29
30 Allow enough time for completing the questionnaire If we ask the participants to fill in the questionnaire forms at the end of the program, they may be in a hurry to leave and may provide incomplete information. It is recommended to set aside enough time to fill in the questionnaire forms as a scheduled session before the end of the program. Pre/post Tests The same set of questions prepared in step 3 will be given to the training participants on the first day of training before all the sessions start (pre-test), and on the last day of training after all the sessions have been completed (post-test). Pre-Test First Day of Training Before all the sessions Sessions Post-Test Last Day of Training After all the sessions 30
31 Semi- Structured Interview for Impact Survey The following are some helpful guidelines to conduct effective interviews. Introduction [Appreciation] Thank you very much for taking time for the interview. [Self introduction] My name is from GDLA, MOI. [Purpose of interview] I would like to ask you several questions about our training course/seminar that you attended last year. Your information will be useful to evaluate and improve our future training programs. Positive attitude Face the interviewee. The interviewee knows that you are paying attention to him/her. The interviewee feels accepted. Sit straight. Do not cross arms or legs. Smile and be friendly. The interviewee feels comfortable and at ease to talk. The interviewee will be willing to give more information. Give positive feedback. The interviewee feels that their information is useful. The interviewee would like to contribute more. Make eye contact, repeat the answers, and rephrase them. Ways to probe answers on open-ended questions When you do not understand the interviewee s reply: What do you mean? Could you tell me what you mean by that? Would you tell me more about your thinking on that? When you would like to know more about details about the answer: Would you explain to me more about that? What?, Who?, When?, Where?, How? 31
32 6 Step 5: Analyze and Report Results Data Input Before summarizing and analyzing the questionnaire or pre/post test data that are collected in Step 4, the data need to be entered into a computer. Many statistical software programs are available for such data. Unless you have extremely large data sets or must conduct highly sophisticated analysis, a simple program like Excel may be enough. A sample data table is in the Appendix 4. Data Analysis There are many ways to analyze data, but the analysis should be as simple as possible and limited to what is necessary to draw the required conclusions from the data. Frequency distribution and average Frequency distribution and average are two basic ways of presenting data. The following are examples of frequency distribution and average of rating for the session on the progress on D&D at the Top Management Seminar in July Progress on D&D total contents materials level of understanding Progress on D&D Average contents 9.29 materials 9.04 level of understanding 8.50 total
33 Using figures to present data Figures are useful tools to present statistical and complex data fairly quickly and easily. Pie charts and bar charts are among commonly used figures. Pie charts are useful to show the components of something that add up to 100%. As a general rule, a pie chart becomes hard to read as the number of categories increases. The following is an example of a pie chart presenting popularity of lecture sessions at the Top Management Seminar in July Source: Top Management Seminar Implementation Report, GDLA Task Force, October 2007 Bar charts work better when many categories are compared, and relative magnitude is to be shown. The following is an example of a bar chart presenting distribution of the pre/post test scores of the GDLA Officials Training in July Source: GDLA Officials Training Implementation Report, GDLA Task Force, October
34 Reporting Who needs to know what? Before writing an evaluation report, we need to identify the primary evaluation users. Primary users are specific individuals or organizations whose information needs you have to serve. They might be a program director, a funding agency, program decision makers, or policy makers. Primary users such as program directors and funding agencies usually require the most technical and detailed information because they are making crucial decisions about the training programs. Forms of communicating evaluation findings The next step is to consider what forms of communication will be most effective to present evaluation findings to the primary users. The following questions may be used as guidance to choose appropriate forms of communication. To what extent and in what specific ways is the information relevant to the user s real and compelling problems? To what extent is the information practical from the user s perspective? To what extent is the information useful and immediately applicable in the user s situation? What information will the user consider credible and what reporting practices will support that credibility? Source: Morris, Lynn Lyons, et al. (1987). How to Communicate Evaluation Findings. SAGE Publications. Evaluation report outline After knowing what kind of information will be relevant and useful to the primary users, you can develop an evaluation report outline. The following is a sample report outline. Summary Purpose of evaluation Evaluation audiences Major findings and recommendations Program Description Program background Program goals/objectives 34
35 Program participants Program activities Evaluation Design and Methods Purpose of the evaluation Evaluation designs Data collection methods Findings and Results Description of how the findings are organized (e.g., by evaluation questions, themes/issues) Results of analyses of quantitative and/or qualitative data collected Recommendations Recommendations for action based on these conclusions Appendices List of participants Seminar/training materials Questionnaires, pre/post tests Program expenditure summary For more details about evaluation reports, please refer to the evaluation report checklist in the Appendix 5. References Chapter 14 Data Analysis (pp ), Chapter 19 Communicating Program Results (pp ) Handbook of Training Evaluation and Measurement Methods. 8. Analyzing Evaluation Data (pp ), 9. Communicating and Reporting Evaluation Processes and Findings (pp ) Building Evaluation Capacity. 9. From Questionnaires to Survey Results (pp ), 10. Reporting Survey Results (pp ) How to Conduct Your Own Survey. 4. Selecting Commonly Used Statistical Methods for Surveys (pp ) How to Manage, Analyze, and Interpret Survey Data. How to Communicate Evaluation Findings. Evaluation Strategies for Communicating and Reporting. 35
36 Appendix 1 Top Management Seminar Sample Questionnaire Forms 36
37 37
38 Appendix 2 Sample Pre/Post Test Forms (GDLA Officials Training) GDLA Officials Training Pre/Post Test Answer This test consists of 20 multiple choice questions concerning local administration system, decentralization and de-concentration policy, organic law, and one window service. Each question has four choices of answers. Please choose ONE answer that is most appropriate. Session 3: Local Administration System in Cambodia & Japan 1. How is the local administration system in Cambodia structured? a) The local administration system in Cambodia is divided into three levels: province/city, district/khan, and commune/sangkat. b) The local administration system in Cambodia is divided into two levels: province/city, and district/khan. c) The local administration system in Cambodia is divided into four levels: province/city, district/khan, commune/sangkat, and village. d) The local administration system in Cambodia is not legally divided into different levels. 2. Which one of the following is the function of the provincial/municipal government? a) Preparation of commune development plans b) Appointment of village chief c) Certification of civil registration d) Developing provincial/municipal development programs 3. Which one of the following is the function of the district/khan government? a) Protection of cultural heritage and natural environment b) Certification of civil registration c) Reporting to Central Government d) Approval of commercial activities 38
39 4. How is the commune chief selected? a) Appointed by the Ministry of Interior b) Elected by voters in the commune/sangkat c) Selected from village chiefs in the commune/sangkat d) Selected from the candidate on the top of the candidate list that receives majority of votes 5. How many councilors does each commune/sangkat council have? a) Each commune/sangkat council has 8 councilors. b) Each commune/sangkat council has 11 councilors. c) Each commune/sangkat council has 8 to 12 councilors depending on its demography and geography. d) Each commune/sangkat council has 5 to 11 councilors depending on its demography and geography. Session 4: D&D Theory and Practice 6. What is decentralization? a) To transfer power and responsibilities from central administration to local authority to manage self responsibility including financial resource, asset and human resources. b) To transfer power and responsibilities from central administration to the commune/sangkat councils to manage self responsibility including financial resource, asset and human resources. c) To distribute power from central administration to local administration to apply their power or their function on behalf of the central administration by means of providing the resources, service fee, and capacity building. d) To distribute power from central administration to the commune/sangkat councils to apply their power or their function on behalf of the central administration by means of providing the resources, service fee, and capacity building. 39
40 7. What is de-concentration? a) To transfer power and responsibilities from central administration to local authority to manage self responsibility including financial resource, asset and human resources. b) To transfer power and responsibilities from central administration to the commune/sangkat councils to manage self responsibility including financial resource, asset and human resources. c) To distribute power from central administration to local administration to apply their power or their function on behalf of the central administration by means of providing the resources, service fee, and capacity building. d) To distribute power from central administration to the commune/sangkat councils to apply their power or their function on behalf of the central administration by means of providing the resources, service fee, and capacity building. 8. What does democratic representation mean as one of the principles for decentralization and de-concentration reforms? a) The reforms will strengthen the roles of councils at provincial/municipal, district/khan and commune/sangkat administrations to be established in accordance with the principles of democracy by expanding their powers, duties, responsibilities and resources. b) The reforms will introduce systems and procedures to ensure that people, especially women, vulnerable groups and indigenous minorities can participate in decision-making at provincial/municipal, district/khan and commune/sangkat levels. c) The reforms will strengthen accountability at all levels of administration and facilitate citizens oversight of the administrative and financial affairs of those administrations. d) The reforms will bring public services closer to users by allowing citizens to participate in planning and monitoring public services in order to meet local needs and priorities. 40
41 9. What does participation of people mean as one of the principles for the decentralization and de-concentration reforms? a) The reforms will strengthen the roles of councils at provincial/municipal, district/khan and commune/sangkat administrations to be established in accordance with the principles of democracy by expanding their powers, duties, responsibilities and resources. b) The reforms will introduce systems and procedures to ensure that people, especially women, vulnerable groups and indigenous minorities can participate in decision-making at provincial/municipal, district/khan and commune/sangkat levels. c) The reforms will strengthen accountability at all levels of administration and facilitate citizens oversight of the administrative and financial affairs of those administrations. d) The reforms will bring public services closer to users by allowing citizens to participate in planning and monitoring public services in order to meet local needs and priorities. 10. What does public sector accountability mean as one of the principles for the decentralization and de-concentration reforms? (4) a) The reforms will strengthen the roles of councils at provincial/municipal, district/khan and commune/sangkat administrations to be established in accordance with the principles of democracy by expanding their powers, duties, responsibilities and resources. b) The reforms will introduce systems and procedures to ensure that people, especially women, vulnerable groups and indigenous minorities can participate in decision-making at provincial/municipal, district/khan and commune/sangkat levels. c) The reforms will strengthen accountability at all levels of administration and facilitate citizens oversight of the administrative and financial affairs of those administrations. d) The reforms will bring public services closer to users by allowing citizens to participate in planning and monitoring public services in order to meet local needs and priorities. 41
42 Session 5: Organic Law on Sub-National Democratic Development 11. What official document is the Organic Law drafted based on? a) Royal Decree on Establishment of the National Committee for the Management of Decentralization and De-concentration b) Strategic Framework for Decentralization and De-concentration Reforms c) Law on Commune/Sangkat Administrative Management d) Constitution of the Kingdom of Cambodia 12. Which organization is responsible for drafting the Organic Law? a) Ministry of Interior b) National Committee for the Management of Decentralization and De-concentration c) Council of Administrative Reform d) Council of Ministers 13. What is the composition of the preliminary draft Organic Law? a) (1) councils, ( 2) personnel, (3) elections b) (1) councils, (2) council committees, chief councilor, governor, (3) personnel c) (1) councils, (2) council committees, chief councilor, governor, (3) personnel, (4) national council, implementation, functions & resources d) (1) councils, (2) council committees, chief councilor, governor, (3) personnel, (4) national council, implementation, functions & resources, (5) elections 14. According to the preliminary draft Organic Law, urban areas will be treated differently from rural areas. Where are urban areas? a) Phnom Penh b) Phnom Penh, Shihanoukville, Kep, and Pailin c) Phnom Penh, Siem Reap, and Battambang d) Phnom Penh and other urban areas like Shihanoukville, Battambang, Siem Reap and Poipet 42
43 15. Which one of the following statements about sub-national councils is NOT consistent with the preliminary draft Organic Law? a) For large councils, a Council Management Committee must be selected by the council among its councilors. b) Every council must have a Technical Advisory Board. c) Every council must establish a Procurement Committee. d) Every council must establish a Women s Consultative Committee. Session 9: Case Study: One Window Service 16. Which districts are designated as the model districts in the Decision 47? a) Siem Reap District b) Battambang District c) Siem Reap District and Battambang District d) Siem Reap District, Battambang District, and Kampong Cham District 17. Under which administrative structure is the One Window Service Office established? a) National administration b) Provincial administration c) District administration d) Commune administration 18. What does the One Window Service Office do? a) Deliver public services to the people, enterprises and organizations that are carrying out the activity within the territory of the districts. b) Design district budget and examine the implementation of district budget c) Oppose to the local officials corruption and contribute to constructing the good relationship and truth between the district administration and citizens enterprises located in the district d) Solve all the citizens problems depending on the existing laws and regulations. 43
44 19. Which one of the following statements explains best about the services provided by the One Window Service Office? a) Services have become faster and cheaper. b) Services have become faster than before, but service fees are same as before. c) Service fees have become cheaper than before, but the processing time is same as before. d) Service fees and processing time are same as before. 20. Which one of the following statements about strengths about the One Window Service Office is NOT correct? a) The One Window Service Office is located in the central area where people have easy access. b) The One Window Service Office opens from 7:30 to 9:30 in the morning, and 2:00 to 4:00 in the afternoon to provide good customer services. c) The prices of different services are written on the wall so that everyone can see them. d) The working group of the One Window Service Office is modest and helpful with the people. 44
45 Appendix 3 Impact Survey Questions 1. Was the overall training course/seminar beneficial to your work? (Yes, No) 2. If yes, please explain why it was beneficial. If no or don t know, please try to explain why not. 3. Is there any specific knowledge of D&D policy and organic law that you have used on your work after the training course/seminar? (Yes, No) 4. If yes, please provide at least one concrete example. If no or don t know, please try to explain why not. 5. Have you used the materials since you participated in the training course/seminar? (Yes, No) 6. If yes, please explain what materials you have used for what purpose. If no or don t know, please try to explain why not. 7. Did you incorporate anything else you learned in the training course/seminar into your work? (Yes, No) 8. If yes, please provide at least one concrete example. If no or don t know, please try to explain why not. 9. Is there anything which has changed your perception, attitude or behavior as a result of the training course/seminar? (Yes, No) 45
46 10. If yes, please provide at least one concrete example. 11. What kind of follow up support would help you do your work better? 12. Do you have any recommendation or other comments on the training course/seminar? 46
47 Sample Data Input Sheet (GDLA Official Training) Appendix 4 47
48 Appendix 5 Evaluation Report Checklist 1. Title Page Title identifies what was evaluated, including target population, if applicable. Title is sufficiently clear and concise to facilitate indexing. Authors name and affiliations are identified. Date of preparation is included. Text and material on title page are clearly and properly arranged. 2. Executive Summary Description of program/project Evaluation question and purpose of evaluation Concise summary of main findings Recommendations, if applicable Overview and description of structure of report 3. Table of Contents and Other Sections That Preface the Report Table of contents contains at least all first- and second-level headers in the reports. Titles and page numbers are accurate. Lists of tables, figures, and appendices are included, if applicable. List of acronyms or abbreviations is included, if appropriate. Acknowledgement section with reference to sponsors, data collectors, informants, contributors to the report, etc., is included. 4. Program Description Description of the program being evaluated (including historical context, if appropriate) Identification of target population for program Description of program activities 5. Evaluation Design and Methods Purpose of evaluation and evaluation questions Evaluation approach or model being used, as well as rationale for the approach or model 48
49 Methods of data collection Design of the evaluation including timing of data collection and use of specific data collection methods Sources of information and data Limitations of the evaluation 6. Findings and Results Details of the evaluation findings are clearly and logically described. Charts, tables, and graphs are appropriately labeled and understandable. Discussion of evaluation findings is objective and included both negative and positive findings. All evaluation questions are addressed or an explanation is included for questions that could not be answered. Findings are adequately justified. 7. Conclusions and Recommendations Discussion and interpretation of findings are included. Summary and conclusion fairly reflect the findings. Judgments about the program that cover merit and worth are included. If appropriate, recommendations are included and they are based on findings in the report. 8. References and Appendices A suitable style or format is used consistently for all references. References are free of errors. References cover all in-text citations. All appendices referenced in the text are included in the appendices section, and vice versa. Data and information in the appendices are clearly presented and explained. Adopted from Torres, Rosalie T., et al. (2005). Evaluation Strategies for Communicating and Reporting. Sage Publications. 49
Manual on Training Needs Assessment
Project on Improvement of Local Administration in Cambodia What is Training Needs Assessment? Five Steps of Training Needs Assessment Step 1: Identify Problem and Needs Step 2: Determine Design of Needs
Manual on o Training Management
5 Project on Improvement of Local Administration in Cambodia Manual on o Training Management Introduction What is Training Management? How to Use the Manuals Table of Contents 1. Introduction...4 2. What
Stakeholder Analysis HEALTH REFORM TOOLS SERIES GUIDELINES FOR CONDUCTING A. A Partnerships for Health Reform Publication
HEALTH REFORM TOOLS SERIES GUIDELINES FOR CONDUCTING A Stakeholder Analysis A Partnerships for Health Reform Publication In collaboration with: Development Associates Inc. Harvard School of Public Health
EVALUATION METHODS TIP SHEET
EVALUATION METHODS TIP SHEET QUANTITATIVE METHODS: Quantitative data collection methods consist of counts or frequencies, rates or percentages, or other statistics that document the actual existence or
Structure of the Administration (political and administrative system)
Thailand Performance Management 1. Introduction Structure of the Administration (political and administrative system) Since the declaration of the first constitution in 1932, the politics and government
Partnership Satisfaction & Impact Survey
Partnership Satisfaction & Impact Survey Page 1 of TABLE OF CONTENTS Contents I INTRODUCTION... 3 II SATISFACTION SURVEY... 4 II.1 What?... 4 II.1.1 Definition... 4 II.1.2 Satisfaction survey in Practice...
Business Analyst Position Description
Analyst Position Description September 4, 2015 Analysis Position Description September 4, 2015 Page i Table of Contents General Characteristics... 1 Career Path... 2 Explanation of Proficiency Level Definitions...
Chapter 3 Data Interpretation and Reporting Evaluation
Chapter 3 Data Interpretation and Reporting Evaluation Results This chapter explains how to interpret data and how to bring together evaluation results. Tips! - Evaluation does not end with data collection
TERMS of REFERENCE (ToR)
TERMS of REFERENCE (ToR) Project Name: Building Disaster Resilient Communities in Cambodia ActionAid (Cambodia) - DRR Program OVERALL PURPOSE OF THE CONSULTANCY Undertake an independent research study
PERFORMANCE MONITORING & EVALUATION TIPS CONDUCTING FOCUS GROUP INTERVIEWS
NUMBER 10 2011 Printing PERFORMANCE MONITORING & EVALUATION TIPS CONDUCTING FOCUS GROUP INTERVIEWS ABOUT TIPS These TIPS provide practical advice and suggestions to USAID managers on issues related to
Performance Management. Date: November 2012
Performance Management Date: November 2012 SSBA Background Document Background 3 4 Governance in Saskatchewan Education System 5 Role of School Boards 6 Performance Management Performance Management Overview
Using Surveys for Data Collection in Continuous Improvement
Innovation Insight Series Number 14 http://www.psu.edu/president/pia/innovation/ Across the University many people use data in assessment and decision-making. Data-based decision-making, an essential element
PROJECT MANAGEMENT METHODOLOGY SECTION 3 -- PLANNING PHASE
PROJECT MANAGEMENT METHODOLOGY SECTION 3 -- PLANNING PHASE Table of Contents Introduction...3-1 Overview...3-1 The Process and the Project Plan...3-1 Project Objectives and Scope...3-1 Work Breakdown Structure...3-1
SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS SABER. August 2014. Education Management Information Systems Data Collection Instrument Training Manual
SYSTEMS APPROACH FOR BETTER EDUCATION RESULTS SABER August 2014 Education Management Information Systems Data Collection Instrument Training Manual Contents Introduction... 2 Overview of the SABER... 3
Customer Market Research Primer
Customer Market Research Primer This factsheet acts as a primer to the basic concepts of Market Research and how a business or individual could do their own research. Market research is an effective way
How To Complete An Assessment Questionnaire In Alberta
Alberta Municipal Sustainability Strategy Self-Assessment Questionnaire Promoting Municipal Sustainability This page intentionally left blank INTRODUCTION Sustainable, responsive, and accountable municipal
TRAINING NEEDS ANALYSIS
TRAINING NEEDS ANALYSIS WHAT IS A NEEDS ANALYSIS? It is a systematic means of determining what training programs are needed. Specifically, when you conduct a needs analysis, you Gather facts about training
Value Engineering VE with Risk Assessment RA
Preparation Conclude & Report Value Engineering VE with Risk Assessment RA yes START Plan for VE and RA Value Engineering Job Plan 1 2 Function Information Analysis 3 Creative 4 Evaluation 5 Develop recommendations
Leadership Development Best Practices. By Russel Horwitz
Leadership Development Best Practices By Russel Horwitz 1 December 2014 Contents The case for leadership development 4 Creating a strategy 4 Best practices 5 1. Align to organizational strategy 5 Begin
April 30, 2009. Assignment 4. Instructional Design for Conversion of Legacy FrameMaker Documentation to the new Xerox Branding
April 30, 2009 Assignment 4 Instructional Design for Conversion of Legacy FrameMaker Documentation to the new Xerox Branding Xerox and the sphere of connectivity design are trademarks of Xerox Corporation
Literature survey: historical and theoretical background. The chapter requires you to have done some library and company research to:
Writing the MBA Dissertation 1. General Comments This should be divided into chapters as detailed in the following section: Note: The dissertation may not divide up easily into the 6 main headings, but
ORGANISATIONAL DIAGNOSIS QUESTIONNAIRE (ODQ)
ORGANISATIONAL DIAGNOSIS QUESTIONNAIRE (ODQ) Robert C. Preziosi Both internal and external organization development (OD) consultants at some point in the consulting process must address the question of
Determines if the data you collect is practical for analysis. Reviews the appropriateness of your data collection methods.
Performing a Community Assessment 37 STEP 5: DETERMINE HOW TO UNDERSTAND THE INFORMATION (ANALYZE DATA) Now that you have collected data, what does it mean? Making sense of this information is arguably
Market Research: Friend or Foe to Christian Charities?
Market Research: Friend or Foe to Christian Charities? by Redina Kolaneci Senior Fundraising & Stewardship Consultant McConkey Johnston international UK If you are ignorant of yourself, in addition to
OUTPUT: RURAL ROAD ASSET MANAGEMENT
Rural Roads Improvement Project II (RRP CAM 42334) OUTPUT: RURAL ROAD ASSET MANAGEMENT A. Introduction 1. The Project. The Royal Government of Cambodia (the Government) has requested for a loan from Asian
Large Scale Systems Design G52LSS
G52LSS Lecture 11 Information Gathering Methods Interactive Methods Non-obtrusive Methods Types of Questions Comparing Gathering Methods Learning outcomes: describe the different information gathering
Key performance indicators
Key performance indicators Winning tips and common challenges Having an effective key performance indicator (KPI) selection and monitoring process is becoming increasingly critical in today s competitive
Monitoring and Evaluation Plan Primer for DRL Grantees
Monitoring and Evaluation Plan Primer for DRL Grantees I. What is a monitoring and evaluation plan? A monitoring and evaluation plan (M&E plan), sometimes also referred to as a performance monitoring or
Revenue Administration: Performance Measurement in Tax Administration
T e c h n i c a l N o t e s a n d M a n u a l s Revenue Administration: Performance Measurement in Tax Administration William Crandall Fiscal Affairs Department I n t e r n a t i o n a l M o n e t a r
GUIDELINES FOR THE IEP TEAM DATA COLLECTION &
GUIDELINES FOR THE IEP TEAM DATA COLLECTION & Progress Monitoring Decisions about the effectiveness of an intervention must be based on data, not guesswork. Frequent, repeated measures of progress toward
Data entry and analysis Evaluation resources from Wilder Research
Wilder Research Data entry and analysis Evaluation resources from Wilder Research General instructions Preparation for data entry Data entry is often thought of as a time-consuming process, but there are
Corporate Learning Watch
Quantifying the Organizational Benefits of Training August 2010 The Majority of CEOs Want Return on Investment Data for Training Programs, But Few Receive It Watch List A recent survey of CEOs found that
TERMS OF REFERENCE. Development of Referral Pathway guidelines on health services for children detected with impairments or disabilities
TERMS OF REFERENCE I. GENERAL OVERVIEW: Title of Program: Title of Project: Title of Consultancy: Mother and Child Health (MCH) Inclusive Healthcare (IHP) Development of Referral Pathway guidelines on
ROYAL GOVERNMENT OF CAMBODIA
' & KINGDOM OF CAMBODIA NATION RELIGION KING 3 ROYAL GOVERNMENT OF CAMBODIA National Program for Sub-National Democratic Development (NP-SNDD) 2010-2019 ) 28 th May 2010 ( Royal Government of Cambodia
9. Data-Collection Tools
9. Data-Collection Tools Depending on the nature of the information to be gathered, different instruments are used to conduct the assessment: forms for gathering data from official sources such as police
Applies from 1 April 2007 Revised April 2008. Core Competence Framework Guidance booklet
Applies from 1 April 2007 Revised April 2008 Core Competence Framework Guidance booklet - Core Competence Framework - Core Competence Framework Core Competence Framework Foreword Introduction to competences
Evaluating Breastfeeding Programs and Initiatives
Breastfeeding in Ontario Evaluating Breastfeeding Programs and Initiatives What is evaluation? Evaluation means different things to different people. Fundamentally, evaluation means getting feedback about
INL PROGRAM PERFORMANCE METRIC EXAMPLES
1 INL PROGRAM PERFORMANCE METRIC EXAMPLES The purpose of this document is to assist Program Officers and Implementers with developing performance measurements that: 1. Are quantifiable 2. Are consistent
MBA Dissertation Guidelines
Faculty of Economics, Management And Accountancy University of Malta MBA Dissertation Guidelines As part of the degree formation you are expected to present a dissertation project. This booklet contains
How To Write A Customer Profile Sheet
How to understand your customers needs and expectations. This information sheet will help you to A) improve the way you collect information from and about your customers. B) use the information to plan
Qualitative methods for effectiveness evaluation: When numbers are not enough
Chapter 7 Qualitative methods for effectiveness evaluation: When numbers are not enough 7.1 Introduction 7.2 Methods of collecting qualitative information 7.2.1 Interviews and focus groups 7.2.2 Questionnaires
Reporting Student Progress: Policy and Practice
Reporting Student Progress: Policy and Practice Table of Contents Contents Introduction... 5 Policy... 6 Formal Reports... 6 Performance Scale... 6 Letter Grades... 7 Reporting on Daily Physical Activity...
Municipal Affairs. Performance Appraisal of a Chief Administrative Officer
Municipal Affairs Performance Appraisal of a Chief Administrative Officer Revised January 2014 Capacity Building, Municipal Services Branch Performance Appraisal of a Chief Administrative Officer Alberta
Chapter 7: Monitoring and Evaluation
Chapter 7: Monitoring and Evaluation The Training Program is relatively new and is still developing. Careful monitoring and evaluation of the Program by the people most closely involved in it will help
MIDLANDS STATE UNIVERSITY
MIDLANDS STATE UNIVERSITY FACULTY OF SCIENCE AND TECHNOLOGY COMPUTER SCIENCE AND INFORMATION SYSTEMS DEPARTMENT.../.../2011 Preamble Students are expected to produce a fully functional Dissertation projects
CONDUCTING IN-DEPTH INTERVIEWS: A Guide for Designing and Conducting In-Depth Interviews for Evaluation Input
P ATHFINDER I NTERNATIONAL T OOL S ERIES Monitoring and Evaluation 2 CONDUCTING IN-DEPTH INTERVIEWS: A Guide for Designing and Conducting In-Depth Interviews for Evaluation Input By Carolyn Boyce, MA,
Usability Evaluation with Users CMPT 281
Usability Evaluation with Users CMPT 281 Outline Usability review Observational methods Interview methods Questionnaire methods Usability ISO 9241-11: The extent to which a product can be used by specified
Building a Training Program
Building a Training Program A Workshop presented to [Insert audience s name here] [Insert trainer s name here] [Insert date/year of training here] [Insert location of training here] Learning Goals At the
Using Focus Groups in Program Development and Evaluation
Using Focus Groups in Program Development and Evaluation by Roger A. Rennekamp, Ph.D. and Martha A. Nall, Ed.D. Extension Specialists in Program and Staff Development 203 Scovell Hall University of Kentucky
An evaluation of the effectiveness of performance management systems on service delivery in the Zimbabwean civil service
An evaluation of the effectiveness of performance management systems on service delivery in the Zimbabwean civil service ABSTRACT P. Zvavahera National University of Science and Technology, Zimbabwe This
Monitoring, Evaluation and Learning Plan
Monitoring, Evaluation and Learning Plan Cap-Net International Network for Capacity Building in Sustainable Water Management November 2009 The purpose of this document is to improve learning from the Cap-Net
Evaluation Summary. Total Cost 295 mill. JPY (Estimated cost as of the end of the Project) Period of Cooperation
Evaluation Summary. Outline of the Project Country: The Lao People s Democratic Republic Issue/Sector: Healthcare and medical treatment Division in charge: Health Division 3, Human Development Department
Customer Service and Communication. Bringing service to the next level
Customer Service and Communication Bringing service to the next level 1 Park Authority Philosophy & Goals Before focusing on customer service, it is first important to understand and reinforce the Park
Performance Appraisal and it s Effectiveness in Modern Business Scenarios
Performance Appraisal and it s Effectiveness in Modern Business Scenarios Punam Singh* *Assistant Professor, Department of Social Work, Sardar Patel University, Vallabh Vidhyanagar, Anand, Gujarat, INDIA.
Steps to Data Analysis
Data Analysis Introduction Data analysis acts as the construction phase of your evaluation. The process of data analysis includes deciding on the appropriate analysis to conduct for each question, preparing
Good Practices in Survey Design Step-by-Step
From: Measuring Regulatory Performance A Practitioner's Guide to Perception Surveys Access the complete publication at: http://dx.doi.org/10.1787/9789264167179-en Good Practices in Survey Design Step-by-Step
GUIDE TO EFFECTIVE STAFF PERFORMANCE EVALUATIONS
GUIDE TO EFFECTIVE STAFF PERFORMANCE EVALUATIONS The research is clear. The outcome is consistent. We know with certainty that the most powerful leadership tool for improving productivity and increasing
Analyzing Research Articles: A Guide for Readers and Writers 1. Sam Mathews, Ph.D. Department of Psychology The University of West Florida
Analyzing Research Articles: A Guide for Readers and Writers 1 Sam Mathews, Ph.D. Department of Psychology The University of West Florida The critical reader of a research report expects the writer to
Writing Reports BJECTIVES ONTENTS. By the end of this section you should be able to :
Writing Reports By the end of this section you should be able to : O BJECTIVES Understand the purposes of a report Plan a report Understand the structure of a report Collect information for your report
QUAๆASSURANCE IN FINANCIAL AUDITING
Table of contents Subject Page no. A: CHAPTERS Foreword 5 Section 1: Overview of the Handbook 6 Section 2: Quality Control and Quality Assurance 8 2. Quality, quality control and quality assurance 9 2.1
HRM. Human Resource Management Rapid Assessment Tool. A Guide for Strengthening HRM Systems. for Health Organizations. 3rd edition
HRM Human Resource Management Rapid Assessment Tool for Health Organizations A Guide for Strengthening HRM Systems 3rd edition . Human Resource Management Rapid Assessment Tool Copyright 2005, renewed
Measuring ROI in Executive Coaching
Measuring ROI in Executive Coaching Jack J. Phillips, Ph.D. and Patricia P. Phillips, Ph.D.
LECTURE 3 REQUIREMENTS GATHERING
LECTURE 3 REQUIREMENTS GATHERING Key Definitions The As-Is system is the current system and may or may not be computerized The To-Be system is the new system that is based on updated requirements The System
How To Monitor A Project
Module 4: Monitoring and Reporting 4-1 Module 4: Monitoring and Reporting 4-2 Module 4: Monitoring and Reporting TABLE OF CONTENTS 1. MONITORING... 3 1.1. WHY MONITOR?... 3 1.2. OPERATIONAL MONITORING...
Section 2. Stakeholder Analysis Guidelines Kammi Schmeer
Section 2 Stakeholder Analysis Guidelines Kammi Schmeer Section 2 6WDNHKROGHU$QDO\VLV *XLGHOLQHV Table of Contents Introduction................................................................ 2-1 Step
True/False Questions
Chapter 7 Training True/False Questions 7-1. On average, U.S. firms spend less than half of what Japanese firms spend on training. Ans: T Page 208 7-2. Continuous learning does not require employees to
TOR - Consultancy Announcement Final Evaluation of the Cash assistance and recovery support project (CARSP)
TOR - Consultancy Announcement Final Evaluation of the Cash assistance and recovery support project (CARSP) Organization Project Position type Adeso African Development Solutions and ACTED - Agency for
Capstone Suggestions for Survey Development for Research Purposes
Capstone Suggestions for Survey Development for Research Purposes 1. Begin by listing the questions you would like to answer with the survey. These questions will be relatively broad and should be based
Board of Commissioners
Board of Commissioners SELF-STUDY HANDBOOK CHAPTER TWO Guidelines for Conducting an Institutional Self-Study TABLE OF CONTENTS Introduction 1 Purpose of the Self-Study 1 Institutional Evaluation 1 Institutional
Energy Management. System Short Guides. A Supplement to the EPA. Guidebook for Drinking Water and Wastewater Utilities (2008)
Energy Management System Short Guides A Supplement to the EPA Energy Management Guidebook for Drinking Water and Wastewater Utilities (2008) PREPARED BY GLOBAL ENVIRONMENT & TECHNOLOGY FOUNDATION (A 501(C)(3)
Performance Appraisal: Director of Education. Date of Next Review: September 2015 (every 2 years)
POLICY SECTION: SUB-SECTION: POLICY NAME: POLICY NO: Board of Trustees Director Performance Appraisal: Director of Education H.C.06 Date Approved: September 26, 2013 Date of Next Review: September 2015
For more on creating a good title, make sure to carefully review section 3 (Title Page)
General Outline for Business Consulting Reports Management 451 Your group s report will follow a similar format. Sample consulting reports from previous semesters are available in the library on course
The Grant Writing Game
101 Ways to Win The Grant Writing Game by Steve Price, Ed.D. and Stephen Price, M.A. Introduction In this brief ebook, Dr. Steve Price and Stephen Price share 101 strategies for successful grant writing
Module 17: EMS Audits
Module 17: EMS Audits Guidance...17-2 Figure 17-1: Linkages Among EMS Audits, Corrective Action and Management Reviews...17-5 Tools and Forms...17-7 Tool 17-1: EMS Auditing Worksheet...17-7 Tool 17-2:
APPLYING THE STRATEGIC PLANNING PROCESS
MANAGEWARE APPLYING THE STRATEGIC PLANNING PROCESS Requirements Each department and agency of state government must engage in the process of strategic planning and must produce a strategic plan to be used
Haulsey Engineering, Inc. Quality Management System (QMS) Table of Contents
Haulsey Engineering, Inc. Quality Management System (QMS) Table of Contents 1.0 Introduction 1.1 Quality Management Policy and Practices 2.0 Quality System Components 2.1 Quality Management Plans 2.2 Quality
The Focus Group Interview and Other Kinds of Group Activities
The Focus Group Interview and Other Kinds of Group Activities Why being clear about what is and what is not a focus group interview important? With the rise in popularity it*s become in vogue to call many
PERFORMANCE MANAGEMENT TRAINING
PERFORMANCE MANAGEMENT TRAINING Performance management is an ongoing process rather than a once a year occurrence. It s a way for you as the supervisor to motivate and develop your employees, and to assist
Customer Service. 1 Good Practice Guide
Customer Service 1 Good Practice Guide Contents Photography by OzShots Foreword 3 The application of this guide to employees in the public service 4 Core principles of customer service 4 Leading and modelling
The remainder of this handbook is divided into the following sections: Getting Started (p. 2); Beginner Mode (pp. 3 5); Expert Mode (pp. 5 11).
User Guide to LAPOP s System for Online Data Analysis (v1) The following guide is designed to help users navigate through and use LAPOP s System for Online Data Analysis (SODA). The system allows individuals
CHAPTER 4 RESULTS. four research questions. The first section demonstrates the effects of the strategy
CHAPTER 4 RESULTS This chapter presents the statistical analysis of the collected data based on the four research questions. The first section demonstrates the effects of the strategy instruction on the
COURSE DELIVERY METHOD
Public Safety Management Degree Program Syllabus for Community Risk Reduction for the Fire and Emergency Services UST 427, 3 Credit Hour Course, Spring Term 2014 PROFESSOR Bernard W. Becker, III, MS, EFO,
Stages of Instructional Design V. Professional Development
Stages of Instructional Design V. Professional Development Derived from Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of Instructional Design (4th ed.). Fort Worth, TX: Harcourt Brace
CIVIL SERVICE COMMISSION. RECRUITMENT PRINCIPLES: review. Consolidation of previous changes and proposed amendments to the explanatory text
CIVIL SERVICE COMMISSION RECRUITMENT PRINCIPLES: review Consolidation of previous changes and proposed amendments to the explanatory text Issue 1. The drafting of the Recruitment Principles explanatory
*Note: Screen magnification settings may affect document appearance.
Good afternoon and welcome to today s Coffee Break presented by the Evaluation and Program Effectiveness Team in the Division for Heart Disease and Stroke Prevention at the CDC. We are very fortunate today
Terms of Reference Baseline Assessment for the employment intensive project for youth in Lower Juba (Dhobley and Afmadow), Somalia
Terms of Reference Baseline Assessment for the employment intensive project for youth in Lower Juba (Dhobley and Afmadow), Somalia Organization African Development Solutions www.adesoafrica.org Project
Performance Management
Performance Management WORKSHOP HANDOUTS Facilitated by: Tara Kemes, Vantage Point Knowledge Philanthropist June 2013 Page 1 of 16 Handout 1 Performance Management System Overview What is performance management?
Copyright 2004.Pamela Cole. All rights reserved.
Key concepts for working with the Role Behavior Analysis The Role Behavior Analysis (RBA), the companion instrument to the Personal Profile System (PPS), uses specific DiSC behavioral statements for defining,
Tennessee Educator Acceleration Model (TEAM) TEAM Evaluation Supplemental Materials 2014
Tennessee Educator Acceleration Model (TEAM) TEAM Evaluation Supplemental Materials 2014 The contents of this manual were developed under a grant from the U.S. Department of Education. However, those contents
AN INTEGRATED APPROACH TO TEACHING SPREADSHEET SKILLS. Mindell Reiss Nitkin, Simmons College. Abstract
AN INTEGRATED APPROACH TO TEACHING SPREADSHEET SKILLS Mindell Reiss Nitkin, Simmons College Abstract As teachers in management courses we face the question of how to insure that our students acquire the
