Utilization-Focused Evaluation. with Michael Quinn Patton September 21, 2010

Similar documents
Online or on campus student: New Mexico State University MPH Program NCHEC Seven Areas of Responsibility and Competencies (

RISK BASED INTERNAL AUDIT

All of these circumstances indicate that the world of tomorrow is as different as today s water utility business is from that of yesteryear.

Research Proposal Guidelines

Responsibility I Assessing Individual and Community Needs for Health Education

Developing an Effective Evaluation Plan. Setting the course for effective program evaluation

Checklist for Reviewing Scopes of Work for Performance Evaluations

PERSONAL DEVELOPMENT GOALS PLAN For September 2013-June 2014

Interview studies. 1 Introduction Applications of interview study designs Outline of the design... 3

WHAT WORKS IN INNOVATION AND EDUCATION IMPROVING TEACHING AND LEARNING FOR ADULTS WITH BASIC SKILL NEEDS THROUGH FORMATIVE ASSESSMENT STUDY OUTLINE

7 Performance Monitoring and Metrics

Youth Development Institute. Advancing Youth Development. A Curriculum for Training Youth Workers

Article Four Different Types of Evidence / Literature Reviews

By the end of the MPH program, students in the Health Promotion and Community Health concentration program should be able to:

Chapter 3 Data Interpretation and Reporting Evaluation

Writing learning objectives

Finding the Right People for Your Program Evaluation Team: Evaluator and Planning Team Job Descriptions

Research & Development Guidance for Students

2 Business, Performance, and Gap Analysis

Developing a Public-Private Partnership Framework: Policies and PPP Units

The 360 Degree Feedback Advantage

Role Description Business Analyst / Consultant - ICT

DESIGNING A QUALITATIVE RESEARCH CONCEPTUAL FRAMEWORK AND RESEARCH QUESTIONS. Nicholas J. Sitko PROJECT: INDABA AGRICULTURAL POLICY RESEARCH INSTITUTE

Texas Higher Education Coordinating Board Characteristics of Texas Doctoral Programs 2015

Implications of the Flat World for Evaluation in Instructional Design and Technology

Annex 1. Call for proposals EACEA No 34/2015 Erasmus+ Key Action 3: Support for policy reform - Initiatives for policy innovation

FOREIGN AFFAIRS PROGRAM EVALUATION GLOSSARY CORE TERMS

University of Maryland School of Medicine Master of Public Health Program. Evaluation of Public Health Competencies

Crosswalk: Core Competencies for Public Health Professionals & the Essential Public Health Services

Competencies for Canadian Evaluation Practice

BEST PRACTICES IN COMMUNITY COLLEGE BUDGETING

Module 2: Introduction to M&E frameworks and Describing the program

Following are detailed competencies which are addressed to various extents in coursework, field training and the integrative project.

IPDET Module 6: Descriptive, Normative, and Impact Evaluation Designs

USING THE EVALUABILITY ASSESSMENT TOOL

Department of Criminology and Criminal Justice University of South Carolina

Stages of Instructional Design V. Professional Development

Nursing Framework and Program Outcomes

Master s Degree THESIS RESEARCH PROJECT CLINICAL PROJECT INFORMATION

QUALITY ASSURANCE POLICY

APPENDIX I. Best Practices: Ten design Principles for Performance Management 1 1) Reflect your company's performance values.

4.1 Identify what is working well and what needs adjustment Outline broad strategies that will help to effect these adjustments.

Critical Inquiry in Educational Research and Professional Practice

Undergraduate Psychology Major Learning Goals and Outcomes i

Silicon Valley Mathematics Initiative Student Focused Math Content Coaching

Planning a Critical Review ELS. Effective Learning Service

Developing Research & Communication Skills

Social Work Field Education Core Competencies and Practice Behaviors

Issues of Program Evaluation in Social Work Education: A Review from the Perspectives of Evaluation Framework

Risk Management Primer

Risk, Risk Assessments and Risk Management. Christopher Bowler CPA, CISA August 10, 2015

CRO Forum Paper on the Own Risk and Solvency Assessment (ORSA): Leveraging regulatory requirements to generate value. May 2012.

The Feasibility Study Process. Developed by Tim O Connell

ORGANIZATIONAL BEHAVIOR

Quality Assurance for doctoral education

Session 4 - Managing OD Process

Queensland Treasury. Queensland Government. Program Evaluation Guidelines

Appendix Chapter 2: Instructional Programs

Advanced Level: Module summaries

Sampling. COUN 695 Experimental Design

Debate Title Competency Models: A Boom or Bane to Leadership Development?

An Assessment of the Performance Evaluation System Used to Evaluate Teachers in Secondary Schools in Meru Central District-Kenya

The MPH. ability to. areas. program. planning, the following. Competencies: research. 4. Distinguish. among the for selection

Evaluation of Practicum Student Competencies SIU Counseling Psychology Program

Guidance Note on Developing Terms of Reference (ToR) for Evaluations

Writing the Evaluation Plan for Your Grant Application

SFHAD4 Develop and disseminate information and advice about substance use, health and social well-being

Last updated: July 2009

Department of Social Work. MSW MCMP Student Learning Agenda and Assessment

Core Competencies for Public Health Professionals

Promoting hygiene. 9.1 Assessing hygiene practices CHAPTER 9

GIVING VOICE TO VALUES: BRIEF INTRODUCTION

Dye s Two-Tier Framework of Evaluation: A Practical Approach for Determining the Effectiveness of HRD Activities

Getting Your Research Published in Peer Reviewed Journals

Measurement and measures. Professor Brian Oldenburg

classroom Tool Part 3 of a 5 Part Series: How to Select The Right

BMM652 FOUNDATIONS OF MARKETING SCIENCE

Developing an Effective Evaluation Report. Setting the course for effective program evaluation

Assessment Policy. 1 Introduction. 2 Background

PROJECT MANAGEMENT TRAINING MODULES

TOOL KIT for RESIDENT EDUCATOR and MENT OR MOVES

History and Philosophy of Psychology (Psy 3611)-- S-06 (Jeff Ratliff-Crain) Contact Information Office: Address: Course web page:

Form: "Activities & Target Dates - Foundation" Created with: Taskstream Author: University at Buffalo Social Work Manager

Candidates will demonstrate ethical attitudes and behaviors.

Specialisation Psychology

This guide has been written to support reviewers in writing SMART objectives within the SRDS framework. These guidelines cover the following.

Monitoring and Evaluation

ASPH Education Committee Master s Degree in Public Health Core Competency Development Project

The Challenger Sale SOUND SMART. SAVE TIME. SELL MORE. A 15-page guide to the 240-page sales book.

QUALITY IN EVERYDAY WORK

11. Conclusions: lessons, limitations and way forward

Welcome to the training on the TransCelerate approach to Risk-Based Monitoring. This course will take you through five modules of information to

School of Social Work

This alignment chart was designed specifically for the use of Red River College. These alignments have not been verified or endorsed by the IIBA.

THE FOUR NON NEGOTIABLES

GUIDE TO ERP IMPLEMENTATIONS: WHAT YOU NEED TO CONSIDER

Mathematics B (2008) Sample assessment instrument and student responses

Council on Social Work Education Educational Policies and Standards (EPAS) Competencies, Practice Behaviors and Advanced Practice Behaviors

Organizational Leadership

EFFECTIVE STRATEGIC PLANNING IN MODERN INFORMATION AGE ORGANIZATIONS

Transcription:

Utilization-Focused Evaluation with Michael Quinn Patton September 21, 2010

1 st edition 1978 2 nd edition 1986 3 rd edition 1997 4 th edition 2008

Research-based Approach Original follow-up study of the use of 20 federal health evaluations 35 years of research on use

Evaluation and Research: Same or different? And why?

Foundational Premise: Research and evaluation are different - and therefore

Foundational Premise: Research and evaluation are different and therefore Evaluated by different standards.

Evaluation Standards Utility ensure relevance & use Feasibility realistic, prudent, diplomatic & frugal Propriety ethical, legal, respectful Accuracy technically adequate to determine merit or worth For the full list of Standards: www.wmich.edu/evalctr/checklists/standardschecklist.htm

Evaluation Models U-FE is one among many.

Utilization-Focused Evaluation (U-FE) A decision-making framework for enhancing the utility and actual use of evaluations.

U-FE begins with the premise that evaluations should be judged by their utility and actual use. Therefore, evaluators should facilitate the evaluation process and design any evaluation with careful consideration of how everything that will be done, from beginning to end, will affect use.

Use Distinctions Instrumental Use Conceptual Use Misuse 12

USE Take use seriously by evaluating use, the source of our own accountability and ongoing learning/professional development Different from dissemination Different from producing reports Groundwork laid and expectations set at the beginning Doesn t happen naturally or automatically

An ancient example What lessons about making evaluation useful do you extract from this example?

Summary Lessons on Useful Evaluation Clearly identify primary intended users Clearly identify primary intended uses Goal: Intended use by intended users Negotiate FOCUS -- get agreement on criteria Establish a clear ACTION framework Distinguish empirical questions from value questions Select methods appropriate to the question Facilitate actual use of the findings

Goal of U-FE Intended Use by Intended Users

Intended Evaluation Users From Audiences to Stakeholders to Primary Intended Users Connotative differences?

Personal Factor

Critical success factors: There are five key variables that are absolutely critical in evaluation use. They are, in order of importance: People People People People»PEOPLE

Identify and Involve Primary Intended Users

Intended Use by Intended Users ----------------- Facilitating Intended Use Options

Different Evaluation Purposes For making judgments Commonly called summative evaluations: For improving programs Commonly called formative evaluations For ongoing development Sometimes called developmental evaluations

Lessons Learned Purpose Knowledge building Meta-evaluation, lessons learned, effective practices

Additional purpose distinctions Accountability Monitoring (M & E)

Tensions Different intended uses serve different purposes and, typically, different intended users. Thus the need to FOCUS and manage tensions between and among different purposes.

Balancing Different Purposes

Premises of Utilization Focused Evaluation No evaluation should go forward unless and until there are primary intended users who will use the information that can be produced Primary intended users are involved in the process Evaluation is part of initial program design - The primary intended users want information to help answer a question or questions. Evaluator s role is to help intended users clarify their purpose and objectives. Make implications for use part of every decision throughout the evaluation the driving force of the evaluation process.

Important trend Capacity-building: Evaluation capacity-building as a priority to support use

Process Use Process use refers to and is indicated by individual changes in thinking and behavior, and program or organizational changes in procedures and culture, that occur among those involved in evaluation as a result of the learning that occurs during the evaluation process. Evidence of process use is represented by the following kind of statement after an evaluation: "The impact on our program came not so much from the findings but from going through the thinking process that the evaluation required."

Process Uses Enhancing shared understandings Focusing programs: What gets measured gets done Supporting and reinforcing the program intervention, e.g., feedback for learning Capacity-building for those involved, deepening evaluative thinking Program and organizational development, e.g., evaluability assessments

New Direction Infusing evaluative thinking as a primary type of process use. Capacity-building as an evaluation focus of process use.

Some premises: Evaluation is part of initial program design, including conceptualizing the theory of change Evaluator s role is to help users clarify their purpose, hoped-for results, and change model. Evaluators can/should offer conceptual and methodological options. Evaluators can help by questioning assumptions. Evaluators can play a key role in facilitating evaluative thinking all along the way.

Utilization-Focused Methods Decisions

Shaping of an issue over time The morphing of the paradigms debate (qualitative vs quantitative) into the Gold Standard debate (randomized control trials as the alleged "gold standard" for impact evaluation)

GOLD STANDARD: METHODOLOGICAL APPROPRIATENESS not Methodological orthodoxy or rigidity

No single design or method is universally strongest Multiple ways of establishing causality Dangerous to privilege one method, creates perverse incentives

The Challenge: Matching the evaluation design to the evaluation s purpose, resources, and timeline to optimize use.

And the beat goes on Evaluation as an ever-evolving field

Reference Utilization-Focused Evaluation, 4th ed, Michael Quinn Patton Sage Publications, 2008.