www.3dgroup.net 510.463.0333



Similar documents
Best Practices in Reporting 360-Degree Feedback Russell Lobsenz, Ph.D., Karen Caruso, Ph.D., and Amanda Seidler, MA November, 2004

Best Practices to Ensure Impact


3D Group Research. Thank you for your interest in the research conducted by 3D Group consultants. This is a partial report.

8 Steps for Turning 360 Feedback into Results. Leadership Intelligence White Paper // Paul Warner, Ph.D.

360 Degrees Performance Appraisal

Warning: 360-degree Feedback May Be Hazardous to Your Health

BEST PRACTICES IN UTILIZING 360 DEGREE FEEDBACK

SU311 Why 360 Degree Feedback Doesn t Work and What to Do About It

IPP Learning Outcomes Report. Faculty member completing template: Rachel August and Greg Hurtz Date: 1/25/12

IT S LONELY AT THE TOP: EXECUTIVES EMOTIONAL INTELLIGENCE SELF [MIS] PERCEPTIONS. Fabio Sala, Ph.D. Hay/McBer

The Successful Manager s Leadership Program

2013 Review of the MS in Sport Management

Self-Other Agreement on a 360-Degree Leadership Evaluation. Stefanie K. Halverson Rice University. Scott Tonidandel Rice University

EVALUATION METHODS TIP SHEET

Corporate Learning Watch

Graduate or Undergraduate: Graduate-PhD Date 12/18/03. Coordinator of the Program: Douglas Johnson

PSI Leadership Services

Sales Checkpoint Performance Feedback System

360 FEEDBACK FROM ANOTHER ANGLE

Work based learning. Executive summary. Background

Performance Appraisal Handbook For Supervisors. For the evaluation of Non-Instructional Academic Staff (NIAS) and University Staff

Longitudinal Evaluation of a 360 Feedback Program: Implications for Best Practices

Unleashing your power through effective 360 feedback 1

Performance Appraisal and it s Effectiveness in Modern Business Scenarios

Building. A Performance Management Program

Master Program in Department of Psychology (MS), 2015~2016

White Paper from Global Process Innovation. Fourteen Metrics for a BPM Program

Are Sustainability Management Systems (SMS) really promising?

What are research, evaluation and audit?

A Blueprint for a Successful 360-Degree Feedback Intervention

Providing innovative solutions resulting in the financial success of our Clients, Staff and Firm.

History and Purpose of the Principles for the Validation and Use of Personnel Selection Procedures

Master Program in Department of Psychology (MS), 2014~2015

Performance Management and Reward Systems

360 Appraisal for Project Staff

THHGLE06B Monitor staff performance

PERFORMANCE MANAGEMENT ROADMAP

Making Your 360 Degree Feedback more effective in delivering successful behavioural change

Job Profile 1. Job Title Industrial/ Organizational Psychologist (consultant)

STUDENTS PERCEPTIONS OF ONLINE LEARNING AND INSTRUCTIONAL TOOLS: A QUALITATIVE STUDY OF UNDERGRADUATE STUDENTS USE OF ONLINE TOOLS

HR STILL GETTING IT WRONG BIG DATA & PREDICTIVE ANALYTICS THE RIGHT WAY

Providence College Local Faculty Survey: Analytical Memo of Findings

How To Fix A Broken Performance Management Program How Leading Organizations are transforming Performance Management to maximize Business Value

A Practical Guide to Performance Calibration: A Step-by-Step Guide to Increasing the Fairness and Accuracy of Performance Appraisal

What is a 360 degree appraisal? A short introduction to the process. Contents: Introduction: page.2.

Service Level Agreement (terms, conditions and operational protocols) between Real Psychology and Purchasing / Commissioning Organisation:

THE USE OF 360-DEGREE FEEDBACK METHOD

Cubiks training programmes. Developing professional HR skills

Integrating HR & Talent Management Processes

CAFOD s People In Aid Code of Good Practice Audit Report

APPENDIX I. Best Practices: Ten design Principles for Performance Management 1 1) Reflect your company's performance values.

Learning Aims: To research, record and evaluate the difficulties of starting your own business.

FOOD SAFETY CULTURE MODULE

CYBERSECURITY INDEX OF INDICES

specific content of the assessments to reflect current research trends and practices in professional counseling.

The Greatest Ever Coaching Outcome Project: Preliminary Findings

October 21, Access to Credit: An analysis of the credit climate for small businesses in the Capital Region

LEADS Summary of Offerings

Soft Skills Requirements in Software Architecture s Job: An Exploratory Study

Appendix A. Higher Education Program Masters Higher Education Administrator Evaluation Rubric

Debate Title Competency Models: A Boom or Bane to Leadership Development?

What to look for when recruiting a good project manager

Sales & Marketing Alignment Benchmarks, Insights & Advice

Coaches Coach, Players Play, and Companies Win

>> BEYOND OUR CONTROL? KINGSLEY WHITE PAPER

1. Emotional consequences of stroke can be significant barriers to RTW

The Impacts of Multiple 360 Feedback Projects Utilizing Coaching and Follow-Up: A Quantitative Longitudinal Exploratory Descriptive Research Study

Investors in People (IiP)

PSYCHOLOGY. Master of Science in Applied Psychology

Improving Performance by Breaking Down Organizational Silos. Understanding Organizational Barriers

This work aimed to identify the underlying environmental and organizational

ASSESSMENT RECORD FOR DEPARTMENT OF. Organizational Leadership

ASSESSMENT CENTERS. Definition. Description

JOB DESCRIPTION. Contract Management and Business Intelligence

Social Research Institute. Understanding your stakeholders A best practice guide for the public sector November 2009

Concept. lack the time and resources to devote to the task; do not have the skills, expertise, experience or methodology internally;

The Lived Experiences of Executive Coaches Interdisciplinary Competencies: A Phenomenological Study: Dissertation Proposal

Ch. 4: Four-Step Problem Solving Model

Coming Full Circle Using Research and Practice to Address 27 Questions About 360-Degree Feedback Programs

WLUSA Performance Review Process. Human Resources 2015

SCHOOL PSYCHOLOGIST. Reasonable accommodation maybe made to enable a person with a disability to perform the essential functions of the job.

9Lenses: Human Resources Suite

Stephanie L. Thorn Marshall University Huntington, WV 25755

Why conduct engagement surveys?

Amanda Lee Adams. Honors Thesis. Appalachian State University

Uinta County School District #1 Multi Tier System of Supports Guidance Document

PSYCHOLOGY DEPARTMENT

Asset Manager CRM Survey Results

Measuring the value of social software

Evaluation of the Texas Collaborative Transition to Teaching Program Interim Report. October 2007 to April 2010

Results Based Performance Management System (RPMS) FREQUENTLY ASKED QUESTIONS (FAQs)

Evaluating Distance Learning in Graduate Programs: Ensuring Rigorous, Rewarding Professional Education

FACTORS THAT AFFECT PEER RATER ACCURACY IN MULTIRATER FEEDBACK SYSTEMS. Katie J. Thiry. KEITH J. JOHANSEN, Ph.D., Committee Member

Measuring Diversity Results Series Article 1 By Dr. Edward E. Hubbard President & CEO, Hubbard & Hubbard, Inc.

Course and Subject Surveys

Indiana Arts Commission Regional Arts Partnership: Best Practices for Organizational Management

National Healthcare Leadership Survey Implementation of Best Practices

Designing effective performance appraisal systems

Executive White Paper

Transcription:

3D Group Research Thank you for your interest in the research conducted by 3D Group consultants. This is a partial report. If you would like a complimentary copy of the full report, please call us at (510) 463-0333. www.3dgroup.net 510.463.0333

360 Program Characteristics 1 Running Head: CHARACTERISTICS OF 360 PROGRAMS A Benchmarking Study of North American 360-Degree Feedback Practices Mark C. Healy, Amanda B. Walsh, and Dale S. Rose 3D Group Berkeley, California Healy, M. C., Walsh, A. B., & Rose, D. S. (2003, April). A benchmarking study of North American 360-degree feedback practices. Poster session presented at the 18 th annual conference of the Society for Industrial and Organizational Psychology, Orlando, Florida.

360 Program Characteristics 2 Abstract This study investigates current practices in the implementation and use of 360-degree feedback programs in 53 North American organizations. Specific practices were found to be quite diverse. Specifically, results revealed substantial variation in survey development methods, survey format and content, selection of raters, and development support. Implications for research and best practices are discussed.

360 Program Characteristics 3 A Benchmarking Study of North American 360-Degree Feedback Practices There is no shortage of information concerning how to properly implement a 360-degree feedback program. Numerous books (e.g., Bracken, Timmreck, & Church, 2001; Tornow, London, & Associates, 1998) provide well-articulated advice on developing and implementing 360-degree feedback programs. Performance appraisals and feedback of all varieties have been researched throughout the history of management science and industrial/organizational psychology. Nevertheless, there are several characteristics of 360-degree feedback that are not well understood. In particular, several very common aspects of real-world implementation have received surprisingly little rigorous research attention. Byham (2002) commented that 360- degree feedback research at the beginning of the millennium is as advanced as research on the selection interview was in the early 1960 s; the individual components are only now beginning to be teased apart and evaluated. A lack of substantial research does not appear to limit the proffering of helpful advice. Contained within thorough books and academic reports regarding 360-degree feedback (e.g., Van Velsor, Leslie, and Fleenor, 1997) as well as in popular, non-academic articles are recommendations on how various aspects of 360-degree feedback should be implemented (often referred to as best practices). Specifically, most treatises on 360-degree feedback, especially those available via the popular press, contain a great deal of advice and courses of action designed to lead to a widely-accepted program, positive behavioral change, and improved organizational performance. Even if many practices in 360-degree feedback are discussed as the right way or as best practices, or how I did it when I was at Bloated Enterprises, Inc., 360-degree feedback

360 Program Characteristics 4 programs are implemented in a wide variety of organizations by a wide variety of individuals. Little is known about the extent to which best practices and recommended approaches to program design are embraced by organizations and implemented as recommended. A primary question, then, is what is really going on in 360-degree feedback programs in organizations? And how do these practices compare with the most common advice given? Recent studies have lent some insight into the activities organizations undertake in the implementation of 360-degree feedback. For example, Rogers, Barriere, Kaplan, and Metley (2002) attempted to link specific practices with perceptions of positive outcomes in 42 sample organizations. They found that those practitioners who perceived their 360 program to have high benefit to their organization were more likely than low benefit organizations to utilize coaching, approval of raters, evaluation of the program, development planning, and careful selection of coaches. This represents a solid attempt at linking specific practices with positive outcomes. London and Smither (1995) interviewed 20 providers of 360-degree feedback and detailed practices including frequency of administration, links to development programs, format of the report, and use of feedback. Other surveys of 360 practices included those detailed by Timmreck and Bracken (1997) and Linkage (1999). This paper utilizes data from a survey-based benchmark study of 360-degree feedback programs to describe common practices by North American organizations and compare them, where possible, to the typical advice given in current articles and books. As described above, surveys of 360 practices have been conducted; however, the authors felt that certain practices had not been covered in enough depth by previous studies (e.g., rater selection and approval, narrative comments, feedback from users, evaluation). We seek to evaluate current practices in two related ways: 1) Compare relevant findings to consistently mentioned recommendations and

360 Program Characteristics 5 best practices; and 2) where no or less-specific best practices are known, consider differences across implementations of 360-degree feedback programs and document where knowledge of best practices may be helpful. For example, narrative comments have received very little research attention and yet, they are utilized almost universally in 360. Consequently, the goal with this practice and other less-researched areas would be to document variations in use and therefore drive future study designs and comparisons. The specific focus of this paper is on item content, survey and report design, and the delivery and interpretation of feedback. Specifically, we narrow our focus of interest to investigate topics of primary interest to industrial/organizational psychologists and those receiving little in-depth focus in popular business articles. 1 These topics are generally within the purview of researchers in industrial/organizational psychology, human resource management, and organizational development. The nature of the study is iterative, and no hypotheses are offered regarding the variation or extent of specific 360 practices 2. However, it was the expectation of the authors that a diverse array of practices were in use and specific procedures and program designs would vary quite a bit from organization to organization. Method Participants and Procedure Participants were identified and contacted using information found in the 2001-2002 Society of Industrial and Organizational Psychology Membership Directory and through the authors industry contacts. Of the 315 organizations contacted, fifty-three organizations were currently administering a 360-degree feedback program and volunteered to participate in the study. Participants had all either administered, managed or been deeply involved in a 360-degree

360 Program Characteristics 6 feedback project within their organization in the past year. Occasionally, the initial contact was not the individual most involved with the process; in this case, the authors were referred to this contact for participation in the interview. Data was gathered using an interview protocol that took an average of twenty-five minutes to complete and was conducted via telephone. In one case, a participant filled out the survey on paper and emailed it to the authors. As an incentive to take part in the study, a complimentary copy of the completed report was offered to all participants. Measures The survey contained forty-two open-ended items. The items were designed to explore the educational background and role of the participant in their organizations 360 program, the development process of the organizations 360-degree feedback tool, characteristics of the surveys-in-use, the process for administering 360 feedback, the specifics of the feedback reports, and the use of 360 and types of development support offered to the participants of 360-degree feedback. Additional questions were asked to assess the participants opinion of the success of 360-degree feedback in their organization as well as suggestions for improvements to their program. The first three items on the survey assessed the participants educational and experience level in conducting 360-degree feedback. Four items then assess the history of the organization s use of 360 and reasons for undertaking this type of program. The next eleven items focused on the specifics of the process for developing the 360 instrument(s). Next, seven questions were asked to explore the different ways in which organizations administer the 360 feedback process. The next six questions assessed the method for the scoring, design and content of the feedback reports. Development support methods and uses of 360 were measured using six