SU311 Why 360 Degree Feedback Doesn t Work and What to Do About It ASTD 2007 International Conference & Exposition Atlanta, Georgia Kenneth M. Nowack, Ph.D. Envisia Learning 3435 Ocean Park Blvd, Suite 203 Santa Monica, CA 90405 (310) 452-5130 (310) 450-0548 Fax www.envisiatools.com ken@envisialearning.com Learning Objectives Use research-based best practices in the administration of multi-rater feedback systems for development and evaluation in your organization Apply a performance coaching model to ensure successful transfer of learning from multi-rater feedback Evaluate decision points to ensure the success of multi-rater feedback intervention 1 Kenneth M. Nowack, Ph.D. is a licensed psychologist (PSY 13758) who has over 20 years experience in the human resources field as both an internal and external consultant. Dr. Nowack received his doctorate degree in Counseling Psychology from the University of California, Los Angles and has published extensively in the areas of 360-degree feedback, assessment, health psychology and behavioral medicine. Ken is the author of the Emotional Intelligence View 360, Executive View 360, Manager View 360, Performance View 360, Career Profile Inventory, PeopleIndex and the StressScan assessments. Ken is a guest lecturer at the UCLA Anderson School of Management and also serves on Daniel Goleman s Consortium for Research on Emotional Intelligence in Organizations. 2
Leadership Matters Results of two company wide employee engagement surveys by Envisia Learning were analyzed for all corporate staff for a large US food service corporation for 2002 and 2004 Employees rated leadership and management practices using a benchmarked 8-item Leadership Effectiveness Index (alpha.91) Employees were asked additional questions about retention (intention to leave in 12 months), job satisfaction and perceptions of stress Nowack, K. (2006). Emotional Intelligence: Leadership Makes a Difference. HR Trends. 17, 40-42. 3 Leadership Matters Leadership Effectiveness and Climate (N=153) Significant Differences (all p's <.01) 3 2.39 2.48 2.07 1.78 2 Job Stress Retention Satisfaction 1.51 1.25 1 High Effectiveness Low Effectiveness 4
Secrets Vendors Won t Tell You About 360 Feedback Most competencies behind 360 tools lack any theoretical model or methodology about where they came from Many 360 tools lack established psychometric properties (i.e., reliability, validity) Most competencies within 360 tools typically are very highly correlated with each other Average scores are typically used in reports and they can be highly misleading to interpret 5 Secrets Vendors Won t Tell You About 360 Feedback Use of normative scores can be distracting or misleading Most vendors have technological limitations to the output of their reports limiting learning style preferences It s harder to sell time series reports that do a better job of demonstrating actual behavior change 6
Common 360 Issues and Questions Should 360 be used for evaluation or development? What competencies should be measured? Who should be asked for feedback? How many should be asked? How should raters be selected? Confidentiality/Anonymity Who should receive feedback? Who should deliver the feedback? How should open-ended questions be presented? How should graphic data be presented? How soon should the 360 be repeated? How should behavior change be facilitated? Nowack, K. (1999). 360-Degree feedback. In DG Langdon, KS Whiteside, & MM McKenna (Eds.), Intervention: 50 Performance Technology Tools, San Francisco, Jossey-Bass, Inc., pp.34-46. 7 Who Should Select Raters? Raters can be selected by participants, others or mutual discussion between the participant and coach, manager, HR In a recent study, selected raters were consistently as accurate as non-selected raters. Allowing individuals to select their raters may enhance feedback acceptance without reducing rating accuracy Jennifer Nieman-Gonder et al. (2006). The effect of rater selection on rating accuracy. Poster presented at the 21st Annual Conference of the Society for Industrial and Organizational Psychology, May 2006, Dallas, TX 8
Does 360 o Feedback Result in Improved Performance? Watson Wyatt s 2001 Human Capital Index, an ongoing study of the linkages between HR practices and shareholder value at 750 publicly traded US companies found that 360-degree feedback programs were associated with a 10.6 percent decrease in shareholder value Pfau, B. & Kay I. (2002). Does 360 degree feedback negatively affect company performance? HR Magazine, Volume 47 (6), June 2002. 9 Does 360 o Feedback Result in Improved Performance? A meta-analysis over over 3,000 studies (607 effect sizes, 23,633 observations) on performance feedback found that although there was a significant effect for feedback interventions (d=.41), one third of all studies showed performance declines Kluger, A. & DeNisi (1996). The effects of feedback interventions on performance: A historical review, meta-analysis and preliminary feedback theory. Psychological Bulletin, 119, 254-285 10
Does 360 o Feedback Result in Improved Performance? Atwater and colleagues found that improvement following an upward feedback intervention only resulted for 50% of the supervisors who received it Atwater,L., Waldman, D., & Cartier. (2000). An upward feedback field experiment. Supervisor s cynicism, follow-up and commitment to subordinates. Personnel Psychology, 53, 275-297 11 Does 360 o Feedback Result in Improved Performance? A recent meta-analysis of 26 longitudinal studies indicate significant but small effect sizes suggesting that is unrealistic to expect large performance improvement after people receive 360-degree feedback Smither, J., London, M., & Reilly, R. (2005). Does performance improve following multisource feedback? A theoretical model, meta-analysis and review of empirical findings. Personnel Psychology, 58, 33-66 12
Are Raters Providing Unique Information? Within Rater Agreement Research A recent meta-analysis found that the average correlation between two supervisors is only.50, between two peers.37 and between two subordinates.30 Conway, J. & Huffcutt A (1997). Psychometric properties of multi-source performance ratings: A meta-analysis of subordinate, supervisor, peer and self-ratings. Human Performance, 10, 331-360 13 Are Raters Providing Unique Information? Between Rater Agreement Research Aggregate comparisons of self and other raters across 20 managerial competencies (Manager View 360) demonstrated correlations ranging from.12 to.30 with an average correlation of.21 (all p s <.01) Nowack, K. (1992). Self-assessment and rater-assessment as a dimension of management development. Human Resource Development Quarterly, 3, 141-153 14
Are Raters Providing Unique Information? Between Rater Agreement Research A recent meta-analytical study found moderately high correlations between peer-supervisor ratings (r=.65) but only modest correlations between selfpeer (r=.36) and self-supervisor ratings (.36) Harris, M. & Schaubroeck, J. (1988). A meta-analysis of self-supervisor, self-peer and peersupervisor ratings. Personnel Psychology, 41, 43-62 15 Self-Other Perceptions BOSS Technical Competence Bottom Line Performance Burr in the Saddle Effect REPORTS Derailment Factors (EI) PEERS Leadership Potential Nowack, K., (2002). Does 360 degree feedback negatively effect company performance: Feedback varies with your point of view. HR Magazine, Volume 47 (6), June 2002 16
Positive Illusions and Self-Delusions 17 Positive Illusions and Self-Delusions In general, self-ratings are inflated relative to others in US studies Overestimators tend to be: Male Older managers Less educated Those with greater tenure Those who supervise more employees Ostroff, Atwater & Feinberg (2004). Understanding self-other agreement: A look at rater and ratee characteristics, context and outcomes. Personnel Psychology, 57, 333-375 18
Receptivity of Feedback Individuals who have higher levels of organizational commitment are more motivated to change regardless of the nature of the feedback Individuals who hold more positive attitudes toward the feedback process are also more motivated to change and report more follow up activities than those who held less positive attitudes Atwater & Brett, 2003 19 Equivalence of Paper Versus Web Based All 360 studies that have been published to date show evidence of equivalence between paper and pencil and electronic versions Additional support for likely equivalence comes from other testing research (e.g., personality) Penny, J. (2003). Exploring differential item functioning in a 360-degree assessment: Rater source and method of delivery. Organizational Research Methods, 6, 61-79 Smither, J. Walker, A. & Yap, M. (2004). An examination of web based versus paper and pencil upward feedback ratings. Educational and Psychological Measurement, 64, 40-61 Ployhart, R. et al. (2003). Web based and paper and pencil testing of applicants in a proctored setting: Are personality, biodata and situational judgment tests? Personnel Psychology, 56, 733-752 20
Impact of Numeric Feedback Individuals appear to be significantly less positive, more negative and less motivated after receiving text feedback than numeric feedback Individuals appear to prefer numeric scores and normative feedback in multi-rater interventions Atwater & Brett, 2003 21 Impact of Open Ended Comments In a study of 176 managers followed for one year, 70% of written comments were generally positive The number of comments received averaged between 1 to 35 Favorable comments were correlated with improved performance Managers who received a small number of unfavorable behavioral/task comments improved more than other managers Managers who received a large number of unfavorable behavioral/task comments declined in performance compared to other managers Smither & Walker (2004). Are the Characteristics of Narrative Comments Related to Improvement in Multirater Feedback Ratings Over Time? Personnel Psychology, 89, 575-581 22
Components of the Talent Accelerator Development Resource Library: The Talent Accelerator resource library provides a comprehensive source of readings, websites, media, and suggestions to facilitate development Feedback Reports: Talent Accelerator provides an electronic copy of the assessment summary report Development Suggestions: For each assessment tool, specific developmental suggestions or tips are provided to enhance job effectiveness Development Planning Wizard : The development wizard provides a structured way for users to focus on those behaviors that are most important Automated Reminders: Talent Accelerator allows users to select how often the system sends out reminders about due dates on the users development plan 23 Outcomes With 360 Feedback and Coaching Olivero et al. (1997) found that employee coaching increased productivity over and above the effects of a managerial training program Thatch (2002) found that 6 weeks of coaching following 360 feedback increased results by 60% Smither et al., (2003) reported that after receiving 360 feedback, managers who worked with a coach were more likely to set specific goals, solict ideas for improvement and subsequently received improved performance ratings 24
360 Feedback and Manager Involvement In a recent doctoral study of 360 feedback across industries (Rehbine, 2006), 62% of the respondents reported being dissatisfied or highly dissatisfied with the amount of time their manager spent helping with a development plan This same level of dissatisfaction was also reported for spent providing coaching and feedback after the 360 feedback process was completed About 65% expressed strong interested in utilizing an on-line follow up tool to measure progress toward behavioral change 25 Development Planning & Talent Accelerator 360 Feedback Alone < 5% 360 Feedback and Talent Accelerator 10% to 15% Coaching, Talent Accelerator and Manager Follow-Up > 65% 26
Envisia 360 Feedback Study ISSUE Large telecommunications company was using diverse 360 interventions Different competency models Different 360 tools Different applications (talent management, executive development) Different feedback models Request from HR to evaluate usefulness of these 360 interventions Nowack, K., Hartley, J. and Bradley, W. (1999). Evaluating results of your 360-degree feedback intervention. Training and Development, 53, 48-53 27 Envisia 360 Feedback Study Best Practices Provide individual coaching to assist in interpreting and using the 360 feedback results Hold participant and manager accountable to create and implement a professional development plan Track and monitor progress on the completion of the development plan Link the 360 intervention to a human resources performance management process Use 360 tools with established psychometric properties Target competencies for 360 feedback interventions that are related to strategic business needs Nowack, K. (2005). Longitudinal evaluation of a 360 degree feedback program: Implications for best practices. Paper presented at the 20th Annual Conference of the Society for Industrial and Organizational Psychology, Los Angeles, March 2005 28
How to Maximize 360 Feedback Some evidence that feedback provided within a workshop format facilitates behavior change Seifert & Yukl, 2003; Nowack, 2005 Evidence that coaching coupled with 360 feedback can facilitate behavior change Smither, J. et al. (2003). "Can working with an executive coach improve multisource feedback ratings over time? A quasi-experimental field study." Personnel Psychology 56: 23-44 Some limited evidence that use of an online development planning system and competency based resource center can facilitate behavior change with managerial involvement Nowack, 2006 29 360 Feedback Selected References Nowack, K. (2005). Longitudinal evaluation of a 360 degree feedback program: Implications for best practices. Paper presented at the 20th Annual Conference of the Society for Industrial and Organizational Psychology, Los Angeles, March 2005. Nowack, K. (1999). 360-Degree feedback. In DG Langdon, KS Whiteside, & MM McKenna (Eds.), Intervention: 50 Performance Technology Tools, Jossey-Bass, Inc., pp.34-46. Nowack, K., Hartley, G, & Bradley, W. (1999). Evaluating results of your 360-degree feedback intervention. Training and Development, 53, 48-53. Nowack, K. (1999). Manager View/360. In Fleenor, J. & Leslie, J. (Eds.). Feedback to managers: A review and comparison of sixteen multi-rater feedback instruments (3 rd edition). Center for Creative Leadership, Greensboro, NC., Wimer & Nowack (1998). 13 Common mistakes in implementing multi-rater systems. Training and Development, 52, 69-79. Nowack, K. & Wimer, S. (1997). Coaching for human performance. Training and Development, 51, 28-32. Nowack, K. (1997). Congruence between self and other ratings and assessment center performance. Journal of Social Behavior & Personality, 12, 145-166 Nowack, K. (1994). The secrets of succession. Training & Development, 48, 49-54 Nowack, K. (1993). 360-degree feedback: The whole story. Training & Development, 47, 69-72 Nowack, K. (1992). Self-assessment and rater-assessment as a dimension of management development. Human Resources Development Quarterly, 3, 141-155. 30
Job Aid: Six Decision Points for Best Practices in Using 360 Degree Feedback 1. Is the 360 feedback intervention related to a strategic business need? 2. Is the 360 intervention linked to a human resources system? 3. Are you using 360 tools with established psychometric properties? 4. Are you providing coaching to assist in interpreting the 360 feedback results? 5. Are the participant and manager accountable to create and implement a professional development plan? 6. Is there a mechanism to track and monitor progress on the completion of the plan? 31