UOttawa / Quality PULSE 360 Survey Frequently Asked Questions by TRAINEES

Similar documents
360 Feedback HR Representative - Initiation

Envisia Learning 360 Feedback Online Project Flow Process

Guide to180 Feedback HUMAN RESOURSES COLLEGE OF EDUCATION AND HUMAN ECOLOGY

BEST PRACTICES IN UTILIZING 360 DEGREE FEEDBACK

The Healthcare Leadership Model Appraisal Hub. 360 Assessment User Guide

360 Feedback Assessment RATER Guide

APPENDIX B-2 FOCUS GROUP PROTOCOLS

A Guide To Understanding Your 360- Degree Feedback Results

The Emotional Intelligence Appraisal MR & 360 Editions

The Healthcare Leadership Model Appraisal Hub and 360 report. Facilitator user guide

SigmaRADIUS Leadership Effectiveness Report

Outlook 2010 Desk Reference Guide

Use Yahoo Messenger for a Video or Audio Conference

A NAFPhk Professional Development Programme Serving Principals Needs Analysis Programme 360 Feedback

Transitioning into a Supervisory Position

ClearView Connects Help

A Brief How-to for 360 Feedback. Using 360 Feedback to Create True Communities of Learning and Health. Workshop Goals

Performance Review Process Guidelines Nova Southeastern University

How to Create a Form and Manage the Spreadsheet in Google Docs

Tips for Performance Review and Goal Setting

COSMETIC SURGERY UNIT OVERVIEW. Authors Introduction Go to impactseries.com/issues to listen to Joseph s unit introduction.

QAQuality Assurance. Practice Assessment. Guide 2015

University of Michigan Health System Adult and Children s Emergency Departments

WHITE PAPER: Ten Tips for More Effective Physician Marketing. Bill Reinstein President & CEO of MedData Group MedData Group, LLC

Best Practices to Ensure Impact

How To Get Feedback From Tma 360 Degree Feedback Workbook

1. Manage your Group. 1. Log on to the CampusGroups platform.

CCC Program Satisfaction Surveys

Isenberg Ticketing Service Center

HOSPITALITY SERVICES MANAGEMENT (aka Services Management and Marketing) HM Course Syllabus Spring 2006

HUMAN RESOURSES COLLEGE OF EDUCATION AND HUMAN ECOLOGY. Manager's Guide to Mid-Year Performance Management

Evaluating health promotion programs

Provider Satisfaction Survey: Research and Best Practices

United Healthcare Certification Details

MSDWT Evaluator Brief May 8, 2015

Performance Development

International Healthcare Administrative Fellowship Program

The 360 Degree Feedback Advantage

CIHR Reviewers Guide for New Investigator Salary Awards

GeneSys. Unit Six.Two: Administering a 360 Project. genesysonline.net. psytech.com

Draft report of the Survey of Participants in Stage 3 of the Workflow Pilot Non MEs users

A Practical Guide to Performance Calibration: A Step-by-Step Guide to Increasing the Fairness and Accuracy of Performance Appraisal

Quality Satisfaction Management. Performing Keyword Research

HOW TO: A Guide to Using Qualtrics Research Suite

COURSE AND TEACHER SURVEYS (CATS) AT VILLANOVA UNIVERSITY

DTRQ9FO8D9N/TGDTRbzN2v4v. Sample Leader Leader Sample

What is Skype? Skype is a free program that uses the latest technology to bring affordable and high-quality voice communications to people.

Google Docs, Sheets, and Slides: Share and collaborate

City of De Pere. Halogen How To Guide

TTUHSC Staff Performance Management SUPERVISOR'S GUIDE TO STAFF PERFORMANCE MANAGEMENT

Performance Management Toolkit for Leadership

MEASURES OF VARIATION

Outlook 2013 ~ e Mail Quick Tips

Longitudinal Evaluation of a 360 Feedback Program: Implications for Best Practices

Supporting GPs to interpret and learn from Patient and Colleague

Guide for Clinical Audit Leads

Coins, Presidents, and Justices: Normal Distributions and z-scores

Magento Extension User Guide

Interpersonal Communication Skills Inventory

Performance Management for ADP Workforce Now. Automatic Data Processing, LLC ES Canada

APPENDIX G. Usability Test Procedures and Script

Effective Performance Appraisals

Supporting the USDA 360 Assessment Program as an AgLearn 360 Process Manager

NORTON MEDICAL CENTRE PATIENT SURVEY OF NEW APPOINTMENT SYSTEM

Getting Started with 20/20 Insight TRIAL VERSION

Procedure Guide: Daily Use Cyber Recruiter 6.6 December 2007

Preparing for the Performance Review Process

Facebook SVEA TRAINING MODULES.

Leadership Effectiveness Survey

SimplyCast emarketing User Guide

360 feedback. Manager. Development Report. Sample Example. name: date:

Employee Engagement Survey Results. Sample Company. All Respondents

Questions and Answers About AdvancED Accreditation, Standards for Quality and ASSIST. About the Standards

Sub-question 6: What are student and faculty perceptions concerning their frequency of use of the DDP?

PERFORMANCE MANAGEMENT A SYSTEMS APPROACH

August Page 1 PSA

If I don t receive your feedback by 5:00pm next Wednesday, I will deem that yu are OK with the content of the attached document.

Running Head: 360 DEGREE FEEDBACK 1. Leadership Skill Assessment: 360 Degree Feedback. Cheryl J. Servis. Virginia Commonwealth University

Canadian Medicine Primer

eperformance Frequently Asked Questions for Managers

Class Facilitator s Guide

Free: John Tschohl Customer Service Strategic E- mail Newsletter at service.com

introduction to emarketing

Members respond to help identify what makes RNs unique and valuable to healthcare in Alberta.

Employee Off-Cycle Review

Subscription Administrator Guide. For GS1 Canada Services

Central Commissioning Facility Research Management Systems (RMS): User Guidance

Set up Outlook for your new student e mail with IMAP/POP3 settings

Module Design & Enhancement. Assessment Types

Welcome to the inaugural issue of TransForm

Transcription:

Page 1 of 6 UOttawa / Quality PULSE 360 Survey Frequently Asked Questions by TRAINEES 1. Why am I doing this? a. Receiving candid feedback is one of the most useful learning tools. It's an opportunity to confirm the competencies in which you excel (most trainees don t hear often enough what they are doing right!). It's also an opportunity to identify any areas which could benefit from improvement, since even the best of us can be better. While it's human nature to be apprehensive about almost any kind of feedback even public praise makes most people uncomfortable uottawa s 360- degree program emphasizes the importance of becoming proficient at both giving as well as receiving constructive and formative feedback in order to enhance learning and development. Multi-level 360-degree feedback from the perceptual vantage point of attendings, peers and healthcare staff is especially effective for giving trainees valuable information about their communicator, collaborator and other non-technical skills related to quality of care. It is a formative, not summative, process designed to reduce errors and increase quality and it's based on the principle that you can't change what you don t know. 2. What is the uottawa / PULSE 360 Survey? a. The PULSE (P.hysicans U.niversal L.eadership S.kills E.ducation) Survey is a 360- degree educational and developmental tool that provides all trainees with feedback on a variety of CanMEDS competencies. All uottawa trainees are expected to participate in the PULSE Survey as part of quality improvement. 3. Where else is the PULSE 360 Survey used? a. The PULSE Survey has been used in numerous hospitals, clinics and healthcare organizations throughout Canada, including McGill University, Montreal Children s Hospital, St. Joseph's (Toronto), CSSS d Antoine-Labelle, Hotel-Dieu Grace Hospital (Windsor), Canadian Medical Protective Association and the Ontario Medical Association. Elsewhere in North America, PULSE is utilized at seven of the Harvard-affiliated Hospitals, including Massachusetts General Hospital, Brigham and Women's Hospital, Beth Israel Deaconess Medical Center, and Boston Children's Hospital. Other Academic Medical Centers which use the PULSE include, for example, the Universities of Michigan, Miami, California, Tennessee and Utah. 4. Who participates in the PULSE 360 Survey? a. Over 4,000 healthcare professionals including over 2,000 physicians have received PULSE Survey feedback from over 45,000 healthcare providers. Feedback recipients range from medical students to residents and fellows, to nurse practitioners and physicians assistants, to hospital administrators and department managers, to attendings and community physicians, and physician-leaders (such as chairs of departments and chiefs of divisions). In short, almost every kind of healthcare practitioner has participated in the PULSE.

Page 2 of 6 5. What are the types of questions on the PULSE Survey? a. There are what PULSE called Motivating Behaviors and Discouraging Behaviors as well as CanMEDS-type competences. The questions are adapted to each major uottawa residency specialty. 6. How long does a PULSE Survey take? a. The typical PULSE Survey takes less than five minutes to complete so it's very quick. The Survey is online and, for the self-rating, the trainee simply clicks on a Likert-type scale response for each of the questions. It could take less time if the rater has completed several surveys and is already familiar with the process? 7. What if the rater doesn t know me that well? a. Some raters will not be able to answer every single question. For that reason, each question has a don't know option. 8. How are the raters selected? a. Each residency program may be a little bit different, but it's almost always a shared process. PULSE typically emails the trainee to select a minimum number of raters from each of the three Rater Groups this is called the PULSE Selection Page and includes tabs and drop-downs for 1) Attendings, 2) Peers, 3) Healthcare Staff and the program director may confidentially add any missing names to validate the list. The role of the validator is to either approve the trainee s list as is or to confidentially add any additional names to help provide more meaningful feedback. It is up to the Validator to decide whether or not to share the names of any added raters. If a trainee doesn t select raters by the deadline date, after repeated automated reminders, then the PULSE system requests the Program Director to do the selections for the trainee. The entire process is highly automated in order to make it fast and easy. Once the program is set to launch for your department, an email will be sent to each trainee with more detailed instructions. 9. IMPORTANT! Rater selection guidelines a. Select a wide range of raters! While it's tempting to select the raters who are friends or who you think will provide you with favorable feedback, it's far more helpful to invite raters who are likely to give you a wider span of observations, from compliments to constructive criticism. Also, your Program Director as your Validator is likely to perceive you as more open to feedback and interested in learning if you choose a more bell-shaped curve spread of raters. Try as best as you can to include the raters you think your Program Director would add. b. Select raters who are more recent or regular! Try to select raters with whom you have worked more recently, since they will be more familiar with you. Alternatively, for raters who are less recent, chose ones with whom you ve had more concentrated and regular contact. c. Try to avoid selecting the over-selected! On the PULSE Selection Page, you'll be able to see how many times each potential rater has already been selected. Avoid selecting raters who have already been selected the most, since we want to

Page 3 of 6 spread out the time it takes to give feedback and give as many team members as possible the opportunity to provide feedback. d. Make sure before you save your rater list! When you save your list of raters, the system will ask, Are you sure? and, if you confirm that you list is final, then the automated software will lock in your selections. Please double check before finalizing your list. 10. Do trainees complete a Self-Rating? a. Yes, trainees score themselves as part of the survey process. At the end of the survey, the trainee is invited to write down the self-improvement goals that he/she wants to start, stop and keep on doing. It is an opportunity to reflect on how they do on each of the behaviors and when trainees receive their PULSE Reports -- to compare their self-assessment with the perceptions of others. 11. How are the surveys sent to raters? a. All surveys are sent by email. 12. How much time do raters have to do their surveys? a. The surveys are open for 21 days. 13. What happens after the survey feedback is collected? a. After self and other responses have been collected and analyzed, a PULSE Feedback Report is generated to provide feedback to the trainee. The report is designed to provide anonymity to the raters individual survey responses while, at the same time, provide detailed, meaningful and helpful feedback to the trainee.

Page 4 of 6 14. How does the PULSE Feedback Report present the numerical scores? a. Graphs: There are graphs that present the rating groups and scales that might look like this: b. Lists: The top five questions for some categories are ranked, based on the top five scores from everyone in all groups combined together. That is what the middle column below, called All-but- Self Avg, means, and that page may look like this: c. Tables: The trainee will be able to see in general if the rating averages for each question are in mid-range, borderline, outlying range or very high. The numerical ratings are presented as means without standard deviations in order to enhance rater anonymity, since a small number of raters and a large SD would suggest that a few of the raters gave the trainee very low scores. The number of raters for each question is not provided for similar reasons. However, if too few raters responded to a question, suggesting that the mean would not be meaningful, no score will be displayed.

Page 5 of 6 Here is a snippet of what the feedback generally might look like: 15. How does the PULSE Feedback Report present written comments in an anonymous way? a. PULSE uses its own proprietary protocols called Anonymity Editing and Anonymity Clustering to present comments in an unidentified, yet meaningful, format. This means that PULSE typically enhances anonymity and then rearranges, re-numbers and re-groups the comments into the most frequent themes, which are then bolded. A trainee who has a strong need to improve his/her listening skills could have comments grouped like this: 16. As a trainee receiving feedback, what safeguards are there against raters providing invalid feedback? a. We strongly encourage raters to provide constructive, balanced and helpful feedback. However, if a rater attempts to skew the feedback by exaggerated scoring in either a favorable or unfavorable direction the PULSE software system can flag response patterns that suggest extreme and unrealistic lenience or harshness, and the responses of these raters may be dropped from the feedback report if it is determined by PULSE to be overly biased. 17. How do trainees review their PULSE Feedback Reports? a. Trainees are expected to review their feedback with their Program Directors to better understand how their skill sets are perceived and to identify excellence goals.

Page 6 of 6 18. What if I get negative feedback? a. The vast majority of trainees receive favorable feedback that highlights their strengths and qualities that others appreciate and value. The goal of PULSE 360 feedback is to help trainees go from good to great. The purpose of being a trainee, of course, is to train, and that means learning how to do things better. One of the best times to receive formative feedback is during your training, when the emphasis is on growth and development. No one is perfect, and we all have areas in which we can improve. The goal of feedback can be summed up in two words: be better. 19. If the survey is anonymous, why does the PULSE Software System keep reminding me to complete my Self-Rating or other surveys? a. Your survey feedback is absolutely anonymous. In order to ensure that all raters provide feedback, the PULSE software is programmed to automatically remind them that there is still time to complete the survey and give helpful feedback. This includes reminding the trainee to select raters and to complete the self-rating. PULSE reminder emails are automatically generated when a survey has not been completed. Once the survey Finish button is clicked, the automated reminders stop. The PULSE Survey reminder system is completely automated, and reminders are sent every few days to anyone who has not completed the next step in the PULSE process. 20. If you have any additional questions, please: email us at PulseInfo@PdpFlorida.com or call the PULSE Program directly at 416-800-9203.