Page 1 of 6 UOttawa / Quality PULSE 360 Survey Frequently Asked Questions by TRAINEES 1. Why am I doing this? a. Receiving candid feedback is one of the most useful learning tools. It's an opportunity to confirm the competencies in which you excel (most trainees don t hear often enough what they are doing right!). It's also an opportunity to identify any areas which could benefit from improvement, since even the best of us can be better. While it's human nature to be apprehensive about almost any kind of feedback even public praise makes most people uncomfortable uottawa s 360- degree program emphasizes the importance of becoming proficient at both giving as well as receiving constructive and formative feedback in order to enhance learning and development. Multi-level 360-degree feedback from the perceptual vantage point of attendings, peers and healthcare staff is especially effective for giving trainees valuable information about their communicator, collaborator and other non-technical skills related to quality of care. It is a formative, not summative, process designed to reduce errors and increase quality and it's based on the principle that you can't change what you don t know. 2. What is the uottawa / PULSE 360 Survey? a. The PULSE (P.hysicans U.niversal L.eadership S.kills E.ducation) Survey is a 360- degree educational and developmental tool that provides all trainees with feedback on a variety of CanMEDS competencies. All uottawa trainees are expected to participate in the PULSE Survey as part of quality improvement. 3. Where else is the PULSE 360 Survey used? a. The PULSE Survey has been used in numerous hospitals, clinics and healthcare organizations throughout Canada, including McGill University, Montreal Children s Hospital, St. Joseph's (Toronto), CSSS d Antoine-Labelle, Hotel-Dieu Grace Hospital (Windsor), Canadian Medical Protective Association and the Ontario Medical Association. Elsewhere in North America, PULSE is utilized at seven of the Harvard-affiliated Hospitals, including Massachusetts General Hospital, Brigham and Women's Hospital, Beth Israel Deaconess Medical Center, and Boston Children's Hospital. Other Academic Medical Centers which use the PULSE include, for example, the Universities of Michigan, Miami, California, Tennessee and Utah. 4. Who participates in the PULSE 360 Survey? a. Over 4,000 healthcare professionals including over 2,000 physicians have received PULSE Survey feedback from over 45,000 healthcare providers. Feedback recipients range from medical students to residents and fellows, to nurse practitioners and physicians assistants, to hospital administrators and department managers, to attendings and community physicians, and physician-leaders (such as chairs of departments and chiefs of divisions). In short, almost every kind of healthcare practitioner has participated in the PULSE.
Page 2 of 6 5. What are the types of questions on the PULSE Survey? a. There are what PULSE called Motivating Behaviors and Discouraging Behaviors as well as CanMEDS-type competences. The questions are adapted to each major uottawa residency specialty. 6. How long does a PULSE Survey take? a. The typical PULSE Survey takes less than five minutes to complete so it's very quick. The Survey is online and, for the self-rating, the trainee simply clicks on a Likert-type scale response for each of the questions. It could take less time if the rater has completed several surveys and is already familiar with the process? 7. What if the rater doesn t know me that well? a. Some raters will not be able to answer every single question. For that reason, each question has a don't know option. 8. How are the raters selected? a. Each residency program may be a little bit different, but it's almost always a shared process. PULSE typically emails the trainee to select a minimum number of raters from each of the three Rater Groups this is called the PULSE Selection Page and includes tabs and drop-downs for 1) Attendings, 2) Peers, 3) Healthcare Staff and the program director may confidentially add any missing names to validate the list. The role of the validator is to either approve the trainee s list as is or to confidentially add any additional names to help provide more meaningful feedback. It is up to the Validator to decide whether or not to share the names of any added raters. If a trainee doesn t select raters by the deadline date, after repeated automated reminders, then the PULSE system requests the Program Director to do the selections for the trainee. The entire process is highly automated in order to make it fast and easy. Once the program is set to launch for your department, an email will be sent to each trainee with more detailed instructions. 9. IMPORTANT! Rater selection guidelines a. Select a wide range of raters! While it's tempting to select the raters who are friends or who you think will provide you with favorable feedback, it's far more helpful to invite raters who are likely to give you a wider span of observations, from compliments to constructive criticism. Also, your Program Director as your Validator is likely to perceive you as more open to feedback and interested in learning if you choose a more bell-shaped curve spread of raters. Try as best as you can to include the raters you think your Program Director would add. b. Select raters who are more recent or regular! Try to select raters with whom you have worked more recently, since they will be more familiar with you. Alternatively, for raters who are less recent, chose ones with whom you ve had more concentrated and regular contact. c. Try to avoid selecting the over-selected! On the PULSE Selection Page, you'll be able to see how many times each potential rater has already been selected. Avoid selecting raters who have already been selected the most, since we want to
Page 3 of 6 spread out the time it takes to give feedback and give as many team members as possible the opportunity to provide feedback. d. Make sure before you save your rater list! When you save your list of raters, the system will ask, Are you sure? and, if you confirm that you list is final, then the automated software will lock in your selections. Please double check before finalizing your list. 10. Do trainees complete a Self-Rating? a. Yes, trainees score themselves as part of the survey process. At the end of the survey, the trainee is invited to write down the self-improvement goals that he/she wants to start, stop and keep on doing. It is an opportunity to reflect on how they do on each of the behaviors and when trainees receive their PULSE Reports -- to compare their self-assessment with the perceptions of others. 11. How are the surveys sent to raters? a. All surveys are sent by email. 12. How much time do raters have to do their surveys? a. The surveys are open for 21 days. 13. What happens after the survey feedback is collected? a. After self and other responses have been collected and analyzed, a PULSE Feedback Report is generated to provide feedback to the trainee. The report is designed to provide anonymity to the raters individual survey responses while, at the same time, provide detailed, meaningful and helpful feedback to the trainee.
Page 4 of 6 14. How does the PULSE Feedback Report present the numerical scores? a. Graphs: There are graphs that present the rating groups and scales that might look like this: b. Lists: The top five questions for some categories are ranked, based on the top five scores from everyone in all groups combined together. That is what the middle column below, called All-but- Self Avg, means, and that page may look like this: c. Tables: The trainee will be able to see in general if the rating averages for each question are in mid-range, borderline, outlying range or very high. The numerical ratings are presented as means without standard deviations in order to enhance rater anonymity, since a small number of raters and a large SD would suggest that a few of the raters gave the trainee very low scores. The number of raters for each question is not provided for similar reasons. However, if too few raters responded to a question, suggesting that the mean would not be meaningful, no score will be displayed.
Page 5 of 6 Here is a snippet of what the feedback generally might look like: 15. How does the PULSE Feedback Report present written comments in an anonymous way? a. PULSE uses its own proprietary protocols called Anonymity Editing and Anonymity Clustering to present comments in an unidentified, yet meaningful, format. This means that PULSE typically enhances anonymity and then rearranges, re-numbers and re-groups the comments into the most frequent themes, which are then bolded. A trainee who has a strong need to improve his/her listening skills could have comments grouped like this: 16. As a trainee receiving feedback, what safeguards are there against raters providing invalid feedback? a. We strongly encourage raters to provide constructive, balanced and helpful feedback. However, if a rater attempts to skew the feedback by exaggerated scoring in either a favorable or unfavorable direction the PULSE software system can flag response patterns that suggest extreme and unrealistic lenience or harshness, and the responses of these raters may be dropped from the feedback report if it is determined by PULSE to be overly biased. 17. How do trainees review their PULSE Feedback Reports? a. Trainees are expected to review their feedback with their Program Directors to better understand how their skill sets are perceived and to identify excellence goals.
Page 6 of 6 18. What if I get negative feedback? a. The vast majority of trainees receive favorable feedback that highlights their strengths and qualities that others appreciate and value. The goal of PULSE 360 feedback is to help trainees go from good to great. The purpose of being a trainee, of course, is to train, and that means learning how to do things better. One of the best times to receive formative feedback is during your training, when the emphasis is on growth and development. No one is perfect, and we all have areas in which we can improve. The goal of feedback can be summed up in two words: be better. 19. If the survey is anonymous, why does the PULSE Software System keep reminding me to complete my Self-Rating or other surveys? a. Your survey feedback is absolutely anonymous. In order to ensure that all raters provide feedback, the PULSE software is programmed to automatically remind them that there is still time to complete the survey and give helpful feedback. This includes reminding the trainee to select raters and to complete the self-rating. PULSE reminder emails are automatically generated when a survey has not been completed. Once the survey Finish button is clicked, the automated reminders stop. The PULSE Survey reminder system is completely automated, and reminders are sent every few days to anyone who has not completed the next step in the PULSE process. 20. If you have any additional questions, please: email us at PulseInfo@PdpFlorida.com or call the PULSE Program directly at 416-800-9203.