Online Supplement to Clinical Peer Review Programs Impact on Quality and Safety in U.S. Hospitals, by Marc T. Edwards, MD



Similar documents
Guidelines for Updating Medical Staff Bylaws: Credentialing and Privileging Physician Assistants (Adopted 2012)

Quality Management Plan 1

NORTHWEST TEXAS HEALTHCARE SYSTEM

Standard HR.7 All individuals permitted by law and the organization to practice independently are appointed through a defined process.

Study Guide: Quality Management

UNIVERSITY OF MISSISSIPPI MEDICAL CENTER RISK MANAGEMENT PLAN

FPPE and OPPE Best Practices: What We Have Learned in the First Three Years

COMMUNITY HOSPITAL NURSE PRACTITIONER/PHYSICIAN ASST

Guide to the National Safety and Quality Health Service Standards for health service organisation boards

AMERICAN BURN ASSOCIATION BURN CENTER VERIFICATION REVIEW PROGRAM Verificatoin Criterea EFFECTIVE JANUARY 1, Criterion. Level (1 or 2) Number

Supporting information for appraisal and revalidation: guidance for General Practitioners

National Clinical Effectiveness Committee. Prioritisation and Quality Assurance Processes for National Clinical Audit. June 2015

STATEMENT OF DUTIES AND RESPONSIBILITIES OF BOARD OF DIRECTORS OF THE SOUTHERN MONO HEALTHCARE DISTRICT

UB Graduate Medical Education Supervision Policy

JOINT COMMISSION INTERNATIONAL ACCREDITATION STANDARDS FOR PRIMARY CARE CENTERS. 1st Edition

The Medical Microbiology Milestone Project

Referral Strategies for Engaging Physicians

As She Lay Dying: How I Fought to Stop Medical Errors From Killing My Mom

Required Clinical Privileging Criteria for Robotic Surgery

Your Hospital PERFORMANCE IMPROVEMENT PLAN

American University of Beirut Medical Center Quality, Accreditation and Risk Management Program (QARM) Quality Improvement Workshop

Establishing a Compensation Model in an Academic Pathology Department

National Healthcare Leadership Survey Implementation of Best Practices

CPME 120 STANDARDS AND REQUIREMENTS FOR ACCREDITING COLLEGES OF PODIATRIC MEDICINE

Brazos Valley Community Action Agency, Inc Community Health Centers (HealthPoint) Quality Management (QM) Plan

University of Michigan Health System Program and Operations Analysis. Utilization of Nurse Practitioners in Neurosurgery.

THE WESTERN AUSTRALIAN REVIEW OF DEATH POLICY 2013

EFFECTIVE DATE: 10/04. SUBJECT: Primary Care Nurse Practitioners SECTION: CREDENTIALING POLICY NUMBER: CR-31

11/4/2014. The Role of the Nurse Practitioner in the Gastroenterology Team. Objectives. Why?

LAP Audioconference How to Prepare and Comply with Your Quality Management Plan* February 18, 2009

MN-NP GRADUATE COURSES Course Descriptions & Objectives

MAMSS 2010 FALL EDUCATIONAL CONFERENCE LOW-VOLUME/NO-VOLUME PRACTITIONERS WHAT S THE SOLUTION FOR YOUR HOSPITAL?

Living Through a Sentinel Event Crisis: Lessons Learned from the David Arndt, MD Case

Jersey City Medical Center. Shared Governance. By-Laws

How To Manage Risk

Assessment of Primary Care Resources and Supports for Chronic Disease Self Management (PCRS)

DDCAT. The Dual Diagnosis Capability in Addiction Treatment (DDCAT) Index Toolkit. Adapted for use in the

UW Medicine Case Study

Learning when things go wrong. Marg Way Director, Clinical Governance Alfred Health, Melbourne

Comparison Between Joint Commission Standards, Malcolm Baldrige National Quality Award Criteria, and Magnet Recognition Program Components

Making the Grade! A Closer Look at Health Plan Performance

MEDICAL STAFF RULES & REGULATIONS

Loma Linda University Children s Hospital. EMERGENCY MEDICINE SERVICE Rules and Regulations

MEDICAL MUTUAL Self-Assessment Form

Active Participation of Residents/Fellows in Patient Care

ADMINISTRATIVE POLICY & PROCEDURE RISK MANAGEMENT PLAN (MMCIP)

M E D I C AL D I AG N O S T I C S P E C I AL I S T Schematic Code ( )

CAREER PATHS IN QUALITY

Terry McGeeney, MD MBA, President, CEO of TransforMED

Health care trend: Developing ACOs

CARE GUIDELINES FROM MCG

University Hospitals. May 2010

Utilizing Physician Extenders to Achieve Group Practice Initiatives

DIRECT CARE CLINIC DASHBOARD INDICATORS SPECIFICATIONS MANUAL Last Revised 11/20/10

Minnesota Board of Nursing Nursing Practice Committee. Proposed Changes to the Minnesota Nurse Practice Act

Certified Nurse-Midwife and Women s Health Care Nurse Practitioner

Sustainable Growth Rate (SGR) Repeal and Replace: Comparison of 2014 and 2015 Legislation

SECTION B DEFINITION, PURPOSE, INDEPENDENCE AND NATURE OF WORK OF INTERNAL AUDIT

RISK MANAGEMENT FOR INFRASTRUCTURE

The Changing Landscape

HIMSS Electronic Health Record Definitional Model Version 1.0

Accreditation Handbook of Urgent Care Centers

Edwin Lindsay Principal Consultant. Compliance Solutions (Life Sciences) Ltd, Tel: + 44 (0) elindsay@blueyonder.co.

Performance Management. Date: November 2012

MAGELLAN HEALTH SERVICES ORGANIZATION SITE - SITE REVIEW PACKET Behavioral Health Intervention Services (BHIS) ONLY

University of Michigan Health System Adult and Children s Emergency Departments

UNIVERSITY OF WASHINGTON SCHOOL OF DENTISTRY

Patient Rights & Responsibilities

Performance Management

JOINT COMMISSION INTERNATIONAL ACCREDITATION STANDARDS FOR HOME CARE,

Board of Governors Exam Compilation

Being JCI Accredited Is Being A Patient Centered Organization

2009 LAP Audioconference Series. How to Prepare and Comply with Your Quality Management Plan

Scioto Paint Valley Mental Health Center. Quality. Assurance Plan

Leadership Summit for Hospital and Post-Acute Long Term Care Providers May 12, 2015

Guidance Note: Corporate Governance - Board of Directors. March Ce document est aussi disponible en français.

ROLE OF THE MEDICAL DIRECTOR IN THE NURSING HOME 1

Assumed Practices. Policy Changes Proposed on First Reading

OP.16 TITLE:

STATEMENT BY ALAN RAPAPORT, M.D., M.B.A. PHYSICIAN SURVEYOR THE JOINT COMMISSION BEFORE THE NEW MEXICO HEALTH AND HUMAN SERVICES COMMITTEE

Standard 5. Patient Identification and Procedure Matching. Safety and Quality Improvement Guide

Standard 1. Governance for Safety and Quality in Health Service Organisations. Safety and Quality Improvement Guide

BoardroomBasics. Medical Staff Credentialing INSIDE. KNOWLEDGEPoints. Knowledge Resources for Health Care Governance Effectiveness

GENERAL INFORMATION BROCHURE FOR ACCREDITATION OF MEDICAL IMAGING SERVICES

Patient Reported Outcomes for Mental Health Services: Current Practice and Future Possibilities

Credentials Policy Manual. Reviewed & Approved by MEC 8/13/2012 Reviewed & Approved by Board of Commissioners 9/11/12

ELECTRONIC MEDICAL RECORDS (EMR)

Physicians on Hospital Boards: Time for New Approaches

Registered Professional Nurse

Florida International University College of Nursing and Health Sciences Nursing Unit. Tenure and Promotion Policies and Procedures Manual

POLICY # SUBJECT: INPATIENT CERTIFICATION AND AUTHORIZATION

Royal College of Obstetricians and Gynaecologists. Faculty of Sexual and Reproductive Healthcare

6/10/2010 DISCLOSURES - NONE INTEGRATING QSEN COMPETENCIES INTO NURSING EDUCATION

The Role of the Advance Practice Clinician (APC) in Pediatric Trauma Care

ORLANDO REGIONAL MEDICAL CENTER EMERGENCY MEDICINE RESIDENCY Post Graduate Year One Educational Objectives Faculty Resident Evaluation

Policy Changes Adopted on Second Reading

Allied Health Care Provider: Appointment and Re-appointment

Executive Total Compensation Review for Natividad Medical Center

Business Analyst Position Description

Physician Assistant Application for Professional Liability Insurance Additional Insured Basis*

Transcription:

Online Supplement to Clinical Peer Review Programs Impact on Quality and Safety in U.S. Hospitals, by Marc T. Edwards, MD Journal of Healthcare Management 58(5), September/October 2013 Tabulated Survey Results The following tables present questionnaire items in a manner comparable to the original online survey along with tallies for the response options. If not otherwise indicated, results are given as % (n). QI model items (7-19) are marked with an (*) and show the points assigned to each response level, which add to form the QI model score. 1) Medical Staff Engagement in Quality & Safety N = 300 Which statement best describes the degree to which your medical staff is engaged with efforts to improve quality and safety at the hospital? Highly engaged 22 (66) Very engaged 47 (142) Somewhat engaged 28 (84) Somewhat unengaged 1 (4) Generally unengaged 1 (3) Distinctly unengaged 0 (1) 2) Quality Impact N = 300 What is the likelihood that your Peer Review Program makes a significant ongoing contribution to the quality and safety of patient care at the hospital? Very likely 40 (121) Likely 37 (110) Somewhat likely 19 (56) Somewhat unlikely 3 (10) Unlikely 0 (1) Very Unlikely 1 (2) 3) Relative Importance N = 300 How important is your Peer Review Program in relation to all other quality and safety improvement activity at the hospital? 6 - Most Important 21 (36) 5 44 (133) 4 31 (92) 3 9 (26) 2 3 (9) 1- Least Important 1 (4) 4) Medical Staff Perception N = 300 Which statement best describes how your medical staff perceives the Peer Review process? Excellent 6 (17) Very Good 35 (105) Good 42 (125) Fair 14 (43) Poor 3 (10) Very Poor 0 (0) Journal of Healthcare Management 58(5), September/October 2013 1

5) Program Scope N = 300 While peer review methods are widely applied, not all such use is locally defined as being within the scope of the medical staff s peer review program. Prior study has shown wide variation in program scope. What types of activities are included within the scope of your Peer Review Program? Check all that apply Retrospective medical record review 97 (292) Focused individual review of quality when serious concerns are raised 91 (272) Ongoing Professional Practice Evaluation 82 (247) Comparative evaluation of performance measures (e.g., complication rates, core 82 (245) measures, patient satisfaction) Case-specific, individually-targeted recommendations to improve performance 78 (234) Focused Professional Practice Evaluation for new privileges 77 (232) Disruptive behavior management 77 (232) Root cause analysis 77 (231) Proctoring for new privileges 63 (188) Comparative evaluation of aggregate data from Peer Review 62 (186) Development and/or review of clinical policies, order sets, etc. 62 (187) Morbidity & Mortality case conferences 57 (170) Benchmarking to normative data (e.g. NSQUIP, STS, UHC, Premier, etc.) 57 (170) Physician Health Program administration 56 (166) Conducting quality improvement studies and/or projects 54 (162) Concurrent medical record review 50 (149) Producing educational programs for groups of clinicians 43 (130) Other forms of direct observation 25 (76) 6) Peer Review vs. Credentialing N = 298 Which statement best describes the relationship between peer review and credentialing at your hospital? The peer review program is completely separate from credentialing 5 (14) The peer review program is independent of credentialing, but the results of peer review 70 (209) may be used in credentialing decisions including OPPE Peer review program activity is partially mixed with credentialing activity 14 (42) The peer review program encompasses the credentialing program 4 (11) The credentialing program encompasses the peer review program 8 (23) 7) *Standardization of Process* N = 299 Points Peer review process is highly standardized. The oversight committee approves 28 (84) 10 all variation. The process is greatly standardized, but there may be some unapproved 31 (94) 6 variation The process is standardized, although there may be significant variation 28 (84) 4 The process may be somewhat standardized, but variation is substantial 10 (30) 0 The process is not really standardized - autonomous behavior predominates 2 (7) 0 Journal of Healthcare Management 58(5), September/October 2013 2

Example A: Commonly Used Overall Rating Method This assessment, limited to an overall score, classifies the review decision, but does not measure clinical performance. It has 10 options, but only captures 3 levels of management quality (Appropriate - Varied from Standard - Unacceptable). Example B: Sample Methodology and Rating Scale for Measuring Clinical Performance Multiple elements of clinical performance are rated on a 5 point scale from Best to Worst. Page one of several. cf: Rubin HR et al. Guidelines for Structured Implicit Review of Diverse Medical and Surgical Conditions. RAND; 1989 N-30066-HCFA. Journal of Healthcare Management 58(5), September/October 2013 3

8) *Clinical Performance Measurement During Case Review (see N = 300 Points examples above)* The findings from case review are essentially undocumented 2 (7) 0 Case review may be documented, but no measures of clinical performance are 11 (32) 0 recorded Clinical performance is summarized with an overall score, which may capture a 54 (161) 2 few key dimensions (Example A) Clinical performance is measured on multiple dimensions using a single 14 (43) 6 template common to all medical care (Example B) Clinical performance is measured on multiple dimensions that vary according to the specific type of clinical activity being reviewed (e.g., surgery vs. medicine) 19 (57) 10 9) *Recognition of Excellence* N = 299 Points We have a method to identify and regularly provide recognition for outstanding 15 (46) 10 clinical performance We occasionally recognize outstanding performance 45 (134) 5 Seldom or rarely, if ever, do we recognize outstanding performance 40 (119) 0 10) *Governance of Process* N = 300 Points The medical staff leadership gives little or no attention to governance of the 8 (25) 0 peer review process and its aggregate outcomes There is regular review of data involving the process and its aggregate 33 (99) 5 outcomes, with little or no discussion There is regular review of data involving the peer review process and its outcomes, with meaningful discussion directed toward ongoing improvement of the process (irrespective of discussions about individual performance issues 59 (176) 10 11) *Rating Scales (see examples above)* N = 300 Points We don t use rating scales 12 (37) 0 We rate elements of clinical performance primarily on Yes or No type scales 17 (50) 0 (e.g., a check box for documentation issues) We rate elements of clinical performance on scales that have at most three or 59 (176) 0 four intervals from best to worst (Example A) We rate elements of clinical performance on scales that have five or six 10 (31) 5 intervals from best to worst (Example B) We rate elements of clinical performance on scales that have seven or more intervals from best to worst 2 (6) 10 12) *Reviewer Participation* N = 300 Points Which statement best describes the level of participation by Reviewers in the Peer Review process? Excellent 26 (78) 10 Very Good 40 (119) 8 Good 25 (76) 4 Fair 7 (21) 0 Poor 1 (4) 0 Very Poor 1 (2) 0 13) *Integration with Performance Improvement Activity* N = 300 Points Peer review is highly interdependent with the hospital s Performance 29 (88) 10 Improvement (Quality/Safety Improvement) process Peer review is at least fairly well-connected to the hospital s PI process 50 (151) 5 At best, peer review is only somewhat connected to the hospital s PI process 20 (61) 0 14) *Identification of Improvement Opportunities* N = 300 Points Journal of Healthcare Management 58(5), September/October 2013 4

In each review, we look for process improvement opportunities including 83 (248) 5 clinician to clinician issues, in addition to evaluating individual clinical performance In each review, we do little more than ask, Was the standard of care met? 17 (52) 0 15) *Board Involvement* N = 300 Points Trustees periodically receive information about peer review activity beyond that 60 (181) 5 which would be reported in relation to an adverse action Trustees are only provided information in relation to adverse actions 27 (82) 0 Unknown 12 (37) 0 16) *Performance Feedback* N = 296 Points Cases are reviewed and opportunities for improvement are communicated on 77 (229) 5 average within three months of an occurrence On average, more than three months is required 20 (59) 0 Unknown 3 (10) 0 17) *Case Review Volume* N = 300 Points The total annual volume of cases reviewed is at least 1% of hospital inpatient 43 (128) 5 volume The total annual volume is less than 1% of hospital inpatient volume 28 (84) 0 Unknown 29 (88) 0 18) *Documents Examined During Case Review* N = 300 Points Pertinent diagnostic images or recordings (e.g., CT, MRI, ultrasound, fetal heart 69 (208) 5 tracings, etc.) are routinely examined along with the medical record Only the medical record and the reports of pertinent diagnostic studies are examined 31 (92) 0 19) *Adverse Events* N = 298 Points Trends in adverse event rates (either globally or by event type) are monitored in 77 (230) 5 the context of peer review outcomes by committees, departments or governance Trends in adverse event rates are not monitored in the context of peer review outcomes 23 (69) 0 20) Self-Reporting N = 296 Medical staff members frequently report adverse events, near misses and/or hazardous conditions affecting their own patients for peer review. Strongly Agree 8 (23) Agree 22 (65) Somewhat Agree 36 (106) Somewhat Disagree 17 (52) Disagree 14 (42) Strongly Disagree 3 (10) Journal of Healthcare Management 58(5), September/October 2013 5

21) Leadership N = 296 If we found compelling reasons to change our peer review process, we would not be hampered by a lack of leadership. Strongly Agree 40 (119) Agree 37 (110) Somewhat Agree 13 (40) Somewhat Disagree 4 (13) Disagree 3 (9) Strongly Disagree 2 (7) 22) Resources N = 294 If we found compelling reasons to change our peer review process, we would not be hampered by a lack of access to resources (budget, staff, information systems, etc.). Strongly Agree 20 (59) Agree 33 (98) Somewhat Agree 25 (73) Somewhat Disagree 12 (36) Disagree 6 (19) Strongly Disagree 4 (12) 23) Resistance to Change N = 296 If we found compelling reasons to change our peer review process, we would not be hampered by general inertia and resistance to change. Strongly Agree 13 (38) Agree 37 (111) Somewhat Agree 26 (78) Somewhat Disagree 16 (47) Disagree 6 (18) Strongly Disagree 2 (6) 24) Physician-Hospital Relations N = 296 Please rate the overall quality of physician-hospital relations on the following scale: 1 Adversarial - Dysfunctional 1 (4) 2 3 (9) 3 12 (35) 4 27 (80) 5 41 (122) 6 Harmonious - Cooperative 16 (48) Journal of Healthcare Management 58(5), September/October 2013 6

25) Case Identification Criteria N = 296 Rank order the 3 most commonly applied criteria used to identify cases for peer review at your hospital. Assign #1 to the most frequently used method. Criteria Relative Use* N = 297 Generic screens for or triggers suggestive of adverse events 79 Physician or hospital staff concerns 45 Unexplained deviation from protocols, pathways or specified clinical standards 21 Patient complaints 18 Core Measures variances 12 Statistical monitoring of process and/or outcomes measures 10 Review of new privileges (Focused Professional Practice Evaluation) 6 Quality improvement studies 5 Random selection 3 Clinically interesting cases 2 * Standardized to a scale of 100 points for an item ranked #1 by all respondents 26) Sources of Cases for Review Rank order the 3 most common sources/methods by which the case identification criteria are applied to identify potential cases for peer review. Assign #1 to the most frequently used method. Source/Method Relative Use* N = 296 Data review - the process of reviewing reports of hospital administrative data to identify cases that might meet peer review criteria, regardless of who does the work 41 Risk management referrals 36 Nursing cases referred by bed-side nurses and nurse managers 31 Committee referral from any other peer review, medical staff or hospital committee 29 Case management - when those who routinely review the medical record for other business purposes thereby indirectly identify cases warranting peer review, regardless of the role moniker used in your organization 28 Medical staff cases involving other physicians care practices 23 Study i.e., incidental to a specific quality improvement study or initiative 4 Self-reported cases involving a physician s own care practices or outcomes (excluding FPPE) 3 Residents cases referred by medical students, residents and/or fellows 2 * Standardized to a scale of 100 points for an item ranked #1 by all respondents 27) Secondary Case Screening N = 292 What proportion of identified cases receives secondary screening prior to assignment for Peer Review? <25% 19 (57) 25-49% 11 (32) 50-74% 6 (17) 75% or more 44 (129) Unknown 21 (61) 28) Reviewed Clinician Input N = 292 Journal of Healthcare Management 58(5), September/October 2013 7

Thinking of case review in general, (not a Morbidity & Mortality Case Conference or a Serious Occurrence investigation), how likely is it that one or more clinicians involved in that patient s care will be solicited for input to the review process? Rarely or not at all 5 (14) Infrequently 5 (14) Occasionally 17 (49) Frequently 25 (75) Very frequently 19 (57) Almost every time 29 (87) 29) Committee Discussion of Case Reviews N = 294 What proportion of case reviews are presented and discussed in a committee prior to final decisionmaking? <25% 17 (51) 25-49% 10 (31) 50-74% 9 (26) 75% or more 60 (178) Unknown 4 (11) 30) Multi-Specialty Committee Discussion of Case Reviews N = 292 What proportion of case reviews are presented and discussed in a committee having multi-specialty representation prior to final decision-making? <25% 35 (103) 25-49% 9 (27) 50-74% 11 (32) 75% or more 39 (114) Unknown 7 (20) If applicable, please provide any information needed to understand the nuance of how committees and multi-specialty review fit into your program. Also indicate whether nurses or other disciplines participate. See article text 31) Quality of Case Review N = 294 Rate the general quality of case reviews on the following scale 1 - Relatively superficial and lax 2 (5) 2 2 (7) 3 9 (27) 4 30 (90) 5 41 (122) 6 - Extremely thorough and rigorous 15 (46) Journal of Healthcare Management 58(5), September/October 2013 8

32) Reviewer Qualifications N = 290 Are there any criteria to become a reviewer other than being a member in good standing of the medical staff in whatever categories may be specified and with privileges appropriate to the assigned role? Yes 29 (85) No 71 (210) If there are additional criteria, please list the criteria used (e.g., must be a service chief or associate chief; highly rated by peers; etc.): Top 5 Additional Criteria n Service chief/vice-chief or Chair 20 Endorsement, respect of or high ratings by peers 17 Selected by department chair or other leadership 15 Approved by leadership 9 Current or prior leadership role 8 33) Reviewer Appointment N = 294 How are most new reviewers identified? Pick the best single answer. If peer review is a responsibility of physician leaders who have other responsibilities, such as service chiefs, answer in terms of how they were identified for that overall role. Volunteer 6 (17) Recruited asked, but have a choice 62 (185) Drafted asked, but have little choice 7 (22) Required expected part of clinical role (e.g., all members of a given service get assigned cases to review in rotation) 24 (71) Involuntary Other (e.g., got elected because they weren't present at the meeting) 1 (2) 34) Reviewer Training N = 290 Do new reviewers routinely receive orientation and/or training? Yes 43 (126) No 57 (169) 35) Training Content N = 126 If reviewers receive training, what components are routinely included? Check all that apply Training Content % (n) Use of review forms/documentation of findings & conclusions 88 (111) Role expectations 85 (107) Program policy and procedures 82 (103) Chart review methods 58 (73) Legal & risk management issues (confidentiality, peer review protections, etc.) 52 (65) Interpersonal skills (e.g., communication with reviewees, group dynamics, etc.) 30 (38) Quality improvement methods (e.g., root cause analysis, Pareto analysis, etc.) 29 (36) Practice reviews 19 (24) Please describe any Other Training Content: Use of external program or conference noted by 7 respondents Journal of Healthcare Management 58(5), September/October 2013 9

36) Reviewer Compensation N = 290 Are Reviewers compensated in any way for their activity? Yes 22 (65) No 78 (230) 37) Data Capture During Case Review N = 294 What data is systematically captured and retained in the case review process? Check all that apply Data Element % (n) Categorization of an event type (e.g., morality, readmission, etc.) 81 (238) Identification of process of care issues involving other disciplines, information systems, organizational policy/procedures, etc. 71 (208) Rating of appropriateness or deviation from standard of care 70 (204) Identification of clinician to clinician issues (gaps in communication, call coverage, supervision, coordination among clinicians, etc.) 69 (203) Overall quality of care rating for an individual clinician 67 (197) Any recommendations for improved performance of an individual clinician 65 (189) Other recommendations or actions for improvement (e.g., group educational program, correction of system or process problem, initiation of a QI study, external review, etc.) 62 (183) Identification of contributory factors to an adverse event (e.g., high risk patient or procedure) 62 (182) Rating of the degree of any associated patient harm 59 (174) Categorization of type of error made (e.g., diagnosis, treatment, performance, etc) 56 (163) Rating of whether an adverse event was preventable 53 (156) Overall rating of completeness of medical record or quality of documentation 48 (140) Categorization of reason for error (e.g., knowledge, skill, habits, situational factors, etc.) 42 (122) Rating of whether an individual clinician could have prevented an adverse event 40 (118) Written case analysis 39 (115) Identification of excellence in clinical care 34 (99) Structured ratings of specific elements of individual performance (legibility, quality of history & physical exam, differential diagnosis, orders, etc.) 24 (71) Rating the likelihood that another provider would have handled the case differently 22 (64) None of the above 1 (4) 38) History of Program Change N = 294 In what Medicare fiscal year did your medical staff last make major changes to peer review program structure, process and/or governance? 2011 (Oct 10 present) 20 (60) 2010 (Oct 09 Sep 10) 22 (66) 2009 (Oct 08 Sep 09) 15 (44) Before October 2008 22 (66) Unknown 21 (61) Please describe what prior changes were made at that time: Top 5 Listed Changes n Multi-specialty process created/enhanced 35 OPPE/FPPE process 31 Standardization of review process 15 Policy/procedure/charter 13 Program structure 10 Journal of Healthcare Management 58(5), September/October 2013 10

39) Future Change Likelihood N = 286 What is the likelihood that your medical staff will make significant changes in the Peer Review Program structure, processes or governance in the coming year? Very likely 16 (47) Likely 22 (65) Somewhat likely 20 (59) Somewhat unlikely 14 (40) Unlikely 19 (56) Very Unlikely 9 (26) 40) Comments, clarifications and feedback about your responses or any aspect of this survey: Journal of Healthcare Management 58(5), September/October 2013 11