Disclosure Quality and peer review The Canadian perspective Co-founder and shareholder in Real Time Medical Inc. MIR Barcelona October 11, 2013 David A Koff MD FRCPC Chief and Chair Radiology at McMaster University Need for Quality Control Five years ago: Poor adoption of Peer Review in Canada Incorrect interpretation of studies is the leading cause of malpractice law suits against radiologists. Regulators expect a definition of acceptable levels of performance amongst radiologists. But what does acceptable levels mean, if any? Radiologists reluctant to participate as: don t want to expose possible weaknesses. resent peer review as punitive, concerned about impact on credentials or license to practice. Few scattered initiatives, mainly based on RADPEER
Health Care in Ontario: The Changing Landscape Excellent Care for All Act; 2010 Mandatory Quality Committees Annual quality improvement plans Executive compensation linked to improvement targets Need for Quality Control The perfect storm. A wave of high profile reviews has raised awareness among radiologists that they have to engage actively in quality control to avoid situations where large scale reviews are mandated by local or regional authorities.
Enough is enough: The Cochrane report And then, after an infamous review of thousands of CTs in BC, which found an exceptionally high amount of errors, the BC Minister of Health Services requested in February 2011 an independent two-part investigation into the quality of diagnostic imaging and asked Dr Doug Cochrane, Provincial Patient Safety and Quality Officer, to lead this review. Among the 35 recommendations pertaining to the provision of diagnostic imaging services, 6 were specific to Quality Assurance and Peer Review in diagnostic imaging Enough is enough: The Cochrane report Among these recommendations, it suggested: That the College and the health authorities develop a standardized retrospective peer review process designed for quality improvement in the health authorities and private facilities (16); That BCRS, the Canadian Association of Radiologists (CAR) and the Royal College establish modality specific performance benchmarks for diagnostic radiologists that can be used in concurrent peer review monitoring (20); And just 2 months ago!!
Health Care in Ontario: The Changing Landscape The CAR position March 2012 New drivers: Performance and Quality of Care The Canadian Association of Radiologists published a white paper in September 2011 to outline the requirements for peer review processes and suggest ways to integrate it into practice. The CAR recommendations for a peer review software Reveals opportunities for quality improvement Helps ensure competence Helps improve individual outcomes Is a fair, unbiased, consistent process Allows trends to be identified The CAR recommendations for a peer review software Employs random selection of cases broadly representing the work done in a department Ensures the opinions of both the reviewers and the radiologists being reviewed are recorded Is non-punitive Has minimal effect on workflow Allows easy participation
The CAR recommendations for a peer review program Include a reactive or proactive double reading with 2 physicians interpreting the same study Allow for random selection of studies to be reviewed on a regularly scheduled basis Examinations must be representative of each physician s subspecialty The CAR recommendations for a peer review program Allow assessment of the agreement of the original report with subsequent review (or surgical or pathology findings) Approved classification of peer-review findings with regard to level of quality concern Policies and procedures for action to be taken on significantly discrepant peer-review findings for quality outcomes improvement. The CAR recommendations for a peer review program Summary statistics can be generated and comparisons shown for each physician by modality to help the coordinator assess performance standards. Summary data for each facility or practice by modality can be obtained to aid the departmental QA program. Planned strategy for remediation and reeducation on both individual and departmental levels when discrepancies arise. The McMaster Quality Assurance solution
Prerequisites for a QA Software at McMaster Automate a complex QA workflow: peer review with feedback Integrate QA activities within reporting workflow Encourage timely QA activities with notifications Demonstrate rules-based QA promoting fairness and radiologist buy-in Anonymization of radiologists to eliminate bias and promote learning Scale from single site to span entire network, across PACS/RIS and sites Monitor own individual performance anonymously and in real time. Limitations of legacy retrospective peer-review system at McMaster Was used at HHS but abandoned. Was resented as punitive. Absence of anonymization. Reviewed cases may be old, more than 6 months when too late for corrective action. No review if original case has no follow-up at the institution. No review for new patients, mainly emergency patients. No cross-institution/cross-platform capability. No regional deployment in current format. Our expectations from a QA Program Objective is not to police but to learn and improve Assessment Education Improvement and meaningfully improve overall service Prospective Quality Assurance for Diagnostic Imaging at McMaster Purpose: Demonstrate prospective (pre-report finalization) radiology quality assurance and improvement Across multiple: Organizations: 2 large hospital systems with 65 radiologists and nuclear medicine physicians in 5 different partnerships, PACS/RIS infrastructures: 2 different GE PACS, and 2 RIS: Meditech and McKesson Modalities: 8 CTs, 6 MRs Ultimately, region wide implementation to provide access to QA to community hospitals and independent health facilities.
Prospective Quality Assurance for Diagnostic Imaging at McMaster SJHH Mapping of Prospective Workflow HHS Process: Deployment of an integrated, workflow-driven Prospective Radiology Quality Assurance (QA) software platform Pilot with participation from Radiologists across four sites at HHS and SJH GE PACS / MAGIC RIS Notification of image availability INTEGRATE FORWARD CONNECT GE PACS / ITS RIS COLLABORATE Meta data extraction & Profile info DISCOVER for all Radiologists stored profile info Dr. Spock uses the customizable QA assessment grid to REVIEW Dr. Spock evaluate submits Dr. his Mayo s QA result, Most appropriate which is then sent report. to Dr. Mayo Using contextual information from: Dr. Mayo study submits assignment her report. QA item appears in Dr. Spock s QA worklist. Workflow Study REVIEW is selected interjects manager for a QA itself, workflow: anonymizesit Image pre-report and producer finalization sends it alongside profiler QA w/ feedback the study to a Dr. The comparable Mayo The study QA reviews item appears peer appears the in assessment. Dr. in Mayo s Dr. Dr. Mayo's worklist. uses QA Image reader profiler She can worklistas either amend, a "Feedback" ignore, VR-enabled itemreporting or send to arbitration. DR. SPOCK S WORKSTATION Image is sent Automatic multi-channel report distribution Performance metrics Analytics DR. MAYO S WORKSTATION Prospective QA Parameters Deliverables Review of adult CT s, x-ray and ultrasounds All routine cases Type of CT s limited to the main sub-specialties of the participating Radiologists (to ensure peer skill set matching) Reviewing Radiologists at a site different from reporting Radiologist (QA best practice: increase geographic and emotional separation between anonymized reporter and reviewer) All reports from SJH reviewed at HHS Reports from HHS reviewed at SJH Anonymized: patient, radiologist and hospital information partially removed A functional, integrated, automated, accepted, and easy-to-use quality assurance system in all four radiology departments. Defined, customized, agreed upon quality assurance workflows, methods and criteria. Measured benefits of an integrated versus manual quality assurance methods. Wrap-up report, including recommendations.
Options to address discrepancies Conclusion When the original radiologist does not agree with the reviewer and the discrepancy is relevant to patient care, there is arbitration via: Quality Committee review of discrepancies Radiology Specialist designated by the Site QA Lead Peer review processes are here to stay and will impact radiologist workflow. Proactive solutions better adapted to practice and in the patient best interest. Individual feedback to change error prone habits. Objective not to police, but encourage further education and improvement. Scoring system to identify errors relevant to patient care and likely to cause harm but not to point the poor performers. References Excellent Care for All Act. MOHLTC. 2010 Kohn LT, Corrigan JM, Donaldson MS: To Err is Human: Building a Safer Health System. Institute of Medicine. Washington DC. National Academy Press, 2000. Larson D and Nance J: Rethinking Peer Review: what Aviation can teach Radiology about Performance Improvement Radiology, June 2011 O Keeffe MM, Piche SL, Mason AC: The CAR guide to peer review systems, Sept 2011 Mahgerefteh S, Kruskal JB, Yam CS, Blachar A, Sosna J: Peer Review in Diagnostic Radiology: Current State and a Vision for the Future. Radiographics 2009;29:1121-1231 Patient-Based Funding for Hospitals. MOHLTC. March 19, 2012 Walton M, The Deming management Method, 1986, The Berkeley publising group