1 Predictive Analytics and Risk Models to Prevent Sepsis, Patient Falls, and Readmissions
2 Today s Agenda: Provide an overview of Mount Sinai s end-to-end informatics and clinical data analytics approach for quality initiatives. Purpose Solutions Approach Delivery Within our end-to-end framework we will illustrate how we: Leverage risk models and best practice alerts in the electronic health record to identify patients at risk for events. Use phenotypic modeling and informatics-enabled registries to identify and select patient cohorts. Use data trusts and data aggregation for unit level and service area scorecards, to measure hospital performance for QI initiatives. Leverage data for systemized and measurable operational change.
3 Mount Sinai Health System, Who We Are 7 hospital campuses and the Icahn School of Medicine Some facts: 3,535 beds and 135 operating rooms 169,532 inpatient admissions 2,600,000 outpatient visits 489,508 emergency department visits 18,000 babies delivered a year 6,200 physicians, including general practitioners and specialists 2,000+ residents and fellows 36,000 employees
4 Mount Sinai Hospital Office For Excellence in Patent Care Advancing Quality, Safety and Service across the Mount Sinai Health System We intend to produce the safest care, the best outcomes, the highest satisfaction, and the best value of any health system or provider in the New York Metropolitan area Chief Medical Officer Big Data HIT Forum Clinical Data Analytics Team Ken McCardle Senior Director Data Analytics Quality Initiatives Allison Glasser Director Operations Patient Safety
5 Purpose Solutions Approach Delivery Provide an overview of Mount Sinai s end-to-end informatics and clinical data analytics approach for quality and safety initiatives. Purpose in our context is Quality Initiatives
6 Quality Improvement (QI) Model Dimensions add further definition to the model: RN & MD Dyads Service Area Units / Wards Disease Audience etc. External Reporting Monitor and Measure Identify Patient Cohorts Optimize Care Processes Improve Patient Outcomes Take Away: QI is built around the patient experience and patient satisfaction
7 Cohort Identification What patient population do I want to impact? Adult vs. Pediatrics Disease specific (e.g. transplant surgery, sepsis, diabetes, etc..) Financial class (e.g. Medicare, Medicaid, ACO, etc ) Who are our high risk patients? Morbidity and mortality Readmissions Surgical site infections Inpatient falls Take Away: Quality initiatives require a clear definition of who you want to impact Where are the patient s being treated? Emergency department Inpatient ICUs Outpatient (e.g. clinics, faculty practices, etc )
8 Optimize Care Processes Electronic Healthcare Records (EHR) Patient Flags Care Guidelines Clinical Decision Support Risk Models Predictive Analytics Take Away: An EHR is a tool to support clinical processes and does not replace clinical judgment
9 Improve Patient Outcomes Decrease morbidity: e.g. falls, hospital acquired infections, unexpected ICU admissions, etc Decrease mortality: In-hospital, 30-day, raw vs. risk adjusted Improve patient satisfaction: Communications, discharge information, hospital quietness, staff responsiveness, etc Take Away: QI involves making, systemized operational changes that can be measured along with impact to outcomes Decrease readmissions: 7-day, 14-day, 30-day Decrease avoidable admissions
10 Monitor & Measure Monitoring Quality and Outcomes Assessment Using Clinical Data: More timely than hospital billing data (billing data available ~15-30 days after discharge) Used for predictive analytics, risk adjustment, decision support systems, and clinical registries Useful and more usable to clinicians for monitoring care protocols and adherence to guidelines Most often used for evidence-based care, translational research Take Away: Monitoring Measuring Measuring Using Administrative / Billing Data: Used for reimbursement policies and pay-for-performance metrics and is useful for measuring hospitals Large, national datasets made available for health service research
11 External Reporting Who is measuring our performance? CMS / Core Measures Department of Health NQF US News and World Report Registries & Data Consortiums (e.g. Premier, UHC, etc ) What are they measuring? Which patient cohort and how are they defining the cohort? Which outcomes? What processes? What data do I need to report? Administrative vs. clinical data Data dictionary requirements Take Away: External reporting requirements do not typically align with QI initiatives
12 QI Data Challenges Alignment of standard definitions across stakeholders Reconciliation of clinical data and hospital billing/administrative data Data quality assurance, mapping, nomenclatures, data standards, governance EHR Data discrete data elements vs. free text Getting the right data in the right format at the right time in the right hands
13 Purpose Solutions Approach Delivery Provide an overview of Mount Sinai s end-to-end informatics and clinical data analytics approach for quality and safety initiatives. Delivery of portfolio of products for purposes of Quality Initiatives
14 Putting QI Concepts into Action 1. Readmissions: How can we identify and inform clinicians that a patient is a high risk for readmission? How can we prevent avoidable admissions? 2. Sepsis: What is the adherence rates to our stop sepsis bundle? How do our internal metrics compare to the NYS Department of Health metrics? 3. Patient Falls: What units have the highest rates of patient falls? Why are the rates higher on these units? 4. Outcomes: What are the post-operative ventilator failure rates in the cardiac surgery patients?
15 Sepsis Pathway
16 Readmissions Using predictive analytics, clinical and demographic variables determine a risk score to flag patients at risk for readmissions Very high risk High risk Not at risk Clinical decision support in the EHR, calculates the score and flags the patient
17 Readmissions Targeted interventions are employed throughout the hospital and upon discharge Upon discharge, patients with very high and high risk flags are monitored by RNs, regarding follow-up on outpatient appointments Reports are used to track appointments: Date & time With whom and location Appointment status
18 Readmissions Patient Encounter List Emergency Room Tracking Board Once a patient is determined to be at risk to be readmitted, a flag appears in the EHR for all clinicians regardless of patient encounter type Outpatient / Clinics Emergency Department Inpatient Header in Patient Record
19 Sepsis Nursing Clinical Decision Support in the EHR alerts nurses that a patient is at risk for sepsis Nurse Managers receive daily reports to monitor timely completion of sepsis screenings and identify patients at risk
20 Sepsis - Clinical Case Confirmation For patients identified as at risk for Severe Sepsis, clinical reports and scorecards for components of the sepsis bundle are generated and distributed for clinical review Clinical dyad teams confirm diagnosis and provide feedback on each case via an intranet portal
21 Sepsis Internal Monitoring & Measuring Monthly internal report cards are generated with: Outcomes Process metric adherence rates Report cards include results by: All hospital Service area Patient level Distributed to: Hospital leadership Service area leadership Program leaders Program champions
22 Sepsis - External Reporting About 70 data elements are submitted to the NYS DOH quarterly, and the issues reports back to Mount Sinai, benchmarking us to the entire State
23 Purpose Solutions Approach Delivery Continue overview of Mount Sinai s end-to-end informatics and clinical data analytics approach for quality and safety initiatives. Solutions in our context is how to be an effective data support team for purposes of Quality Initiatives
24 Solution Sets Care Process Predictive Models Study Design Cohort Identification Patient Registry Inclusion/Exclusion Reporting Scorecard (metrics) Algebra Outcomes Studies Risk Adjustment (measures) Statistics Business Need Leveraging clinical data for patient quality initiatives, hospital safety improvement, and cohort outcomes assessment.
25 Care Process How can we identify and inform clinicians that a patient is a high risk for readmission? How can we prevent avoidable admissions?
26 Predictive Models Solutions for Quality Improvement Using administrative/claims/ehr data Predictive risk of readmission Using EHR data from nursing documentation MORSE model for falls prevention Using EHR/clinical data Improve c statistics on MORSE model Sequential, mulit-variate modeling (eg. medmed, labs, ICU monitoring, etc) CDIFF, MEWS+ Unexpected ICU admissions/bouncebacks Take Away: Clinical phenotyping helps to identify patient cohorts for care process, care interventions, disease registries, and outcomes studies.
27 % hospitalizations w/ falls Predictive Models Diagnostic characteristics of Morse Score (predictive model for Patient Falls) Sensitivity (ability to identify a condition correctly) 76.3% Specificity (ability to exclude a condition correctly) 69.0% Positive Predictive Value (precision, poor at confirming the fall) 3.8% Negative Predictive Value 99.5% Accuracy (correct classification rate) 69% C-statistic ,000 hospitalizations ~30% patients have high MORSE Take Away: Using baseline data (nursing assessment) from EHR at time of admission is good but can it be improved using big data?
28 Predictive Models Improving MORSE: Additional clinical data variables Hospital Characteristics Bed units, room locations Patient Characteristics Nursing assessment At time of admission (age, BMI, previous hospitalization and OP visits, labs and meds at time of admission, etc) Using EHR/clinical data that are sensitive to hourly/daily patient assessment Med administration sequencing Med-Med combinations and side effects Active monitoring lab results Take Away: Using baseline data (nursing assessment) from EHR at time of admission is good but can it be improved using big data?
29 Cohort Identification What is the adherence rates to our stop sepsis bundle? How do our internal metrics compare to the New York State Department of Health metrics?
30 Cohort Identification Oh, here s the problem. He s got a doohickey on his thingamabob.
31 Cohort Identification Sepsis Incidence: 6 months data, ,080 cases Dombrovskiy model for sepsis and severe sepsis (incidence rate methodology using ICD-9 codes for septicemia and organ dysfunction) ,501 cases possible inclusion criteria as sepsis and severe sepsis (NQF) using or cases possible severe sepsis (Mount Sinai Hospital) using EHR predictive model Take Away: Different definitions yield different cohorts. Concurrent patient identification allows opportunity for improvement at the point of care.
32 Patient Registries Solutions for Quality Improvement Coding terminology versus Clinical ontology Regulatory data registries, meaningful use, and public reporting typically identify patients through admin/claims ICD-9 coding. CMS/JC measures, NQF Society-based patient data registries typically use clinical ontologies for case ascertainment. Evidence-based practices, P4P measures Allows opportunity for improvement at the point of care. Take Away: Clinical data registries include case adjudication and data reconciliation process. (At MSH, weekly patient lists also double as data QA process)
33 Reporting What units have the highest rates of patient falls? Why are the rates higher on these units?
34 Reporting Solutions for Quality Improvement Data registries allow for conformed dimensions to meet the needs of care monitoring and quality measuring Registries combined with additional data sources to develop data marts (eg. procedures, episodes) Analytics tables for dimensions, calculations Risk adjustment: empirical and MSH cohort Outcomes Studies Data Analytics Data Quality Clinical Data Registries Data Stewardship Clinical Informatics Phenotypic Modeling Take Away: Clinical data registries become conformed dimensions with data marts built for agile analytics, reporting, and ad-hoc queries.
35 Outcomes Studies What is the respiratory failure rate in the cardiac surgery patient? Is respirator failure defined by prolonged ventilator times > 24 hours? What is ventilator times for CT ICU excluding OR ventilator period? Maybe we need to exclude transplant and VAD cases? Does this ventilator time include re-intubation periods? Are bounce-back CT ICU patients included in these counts? Maybe we need to exclude those patients with trach? Do you have risk adjustment model for prolonged ventilation? What is the correlation of surgeon clamp time to CT ICU ventilator? Is there a difference in patients returning from OR after 5pm? Is there a difference in patients extubated on weekend? In the CT ICU we changes our Attending staff model in October 2013, did ventilator times drop after this with statistical significance?
36 Outcomes Studies Solutions for Quality Improvement Electronic Health Record Patient Cohort Clinical Data Registries Data Dictionary Enterprise Data Warehouse Data Mart Take Away: Outcomes studies and statistical data analysis used for quality measures, risk adjustment, and assessment of rules-based clinical pathways. Statisticians use same data sets for data mining and charting.
37 Purpose Solutions Approach Delivery Continue overview of Mount Sinai s end-to-end informatics and clinical data analytics approach for quality and safety initiatives. Approach is implementation of data management best practices in alignment to purpose and delivery of Quality Initiatives.
38 Team Approach Team Roles Knowledge Sharing Data Stewardship Agile Approach The goal is to provide routine data monitoring and implementation of data management best practices to increase and optimize the value of clinical data. Solution Sets Care Process Cohort Identification Reporting Outcomes Studies
39 Team Roles RN Clinical Data Coordinators Clinical Data Managers Result Quality Data Quality Data Management Analysts Information Quality Data Analysts Statisticians Take Away: The people and work effort behind tools, logic, and databases are most important assets in making useful and usable clinical data for QAPI initiatives. QAPI Managers Decision Quality Data Scientists
40 Knowledge Sharing Solutions for Quality Improvement Data dictionary Clinical definitions Measures catalog Data stewardship OPs manual STS_Ver STS_Short_Nam Measure_Name Measure_Description STS_Section sion e Indicates whether the patient was alive or dead at discharge from the hospitalization in which surgery occurred. Includes patients who died after transfer to another acute InHosMort care hospital. Q. Mortality 2.73 MtDCStat Indicates whether the patient was alive or dead at 30 days post surgery (whether in 30DMort hospital or not). Q. Mortality 2.73 Mt30Stat Operative Mortality includes: (1) all deaths, regardless of cause, occurring during the hospitalization in which the operation was performed, even if after 30 days (including patients transferred to other acute care facilities); and (2) all deaths, regardless of cause, occurring after discharge from the hospital, but before the end of the thirtieth OpMort postoperative day Q. Mortality 2.73 MtOpD Number of patients undergoing isolated CABG who require return to the operating room for mediastinal bleeding with or without tamponade, graft occlusion, valve AnyReopNQF dysfunction, or other cardiac reason P. Postoperative Events 2.73 COpReBld Number of patients undergoing isolated CABG who require return to the operating room for mediastinal bleeding with or without tamponade, graft occlusion, valve AnyReopNQF dysfunction, or other cardiac reason P. Postoperative Events 2.73 COpReVlv Number of patients undergoing isolated CABG who require return to the operating room for mediastinal bleeding with or without tamponade, graft occlusion, valve AnyReopNQF dysfunction, or other cardiac reason P. Postoperative Events 2.73 COpReGft Number of patients undergoing isolated CABG who require return to the operating room for mediastinal bleeding with or without tamponade, graft occlusion, valve AnyReopNQF dysfunction, or other cardiac reason P. Postoperative Events 2.73 COpReOth Indicate whether a deep sternal wound infection or mediastinitis was diagnosed DeStWoInf within 30 days of the procedure or any time during the hospitalization for surgery. P. Postoperative Events 2.73 CIStDeep Take Away: Staff share in clinical and knowledge discovery process through common data dictionaries and better assist in data wrangling activities.
41 Data Stewardship Solutions for Quality Improvement The value of clinical data as used for quality improvement and data analytics are defined through key attributes and contributing factors: Purposeful Timeliness Reconciliation Data Quality Accessible Informative Result Quality Data Quality Information Quality Take Away: Clinical data registries in data stewardship program have support model, staff productivity metrics, and data quality assurance measures. Decision Quality
42 Agile Approach Solutions for Quality Improvement Agile approach promotes iterative process and team interaction Internal teams work off of JIRA (issue tracking product) Internal documentation through sharepoint and wiki pages Collaborative teams share files through Box.com (account through Mount Sinai) Take Away: Perfect is the enemy of good is an aphorism commonly attributed to Voltaire. (also can be said, Perfect is the enemy of done. )
43 Concluding Thoughts Impacting operational change in clinical settings requires systemization, focus, team approach, and agile responsiveness. Throughout our presentation, we shared a few takeaways with you. In summary: Systemization knowing where/when/why in the care processes Focus - there is complexity of working with large clinical datasets: fraught with multiple paths (project creep, changing measures, etc.) Team approach communication across clinical and technical and front-end and back-end is essential Agile responsiveness quality improvement methodology requires an iterative design and adaptation to dynamic environment
44 Ken McCardle Allison Glasser Mount Sinai Health System
46 Delivery: Sepsis - Providers Providers receive an alert on their patient list when a patient meets criteria for risk of sepsis. Guidelines, order sets and documentation templates are in the EHR Service Area Leaders receive daily reports to monitor timely completion of provider documentation
Performance Measures for Health Care Systems David R. Nerenz, Ph.D. Michigan State University Nancy Neil, Ph.D. Virginia Mason Medical Center Commissioned Paper for the Center for Health Management Research
Quality-Based Procedures Clinical Handbook for Primary Hip and Knee Replacement Health Quality Ontario & Ministry of Health and Long-Term Care November 2013 Suggested Citation This report should be cited
Medications at Transitions and Clinical Handoffs (MATCH) Toolkit for Medication Reconciliation Medications at Transitions and Clinical Handoffs (MATCH) Toolkit for Medication Reconciliation Prepared for:
Electronic Health Record Usability Evaluation and Use Case Framework Prepared for: Agency for Healthcare Research and Quality U.S. Department of Health and Human Services 540 Gaither Road Rockville, Maryland
Guide to Implementing Quality Improvement Principles Foreword Quality Improvement in health care continues to be a priority for the Centers for Medicare & Medicaid Services (CMS). In the Affordable Care
Department of Health and Human Services OFFICE OF INSPECTOR GENERAL HOSPITAL INCIDENT REPORTING SY STEMS DO NOT CAPTURE MOST PATIENT HARM Daniel R. Levinson Inspector General January 2012 OEI-06-09-00091
Monitoring What Matters Health Quality Ontario s Approach to Performance Monitoring and Public Reporting October 2014 Table of Contents Executive Summary... 3 Implementation Timeline... 5 Introduction...
Waste and Inefficiency in the U.S. Health Care System Clinical Care: A Comprehensive Analysis in Support of System-wide Improvements ABOUT NEHI The New England Healthcare Institute (NEHI) is an independent,
ACCURACY OF CONDITION PRESENT ON ADMISSION, DO NOT RESUSCITATE, AND E-CODES IN CALIFORNIA PATIENT DISCHARGE DATA Prepared for the Office of Statewide Health Planning and Development, Healthcare Outcomes
Clinical Decision Support Systems: State of the Art Prepared for: Agency for Healthcare Research and Quality U.S. Department of Health and Human Services 540 Gaither Road Rockville, MD 20850 www.ahrq.gov
Connected Health: The Drive to Integrated Healthcare Delivery 2 Connected Health: The Drive to Integrated Healthcare Delivery www.accenture.com/connectedhealthstudy Table of Contents Executive Summary...
Implementing a State-Level Quality Improvement Collaborative: A Resource Guide From the Medicaid Network for Evidence-based Treatment (MEDNET) Implementing a State-Level Quality Improvement Collaborative:
(is CENTER5 FOR MEDLCARE & MEDICAID SERVICES U.S. DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Medicare & Medicaid Services Center for Medicare and Medicaid Innovation Initial Announcement Cooperative
Department of Health and Human Services OFFICE OF INSPECTOR GENERAL ADVERSE EVENTS IN HOSPITALS: NATIONAL INCIDENCE AMONG MEDICARE BENEFICIARIES Daniel R. Levinson Inspector General November 2010 E X E
GUIDING TRANSFORMATION: HOW MEDICAL PRACTICES CAN BECOME PATIENT-CENTERED MEDICAL HOMES Edward H. Wagner, Katie Coleman, Robert J. Reid, Kathryn Phillips, and Jonathan R. Sugarman February 2012 ABSTRACT:
Royal Australasian College of Surgeons Work-based assessment: a practical guide Building an assessment system around work Tri-Partite Alliance Royal College of Physicians and Surgeons of Canada Royal Australasian
Artigos originais The Evaluation of Treatment Services and Systems for Substance Use Disorders 1,2 Dr. Brian Rush, Ph.D.* NEED FOR EVALUATION Large numbers of people suffer from substance use disorders
Program Evaluation Final Contract Report Evaluation of the Use of AHRQ and Other Quality Indicators Program Evaluation Final Contract Report Evaluation of the Use of AHRQ and Other Quality Indicators Prepared
National study The right information, in the right place, at the right time A study of how healthcare organisations manage personal data September 2009 About the Care Quality Commission The Care Quality
WASHINGTON STATE COMMON MEASURE SET FOR HEALTH CARE QUALITY AND COST Approved December 2014 Page 0 of 19 TABLE OF CONTENTS Introduction, Background and Acknowledgements 2 Overview of Washington s Starter
Key Measurement Issues in Screening, Referral, and Follow-Up Care for Young Children s Social and Emotional Development April 2005 Prepared by Colleen Peck Reuland and Christina Bethell of the Child and
Reducing Care Fragmentation A TOOLKIT FOR COORDINATING CARE Reducing Care Fragmentation 1 Contents I. Introduction 1 MS. G: A Case Study in Fragmented Care...1 II. The Care Coordination Model 4 Care Coordination
Chapter 8. Selection of Data Sources Cynthia Kornegay, Ph.D.* U.S. Food and Drug Administration, Silver Spring, MD Jodi B. Segal, M.D., MPH Johns Hopkins University, Baltimore, MD Abstract The research