Managing Safety: Lessons from the Titanic and the Air Force Gregg S. Meyer, MD, MSc Senior Vice-President, Quality and Safety, MGH/MGPO Patient Safety CoE October 2008 1
Institute of Medicine Recommendation November 1999 Establish a national focus of research, tools, and protocols to enhance knowledge base about patient safety Identifying and learning from errors through both mandatory and voluntary reporting systems Raising standards and expectations for improvement in safety through actions of oversight organizations Create safety systems inside health care organizations through implementation of safe practices at the delivery level 2
Be aware (and beware) of history Plus ca change, plus ce la meme chose We are just stewards 3
Errors are Useful Information We learn more from our failures than we may from success Give me a fruitful error anytime, full of seeds, bursting with its own corrections. You can keep your sterile truth for yourself - Vilfred Pareto Can improve our process when studied Benign error may predict disasters or bad outcomes 4
Medical Errors are not Unique Share important causal factors with those in other complex systems Transportation Aviation Railroads Automobiles Nuclear power Petrochemical Industry 5
The Swiss Cheese Model of System Accidents J.Reason, BMJ 2000;320:768-770 770 6
Types of Errors In April 1912 Titanic was the world s newest, most technologically advanced and largest liner. Despite all of its innovative technology the ship sank on a clear night on its maiden voyage with the loss of over 1500 lives Active (pushing the boat hard in uncertain conditions) and Latent (inadequate contingency planning) errors produced the final outcome 7
Captain Smith Did not reduce speed when informed of an ice field ahead It was a clear night with good visibility Pressured by owner to set new speed record for Atlantic crossing Went down with the ship 8
Wireless Officer Phillips Only one radio channel Wireless operator Phillips placed priority on sending out personal messages over receiving iceberg warnings Went down with the ship 9
Lookout Fred Fleet First spotted iceberg dead ahead at 500 yards Did not know where to find binoculars No shakedown cruise for training Manned one of the lifeboats as crew 10
Officer of the Deck Murdoch Order all engines back full to stop ship Should have increased speed No training for officers on handling of large single rudder ships Commanded one of the lifeboats 11
Lifeboats Designers planned for new regulations Added double davits to accommodate 32 lifeboats 12
Owner Bruce Ismay Decided not to add extra lifeboats since new regulation had not been enacted Extra lifeboats would cut down on space on promenade deck Placed comfort of first class passengers over safety considerations Owner survived sinking in one of the lifeboats But later committed suicide during stock market crash 13
Reason s Types of Errors Categories based on who initiated the error and how long it takes to have an adverse effect Active are errors committed by those in direct contact with the human-system interface (human error) Latent are the delayed consequences of technical and organizational actions and decisions 14
Sharp End - Active Failures Individuals at the sharp end are in direct contact with the human-system interface. They administer care to patients. Their actions and decision may result in active failures. Sharp End 15
Rasmussen s Model of Human Error Skill based behavior Rule based behavior Knowledge based behavior 16
Skill Based Behavior Perform routine tasks e.g. Driving while listening to the radio, holding a conversation 17
Rule Based Behavior Perform familiar tasks, experience e.g. approach familiar stop sign access stored info = slow car down, look both directions, etc.. 18
Knowledge Based Behavior Knowledge based behavior Novel ovel situation, problem solving at conscious level e.g. traffic lights broken at busy junction Consciously generate solution? Proceed or stop 19
Blunt End - Latent Failures Individuals at the blunt end take actions and/or make decisions that affect technical and organization policy and procedures and allocate resources. Their actions and decisions may result in latent failures. Blunt End 20
Misadventures Happen When: Blunt end actions and decisions created latent underlying conditions Sharp end actions and decisions create active human failure = Misadventure + Latent Active Failure Failure Misadventure 21
Tragedy #1: TWA Flight 514 Outside of Dulles Airport - 1974 Trans World Airlines (TWA) Flight 514 was inbound to Dulles Airport through cloudy and turbulent skies, when the aircraft descended to 1,800 feet before reaching the approach segment where that minimum altitude applied. Flight 514 collided with a Virginia mountain top, with the loss of all lives on board. United Airlines had a similar incident 6 weeks earlier The Aviation Safety Reporting System at NASA was developed as a response 22
Tragedy #2: Intrathecal Vincristine ALBANY MEDICAL COLLEGE 1986 A mix-up in Albany, N.Y., between vincristine and another cancer drug that is supposed to be injected into the spine, killed 21-year-old Lillian Cedeno and forced the premature delivery of her daughter, who died a few weeks later. Similar reports can be found as early as 1968. In March 1989, Michael Sosnoskie of Middletown, Dauphin County, a 3-year- old Down syndrome child with leukemia and heart problems, died at Hershey Medical Center after a mix-up of two cancer-fighting drugs. 23
Be prepared for the long haul - What can you deliver by the end of your residency? NAVAL AVIATION MISHAP RATE Class A Mishaps/100,000 Flight Hours 776 aircraft destroyed in 1954 FY 50-96 Angled Carrier Decks 60 Naval Aviation Safety Center 50 NAMP est. 1959 40 RAG concept initiated NATOPS initiated 1961 30 Squadron Safety program 20 System Safety Designated Aircraft 10 ACT HFC s 0 50 65 80 96 Fiscal Year 39 aircraft destroyed in 1996 2.39 24
Get inspiration wherever you can Catapoultry Clinton Healthcare Reform The importance of process USUHS work on Med Exec MedXellence: Critical Decision Making for Improving Healthcare Delivery 25
The Missing W Question What happened? Why did it happen? What can we do to prevent it from happening again? 26
Someone s Got to Pay - Blame 27
Punishment and Safety Ironically, rather than improving safety, punishment makes reducing errors much more difficult by providing strong incentives for people to hide their mistakes, thus preventing recognition, analysis, and correction of underlying causes. Leape 1998 28
Organizational Culture Pathologic Shoot the messenger Bureaucratic Write a new rule Learning (Generative) Understand broader implications - its generalizability Ron Westrum 29
Blame and Train (the CME approach) 30
Iceberg Model of Accidents and Errors Actually Misadventure Death\severe harm Actual Harm Occurred No Harm Event Potential for harm is present Near Miss Unwanted consequences were prevented because of recovery Recovered 31
Heinreich s Ratio It has been proposed that reporting systems could be evaluated on the proportion of minor to more serious incidents reported 1 Major injury 29 Minor injuries 300 No-injury accidents Countless close calls Highly reliable organizations place most of their focus here 1 29 300 1. Heinreich HW Industrial Accident Prevention, NY And London 1941 2. An Organization With a Memory, A report of an expert group on learning from adverse events in the NHS chaired by the Chief Medical Officer, The Stationary Office, London 2000 32
Event Report Rate from a Hospital Transfusion Service 45 40 35 30 25 20 15 10 5 0 Aug Sep Oct Nov Dec Jan Feb Mar Apr May Jun Orientation Jul 33
A Delicate Balancing Act - Accountability v Learning in Patient Safety Reporting Discipline Voluntary Reporting A Feeling of Trust Just System of Organizational Culture Motivation 34
Everyone in Organization is Accountable for their Actions and Decisions If you hold sharp end individuals accountable for their actions and decisions You must also make blunt end individuals accountable for their actions and decisions as well 35
Culpability Blameworthiness Consoling, Coaching, and Consequences Intention Intentionally Knowingly Recklessly Negligently Action Consequence Punitive Response 36
High Detection Rate Detection is the first step in error management From an organizational point of view it is important that error detection rate be high Errors that are not detected can have disastrous consequences (Zapt & Reason 1994) The goal of error management is to increase error detection & reporting rates 37
Event Severity Level (ESL) Actual or Potential Level of Harm Severity 1 Fatal outcome/ Serious injury Severity 2 Minor, transient injury Severity 3 No harm or ill effects 38
Relationship of DSL to ESL Risk DSL ESL Information 39
Consequences of a High DSL High volume of events Events different levels of investigation Routine investigation Expanded investigation 40
Risk The possibility of recurrence X the severity of the event Recurrence Probable Possible Unlikely Remote X Severity Level One - death or severe harm Level Two - moderate or transient harm Level Three - minimal or no ill effects 41
Risk Assessment The Key To Priority Setting Recurrence Hi Low Lower Risk Greatest Risk Hi Low Recurrence Low Severity Hi 42
Where We Need Your Help Do we want to train a great goalie or reduce shots on goal? How do we know where the puck is going to be? Everyone has two jobs, to do the work and improve the work 43
What you can do about it Use our systems (avoid workarounds) CPOE DEx ROE Medication Reconciliation Responsible Physician (PEPL) Electronic Medical Records (LMR and EMR) Keep Score Quality and Safety Dashboard Let us know when the systems are failing 44
What you can do about it Yesterday Today A Web-Based Reporting System Pen and Paper 100% Report Readability Convenient Access to Reporting Forms from Your Computer Easy Entry: Online Forms Guide You Through the Reporting Process Timely Reporting to Those Who Need to Know 45
KEEPING SCORE: Bigelow Team E Balanced Scorecard Safety AT LEAST ONE SAFETY REPORT WAS FILED (WRONG MEDICATION) BY TEAM AND SUBSEQUENT ACTION WAS REVIEWED Medication Reconciliation Massachusetts General Hospital Medication Reconciliation Project Performance Dashboard Sub-Service Report Admissions from December 1, 2007 through February 29, 2008 Percent of Admissions with PAMLs Complete at 24 Hours after Admission Score equal/better than Benchmark target Score 50-90% of Benchmark target Score worse than 50% of Benchmark target Service Month 3 month Summary December January February MEDICAL PRIVATE 14 93% 26 92% 18 100% 58 95% MEDICINE 1 104 96% 162 93% 104 89% 370 93% MEDICINE 2 145 97% 123 96% 82 93% 350 96% MEDICINE 3 96 96% 87 97% 62 97% 245 96% MEDICINE 4 169 92% 200 91% 146 93% 515 92% MEDICINE 5 3 100% 5 100% 1 100% 9 100% MEDICINE A 84 87% 89 81% 57 63% 230 79% MEDICINE B 99 96% 95 92% 49 82% 243 91% MEDICINE C 89 99% 69 99% 46 83% 204 95% MEDICINE D 95 94% 93 90% 67 84% 255 90% MEDICINE E 83 96% 102 93% 84 81% 269 90% 46
What you can do about it Listen to your patients and their families 47
What you can do about it Disclosure Look for a designated situation manager Taking care 48
Take Care in Goal Setting 49
Where Are We On The Healthcare Quality and Safety Journey? Optimism as a force multiplier 50
Acknowledgements: I would like to acknowledge Dr. James Battles of AHRQ, a former naval officer who taught me the Titanic story 51