NICE Online; a web-based tool for monitoring performance measures in intensive care



Similar documents
de Vos et al. Implementation Science (2015) 10:95 DOI /s Implementation Science

JHM Patient Safety & Quality Dashboard. Quick Start Guide

Lean thinking and Six sigma at the level of Clinical Service Delivery

Topic 7: Introduction to quality improvement methods

Overnight ICU Physician Coverage: Do We Need to Stay in Hospital 24-7?

A Case Study: Knowledge Management Systems to Enhance a Nursing Curriculum

Pre-Post Evaluation of Physicians Satisfaction with a Redesigned Electronic Medical Record System

A Population Health Management Approach in the Home and Community-based Settings

TORONTO STROKE FLOW INITIATIVE - Outpatient Rehabilitation Best Practice Recommendations Guide (updated July 26, 2013)

The use of EHR data in quality improvement reports and clinical automatic calculators in ICU

VQC Acute Pain Management Measurement Audit Tool Guidelines

MEASURING USABILITY OF ICONIC BASED GUIs OF MOBILE EMERGENCY SERVICE SOFTWARE BY USING HCI. Y.Batu Salman, Adem Karahoca

TANDBERG MANAGEMENT SUITE 10.0

Insulin Administration and Meal Delivery Coordination for Hospitalized Patients

ABSS Solutions, Inc. Upper Marlboro, MD 20772

ScrumDesk Quick Start

Medical Records Training Manual for EMR

NURSING INFORMATICS CHAPTER 1 OVERVIEW OF NURSING INFORMATICS

Usability Issues in Web Site Design

Information Governance. A Clinician s Guide to Record Standards Part 1: Why standardise the structure and content of medical records?

Anaesthetics, Pain Relief & Critical Care Services Follow-Up Study REGIONAL REPORT. Performance Review Unit

Fluke 89-IV & 189 Event Logging

A decision support system for bed-occupancy management and planning hospitals

NHA. User Guide, Version 1.0. Production Tool

Certification Program Rates and National/State Overall Rates

Implementation of a high volume, complex clinical pathway for cardiothoracic surgery patients in the intensive care unit.

Integrated Workforce Experience Case Studies

User Requirements for a Practice-integrated Nurse-administered Online Communication Service for Cancer Patients

Measuring road crash injury severity in Western Australia using ICISS methodology

KPIs for Effective, Real-Time Dashboards in Hospitals. Abstract

Learning from Defects

Clinical Comparators: A Guide By Physicians For Physicians

Physician-Led Emergency Department Optimization Dashboard

CINSAY RELEASE NOTES. Cinsay Product Updates and New Features V2.1

Soarian Clinicals Training Manual. For Current Nursing Staff Migrating From Siemens NetAccess to Soarian Clinicals

Customer to Partner Relationship

Electronic health records: underused in the ICU?

Instructions for data-entry and data-analysis using Epi Info

Inpatient rehabilitation services for the frail elderly

How to Login Username Password:

Open-EMR Usability Evaluation Report Clinical Reporting and Patient Portal

Dell Active Administrator 8.0

Track-It! 8.5. The World s Most Widely Installed Help Desk and Asset Management Solution

Novell ZENworks Asset Management 7.5

Briefing for Doctors. Introduction. Electronic Prescribing. Electronic Prescribing:

The graduate training in medical information sciences in the Academic Medical Centre at the University of Amsterdam

Citrix Receiver. Configuration and User Guide. For Macintosh Users

GFI Product Manual. Outlook Connector User Manual

Design of a Multi Dimensional Database for the Archimed DataWarehouse

Health Information Technology Toolkit for Family Physicians

Running head: USABILITY ENGINEERING, COGNITIVE SCIENCE, AND HEALTHCARE INFORMATION SYSTEMS

Screen Design : Navigation, Windows, Controls, Text,

Single Level Drill Down Interactive Visualization Technique for Descriptive Data Mining Results

National Diabetes Inpatient Audit

DETAILED DESCRIPTION OF THE LEADERSHIP AND ORGANIZATIONAL CHANGE FOR IMPLEMENTATION (LOCI) DEVELOPMENT PROCESS

SWORD (Surgical Workload, Audit and Research Database)

Evidence-based guideline development. Dr. Jako Burgers/dr. H.P.Muller Dutch Institute for Healthcare Improvement CBO, Utrecht, The Netherlands

Australian Safety and Quality Framework for Health Care

AROC. Establishing and Maintaining a National Clinical Registry. Frances Simmonds, AROC Director

Acute care toolkit 2

MoSO Supporting Information System of Profound Knowledge

EDUCATIONAL PLANNING TOOL:

ManageMyHealth SMS Text Message Service User Guide. Medtech32. Version 20.0 (March 2012)

Ventura County Credit Union Online Banking User Guide

TORONTO STROKE FLOW INITIATIVE - Inpatient Rehabilitation Best Practice Recommendations Guide (updated January 23, 2014)

Knowledge Visualization: A Comparative Study between Project Tube Maps and Gantt Charts

webmail Outlook web application webmail

Towards a Visually Enhanced Medical Search Engine

Standard 5. Patient Identification and Procedure Matching. Safety and Quality Improvement Guide

Interactive Logging with FlukeView Forms

Quality Improvement Primers. Measurement for Quality Improvement

Australian Safety and Quality Framework for Health Care

What s New in BridgeTrak 8.0?

UNIVERSITY OF BIRMINGHAM AND UNIVERSITY OF YORK HEALTH ECONOMICS CONSORTIUM (NICE EXTERNAL CONTRACTOR)

Interoperable Learning Leveraging Building Information Modeling (BIM) in Construction Management and Structural Engineering Education

Lean UX. Best practices for integrating user insights into the app development process. Best Practices Copyright 2015 UXprobe bvba

Improving Breakthrough Pain Assessment in a Residential Hospice

NOC: Occupation: Respiratory Therapist

Adoption of the National Early Warning Score: a survey of hospital trusts in England, Northern Ireland and Wales

A Project to Reengineer Discharges Reduces 30-Day Hospital Readmission Rates. April 11, 2014

EHR Usability: How to Recognize It and Where to Find It WHITE PAPER

Name / Team Name. App Name. Category

Business Intelligence Capabilities Sage MAS 90 ERP and QuickBooks Enterprise Solutions

White Paper: Designing Resourceful Graphical User Interfaces (GUIs) for Healthcare Applications

Assessment of Primary Care Resources and Supports for Chronic Disease Self Management (PCRS)

Practice profitability

Screen Design : Navigation, Windows, Controls, Text,

WHO STEPS Surveillance Support Materials. STEPS Epi Info Training Guide

Leading Genomics. Diagnostic. Discove. Collab. harma. Shanghai Cambridge, MA Reykjavik

MEDICAL QUALITY MANAGEMENT: THEORY AND PRACTICE Previously titled: Core Curriculum for Medical Quality Management

Accountable Care: Implications for Managing Health Information. Quality Healthcare Through Quality Information

Process Improvement Handbook. A Commitment to Ongoing Improvement Using a QAPI Framework

User Manual for Web. Help Desk Authority 9.0

Australian Safety and Quality Framework for Health Care

Guidian Healthcare Consulting

Mental Health Smartphone Application A New Initiative for Mental Health Care Providers

Introduction of a Dedicated Admissions Nurse to Improve Access to Care for Surgical Patients

Meditech EMR Introduction and Physician Training Tool James W Langley MD MS Director of MHS Medical Informatics October 2006

Hamline University Administrative Computing Page 1

Transcription:

Copyright 2011, Nederlandse Vereniging voor Intensive Care. All Rights Reserved. Received October 2010; accepted November 2010 Review NICE Online; a web-based tool for monitoring performance measures in intensive care NF de Keizer 1, L Peute 1, E van der Zwan 1, M Jaspers 1, E de Jonge 2 1 Department of Medical Informatics, Academic Medical Center, Amsterdam, The Netherlands 2 Department of Intensive Care, Leiden University Medical Center, Leiden, The Netherlands Abstract - Background: Accountability and public interest in the quality of intensive care have increased the need for tools that give care providers insight into their performance and that monitor the effect of changes in the care process. Aim: This paper describes the development of such a tool in the context of the Dutch intensive care quality assurance registry National Intensive Care Evaluation (NICE). Methods: Based on usability engineering principles we were able to develop a web-based application, called NICE online, that provides a quick overview of the current status of performance measures as well as an easy entry point to further analyze the characteristics and outcomes of an ICU s own population. Results: The tool provides external benchmarks such as the national average or a group of ICUs with comparable organisation characteristics; these can be used by an ICU as a reference to investigate whether certain process improvements are needed. To internally benchmark an ICU s performance over time and to monitor the effect of changes in the care process, Statistical Process Control techniques can be used. Conclusion: Based on the usability evaluation and the usage patterns we conclude that the users of NICE Online appreciate the newly developed tool. NICE Online is a tool under continuous development. Based on regular feedback from the users we will further improve and extend the functionality of the system. Keywords - Evaluation, quality of care, information system, usability engineering Introduction Attention for quality assessment and improvements in intensive care has been growing during recent decades [1,2]. Quality indicators and clinical performance measures are increasingly being used to support and guide the implementation of improvements in the quality of care. Mortality is one of the most commonly used outcome measures in intensive care but other outcome, process or structure measures are also in use, such as complications, glucose regulation, and the availability of intensive care units (ICUs) physicians. The Plan-Do-Study-Act (PDSA) cycle is often used as a way to assist quality improvement [3]. The systematic gathering of clinical performance measures forms the basis to plan changes in the care process or organization and study the effectiveness of these changes [4,5]. To routinely have performance measures of ICUs available, medical registries within the intensive care domain have been established worldwide. Examples are the Australian and New Zealand Intensive Care Society (ANZICS) [6], Project IMPACT in the USA [7], the Intensive Care National Audit and Research Centre (ICNARC) in the UK [8], and the National Intensive Care Evaluation (NICE) Foundation in the Netherlands [9]. These registries facilitate data collection and send participating ICUs periodical feedback reports on their performance over time and in comparison to other Correspondence NF de Keizer E-mail: n.f.keizer@amc.uva.nl units, including groups of units. Although these feedback reports are potentially effective in improving quality of care [10-12], the fixed content of these reports and the resources needed to process the data, may hamper the optimal use of the clinical performance measures for planning and monitoring changes in the care process [11]. One solution is to add an online analyses tool enabling the monitoring and further analyses of performance measures and population characteristics to the fixed feedback reports. Such an application requires a very intuitive and easy-to-use user interface as different types of users with varying computer and analyses skills will use it. The aim of this paper is to describe a web-based tool that has been recently developed according to usability engineering principles. This tool enables the online monitoring and further analyses of performance measures in the context of the Dutch NICE registry. Methods Setting In 2006 the Netherlands Society for Intensive Care (NVIC) developed a set of eleven structure, process, and outcome measures for Dutch ICUs. These measures were as follows: Availability of intensivist (hours per day), Patient-to-nurse ratio, Strategy to prevent medication errors, Measurement of patient/family satisfaction, Length of ICU stay, Duration of mechanical ventilation, Proportion of days with all ICU beds occupied, Proportion of hypo- or hyperglycaemic measurements, 131

NF de Keizer, L Peute, E van der Zwan, M Jaspers, E de Jonge Standardized mortality ratio, Incidence of pressure ulcers, and finally, Number of unplanned extubations [13]. This development took place in close collaboration with the NICE foundation. NICE was founded in 1996 by the Dutch intensive care profession to set up and maintain a national medical registry. The aim was to assist ICUs in improving their quality of care by systematically and continuously monitoring, assessing and comparing ICU performance based on the outcome indicators of case mix adjusted hospital mortality and length of ICU stay. The idea behind benchmarking the performance of individual ICUs to the national average, best performer or comparable ICUs is to provide insight into areas for improvement. Since its start, the NICE dataset has been extended with a registry of complications and daily SOFA scores. Furthermore, from 2008 on, NICE has facilitated data collection on all eleven quality indicators developed by the NVIC. In 2010, 76 of the 94 Dutch ICUs were voluntarily participating in the NICE registry, of which 47 also collected the dataset of eleven quality indicators. The NICE registry sends its participants a standard quarterly feedback report on their performance over time (internal benchmark) and in comparison with other units and groups of units (external benchmark). In 2004, by request of some participating ICUs, a webbased Query tool was developed based upon the knowledge of two domain experts and NICE s senior software engineer (EvdZ) [14]. The intention of the tool was to provide clinicians with the means to analyze their hospital data online and compare it with national averages. The tool provided them with the opportunity to develop/ define queries themselves and in doing so compose a graph or a table displaying the query results. A log file analysis, performed in 2008, showed that only 17% of the participants with a user account actually used the query functionality on a regular basis. Feedback by telephone from the users indicated that users did acknowledge the usefulness of such a tool but considered the interface structure for query development too complex. In addition, the application did not yet include the new datasets such as the quality indicators and performance measures collected since 2008. To accommodate these new requirements and enhance the tool s usability, redesign of the system based on user evaluation was considered the next step. Usability evaluation First, a usability evaluation was performed in 2008 on the Query Tool available at that time in order to study how to simplify the query functionality and improve upon the general usability of the Tool. For the usability evaluation, a representative set of sixteen end-users was selected with varying levels of computer, statistical and query development expertise [15]. In order to evaluate the cognitive workload of query development and the general usability of the application, six tasks were provided to the 16 end-users. Two methods from the field of cognitive psychology, the concurrent and retrospective think aloud [16], were applied to observe users while performing the predefined tasks in the Tool. The experiments took place in the actual clinical working area of the end-users. A portable usability laptop with Morea software, v3.2, TechSmith, enabled the recording of all the end-users verbalizations for the think aloud verbal protocol analysis in combination with a screen recording of their (mouse) actions in the system and a video recording of the subjects performing the actual tasks in the Query tool on the usability laptop. In total 43 usability problems were found, ranging from small graphical user interface problems, such as not recognizing help symbols, to serious terminology problems and user cognitive task mismatches with the underlying query model of the Tool. Table 1 shows examples of these usability problems and their proposed redesign solution. Based on these results, the senior software engineer (EvdZ) and the usability expert (LP) developed a proposal for redesigning the NICE Online Tool. This proposal consisted of two screenshots: the Indicator screenshot and the Analyses screenshot. The Indicator screenshot showed the direct status of the clinical performance measures (quality indicators) as they were considered the perfect gateway for further user analysis in a user-query tool and were thus prominently placed on the new NICE Online interface. The Table 1. Examples NICE Online usability requirements Analyzed usability problem Re-design solution New usability requirement Starting and navigating in the website is too complex Developing queries requires complex user actions which might be overloading Cognitive mismatch between developed query and query result Users are unable to accurately define a period in the query model Users are unable to change figures in the system, they define additional wishes Users request overall support to monitor their quality of care New gateway to NICE Online, start with performance indicators, from there enter the Analysis Tool Redevelopment query model, start from outcome measures. Direct view on query result Separate period scaling for figures, static when changes in query are made. Add separate options to change or add information in graphs or titles Add statistical process control charts High Learnability, ease of use High Learnability, ease of use Improve performance, user comprehension of query Improve ease of use, additional functionality, system flexibility Support user preferences, system flexibility Additional functionality 132

NICE Online; a web-based tool for monitoring performance measures in intensive care Analyses screenshot consisted of an interface based on the query development model extracted from the think aloud verbal protocol analysis which was expected to support even the most novice users in analyzing their hospital data. This proposal was then presented and agreed upon for further development by the executive board of the NICE registry. A team was assembled for the Tool s redesign project including two software engineers, the usability expert and six data managers/researchers. The software engineers developed a working prototype with JAVA and the Google Webkit (GWT). The GWT allowed the creation of an interface which could respond directly to user actions. After several rounds of design iteration, the tool was considered ready for a post redesign usability evaluation. The goal of this post redesign usability evaluation was to see whether the improved user interface met the needs of NICE participants and whether the newly developed underlying query model was easy and pleasant to use. The post redesign usability evaluation was performed with 16 end-users of which eight participants likewise participated in the first usability test. The results of the test showed that all users were highly satisfied with the new Tool and could easily access and analyze their ICU data by means of the underlying query model. In total, 35 small usability issues were identified in the redesigned Tool. These usability issues were all addressing minor interface characteristics that after redesign would only enhance the user friendliness of the system even more. Figure 1. Indicator window showing actual state of performance measures Results: The Redesigned NICE Online Tool In June 2010 the redesigned NICE Online Tool was released to the NICE participants. To use NICE Online, a user account has to be requested. An account consists of a username and a password and it is always linked to a single hospital. It is not possible to view or query data from other hospitals. To prevent dormant accounts from existing, users are able to view the active accounts for their own hospital. After a user logs in to NICE Online, the Indicator window auto matically shows the trend of all collected performance measures over the last six months for that particular ICU (Figure 1). Next to each performance measure, an info ( toelichting ) button can be used to get more information about the presented performance measure and by clicking on the analysis ( naar analyse ) button the user is directed to the Analysis tool for more detailed analyses. The Analysis tool by default takes the performance measure that the user selected in the Indicator window as input but the user can easily switch to another measure by using the navigation menu. The navigation menu is positioned at the upper part of the Analysis tool and consists of five selection columns (Figure 2). Through the first (left) column one can switch to a performance measure ( uitkomstmaat ) other than the one that was selected in the Indicator window. Per measure multiple types of data can be viewed by selecting preferences in the second ( inzicht in ) column of the navigation menu. Concerning, for example, the performance measure Mortality-SMR ( uitkomstmaat ) the user can select which prognostic model (APACHE II, APACHE IV or SAPS II) should be used to calculate the Standardized Mortality Ratio (SMR) and whether the number and/or percentage of observed mortality should likewise be displayed. The third or middle column ( vergelijk ) can be used to define subgroups on which a comparison concerning a certain indicator should be based, e.g. surgical patients versus non-surgical patients or different age categories. The fourth column ( includeer ) can be used to define a specific patient population to be included in the analyses, e.g. surgical patients over 80 years. One of the selection options in the third and fourth columns is the APACHE IV diagnoses list to compare performance measures of two or more diagnostic categories or include one or more specific diagnostic category by the exclusion of all other diagnostic categories. The last column ( benchmarks ) allows the selection of a benchmark to which the results of the individual ICU has to be compared, e.g. the national average or the average of a group of equally sized (based on number of admissions) ICUs. Within three seconds after a selection has been made in the navigation menu, the resulting figure is displayed in the bottom part of the Analysis tool. Right next to the resulting figure the user can change the time period of the analysis and use the Graphics options to add or change axes titles and scales. At the left hand side of the Indicator window and the Analysis tool, a main menu is displayed with some interesting functionalities such as, a link to the data dictionary which extensively describes the variables in the NICE database, a list of frequently asked questions about NICE Online and an E-learning 133

NF de Keizer, L Peute, E van der Zwan, M Jaspers, E de Jonge course for novel users. NICE Online should, in principle, be usable without any specific training although the 30-minutes E-learning will provide good insight into all functionalities. Statistical Process Control As reported in Table 1, the usability analyses revealed the need for an additional tool functionality to support users overall in monitoring their quality of care. Improvements in quality of care require changes in the way of working and/or changes in the organizational processes of a healthcare institution. But such changes do not always lead to improvements in the care delivered. To discriminate between changes that yield improvement and those that do not, outcome and process performance measures need to be assessed the study phase in the PDSA cycle. In addition, these performance measurements guide decisions about where improvement efforts should be focussed on in the first place the plan phase of the PDSA cycle. Statistical Process Control (SPC) may facilitate both plan and study phases. SPC was originally developed in the 1920s by the physicist Walter Shewhart to improve industrial manufacturing. It migrated to healthcare, first in laboratory settings and then into direct patient care applications [17]. The Statistical process control approach is based on learning through data and has its foundation in the theory of variation, which entails understanding common and special causes. Control charts, central to SPC, are used to visualize and analyze the performance of a process over time, sometimes in real time. The aforementioned includes biological processes such as blood glucose regulation or organizational processes, or outcome in a hospital such as mean ventilation duration or mortality. Statistically derived decision rules help users to determine whether the performance of a process is stable and predictable or whether it varies over time making the process unstable and unpredictable. One source of such variation could, for example, be a successful intervention aimed at improvement that changes performance for the better. If the improvement is maintained, the process will stabilize again at its new level of performance. When a process remains instable, this then indicates that a change is needed. All of this can be easily determined by using SPC. Control charts for the quality indicators mortality, length of stay, duration of ventilation, time between glucose measurements and duration of hyperglycaemia and hypoglycaemia are included in NICE Online. Figure 3 shows an SPC chart of the mean glucose values of one ICU for 2009. This graph shows that the process is not stable (in March it is below the 3 sigma line and in November above the 3 sigma line) and that the mean glucose level is relatively high in this time period. Based on this graph, this ICU might consider the implementation of a glucose regulation guideline. The effect of implementing such a guideline can, again, be monitored by the SPC chart. Figure 2. NICE Online Analysis tool with navigational menu at the top and resulting graph at the bottom. In this mortality example ( uitkomstmaat ) the APACHE IV SMR ( inzicht in ) of medical patients ( includeer ) of an individual IC is shown per quarter ( vergelijk ) for the year 2009 ( periode ) compared to the national mean SMR ( benchmark ) of medical patients. 134

NICE Online; a web-based tool for monitoring performance measures in intensive care Usage of NICE Online In the first four months since its launch in June 2010, NICE Online has been used 328 times by 64 different users from 50 different hospitals. On average these users used NICE Online 4.1 times a month. Within each of these 328 sessions, users could analyze multiple performance measures. Table 2 shows the frequency of analyses per performance measure. Most interest was in the performance measure mortality including SMRs. over time. SPC is a method to support the internal benchmarking process which is a valuable addition to external benchmarking. With internal benchmarking one can monitor whether processes are stable and whether changes in the care process lead to a rise in performance which stabilizes over time. An external benchmark is needed as a reference of performance to be achieved. Although SPC is up coming and promising there are still many questions Discussion Based on extensive usability analyses of a former query tool, we were able to develop NICE online for monitoring and analyzing performance measures in the context of the Dutch NICE registry. The current version of NICE Online provides a quick overview of performance measures as defined by the Netherlands Society for Intensive Care (NVIC) over the last six months as well as an easy entry point for further analyzing characteristics and outcomes of an ICU s own population. The post redesign round of usability evaluation, on the newly developed NICE Online, showed only minor usability problems left but overall the users are highly satisfied and can easily work with the system. The usage of NICE Online is much higher than that of the former query tool which indicates that the users appreciate using the newly developed system. NICE Online is an important tool in the Plan-Do-Study-Act (PDSA) cycle as it supports users in gaining insight into areas for improvement and thereby forms the basis to plan changes in the care process. Once these changes have been implemented, NICE Online supports users in the study of the effectiveness of these changes. An important new functionality of the new NICE Online is Statistical Process Control to monitor performance measures Table 2. Frequency of analyses per performance measure Performance measure Number of times analysed (%) Mortality 1760 (34.3%) Admissions and readmissions 926 (18.1%) Length of stay 767 (15.0%) VLAD/CUSUM 529 (10.3%) Ventilation duration 304 (5.9%) Prognostic scores 292 (5.7%) Bed occupancy 188 (3.7%) Nurse:patient ratio 91 (1.8%) SPC length of stay 78 (1.5%) SPC mortality 66 (1.3%) SPC ventilation duration 32 (0.6%) SPC Glucose values 31 (0.6%) Stress ulcer 31 (0.6%) SPC time to next glucose 14 (0.2%) measurement SPC hypoglycaemia 8 (0.2%) SPC hyperglycaemia 7 (0.1%) Figure 3. SPC chart on mean glucose values 135

NF de Keizer, L Peute, E van der Zwan, M Jaspers, E de Jonge on how to use it adequately in the health care setting. A literature review of Thor et al. [17] concluded that SPC is a versatile tool to detect changes in healthcare, but they also identified methodological pitfalls on the construction of the control charts as barriers to successful application of these charts in quality improvement. Variation in biological and health care processes is higher than in most industrial processes for which SPC was originally developed. The lack of case mix adjustment and the use of wrong types of control charts are examples of commonly made errors; these probably have large consequences when users try to optimize care processes based on false alerts produced by these charts, or when users believe that their processes are stable while they are actually not. Therefore, the NICE foundation is investigating the sensitivity of different types of SPC charts in their ability to detect changes in performance in order to provide our participants with validated results. Furthermore, based on the usage patterns (Table 2) we conclude that SPC analyses are not the favourite choice of our users at this time. They probably need some more training in how to interpret the charts before using it regularly. NICE Online is a tool under continuous development. Based on regular feedback from our users, we intend to further improve and extend the functionality of the system. For example, the current list of performance measures will be extended as well as other benchmarks, e.g. university centres or level of ICU. The current version only includes graphs but output presented as tables will be available soon. In summary, the application of usability engineering methods in the redesign of the NICE online tool has produced a highly usable application for a variety of end user groups. We will continue to enhance the functionalities of NICE Online and assess new versions of it on their usability. Acknowledgement We would like to thank all participants of the usability evaluations for their openness and valuable remarks regarding NICE Online and the NICE data managers, researchers and software engineers for their contributions to the content of NICE online. References 1. Curtis JR, Cook DJ, Wall RJ, Angus DC, Bion J, Kacmarek R, Kane-Gill SL, Kirchhoff KT, Levy M, Mitchell PH, Moreno R, Pronovost P, and Puntillo K. Intensive Care unit quality improvement: A how to guide for the interdisciplinary team. Critical Care Med 2006;34(1):211-18. 2. McMillan TR and Hyzy RC. Bringing quality improvement into the intensive care unit. Critical Care Med 2007 35;2 (Suppl):S59-65. 3. Berwick DM. Developing and testing changes in delivery of care. Annals of Internal Medicine 1998:128(8):651-6. 4. Langley GJ, Nolan KM, Nolan TW, Norman CL, and Provost LP. The improvement guide: a practical approach to enhancing organizational performance. Jossey-Bass Publishers, San Francisco, 1996. 5. Kritchevsky SB and Simmons BP. Continuous quality improvement. Concepts and applications for physician care. JAMA 1991:266(13):1817-23. 6. Stow PJ, Hart GK, Higlett T, George C, Herkes R, McWilliam D, and Bellomo R. Development and implementation of a high-quality clinical database: the Australian and New Zealand Intensive Care Society Adult Patient Database. Journal of Critical Care 2006;21:133-41. 7. Cook SF, Visscher WA, Hobbs CL, Williams RL, and the Project IMPACT Clinical Implementation Committee. Project IMPACT: Results from a pilot validity study of a new observational database. Critical Care Medicine 2002;30(12):2765-70. 8. Harrison DA, Brady AR, and Rowan K. Case mix, outcome and length of stay for admissions to adult general critical care units in England, Wales and Northern Ireland: the Intensive Care National Audit & Research Centre Case Mix Programme Database. Critical Care 2004;8(2)R99-111. 10. Jamtvedt G, Young JM, Kristoffersen DT, O Brien MA, and Oxman AD. Audit and feedback: effects on professional practice and health care outcomes. Cochrane Database Syst Rev (2):CD0002592006. 11. Van der Veer SN, De Keizer NF, Ravelli ACJ, Tenkink S, and Jager KJ. Improving quality of care. A systematic review on how medical registries provide information feedback to health care providers. International Journal for Medical Informatics 2010;79:305-23. 12. De Vos M, Graafmans W, Kooistra M, Meijboom B, Van der Voort P, and Westert G. Using quality indicators to improve hospital care: a review of the literature. International Journal for Quality in Health Care 2009;21(2):119-29. 13. De Vos M, Graafmans W, Keesman E, Westert G, and Van der Voort P. Quality measurement at intensive care units: which indicators should we use? Journal of Critical Care 2007;22:267-74. 14. Benner PM, de Keizer NF, Arts DGT, Bosman RJ. NICE Online for benchmarking. ICU Management 2005;4:16. 15. Peute LW, de Keizer NF, Jaspers MW. Cognitive evaluation of a physician data query tool for a national ICU registry: comparing two think aloud variants and their application in redesign. Stud Health Technol Inform. 2010;160:309-13. 16. Jaspers MWM. A comparison of usability evaluation methods for testing interactive healthcare applications: methodological aspects and empirical evidence. Int J Med Inform 2009;78(5):340-353. 17. Thor J, Lundberg J, Ask J, Olsson J, Carli C, Härenstam KP, Brommels M. Application of statistical process control in healthcare improvement: systematic review. Qual Saf Health Care 2007;16:387-399 9. Bakshi-Raiez F, Peek N, Bosman RJ, De Jonge E, and De Keizer NF. The impact of different prognostic models and their customization on institutional comparison of intensive care units. Critical Care Medicine 2007;35(11): 2553-60. 136