Hospitals of the Future Ubiquitous Computing support for Medical Work in Hospitals Jakob E. Bardram Centre for Pervasive Healthcare Department of Computer Science, University of Aarhus Aabogade 34, 8200 Århus N, Denmark bardram@daimi.au.dk Abstract. This paper describes the visions and on-going research within creating ubiquitous computing support for medical work in the hospitals of the future. Today, clinical computer systems seldom play any role in in the execution of clinical work as such. Electronic Patient Records (EPR) are more often located in offices at a hospital rather than at patients bedside, or in operating theaters. The are a number of challenges to the hardware and software design of contemporary computer systems that make them unsuitable for clinical work. It is, for example, difficult to operate a keyboard and a mouse while operating a patient. Research within UbiComp provides a range of new conceptual and technological possibilities, which enable us to move clinical computer support closer to the clinical work setting. An important branch of the research at the Danish Center for Pervasive Healthcare is to design and develop such new ubicomp computer technologies for clinical work. 1 Introduction Studies of the usage of contemporary clinical computer systems in medical work in hospital have revealed that conventional computer technology designed for office use is inadequate for use in a hospital setting [3, 7, 4, 6]. There is a range of challenging properties of medical work, which makes it fundamentally different from typical office work: extreme mobility, ad hoc collaboration, interruptions, high degree of communication, etc. attributes that are in strong contrast to normal office work. In a hospital setting such a basic thing for desktop computers as a desk and a chair simply does not exist. Hence, there is a need for creating fundamental new concepts for creating computer systems in a hospital environment. Based on observations like the ones presented above, we argue that researching the design and development of ubicomp technologies for use in hospital can be particular beneficial for the work of clinicians. In this paper we will present our research into the Hospital of the Future where clinical computer systems can become more tightly interconnected to the medical work of clinicians in hospitals. 2 Vision The Hospital of the Future The Hospital of the Future is a vision of a highly interactive hospital, where clinicians can access relevant medical information and can collaborate with colleagues and patient
independent of time, place, and whatever they are doing. Other researchers have used the term intelligent hospitals [11]. We prefer, however, the term interactive hospital, because it in a better way covers the goals we are pursuing, and because we are not embedding any kind of artificial intelligence in the hospital. The vision of an interactive hospital can be divided into different research themes, of which we will consider some here. Examples from our vision videos can be seen in figure 1. 2.1 Infrastructure We believe that in order to create a truly interactive hospital environment we need to devise a new kind of basic infrastructure upon which the clinical computer systems can be designed, developed, and deployed. As a part of its core execution environment, this infrastructure must support the aspects of medical work already introduced above, and is to work as the computational spinal cord of a hospital. The infrastructure must support clinicians to move around freely inside (and outside) the hospital, while maintaining their computational environment intact. Clinicians should be able to move around while they initiate, pause, resume, and suspend their interaction with the clinical computer systems. The infrastructure should also ensure the proper inter-operability of the many different clinical systems in use at a hospital, providing basic mechanisms for developers of clinical systems to create highly integrated systems. Important themes in such an infrastructure are: Collection of services The infrastructure must support the work of clinicians and try to encompass and adapt to the various services needed in a changing heterogeneous environment. For example, a radiologist arriving at a conference room should be able to transfer his entire radiology conference from his PDA to a wall-sized display. Secure Such an infrastructure obviously needs to incorporate security for the clinical applications running on top of it. Context Aware We believe that creating context-aware clinical computer systems is central to their adoption in clinical work. In a hectic working environment, where many tasks are carried out in parallel, where clinicians move from one setting to another, and where collaborative settings constantly emerge, it is important that clinical computer applications have a knowledge about the user s working context and that they are able to adapt to this context. Support for context-awareness should be part of the basic infrastructure, rather than something each clinical application implements. Collaboration We want to build mechanisms for collaboration into the very basic infrastructure of a hospital, because collaboration is so fundamental to clinical work. This can be used for setting up tele-medicine conferences between physicians and patients at home, but it should also support more ad hoc and situated kind of cooperation happening around public displays, like at the patient s bed-side.
Fig. 1. Two frames from our vision video. The left-hand side showing the interactive bed and the right-hand side showing a remote radiology conference on a wall-sized interactive display. 2.2 Interactive Hospital Environments and Devices Personal Computers are made for office work and are hence often difficult to deploy and use in a hospital, where there are no desks, chairs, or place for computer equipment. Therefore we envision that clinical computer applications are to be embedded in medical equipment and in the hospital as such. We are working on prototypes for creating interactive walls, ceilings, and floors, as well as embedding computers in hospital beds, pill containers, surgical tools, etc. We envision a hospital where clinicians can approach interactive surfaces anywhere and carry on their work. Some of these surfaces are small and handheld like PDAs (but are not personal), others are large like the one used in a radiology conference room, where the whole wall is one big interactive surface. An important part of this research is to develop and test new ways of modeling, representing, storing, manipulating, displaying, and using medical data or information. Medical records are to a large degree textual even when computerized. This is a very abstract way of representing medical knowlegde, which is tied to very physical aspects of human life. Simple things like using video in documenting orthopedic rehabilitation and in healing of wounds is desirable, but not easily integrated in exsisting EPR systems. Another important part of this research is to develop new ways of interacting with medical data and new kinds of interactive medical equipment. The keyboard and mouse are very hard to use at the bedside, for example. Hence we envision various kinds of multi-modal interaction needs to take place. For example, enabling the surgeon to access medical records and x-ray images using voice and gestures, while performing the operation. 3 Current Work Our work so far has been concentrating on creating a basic infrastructure to be used in hospitals, and on creating some example of clinical applications running on top of this framework.
3.1 The ABC Framework The ABC Framework is designed to be the computational infrastructure of the future hospital [9]. ABC is an acronym for Activity Based Computing, an architectural principle also referred to as Task-Based Computing in the Aura project at Carnegie Mellon University [1, 12, 10]. In Activity Based Computing, the basic computational unit is no longer the file (e.g. a document) or the application (e.g. MS Word) but the work activity of an user. Users can simply carry with them around the hospital the various work activities that they are engaged in and seamlessly transfer these from one computer to another. Actually, we no longer talk about computers anymore but about public displays even PDA are considered public, and not personal. These public displays are embedded in floors, walls, medicine cabinets, beds, etc. A prototype of a wall-sized public display is shown in figure 2. The main goal of the ABC framework is to provide a programming platform for the development and deployment of computer applications that can be used in our activitybased computing concept. This is achieved by having on the one hand a runtime infrastructure, which handles the computational complexities of managing distributed and collaborative activities by adapting to the available services or resources in a specific environment. And on the other hand by having a developer s framework, which helps the programmer to create ABC-aware applications that can be deployed in the runtime infrastructure. Fig. 2. The current implmentation of the vision shown in figure 1. Left-hand side: The interactive bed. Right-hand side: A prototype on a public wall-display. A nurse is having a real-time conference with a radiologist. The ABC Framework embeds the following sub-components: A context-awareness sub-systems A central part of the ABC infrastructure is a context-awareness sub-system which continuously monitors the users context and gathers context information. This context-awareness sub-system can be accessed from clinical applications, or the it can be setup to notify applications when appropriate. Our EPR application uses this sub-system to display EPR data for the nurse
in the medicine room based on information about the nurses current activity. This is for example monitored by looking at which patient s pill container she is holding in her hand. An user authentication sub-system Another sub-system help us accomplish what we have termed proximity-based user authentication [5]. This enables user to be securely authenticated on a public computer just by approaching it. A collaboration sub-system The framework embeds basic support for collaboration [2]. All activities are in principles shared, which means that users participating in an activity can access it and collaborate in it. For example, if a radiologist creates a radiology conference activity for medical department B, s/he can invite all relevant participants to the activity. This means that everybody can join this activity and they are conducting a real-time conference using whatever computer they are logged in to. The session can be recorded for later playback by participants, who did not take part in the conference. A social awareness sub-system Central to the collaboration support in the ABC Framework is a system which helps clinicians to judge who and how to initiate a collaboration session. This system tries to provide clinicians with a peripheral, social awareness on what their colleagues are doing. This social awareness system uses information about the activity and context of users. 3.2 The Interactive Hospital Bed As an example of an ABC application we created the Interactive Hospital Bed (see figure 2). The bed has an integrated computer and a touch sensitive display. Furthermore, the bed is equipped with various sensors that can identify the patient lying in the bed, the clinician standing beside the bed, and various medical stuff embedding RFID tags. In this way the computer can adapt the computer screen to the users in its vicinity. For example, when the nurse arrives with the patient s medicine, the bed is able to log in the nurse, check if the nurse is carrying the right medicine for the right patient, and it can display the relevant information on the screen, typically the medicine schema from the EPR system. Furthermore, various medical sensors measuring things like blood pressure, temperature, etc. can be attached to the bed and start using the onboard computer as a gateway to the basic infrastructure. Every bed is in itself a server containing various information about its patient and can be queried from e.g. an EPR. 4 Conclusions In this short paper we have tried to outline our current research into creating new technologies for the hospital. This research is tightly connected to the research field of UbiComp, as well as related research field like CSCW, HCI, and Software Architecture. The current status of our work is that it has been tested extensively in our labs, but has not been deployed in real hospitals. We hope to begin to incorporate some of the concepts and results from our research into the systems running in hospitals. To do this we currently engage in cooperative project with the EPR and healthcare industry in Denmark.
Short Biography for Jakob E. Bardram Jakob Bardram s main research areas are pervasive and ubiquitous computing, distributed component-based system, computer supported cooperative work, humancomputer interaction, and medical informatics. Currently, his main focus is Pervasive Healthcare and he is conducting research into technologies of future health both at hospitals and in the patient s home. Currently, he is managing a large project investigating technologies for The Future Hospital, which includes (among other things) embedding intelligence in everyday artifacts within a hospital, such as in the walls of the radiology conference room, in the patient s bed, in the pill containers, and even into the pills. Additionally, he has done research into the design and development of computerized medical records, especially focusing on the support for mobile work and clinical cooperation. Jakob E. Bardram received his Ph.D. in computer science in 1998 from the University of Aarhus, Denmark. He currently directs the Centre for Pervasive Healthcare at Aarhus University [8]. References 1. Aura Project. http://www.cs.cmu.edu/ aura. 2. J. E. Bardram. Supporting Mobility and Collaboration in Ubiquitous Computing. Technical Report CfPC 2003 PB 38, Center for Pervasive Computing, Aarhus, Denmark, 2003. Available from http://www.pervasive.dk/publications. 3. J. E. Bardram. The Trouble with Login On usability and Computer Security in Pervasive Computing. Technical Report CfPC 2003 PB 50, Center for Pervasive Computing, Aarhus, Denmark, 2003. Available from http://www.pervasive.dk/publications. 4. J. E. Bardram and C. Bossen. Moving to get ahead: Local Mobility and Collaborative Work. In K. Kuutti, E. H. Karsten, G. Fitzpatrick, P. Dourish, and K. Schmidt, editors, Proceedings of the Eighth European Conference on Computer Supported Cooperative Work, pages 355 374, Helsinki, Finland, Sept. 2003. Kluwer Academic Publishers. 5. J. E. Bardram, R. E. Kjær, and M.. Pedersen. Context-Aware User Authentication Supporting Proximity-Based Login in Pervasive Computing. In A. Dey, J. McCarthy, and A. Schmidt, editors, Proceedings of Ubicomp 2003: Ubiquitous Computing, Lecture Notes in Computer Science, pages 107 116, Seattle, Washington, USA, Oct. 2003. Springer Verlag. 6. J. E. Bardram, T. K. Kjær, and C. Nielsen. Supporting local mobility in healthcare by application roaming among heterogeneous devices. In Proceedings of the Fifth International Conference on Human Computer Interaction with Mobile Devices and Services. Springer Verlag, 2003. 7. C. Bossen. The parameters of common information spaces: The heterogeneity of cooperative work at a hospital ward. In Proceedings of the 2002 ACM conference on Computer supported cooperative work, pages 176 185. ACM Press, 2002. 8. Center for Pervasive Healthcare, University of Aarhus, Denmark. http://www.cfph.dk. 9. H. B. Christensen and J. E. Bardram. Supporting human activities exploring activitycentered computing. In G. Borriello and L. E. Holmquist, editors, Proceedings of Ubicomp 2002: Ubiquitous Computing, volume 2498 of Lecture Notes in Computer Science, pages 107 116, Göteborg, Sweden, Sept. 2002. Springer Verlag. 10. D. Garlan, D. P. Siewiorek, A. Smailagic, and P. Steenkiste. Project Aura: Toward Distraction-Free Pervasive Computing. IEEE Pervasive Computing, 1(2):22 31, Apr. 2002.
11. S. Mitchell, M. D. Spiteri, J. Bates, and G. Coulouris. Context-Aware Multimedia Computing in the Intelligent Hospital. In Proceedings of the 9th ACM SIGOPS European Workshop, pages 13 18. ACM Press, 2000. 12. J. P. Sousa and D. Garlan. Aura: an Architectural Framework for User Mobility in Ubiquitous Computing Environments. In Proceeding of the 3rd Working IEEE/IFIP Conference on Software Architecture, Montreal, 2002.