Study on Cognitive Approach to Human Error and its Application to Reduce the Accidents at Workplace
|
|
|
- Karen Chapman
- 10 years ago
- Views:
Transcription
1 Study on Cognitive Approach to Human Error and its Application to Reduce the Accidents at Workplace Ganapathi Bhat Manchi, Sidde Gowda, Jaideep Singh Hanspal Abstract The err is in built in human nature. There are no specific counter measures for error. Human cognition uses processes that allow us to be amazingly fast, to respond flexibly to new situation [1] and to juggle several tasks at once (Flower and Hayes 1980). Unfortunately, these processes inevitably produce occasional errors. It is now well understood that these errors are the product of limitations in human information processing coupled with design features that are ill matched to human abilities. This is especially true for highly automated environments in which robust perceptual-motor tasks have been largely replaced by more error-prone cognitive tasks. The emerging model of cognition provides at least partial model of cognitive mechanism to understand the way human thinking works. The most effective way to deal with error due to human behavior and unpredictable environment is by safety culture and favorable system design. Index Terms Cognition, Human error, Safety culture, System design I. INTRODUCTION The err is in built in human nature. Error themselves are not intrinsically bad. Indeed they are often highly adoptive as in trial and error learning or the serendipitous discovery that can rise from error [1]. There are no specific counter measures for error. Routed as it is in the human condition, fallibility cannot be eliminated but its adverse consequences can be moderated through target error techniques [2]. It has been estimated that up to 90% of all workplace accidents have human error as a cause [3]. There are theories that show why human error is so universe and constant. The human cognitive functions make the person fast and flexible but the thinking process itself is prone to some inevitable error. Human cognition uses processes that allow us to be amazingly fast, to respond flexibly to new situation [1] and to juggle several tasks at once (Flower and Hayes 1980). Unfortunately these processes inevitably produce occasional errors. To study the cognitive function many broad spectrum researches were done. Out of these, emerging model of cognition was the most widely accepted from Reason s 1990 genetic error-modelling system and Baars 1992b global workspace theory. This model is for overall cognition not only for error. It is agreed now that correct performance and errors follow from the same underlying cognitive process [1]. Manuscript received August, Ganapathi Bhat Manchi, Department of Civil Engineering, CMJ University, Shillong, Meghalaya, India. Dr. Sidde Gowda, Department Civil Engineering, S.J.C. Institute of Technology, Chickballapur, Bangalore-India. Dr. Jaideep Singh Hanspal, Orthopedic Surgeon, Laing O Rourke Medical Center, Dubai, UAE. II. MATERIALS AND METHOD The available literature was searched on this topic extensively. Articles were procured and several web sites were consulted. The material was scanned thoroughly and the views of different authors were analyzed. Moreover in depth analysis was done on cognitive functions as applicable to human error and its application in modifying human error. III. CRITICAL ANALYSIS The cognition function of human mind is very complex system. We still do not understand it fully. Many researches have been done in this field but the most accepted theory till present time is the emerging model of cognition from Reason s 1990 genetic error-modelling system and Baars 1992b global workspace theory. This model explains the interaction between the human mind and environment. Hence this model helps in understanding the occurrence of human error. A. The Emerging Model Figure 1: Emerging Model of Cognition Figure 1, which illustrates the basic elements of this emerging model, is drawn from Reason s [1] Generic Error-Modelling System (GEMS) and Baars [1992b] Global Workspace (GW) Theory. Researchers now agree that both correct performance and errors follow from the same underlying cognitive processes [Reason, 1990, p. 36]. Human cognition uses processes that allow us to be amazingly fast [1], to respond flexibly to new situations [1], and to juggle several tasks at once [Flower & Hayes, 1980]. Unfortunately, these processes inevitably produce occasional errors. In most situations, the occasional errors produced by human cognitive processes rarely cause serious problems. In large spreadsheet models with thousands of cells, however, even a minuscule error rate can lead to disaster (Raymond R. Panko). 236
2 Study on Cognitive Approach to Human Error and its Application to Reduce the Accidents at Workplace a. The Automatic Subsystem Figure 1 shows that the model has three interacting subsystems. One is the automatic subsystem, which is characterized by cognition below the level of consciousness. A great deal of our everyday cognition occurs in the automatic subsystem. The automatic subsystem is characterized by a great deal of parallel operation. For example, when we prepare to speak, we actually generate several competing plans [Baars, 1980]. We flesh them out in parallel with proper syntax, phonemics, and other aspects of speech. At some point during this process of development, we select a single utterance. Competition among competing plans is responsible for several types of errors that are incorrect in one or a few particulars but are highly lawful in other aspects of language. Because of parallelism, our automatic subsystem is extremely fast. The automatic subsystem uses schemata [Bartlett, 1932; Neisser, 1976], which are organized collections of information and response patterns. When the proper conditions exist, a schema is activated. Each activated schema leads to specific response. The automatic subsystem has an enormous pool of schemata to activate. There appear to be no known limits on schemata in long-term memory in terms of either their number or the duration of retention [1]. Reason [1] argues that there appear to be two core mechanisms for selecting schemata to be activated. The first is pattern matching. If the situation matches the activating criteria for a schema, that schema is activated. The second mechanism is frequency gambling [1]. This mechanism is required when there is no perfect match between the situation and a single schema or the normal process of selection between competing schemata is disrupted. In this case, the schema that has been activated the most frequently under similar conditions in the past will execute. Mostly, this frequency gambling will produce positive results but rarely will it result in errors. When frequency gambling fails, we get what Reason [1] called a strong but wrong error. We do something we have done many times before, rather than what we should do. b. The Attentional Subsystem The second subsystem in Figure 1 is the attentional subsystem [1]. The automatic subsystem is simple, relying largely on pattern recognition and frequency gambling. In contrast, the attentional system has powerful logical capabilities. Unfortunately, using the attentional mode is also "limited, sequential, slow, effortful, and difficult to sustain for more than brief periods [1]." The resources of attentional subsystem are very limited. If we wish to memorize things, we can only remember about seven things [Miller, 1956]. If we are solving a logic problem, the resources we can allocate to memory are even more limited. These limitations can easily lead to errors in memory or logical analysis (Raymond Panko). Even with logical attentional thinking, there appears to be schematic organization, just as there is in the automatic subsystem. In fact, it is likely that schemata in the attentional subsystem are really schemata from the automatic subsystem that rise to the attentional level [Baars, 1992b]. One aspect of schemata in the attentional subsystem is the reliance of people on "lay theories" (habitual informal theories) when they deal with technical areas such as physics [Furnham, 1988; Owen, 1986; Resnick, 1983]. Even after people receive training in specific areas, such as physics, they often revert to lay theories afterward [Resnick, 1983]. Lay theories are schemata that we have developed over many years. They are very likely to produce errors when we model situations. c. The Environment The third subsystem in Figure 1 is the environment. Human cognition is not merely internal. It is situated in the environment surrounding it. Neisser [1976] argued that schemata direct our exploration of the environment. Exploration samples specific aspects of the vast information surrounding us. This, in turn, modifies our schemata. d. Interaction between subsystems and environment The attentional subsystem holds goals. These goals influence the activation of automatic subsystem nodes and so at least partially control the automatic subsystem. If the attentional subsystem loses its goal, the entire cognitive system is likely to err. Baars [1992b] calls this aspect of the attentional subsystem the Global Workspace, after artificial intelligence theories. The automatic subsystem can also take initiate action. If there is an error, for instance, the automatic subsystem can interrupt the attentional subsystem and demand attention [49]. In this way, a schema in the automatic subsystem can influence or even grab control of the Global Workspace, so that it can influence the activation of other schemata. Sellen and Norman [1992] emphasize that the interaction with the environment takes place continuously as we plan and execute an action. First, we form a high-level intention. When we execute it, we constantly adjust our action through feedback with the environment. This adjustment takes place without burdening our limited attentional system in most cases. e. Error Correction and Error Rates The way human cognition operates, errors are inevitable. This makes the detection and correction of errors critical. Several studies have used protocol analysis to observe how people do cognitive work. In this technique, the subject voices what he or she is thinking during the process. Wood [37] used this technique to study statistical problem solving. Hayes and Flower [1980] used protocol analysis to study writing. Both studies found that as subjects worked, they frequently stopped to check for errors. In some cases, the error detection happened immediately after an error or the suspicion of an error. In other cases, it was done systematically. Kellog [1994] describes several studies that measured time spent in writing. These studies found that about a fifth of all writing time is spent in reviewing what we have already written. Although the emerging model of human error is compelling and fits a broad spectrum of research in a qualitative fashion, it cannot, at this stage, predict error commission rates or error correction rates. Error rates are still a matter for empiricism. Most of the error rates are for mechanical errors. A good general figure for mechanical error rates appears to be about 0.5%. f. Limitations of human behavior The modern working environment is very different to the settings that humans have evolved to deal with. This exposes the limitations of humans to interact with the working environment. These limitations include: 237
3 Attention - the modern workplace can overload human attention with enormous amounts of information, far in excess of that encountered in the natural world. The way in which we learn information can help reduce demands on our attention, but can sometimes create further problems (e.g. the Automatic Warning System on UK trains). Perception - in order to interact safely with the world, we must correctly perceive it and the dangers it holds. Work environments often challenge human perception systems and information can be misinterpreted. Memory - our capacity for remembering things and the methods we impose upon ourselves to access information often put undue pressure on us. Increasing knowledge about a subject or process allows us to retain more information relating to it. Logical reasoning - failures in reasoning and decision making can have severe implications for complex systems such as chemical plants, and for tasks like maintenance and planning. B. Classifying Human Failures Two kinds of categorization are used: a classification by consequences, and a classification by psychological origins. In our study, psychological classification will be highlighted which focus upon the mental antecedents of the error. Three distinctions are important. 1) Slips and lapses versus mistakes 2) Errors versus violations 3) Active versus latent failures 1) Slips and lapses versus mistakes All errors involve some kind of deviation. There are basically two ways in which this failure can occur: The plan is adequate but the associated actions do not go as intended. These are failures of execution and are commonly termed slips and lapses. Slips relate to observable actions and are associated with attentional failures. Lapses are more internal events and relate to failures of memory. The actions may go entirely as planned but the plan is inadequate to achieve its intended outcome. These are failures of intention, termed mistakes. Slips and lapses: In the case of slips, lapses, and fumbles, actions deviate from the current intention. Here, the failure occurs at the level of execution. Slips and lapses occur during the largely automatic performance of some routine task, usually in familiar surroundings. They are almost invariably associated with some form of attentional capture, either distraction from the immediate surroundings or preoccupation. They are also provoked by change, either in the current plan of action or in the immediate surroundings [1]. Mistakes: In the case of mistakes, the actions may go entirely as planned, but the plan itself deviates from some adequate path towards its intended goal. Here, the failure lies at a higher level: with the mental processes involved in planning, formulating intentions, judging and problem solving. Mistakes can begin to occur once a problem has been detected. A problem is anything that requires a change or alteration of the plan. Mistakes can be further subdivided into two categories: rule-based mistakes and knowledge based mistakes. Rule-based mistakes: These occur in relation to familiar or trained-for problems. We are extremely good at making rapid and largely automatic assessments of complex situations based upon matching features of the world to patterns stored in long term memory. But this process can go wrong in two ways. We can misapply a good rule (i.e. one that is frequently applicable) because we fail to notice the contraindications or we can apply a bad rule that has remained uncorrected in our stored repertoire of problem solutions. Knowledge-based mistakes: These occur when the person encounters a novel situation that lies outside the range of his or her stock of pre-packaged problem solving routines. Under these conditions, persons are forced to resort to slow, effortful, online reasoning. This process is extremely error prone for several reasons. First, our capacity for conscious thought is highly resource limited; we can only attend to and manipulate one or two discrete items at any one time. Second, we have to rely upon a mental model (attentional subsystem) of the current situation that is nearly always incomplete and, in parts, incorrect. Third, we have a marked tendency in these circumstances to fixate upon a particular hunch or hypothesis and then select features of the world to support it, while neglecting contradictory evidence. This has been called confirmation bias or cognitive lock-up and has been frequently observed in nuclear power plant operators and others during attempts to recover from an emergency [37]. 2) Errors versus violations Violations are deviations from safe operating practices, procedures, standards or rules. Such deviations can either be deliberate or erroneous (e.g. speeding without being aware of either the speed or the restriction). However, we are mostly interested in deliberate violations, where the actions (though not the possible bad consequences) were intended. Deliberate violations differ from errors in a number of important ways. Whereas errors arise primarily from informational problems (forgetting, inattention, incomplete knowledge, etc), violations are more generally associated with motivational problems (low morale, poor supervisory examples, perceived lack of concern, the failure to reward compliance and sanction non-compliance, etc). Errors can be explained by what goes on in the mind of an individual, but violations occur in a regulated social context. Errors can be reduced by improving the quality and delivery of the necessary information within the workplace. Violations generally require motivational and organisational remedies. 3) Active versus latent failures Active failures are unsafe acts (errors and violations) committed by those at the sharp end of the system (e.g. anesthetists, surgeons, nurses). They are the people whose actions can have immediate adverse consequences. Latent failures are created as the result of decisions taken at the higher echelons of the organization. Their damaging consequences may lie dormant for a long time, only becoming evident when they combine with active failures and local triggering factors to breach the system s many defences. This classification was given by The Human Factors Analysis and Classification System (HFACS). The Human Factors Analysis and Classification System (HFACS) was developed initially as a framework to understand "human error" as a cause of aviation accidents (Shappell and Wiegmann, 2000; Wiegmann and Shappell, 2003). It is based on James Reason's Swiss cheese model of human error in complex systems. HFACS distinguishes 238
4 Study on Cognitive Approach to Human Error and its Application to Reduce the Accidents at Workplace between the "active failures" of unsafe acts, and "latent failures" of preconditions for unsafe acts, unsafe supervision, and organizational influences. Unsafe acts are performed by the human operator "on the front line" (e.g., the pilot, the air traffic controller, the driver). Unsafe acts can be either errors (in perception, decision making or skill-based performance) or violations (routine or exceptional). There are two types of preconditions for unsafe acts: those that relate to the human operator's internal state and those that relate to the human operator's practices or ways of working. Adverse internal states include those related to physiology (e.g., illness) and mental state (e.g., mentally fatigued, distracted). A third aspect of 'internal state' is really a mismatch between the operator's ability and the task demands; for example, the operator may be unable to make visual judgments or react quickly enough to support the task at hand. Poor operator practices are another type of precondition for unsafe acts. Four types of unsafe supervision are: inadequate supervision; Planned inappropriate operations; Failure to correct a known problem; and Supervisory violations. Organizational influences include those related to resources management (e.g., inadequate human or financial resources), organizational climate (structures, policies, and culture), and organizational processes (such as procedures, schedules, oversight). The Swiss cheese model of accident causation The figure shows a trajectory of accident opportunity and its penetration through several types of defensive system. The combined chances of an accident occurring are very small, as the holes in the various defence systems must all line up. Some are active failures of human or mechanical performance, and others are latent conditions, such as management factors or poor system design. However, it is clear that if steps are taken in each case to reduce the defensive gaps, the overall chance of accident will be greatly reduced [59]. C. Application of cognitive approach in accident/incident reduction at work environment This is a part of error management, which is most difficult to implement. Changing the way of thinking of a person is not a single object exercise. So we cannot change the human condition, but we can change the conditions under which humans work. In most safety-critical industries, a number of checks and controls are in place to minimise the chance of errors occurring. For a disaster to occur there must be a conjunction of oversights and errors across all the different levels within an organisation. From the Swiss Cheese Model it is clear that the chances of an accident occurring can be made smaller by narrowing the windows of accident opportunity at each stage of the process. Factors such as training and competence assurance, management of fatigue-induced errors and control of workload can eliminate some errors. But errors caused by human limitations and/or environmental unpredictability are best reduced through improving system interface design and safety culture. Hence safety culture and system design are the most important error management tools to combat human error and address cognition changes in humans. D. Safety Culture Attribution of accidents to human failures at the 'sharp end' of an industry may not provide a full picture of all the factors involved. The management of the organisation must also take responsibility for decisions, which affect the safe functioning of the organisation as a whole (Health and Safety Executive 1999). Unwise decisions at this level are more difficult to link directly to an accident, as they are often implemented well before an accident occurs, and they do not make their presence urgently felt. Good decisions at this level can create a culture of safety, which can remove the precursor conditions for accidents or ameliorate their consequences. Safety Culture is a term that was first introduced after the Chernobyl disaster in The safety culture of an organization is the product of the individual and group that determine the style and proficiency of an organization s health and safety programmes. A positive safety culture is one in which shared perceptions of the importance of safety and confidence in preventative measures are experienced by all levels of an organization. According to the Health and Safety Executive (HSE, the statutory body that ensures that risks to health and safety from work activities are properly controlled), factors that create this positive culture include: leadership and the commitment of the chief executive; a good line management system for managing safety; the involvement of all employees; effective communication and understood/agreed goals; good organizational learning/responsiveness to change; manifest attention to workplace safety and health; a questioning attitude and rigorous and prudent approach by all individuals. Trust is a key element of a reporting culture and this, in turn, requires the existence of a just culture one possessing a collective understanding of where the line should be drawn between blameless and blameworthy actions. E. System design A good system should not allow people to make mistakes easily. The common system design is carried out in the absence of feedback from its potential users, which increases the chance that the users will not be able to interact correctly with the system. A set of design principles has been proposed [21] which can minimize the potential for error. These principles are as below: a. Accurate mental models There is often a discrepancy between the state of a system and the user's mental model of it. This common cause of erroneous behaviour arises because the user's model of the system and the system itself will differ to some extent, since the user is rarely the designer of the system. Designers need to exploit the natural mappings between the system and the expectations and intentions of the user. 239
5 b. Managing information As our brains are easily distracted and can overlook necessary tasks, it makes sense to put information in the environment, which will help us, carry out complex tasks. When under time pressure, technicians are likely to forget to perform tasks such as replacing nuts and bolts. A very simple solution to this problem would be to require technicians to carry a hand-held computer with an interactive maintenance checklist, which specifically required the technician to acknowledge that certain stages of the job had been completed. It could also provide information on task specifications if necessary. This would also allow a reduction in paperwork and hence in time pressure. c. Reducing complexity Making the structure of tasks as simple as possible can avoid overloading the psychological processes outlined previously. The more complex the task specifications, the more chances are for human error. d. Visibility The user must be able to perceive what actions are possible in a system and furthermore, what actions are desirable. This reduces demands on mental resources in choosing between a range of possible actions. Perhaps even more important is good quality feedback, which allows users to judge how effective their actions have been and what new state the system is in as a result of those actions. e. Constraining behavior If a system could prevent a user from performing any action, which could be dangerous, then no accidents would occur. However, the real world offers too complex an environment for such a simplistic solution: in an industrial operation, a procedure, which could be beneficial at one stage in the process, may be disastrous at another. Nevertheless, it is possible to reduce human error by careful application of forcing functions by setting one step after the other. f. Design for errors In safety-critical systems, such as nuclear power plants, numerous safety systems are in place which can mitigate accidents. One approach is defense in depth (implementing many independent systems simultaneously); another is fail-to safe state system design. However, designers must assume that mistakes will occur, and so any useful system must make provision for recovery from these errors. Another consideration is that the design should make it difficult to enact non-reversible actions. Although this is an underlying principle of design, it needs to be applied carefully. g. Standardization When systems are necessarily complex but have been made as accessible and easy to use as possible and errors are still being made, then standardization is sometimes used as an attempt to make the situation predictable. One problem with standardization is that if any advances in design or usage are made, then it is a very costly process to re-implement standardization across all departments of an industry. Also, a standardized system may be ideal for one set of tasks, but very inefficient for another set. Such practical considerations have tended to limit the application of standardization as an approach for reducing human errors. h. User-centered design Another basic principal of design is that it should be centred around the user at all stages from initial conception, through evolution and testing, to implementation. In practice however, systems designers are often given a brief, create the system and impose it upon the users without appropriate feedback. This can result in unexpected system behavior and over-reliance on manuals which themselves have been written by the system designers from their own perspective. Systems designed in this way will be opaque to the end user, and this can hinder effective interaction. IV. DISCUSSION Human error is the causal or contributing factor in the majority of accidents. It is now well understood that these errors are the product of limitations in human information processing coupled with design features that are ill matched to human abilities. This is especially true for highly automated environments in which robust perceptual-motor tasks have been largely replaced by more error-prone cognitive tasks. Attempts to reduce human error by improved design have met with only partial success, in part because the complexity of human cognitive behaviour has proven an obstacle to a useful theory of how errors are generated. The complexity of human cognition makes it difficult to measure and quantify important aspects of human behaviour because of the variability inherent in complex performance. Improved understanding of how limitations in cognitive processing contribute to human error would facilitate the design of error tolerant (or error resistant) systems. Instances of error often reflect failures of executive control that result from limitations in human information processing. Failures of executive control lead to identifiable classes of memory errors, such as failures to remember intended actions (prospective memory failures), and habit capture error. Failures of executive control are also associated with failures in routine monitoring, where observers will fixate an information source but not apply the executive control needed to process the information. In current theories of human cognition, executive control is associated with limited-capacity attention-demanding mental processing. Common cognitive acts such as fetching items from memory, reading text, solving problems, etc., require executive control, in contrast to early perceptual processes and low-level motor behaviours whose processing is independent of executive control. A better understanding of the relationship between executive control and the more autonomous information gathering and motor behaviours would lead to significant advances in our understanding of human error. The cognitive variables of most interest are not directly observable. This has made it difficult to relate theory to complex applications domains where measurement techniques are too crude for the required inferences. The emerging model of cognition provides at least partial model of cognitive mechanism to understand the way human thinking works. Human Error has been cited as a cause or contributing factor in many major disasters and accidents. It is also important to stress that "human error" mechanisms are the same as "human performance" mechanisms; performance later categorized as 'error' is done in hindsight (Reason, 1991; Woods, 1990). Human error is part of the ordinary spectrum of behavior. Recently, human error has been 240
6 Study on Cognitive Approach to Human Error and its Application to Reduce the Accidents at Workplace re-conceptualized as resiliency to emphasize the positive aspects that humans bring to the operation of technical systems [39] (Woods and Leveson, 2006). The most effective way to deal with error due to human behaviour and unpredictable environment is by safety culture and favourable system design. The definition of safety culture suggested by the Health and Safety Commission is: "The safety culture of an organization is the product of the individual and group values, attitudes, competencies and patterns of behavior that determine the commitment to, and the style and proficiency of, an organization s health and safety programmes. Organizations with a positive safety culture are characterized by communications founded on mutual trust, by shared perceptions of the importance of safety, and by confidence in the efficacy of preventative measures." A positive safety culture implies that the whole is more than the sum of the parts. The different aspects interact together to give added effect in a collective commitment. In a negative safety culture the opposite is the case, with the commitment of some individuals strangled by the cynicism of others. From various studies it is clear that certain factors appear to characterise organisations with a positive safety culture. These factors include: The importance of leadership and the commitment of the chief executive The executive safety role of line management The involvement of all employees Effective communications and commonly understood and agreed goals Good organisational learning and responsiveness to change Manifest attention to workplace safety and health A questioning attitude and a rigorous and prudent approach by all individuals System design is also very effective in creating environment where chances of error are reduced. The factors taken are already explained in details above and are: Accurate mental models Managing information Reducing complexity Visibility Constraining behavior Design for errors Standardization User-centred design Designing for human error is a major challenge for developers of safety-critical and mission-critical systems. Most approaches to error-tolerant design use either general design guidelines or treat humans as just another error-prone system component. Hence, cognition approach to human error highlights the need to recognize the human as a thinking individual than as an error-prone system component. REFERENCES [1] Reason J. Human error. New York: Cambridge University Press, [2] Reason J. Combating omission errors through task analysis and good reminders, 2002 [3] Feyer, A.M. & Williamson, A.M. (1998): Human factors in accident modelling. In: Stellman, J.M. (Ed.), Encyclopaedia of Occupational Health and Safety, Fourth Edition. Geneva: International Labour Organisation. [4] Institute of Medicine (2000): To err is human: Building a safer health system. Washington: National Academy Press. [5] Reason, J. (1989): Human Error. Cambridge: CUP. [6] Norman, D. (1988): the Psychology of Everyday Things. New York: Basic Books. [7] Duncan, K. D. (1987). Fault diagnosis training for advanced continuous process installations. In: Rasmussen, J., Duncan, K., and Leplat, J. (Eds), New Technology and Human Error. Chichester: Wiley. [8] Rasmussen, J. (1980). The human as a systems component. In: Smith, H.T. and Green, T.R.G. (Eds), Human Interaction with Computers. London: Academic Press. [9] Health and Safety Executive (1999): Reducing error and influencing behaviour. London: HMSO. [10] Guest, D.E., Peccei, R. & Thomas, A. (1994): Safety culture and safety performance: British Rail in the aftermath of the Clapham Junction disaster. Paper presented at the Bolton business school conference on changing perceptions of risk, Bolton, February [11] Lee, T. & Harrison, K. (2000): Assessing safety culture in nuclear power stations. Safety Science, 34: [12] Leape, L. (1994): Error in medicine. Journal of the American Medical Association 272: [13] Clarke, S. (1998): Organisational factors affecting the incident reporting of train drivers. Work & Stress 12: [14] HSE/C (2001): Proposals for a new duty to investigate accidents, dangerous occurrences and disasters. visited on 11/05/01. [15] Davies, J.B., Wright, L., Courtney, E. & Reid, H. (2000): Confidential incident reporting on the UK railways: The CIRAS system. Cognition, Technology & Work 2: [16] Quality Interagency Coordination Task Force (QuIC). Doing what counts for patient safety. Summary of the Report for the President of the Quality Interagency Coordination Task Force. Washington: QuIC, [17] Amalberti R, Wioland L. Human error in aviation. In: Soekkha HM, ed. Aviation safety. Utrecht: VSP, 1997: [18] de Leval M, Carthey J, Wright D, et al. Human factors and cardiac surgery: a multicentre study. J Thorac Cardiovasc Surg 2000; 119: [19] Reason J. Managing the risks of organisational accidents. Aldershot, UK: Ashgate, [20] Reason J. How necessary steps in a task get omitted: revising old ideas to combat a persistent problem. Cognitive Technol 1998; 3: [21] Norman DA. The psychology of everyday things. New York: Basic Books, [22] Baber C, Stanton NA. Task analysis for error identification: a methodology for designing error-tolerant consumer products. Ergonomics 1994; 11: [23] Herrmann D. Super memory: a quick action program for memory improvement. Emmau: Roedale Press, [24] Herrmann D, Weigartner H, Searleman A, et al. Memory improvement: implications for memory theory. New York: Springer-Verlag, [25] Hobbs AN. Human errors in context: a study of unsafe acts in aircraft maintenance. PhD Thesis, University of New South Wales, [26] Institute of Nuclear Power Operations (INPO). An analysis of root causes in 1983 and 1984 significant event reports. INPO Atlanta: Institute of Nuclear Power Operations, [27] Parliamentary Office of Science and Technology note June 2001 number 156 [28] Reason J Safety in the operating theatre-part2: Human error and organisational failure [29] Gaba DM. Human error in dynamic medical domains. In: Bogner MS, ed. Human errors in medicine. Hillsdale, NJ: Erlbaum, [30] Wilson M. Errors and accidents in anaesthetics. In: Vincent C, Ennis M, Audley R, eds. Medical accidents. Oxford: Oxford University Press, 1993: [31] Williamson JA, Webb RK, Sellen A, et al. Human failure: an analysis of 2000 incident reports. Anaesth Intens Care 1993; 21: [32] Weir PM, Wilson ME. Are you getting the message? A look at the communication between the Department of Health, manufacturers and anaesthetists. Anaesthesia 1991; 46: [33] Mayor AH, Eaton JM. Anaesthetic machine checking practices: a survey. Anaesthesia 1992; 47: [34] Cooper JB, Cullen DJ, Nemeskal R, et al. Effects of information feedback and pulse oximetry on the incidence of anaesthesia complications. Anesthesiology 1987; 67: [35] Chopra V, Bovill JG, Spierdijk J, et al. Reported significant observations during anaesthesia: a prospective analysis over an 18-month period. Br J Anaesth 1992; 68:
7 [36] Senders JW, Moray NP. Human error: cause, prediction and reduction. Hillsdale, NJ: Erlbaum, [37] Woods DD. Some results on operator performance in emergency events. Institute of Chemical Engineers Symposium Series 1984; 90: [38] Sheen Mr Justice. MV Herald of Free Enterprise. Report of Court No Formal Investigation. London: Department of Transport, [39] Hollnagel E. Human reliability analysis: context and control. London: Academic Press, [40] Cook RI, Woods DD. Operating at the sharp end: the complexity of human error. In: Bogner MS, ed. Human errors in medicine. Hillsdale, NJ: Erlbaum, [41] Reason J. The mariner s guide to human error. London: Shell International Tankers, [42] Fischhoff B. For those condemned to study the past: heuristics and biases in hindsight. In: Kahneman D, Slovic P, Tversky A, eds. Judgment under uncertainty: heuristics and biases. New York: Cambridge University Press, 1982: [43] Caplan RA, Posner KL, Cheney FW. Effect of outcome on physician s judgments of appropriateness of care. JAMA 1991; 265: [44] Bacon Sir F. The New Organon. In: Anderson F, ed. Indianapolis: Bobbs- Merrill, 1960 (originally published 1620). [45] Runciman WB, Webb RK, Lee R, et al. System failure: an analysis of 2000 incident reports. Anaesth Intens Care 1993; 21: [46] Eagle CJ, Davies JM, Reason JT. Accident analysis of large scale technological disasters applied to an anaesthetic complication. Can J Anaesth 1992; 39: [47] Helmreich RL, Schaefer H-G. Team performance in the operating room. In: Bogner MS, ed. Human errors in medicine. Hillsdale, NJ: Erlbaum, [48] Woods DD, Johannesen JJ, Cook RI, et al. Behind human error: cognitive systems, computers, and hindsight. CSERIAC State-of-the-Art Report. Wright- Patterson Air Force Base, OH: Crew Systems Ergonomics Information Analysis Center, [49] Currie M, Mackay P, Morgan C, et al. The wrong drug problem in anaesthesia: an analysis of 2000 incident reports. Anaesth Intens Care 1993; 21: [50] Stebbing, C., Wong, I. C K, Kaushal, R., Jaffe, A. (2007). The role of communication in paediatric drug safety. Arch. Dis. Child. 92: [51] Hirose, M., Regenbogen, S. E, Lipsitz, S., Imanaka, Y., Ishizaki, T., Sekimoto, M., Oh, E.-H., Gawande, A. A (2007). Lag time in an incident reporting system at a university hospital in Japan. Qual Saf Health Care 16: [52] Healey, A N, Primus, C P, Koutantji, M (2007). Quantifying distraction and interruption in urological surgery. Qual Saf Health Care 16: [53] Garbutt, J., Brownstein, D. R., Klein, E. J., Waterman, A., Krauss, M. J., Marcuse, E. K., Hazel, E., Dunagan, Wm. C., Fraser, V., Gallagher, T. H. (2007). Reporting and Disclosing Medical Errors: Pediatricians' Attitudes and Behaviors. Arch Pediatr Adolesc Med 161: [54] Waring, J. J. (2007). Doctors' thinking about 'the system' as a threat to patient safety. Health (London) 11: [55] Galbraith, R M, Holtman, M C, Clyman, S G (2006). Use of assessment to reinforce patient safety as a habit. Qual Saf Health Care 15: i30-i33 [56] Reiling, J (2006). Safe design of healthcare facilities. Qual Saf Health Care 15: i34-i40 [57] Lowe, C M (2006). Accidents waiting to happen: the contribution of latent conditions to patient safety. Qual Saf Health Care 15: i72-i75 [58] Jackson, C R, Gibbin, K P (2006). 'Per ardua...'training tomorrow's surgeons using inter alia lessons from aviation. J. R. Soc. Med. 99: [59] Reason J. Human error: models and management. BMJ March 18; 320(7237): [60] Shojania, K. G., Fletcher, K. E., Saint, S. (2006). Graduate medical education and patient safety: a busy--and occasionally hazardous--intersection. ANN INTERN MED 145: [61] Langford, N J (2006). e-learning and error. Qual Saf Health Care 15: [62] Johnstone, M. -J., Kanitsaki, O. (2006). Culture, language, and patient safety: making the link. Int J Qual Health Care 18: [63] Ghaleb, M. A., Barber, N., Franklin, B. D, Yeung, V. W., Khaki, Z. F, Wong, I. C. (2006). Systematic Review of Medication Errors in Pediatric Patients. The Annals of Pharmacotherapy 40: [64] McLoughlin, V., Millar, J., Mattke, S., Franca, M., Jonsson, P. M., Somekh, D., Bates, D. (2006). Selecting indicators for patient safety at the health system level in OECD countries.. Int J Qual Health Care 18: [65] Crone, K. G., Muraski, M. B., Skeel, J. D., Love-Gregory, L., Ladenson, J. H., Gronowski, A. M. (2006). Between a Rock and a Hard Place: Disclosing Medical Errors. Clin. Chem. 52: [66] Goodyear, M. D E (2006). Further lessons from the TGN1412 tragedy. BMJ 333:
Safety in the operating theatre Part 2: Human error and organisational failure*
56 CLASSIC PAPER Safety in the operating theatre Part 2: Human error and organisational failure* J Reason... Over the past decade, anaesthetists and human factors specialists have worked together to find
Learning from errors to prevent harm
Topic 5 Learning from errors to prevent harm 1 Learning objective Understand the nature of error and how healthcare providers can learn from errors to improve patient safety 2 Knowledge requirement Explain
Crisis Resource Management (CRM)
Crisis Resource Management (CRM) Goals Develop an understanding of the cognitive and technical actions involved in managing a critical event (thinking and doing) Advance the ability to generate a differential
How Complex Systems Fail
How Complex Systems Fail (Being a Short Treatise on the Nature of Failure; How Failure is Evaluated; How Failure is Attributed to Proximate Cause; and the Resulting New Understanding of Patient Safety)
Identifying the potential roles of design-based failures on human errors in shipboard operations
Identifying the potential roles of design-based failures on human errors in shipboard operations M. Celik & I.D. Er Istanbul Technical University, Istanbul, Turkey ABSTRACT: Despite the various developments
What Is Patient Safety?
Patient Safety Research Introductory Course Session 1 What Is Patient Safety? David W. Bates, MD, MSc External Program Lead for Research, WHO Professor of Medicine, Harvard Medical School Professor of
Chapter 1: Health & Safety Management Systems (SMS) Leadership and Organisational Safety Culture
Chapter 1: Health & Safety Management Systems (SMS) Leadership and Organisational Safety Culture 3 29 Safety Matters! A Guide to Health & Safety at Work Chapter outline Leadership and Organisational Safety
Human variability: Two aspects (Reason, 2000)
Safe and Efficient Job Performance Non-Technical Skills in Industry and Medicine Rhona Flin Industrial Psychology Research Centre The Royal College of Pathologists, London, 28 November, 2013 Latent Conditions
Understanding Human Behaviour and Error
Understanding Human Behaviour and Error David Embrey Human Reliability Associates 1, School House, Higher Lane, Dalton, Wigan, Lancashire. WN8 7RP 1. The Skill, Rule and Knowledge Based Classification
MODELS OF THREAT, ERROR, AND CRM IN FLIGHT OPERATIONS
MODELS OF THREAT, ERROR, AND CRM IN FLIGHT OPERATIONS Robert L. Helmreich 1, James R. Klinect, & John A. Wilhelm University of Texas Team Research Project The University of Texas at Austin Department of
Preparation of a Rail Safety Management System Guideline
Preparation of a Rail Safety Management System Guideline Page 1 of 99 Version History Version No. Approved by Date approved Review date 1 By 20 January 2014 Guideline for Preparation of a Safety Management
Building Safety on the Three Cultures of Aviation
Building Safety on the Three Cultures of Aviation Robert L. Helmreich, Ph.D. Professor and Director University of Texas Aerospace Crew Research Project Austin, Texas, USA Effective efforts to achieve safety
Cognitive and Organizational Challenges of Big Data in Cyber Defense
Cognitive and Organizational Challenges of Big Data in Cyber Defense Nathan Bos & John Gersh Johns Hopkins University Applied Laboratory [email protected], [email protected] The cognitive and organizational
A Human Factors Approach to understanding Slip, Trip and Fall. Change in Elevation
A Human Factors Approach to understanding Slip, Trip and Fall Associate Professor Chui Yoon Ping Head of Programme BSc. Human Factors in Safety SIM University Change in Elevation 1 Hazardous Footwear Human
C-A OPERATIONS PROCEDURES MANUAL. Conducting Effective Pre-Job Briefings, Walk-Downs and Post-Job Reviews. Hand Processed Changes
If you are using a printed copy of this procedure, and not the on-screen version, then you MUST make sure the dates at the bottom of the printed copy and the on-screen version match. The on-screen version
Errors in Operational Spreadsheets: A Review of the State of the Art
Errors in Operational Spreadsheets: A Review of the State of the Art Stephen G. Powell Tuck School of Business Dartmouth College [email protected] Kenneth R. Baker Tuck School of Business Dartmouth College
CHAPTER 21 Error in medical practice
CHAPTER 21 Error in medical practice Steven Lillis is a general practitioner in Hamilton, and Medical Adviser for the Medical Council. Cite this as Lillis S 2013. Error in medical practice. Chapter 21
White paper: Complexity in health care. Gordon Baxter School of Computer Science University of St Andrews
White paper: Complexity in health care Gordon Baxter School of Computer Science University of St Andrews 26 th February 2010 Contents Introduction 3 Different views of complexity in health care 4 Where
High-Reliability Health Care: Getting There from Here
High-Reliability Health Care: Getting There from Here MARK R. CHASSIN and JEROD M. LOEB The Joint Commission Context: Despite serious and widespread efforts to improve the quality of health care, many
HUMAN FACTORS IN AIRCRAFT ACCIDENTS: A HOLISTIC APPROACH TO INTERVENTION STRATEGIES
To appear in the Proceedings of the 46 th Annual Meeting of the Human Factors and Ergonomics Society. Santa Monica, Human Factors and Ergonomics Society, 2002. HUMAN FACTORS IN AIRCRAFT ACCIDENTS: A HOLISTIC
Frequency, definition Modifiability, existence of multiple operations & strategies
Human Computer Interaction Intro HCI 1 HCI's Goal Users Improve Productivity computer users Tasks software engineers Users System Cognitive models of people as information processing systems Knowledge
High Integrity Systems in Health Environment. Mark Nicholson
High Integrity Systems in Health Environment Mark Nicholson Introduction National Patient Safety Agency (NPSA - 2003) has described patient safety as process by which an organisation makes patient care
Market Watch. Trade volume advertising: Considerations for firms and individuals relating to risks of market abuse. Contents
Financial Conduct Authority Market Watch Newsletter on market conduct and transaction reporting Issues Contents Trade volume advertising: Considerations for firms and individuals relating to risks of market
Crew Resource Management
Crew Resource Management DR TIMOTHY BRAKE, SENIOR MEDICAL OFFICER, UNITED CHRISTIAN HOSPITAL HONORARY SECRETARY HKSSIH Crew Resource Management 1 Crew Resource Management I am not an expert in CRM CRM
White Paper from Global Process Innovation. Fourteen Metrics for a BPM Program
White Paper from Global Process Innovation by Jim Boots Fourteen Metrics for a BPM Program This white paper presents 14 metrics which may be useful for monitoring progress on a BPM program or initiative.
PSPMS an airline operational data model for a predictive safety performance management system
VisionMonitor PSPMS an airline operational data model for a predictive safety performance management system P. Ulfvengren, J. Rignér Industrial Management, KTH Royal Institute of Technology, Stockholm,
SCENARIO DEVELOPMENT FOR DECISION SUPPORT SYSTEM EVALUATION
SCENARIO DEVELOPMENT FOR DECISION SUPPORT SYSTEM EVALUATION Emilie M. Roth Roth Cognitive Engineering Brookline, MA James W. Gualtieri, William C. Elm and Scott S. Potter Aegis Research Corporation Pittsburgh,
Supporting Employee Success. A Tool to Plan Accommodations that Support Success at Work
Supporting Employee A Tool to Plan Accommodations that Support at Work Supporting Employee A Tool to Plan Accommodations that Support at Work Table of Contents Background... Page 1 How the process works...
Strategies for LEADERSHIP. Hospital Executives and Their Role in Patient Safety
Strategies for LEADERSHIP Hospital Executives and Their Role in Patient Safety 1 Effective Leadership for Patient Safety Creating and Leading Significant Change Dear Colleague: In 1995, two tragic medication
OPERATIONAL RISK MANAGEMENT B130786 STUDENT HANDOUT
UNITED STATES MARINE CORPS THE BASIC SCHOOL MARINE CORPS TRAINING COMMAND CAMP BARRETT, VIRGINIA 22134-5019 OPERATIONAL RISK MANAGEMENT B130786 STUDENT HANDOUT Basic Officer Course (ORM) Introduction Importance
Prof. Soumen Ganguly Professor, FTMS Global Academy Pte Ltd, Singapore. e-mail: [email protected]
Human error Vs. Work place Management in modern organizations Prof. Soumen Ganguly Professor, FTMS Global Academy Pte Ltd, Singapore. e-mail: [email protected] Abstract First of all, we are all humans,
Scott A. Shappell, Ph.D. Wiegmann, Shappell, & Associates
The (HFACS) Scott A. Shappell, Ph.D. Wiegmann, Shappell, & Associates Swiss-cheese Model of Human Error Inputs Organizational Factors Unsafe Supervision Preconditions for Unsafe Acts Unsafe Acts Failed
THE CONCEPT OF HUMAN AND ORGANIZATIONAL FACTORS IN RISK ASSESSMENTS IN THE NORWEGIAN OFFSHORE PETROLEUM INDUSTRY
THE CONCEPT OF HUMAN AND ORGANIZATIONAL FACTORS IN RISK ASSESSMENTS IN THE NORWEGIAN OFFSHORE PETROLEUM INDUSTRY K.B. ALLRED, K. KJESTVEIT and S.M. NESVÅG RF-Rogaland Research, Stavanger, Norway Introduction
A Guide to Accident Investigations
A Guide to Accident Investigations Introduction The Health and Safety Executive (HSE) report that in 2010/2011 171 workers were killed at work. A further 121,430 other injuries to employees were reported
A Review of the NHSLA Incident Reporting and Management and Learning from Experience Standards. Assessment Outcomes. April 2003 - March 2004
A Review of the NHSLA Incident Reporting and Management and Learning from Experience Standards Assessment Outcomes April 2003 - March 2004 September 2004 1 Background The NHS Litigation Authority (NHSLA)
Edwin Lindsay Principal Consultant. Compliance Solutions (Life Sciences) Ltd, Tel: + 44 (0) 7917134922 E-Mail: [email protected].
Edwin Lindsay Principal Consultant, Tel: + 44 (0) 7917134922 E-Mail: [email protected] There were no guidelines/ regulations There was no training No Procedures No Inspectors Inform All staff of
Prokrustes säng. och HMI-design
COGNITIVE SYSTEMS ENGINEERING ORATORY Prokrustes säng och HMI-design Erik Hollnagel Cognitive Systems Engineering Laboratory () Department of Computer and Information Science University of Linköping, Sweden
An introduction to Crew Resource Management (Human Factors) Training. by Captain John Wright
An introduction to Crew Resource Management (Human Factors) Training by Captain John Wright CRM Background Tenerife, March 27, 1977 Collision between a KLM Boeing 747 and a Pan Am Boeing 747 CRM Course
Operational Risk Management Policy
Operational Risk Management Policy Operational Risk Definition A bank, including a development bank, is influenced by the developments of the external environment in which it is called to operate, as well
Behavior Rating Inventory of Executive Function BRIEF. Interpretive Report. Developed by. Peter K. Isquith, PhD, Gerard A. Gioia, PhD, and PAR Staff
Behavior Rating Inventory of Executive Function BRIEF Interpretive Report Developed by Peter K. Isquith, PhD, Gerard A. Gioia, PhD, and PAR Staff Client Information Client Name : Sample Client Client ID
Maximising the Effectiveness of Information Security Awareness
Maximising the Effectiveness of Information Security Awareness This thesis offers a fresh look at information security awareness using research from marketing and psychology. By Geordie Stewart and John
The Danish National Agency for Patients' Rights and Complaints Identifying the risk and assessing the frequency and severity of the risk
The Danish National Agency for Patients' Rights and Complaints Identifying the risk and assessing the frequency and severity of the risk Pia Knudsen, Ph.D. Pharm, Senior Patient Safety Officer 18. december
Safety System Performance Measurement
Workplace Safety Strategy for Nova Scotia 2013 2017 Summary of Findings from Consultation and Research: Safety System Performance Measurement Introduction: Performance management is a process that links
Risk Management and Patient Safety Evolution and Progress
Risk Management and Patient Safety Evolution and Progress Madrid February 2005 Charles Vincent Professor of Clinical Safety Research Department of Surgical Oncology & Technology Imperial College London
Board of Directors Meeting 12/04/2010. Operational Risk Management Charter
Board of Directors Meeting 12/04/2010 Document approved Operational Risk Management Charter Table of contents A. INTRODUCTION...3 I. Background...3 II. Purpose and Scope...3 III. Definitions...3 B. GOVERNANCE...4
SPECIFIC FEATURES REGARDING THE PREVENTION OF WORK ACCIDENTS AND OCCUPATIONAL DESEASES IN LAND RECLAMATIONS ACTIVITY
SPECIFIC FEATURES REGARDING THE PREVENTION OF WORK ACCIDENTS AND OCCUPATIONAL DESEASES IN LAND RECLAMATIONS ACTIVITY Adrian - George COJOCARU 1, Elena - Livia COJOCARU 2 1 Research and Development Institute
Motivations. spm - 2014 adolfo villafiorita - introduction to software project management
Risk Management Motivations When we looked at project selection we just took into account financial data In the scope management document we emphasized the importance of making our goals achievable, i.e.
SITUATIONAL AWARENESS 5-1 SITUATIONAL AWARENESS
SITUATIONAL AWARENESS 5-1 SITUATIONAL AWARENESS OBJECTIVES To successfully complete this assignment, you must study the text and master the following objectives: State the actions required to maintain
The Art of Aeronautical Decision-Making Course Table of Contents
Federal Aviation Administration The Art of Aeronautical Decision-Making Course Table of Contents Introduction Chapter 1 Chapter 2 Chapter 3 Chapter 4 Chapter 5 Chapter 6 Chapter 7 What is ADM? 3-P Model
Eliminating Over-Confidence in Software Development Effort Estimates
Eliminating Over-Confidence in Software Development Effort Estimates Magne Jørgensen 1 and Kjetil Moløkken 1,2 1 Simula Research Laboratory, P.O.Box 134, 1325 Lysaker, Norway {magne.jorgensen, kjetilmo}@simula.no
RESTORATIVE TECHNIQUES IN COGNITIVE REHABILITATION: PROGRAM DESIGN AND CLINICAL BENEFITS
RESTORATIVE TECHNIQUES IN COGNITIVE REHABILITATION: PROGRAM DESIGN AND CLINICAL BENEFITS In the treatment of traumatic brain injury, cognitive rehabilitation is an intervention that seeks to improve cognitive
Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire. P3M3 Project Management Self-Assessment
Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire P3M3 Project Management Self-Assessment Contents Introduction 3 User Guidance 4 P3M3 Self-Assessment Questionnaire
Cyber Security Evolved
Cyber Security Evolved Aware Cyber threats are many, varied and always evolving Being aware is knowing what is going on so you can figure out what to do. The challenge is to know which cyber threats are
A Risk Management Standard
A Risk Management Standard Introduction This Risk Management Standard is the result of work by a team drawn from the major risk management organisations in the UK, including the Institute of Risk management
P3M3 Portfolio Management Self-Assessment
Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire P3M3 Portfolio Management Self-Assessment P3M3 is a registered trade mark of AXELOS Limited Contents Introduction
WHO GLOBAL COMPETENCY MODEL
1. Core Competencies WHO GLOBAL COMPETENCY MODEL 1) COMMUNICATING IN A CREDIBLE AND EFFECTIVE WAY Definition: Expresses oneself clearly in conversations and interactions with others; listens actively.
Health and and Safety Executive. Behavioural Safety as part of your management system.
Health and and Safety Executive Simon Parry Food & Drink Section Behavioural Safety as part of your management system. Developing safety cultures workshop event. Plan for this starter session: Human behaviours
Occupational safety risk management in Australian mining
IN-DEPTH REVIEW Occupational Medicine 2004;54:311 315 doi:10.1093/occmed/kqh074 Occupational safety risk management in Australian mining J. Joy Abstract Key words In the past 15 years, there has been a
Designing Law Enforcement: Adaptive Strategies for the Complex Environment
Designing Law Enforcement: Adaptive Strategies for the Complex Environment John A. Bertetto, Police Officer, Chicago Police Department Crime Control Strategies As law enforcement professionals, we often
Statutory duty of candour with criminal sanctions Briefing paper on existing accountability mechanisms
Statutory duty of candour with criminal sanctions Briefing paper on existing accountability mechanisms Background In calling for the culture of the NHS to become more open and honest, Robert Francis QC,
Introduction. Chapter 1
Chapter 1 Introduction The area of fault detection and diagnosis is one of the most important aspects in process engineering. This area has received considerable attention from industry and academia because
Threat & Error Management (TEM) SafeSkies Presentation. Ian Banks Section Head, Human Factors 27 Aug 2011
Threat & Error Management (TEM) SafeSkies Presentation Ian Banks Section Head, Human Factors 27 Aug 2011 Objectives Definition Discuss Brief history of TEM The Original University of Texas TEM Model Threats
THEORIES OF ACCIDENT CAUSATION
THEORIES OF ACCIDENT CAUSATION You ve carefully thought out all the angles. You ve done it a thousand times. It comes naturally to you. You know what you re doing, it s what you ve been trained to do your
PREDICTING SOFTWARE FAULTS
PREDICTING SOFTWARE FAULTS Jay Naphas Federal Aviation Administration/AST-300, 800 Independence Avenue SW, Washington, DC 20591, USA, [email protected] ABSTRACT At first glance, software faults appear
Healthcare risk assessment made easy
Healthcare risk assessment made easy March 2007 The purpose of this document is to provide: 1. an easy-to-use risk assessment tool that helps promote vigilance in identifying risk and the ways in which
POL ENTERPRISE RISK MANAGEMENT SC51. Executive Services Department BUSINESS UNIT: Executive Support Services SERVICE UNIT:
POL ENTERPRISE RISK MANAGEMENT SC51 POLICY CODE: SC51 DIRECTORATE: Executive Services Department BUSINESS UNIT: Executive Support Services SERVICE UNIT: Executive Support Services RESPONSIBLE OFFICER:
RISK MANAGEMENT FOR INFRASTRUCTURE
RISK MANAGEMENT FOR INFRASTRUCTURE CONTENTS 1.0 PURPOSE & SCOPE 2.0 DEFINITIONS 3.0 FLOWCHART 4.0 PROCEDURAL TEXT 5.0 REFERENCES 6.0 ATTACHMENTS This document is the property of Thiess Infraco and all
Call topics. September 2013. 2013 SAF RA joint call on Human and organizational factors including the value of industrial safety
Call topics 2013 SAF RA joint call on Human and organizational factors including the value of industrial safety September 2013 SAF RA is an ERA-NET on industrial safety funded by the European Commission
performance Tackling the Challenges To Develop Effective Safety Cultures
improving human performance Tackling the Challenges To Develop Effective Safety Cultures PAM BOSCHEE, OIL AND GAS FACILITIES SENIOR EDITOR Human error is identified as a root cause of accidents and incidents
)LQDQFLDO$VVXUDQFH,VVXHV RI(QYLURQPHQWDO/LDELOLW\
)LQDQFLDO$VVXUDQFH,VVXHV RI(QYLURQPHQWDO/LDELOLW\ ([HFXWLYH6XPPDU\ %\ 3URI'U0LFKDHO*)DXUH//0 DQG 0U'DYLG*ULPHDXG Maastricht University and European Centre for Tort and Insurance Law (ECTIL) Final version
TASK, TEAM AND TECHNOLOGY INTEGRATION IN SURGICAL CARE
TASK, TEAM AND TECHNOLOGY INTEGRATION IN SURGICAL CARE Dr Ken Catchpole Director of Surgical Safety and Human Factors Research Department of Surgery Cedars-Sinai Medical Center Los Angeles .WHERE I FINISHED
www.costsbarrister.co.uk NIHL and success fees Andrew Hogan Barrister at law 1
www.costsbarrister.co.uk NIHL and success fees Andrew Hogan Barrister at law 1 On 13 th March 2015 at 4pm, Mr Justice Phillips handed down judgment in conjoined cases, Dalton and others.v.british Telecommunications
Medication error is the most common
Medication Reconciliation Transfer of medication information across settings keeping it free from error. By Jane H. Barnsteiner, PhD, RN, FAAN Medication error is the most common type of error affecting
TRAINING NEEDS ANALYSIS
TRAINING NEEDS ANALYSIS WHAT IS A NEEDS ANALYSIS? It is a systematic means of determining what training programs are needed. Specifically, when you conduct a needs analysis, you Gather facts about training
Information Governance. A Clinician s Guide to Record Standards Part 1: Why standardise the structure and content of medical records?
Information Governance A Clinician s Guide to Record Standards Part 1: Why standardise the structure and content of medical records? Contents Page 3 A guide for clinicians Pages 4 and 5 Why have standards
PERCEIVED QUALITY IN THE DELIVERY OF BUSINESS SUPPORT SERVICES: A CONCEPTUAL FRAMEWORK (WITH PRACTICAL IMPLICATIONS)
PERCEIVED QUALITY IN THE DELIVERY OF BUSINESS SUPPORT SERVICES: A CONCEPTUAL FRAMEWORK (WITH PRACTICAL IMPLICATIONS) Nicola Bellini LINK Research Center Scuola Superiore Sant'Anna Pisa - Italy [email protected]
INTERNATIONAL STANDARD ON AUDITING (UK AND IRELAND) 240 THE AUDITOR S RESPONSIBILITIES RELATING TO FRAUD IN AN AUDIT OF FINANCIAL STATEMENTS
INTERNATIONAL STANDARD ON AUDITING (UK AND IRELAND) 240 Introduction THE AUDITOR S RESPONSIBILITIES RELATING TO FRAUD IN AN AUDIT OF FINANCIAL STATEMENTS (Effective for audits of financial statements for
the Proactive Way Safety Management Systems (SMS)
Safety the Proactive Way Safety Management Systems (SMS) If you keep doing what you re doing you re going to keep getting what you got! Yogi Berra Insanity: doing the same thing over and over again and
Risk Assessment Tool and Guidance (Including guidance on application)
Risk Assessment Tool and Guidance (Including guidance on application) Document reference number Revision number OQR012 Document developed by 5 Document approved by Revision date October 2011 Responsibility
Railway Management Maturity Model (RM 3 )
Railway Management Maturity Model (RM 3 ) (Version 1.02) March 2011 Published by the Office of Rail Regulation 1 Contents Introduction... 1 Excellence in safety management systems... 3 Governance, policy
Fit work to people: How can national occupational health regulation work in practise
Fit work to people: How can national occupational health regulation work in practise Per Langaa Jensen Professor Department of Management Engineering Technical University of Denmark 13-02-2013 Who am I?
MANAGING FOR THE UNEXPECTED
Ross Thought in Action MANAGING FOR THE UNEXPECTED Kathleen M. Sutcliffe Professor, University of Michigan, Stephen M. Ross School of Business Marlys K. Christianson Assistant Professor, University of
Diversity and Organizational Change
Diversity and Organizational Change By Ginger Lapid-Bogda, Ph.D. Abstract: Diversity is often viewed as a training program, limited to a human resources initiative focused on race and gender and separate
THE IMPACT OF EMPLOYEE BEHAVIOUR ON ORGANIZATIONAL PERFORMANCE
THE IMPACT OF EMPLOYEE BEHAVIOUR ON ORGANIZATIONAL PERFORMANCE September 2013-0 - Contents 1. Introduction 2. Business Strategy and its Execution 3. Organisational Culture - Employee Behaviour 4. The Challenges
Insurance. Chapter 7. Introduction
65 Chapter 7 Insurance Introduction 7.1 The subject of genetic screening in relation to insurance is not new. In 1935 R A Fisher addressed the International Congress of Life Assurance Medicine on the topic,
How to Develop a Research Protocol
How to Develop a Research Protocol Goals & Objectives: To explain the theory of science To explain the theory of research To list the steps involved in developing and conducting a research protocol Outline:
University of Michigan Health System Team 1: Cable Management Analysis Program and Operations Analysis Project Final Report
University of Michigan Health System Team 1: Cable Management Analysis Program and Operations Analysis Project Final Report To: Frank J. Krupansky, Materiel Services Department, Director Hank Davis, Materiel
UK CIA Sustainable Health Metrics Indicator Tool
UK CIA Sustainable Health Metrics Indicator Tool A tool to promote establishing a sustainable healthy workplace One easy to use tool Simple questionnaire enables self-assessment and reporting of key aspects
Mauro Calvano. About Aviation Safety Management Systems
Mauro Calvano About Aviation Safety Management Systems January 2003 1 INTRODUCTION In order to be aware of the factors that are driving the accident rate during the last decade, we must identify the hazards
Improved Event Logging for Security and Forensics: developing audit management infrastructure requirements
Improved Event Logging for Security and Forensics: developing audit management infrastructure requirements Atif Ahmad & Anthonie Ruighaver University of Melbourne, Australia Abstract The design and implementation
Measurement Information Model
mcgarry02.qxd 9/7/01 1:27 PM Page 13 2 Information Model This chapter describes one of the fundamental measurement concepts of Practical Software, the Information Model. The Information Model provides
Section A: Introduction, Definitions and Principles of Infrastructure Resilience
Section A: Introduction, Definitions and Principles of Infrastructure Resilience A1. This section introduces infrastructure resilience, sets out the background and provides definitions. Introduction Purpose
