COGNITIVE SYSTEMS ENGINEERING ORATORY Prokrustes säng och HMI-design Erik Hollnagel Cognitive Systems Engineering Laboratory () Department of Computer and Information Science University of Linköping, Sweden E-mail mail: : eriho@ida.liu.se COGNITIVE SYSTEMS ENGINEERING ORATORY One size fits all Procrustes,, whose name means "he who stretches", kept a house by the side of the road where he offered hospitality to passing strangers, who w were invited in for a meal and a night's rest. As soon as the guest lay down Procrustes went to work upon him, stretching him on the rack if he was too short for the bed and chopping off his legs if he was too long. Taylor, F. V. and Garvey, W. D. (1959). The limitations of a 'Procrustean' approach to the optimization of man- machine systems. Ergonomics, 2, 187-194. 194. Page 1 1
COGNITIVE SYSTEMS ENGINEERING ORATORY NASA: 612 shuttle incidents (1990 X - 1993 IV) Other: lightning strikes communication breakdown poor training Faulty procedures 5% Other 21% Human error 66% Equipment failure 8% COGNITIVE SYSTEMS ENGINEERING ORATORY Main causes for maritime accidents (1987-1991) 1991) Structural error 12% Shore control error 10% Equipment and mechanical error 18% Crew error 16% Other 18% Pilot error 7% Officer error 27% Page 2 2
COGNITIVE SYSTEMS ENGINEERING ORATORY Reactions to accidents / incidents Classification Rare event Act of God Unwanted event Agreed Cause The classification of the cause affects the selection of response Do nothing Replacement Barriers Redesign Elimination Response Identical module Improved module Rules Soft barriers Procedures Roles Alarms Hard barriers Interlocks Interface design System design Operational support Task design & allocation Fault tolerant system COGNITIVE SYSTEMS ENGINEERING ORATORY Dependability In order for a system to be useful, it must be dependable (reliable) A system is dependable if it correctly performs a required activity in a required time period (if time is a limiting factor) and does nothing that can degrade the system. Dependability is an issue for single objects and complex systems alike A system is a set of objects together with relationships between the objects and between their attributes i.e., anything that consists of parts connected together. A system is dependable, if: The components (parts and subsystems) are dependable The interactions/couplings between parts are dependable Page 3 3
COGNITIVE SYSTEMS ENGINEERING ORATORY Car engines - complex but dependable. A modern car engine is a complex, but highly dependable, system consisting of many, many parts. The components and their interactions are designed and engineered to a high degree of precision. COGNITIVE SYSTEMS ENGINEERING ORATORY Technological versus joint systems Technological systems (user independent) Design (specification) of components Feasible to a very high degree Design (specification) of interaction Feasible to a very high degree Joint cognitive systems (HMS) (user dependent) Feasible for technology. Impractical for humans Only possible to a limited extent. Page 4 4
COGNITIVE SYSTEMS ENGINEERING ORATORY Coping with complex systems Failures of technical components & systems Events in social system (team, other parts of organisation) Abrupt changes in physical environment ( acts of nature ) Maintaining a good balance between feedforward and feedback Unexpected changes in work environment Constraining variability Simplify control actions (number, time, magnitude, reversibility) Impose constraints on others (team members, organisation) Limit demands by coping (simplification, distribution) Narrow scope of control (temporal, system range) COGNITIVE SYSTEMS ENGINEERING ORATORY The forced automaton metaphor If machines are to function properly, users must provide responses that fall within a pre-defined set of allowed or valid responses Machines have limited power of perception perception and interpretation interpretation. Machines are not able to generalise or go beyond the information given In order to achieve that it is necessary that the designer forces the user to function as a finite state automaton We therefore also have to think of the user in terms of a finite automaton Page 5 5
COGNITIVE SYSTEMS ENGINEERING ORATORY The finite state automaton i 1 i 2... i m A s 1 s 2... o 1 o 2... o n A = (I, O, S, λ, δ) I is a finite set (of inputs) O is a finite set (of outputs) S is a finite set (of internal states) λ: : S x I S is the next-state function δ: : S x I O is the next-output function COGNITIVE SYSTEMS ENGINEERING ORATORY Control system, process & operator Process output Process Process Process input Control system input i 1 i 2... i m Control system s 1 s 2... o 1 o 2... o n Control system output Operator output Operator Operator Operator input Page 6 6
COGNITIVE SYSTEMS ENGINEERING ORATORY Restricting user interaction Possible range of Valid range of input and output Machine of input and output COGNITIVE SYSTEMS ENGINEERING ORATORY Miller (1953): Man-Machine Machine Task Analysis 1 2 3 4 5 6 7 8 Specify the man-machine machine system criterion output. Determine the system functions (variables in the machine's output t which are necessary and sufficient to control the quality of the overall output). Trace each system function to the machine input or control established for the operator to activate. For each function determine what information is displayed by the machine whereby the operator is directed to appropriate control activation (or monitoring) oring) for that function. Determine what indications of response adequacy in the control of o f each function will be fed back to the operator. Determine what information will be available and necessary to the e operator from the man-machine machine "environment." Determine what functions of the system must be modulated by the operator at or about the same time, or in close sequence, or in cycles (tasks). In reviewing the analysis be sure that each stimulus is linked to t o a response and that each response is linked to a stimulus. Page 7 7
COGNITIVE SYSTEMS ENGINEERING ORATORY Compliance as a solution? Need of compliance HI Compliance techniques: standardisation :: procedures / regulations :: formal methods :: interface / interaction design Space mission Power generation Military Transportation Absolute compliance is impossible. Home Public service Commerce Leisure LO LO HI Level of risk (operation) COGNITIVE SYSTEMS ENGINEERING ORATORY Demand-capacity matching Supplement Supplement user by automation. Mismatch between task demand and user capacity Redesign tasks to reduce demands Tasks are often designed so that the minimum demands max. require maximum capacity. min. max. min. Tasks should be designed so that the maximum demands can be achieved with normal / minimum capacity. max. max. min. min. Task demand User capacity Task demand User capacity Page 8 8
COGNITIVE SYSTEMS ENGINEERING ORATORY Automation as a solution? Technocentric view Humans are a major source of failure and should therefore be designed out of the system. Automatic control systems are more rigid, and therefore more reliable. Automation permits a system to function when human capability has been exhausted. Automation is cost-effective because it reduces skill- requirements to operators. Cognitive engineering view Humans are adaptive - and can recover from unexpected situations. Automation relies on software that is often not reliable, even when only moderately complex. Automation is always incomplete, hence requires humans as back-up when system fails. Only true for routine operations; operators must monitor automation, as an extra task. COGNITIVE SYSTEMS ENGINEERING ORATORY Ironies of automation L. Bainbridge (1987), Ironies of automation The basic view is that the human operator is unreliable and inefficient, and therefore should be eliminated from the system. First First irony irony Designer errors can be a major source of operating problems. Second irony irony The designer, who tries to eliminate the operator, still leaves the operator to do the tasks which the designer cannot think how to automate. Page 9 9
COGNITIVE SYSTEMS ENGINEERING ORATORY Design objectives The objective of system design is not to optimise human- machine interaction per se,, but rather to ensure that the joint system can control the situation - and itself. It is not an objective to eliminate errors errors at any cost It is more important to understand the nature of actions that lead to unwanted system conditions. The requirements must reflect actual user needs. Focus on human-machine machine co-operation operation,, rather than on interaction design and interface details. The purpose is to enable the joint system to achieve its goals, not to enhance interface usability. COGNITIVE SYSTEMS ENGINEERING ORATORY Maintaining control What causes the loss of control? Unexpected events Acute time pressure Not knowing what happens Not knowing what to do Not having the necessary resources What can help maintain or regain control? Sufficient time Good predictions of future events Reduced task load Clear alternatives or procedures Being in control of Capacity to the situation means: evaluate and plan Knowing what will happen Knowing what has happened Page 10 10
COGNITIVE SYSTEMS ENGINEERING ORATORY Basic cyclical model Disturbances Events / feedback Modifies Produces Process / application / controlled system Controller / controlling system Construct Action Determines COGNITIVE SYSTEMS ENGINEERING ORATORY Human/system control modes Complete control STRATEGIC Well-planned, highly organised performance, high reliability TACTICAL Well-organised performance but limited planning, good reliability OPPORTUNISTIC Loosely organised performance, scanty planning, limited chance of success No control SCRAMBLED Disorganised performance, failures very likely Page 11 11
COGNITIVE SYSTEMS ENGINEERING ORATORY Control mode dependencies Situation regularity High Tactical (unattended) Tactical (attended) Opportu- nistic Strategic Low Scram- bled Low High Available time (subjective) COGNITIVE SYSTEMS ENGINEERING ORATORY Time needed to assess situation Activity is time limited Delays in feedback / reactions Events Disturbance, interference Aging Aging of information Evaluating / assessing situation Carrying out action Time available to do action (time window) Intention Time needed to choose action Choosing what to do Action Rate of change of process (stability) Page 12 12
COGNITIVE SYSTEMS ENGINEERING ORATORY The pace of work When human work with machines, the pace is set by the maximum speed of the machine. Humans must therefore struggle to keep pace. When humans work with humans, a natural pace develops. Everyone works equally slow or fast. THINK! DO! COGNITIVE SYSTEMS ENGINEERING ORATORY Time to think or time to do? THINK! Work is carefully planned and monitored Demands match capacity Control is kept. Efficient performance requires a balance between thinking and doing. DO! Work is paced by technology and external events. Demands exceed capacity Easy to lose control. Page 13 13
COGNITIVE SYSTEMS ENGINEERING ORATORY Maintaining the balance Lagging behind; shortage of time Feedback Feedforward Efficient performance requires a balance between feedback and feedforward. Decisions buy time but also take time. Model dependency; uncertainty of outcomes Feedback Feedforward Decision-making is embedded in compensatory actions Decision-making an explicit part of planning & scheduling COGNITIVE SYSTEMS ENGINEERING ORATORY Ways of making HMI reliable Limit the possibilities for actions. Use facilitators rather than restrains Build in barriers and interlocks. Prevent against unsafe actions Make sure messages are understandable. Provide unambiguous feedback Feedback for actions, clear indications of system state/mode. Include possibilities to undo actions. Make system robust and resilient Use error correcting algorithms. Practical only for very constrained environments Use plan / intent recognition. Not yet practical Page 14 14
COGNITIVE SYSTEMS ENGINEERING ORATORY Foundations of efficient joint systems Users understand the nature of the process Good mental model - acquired through training and experience Actual interface more powerful than formal training System observability is high Adequate feedback from actions Easy to navigate, locate, and differentiate information System predictability is high Transparent technology - no automation surprises! High correspondence between surface surface system representation and real real system Efficient performance requires all of these Inefficient performance results if just one is missing COGNITIVE SYSTEMS ENGINEERING ORATORY Conclusions Safety is not just a question of fallible humans corrupting perfect computing artefacts Safety cannot be ensured by enforced compliance or automation Need of better ways to describe, analyse and assess safety of joint human-machine machine systems Functioning of a system cannot be considered separate from role of humans (from developers to maintenance) Human-and and-computer must be seen as a single, joint system Efficient system performance requires local optimisation by humans (Efficiency-Thoroughness Trade-Off) - during design, development, and operation No single discipline can provide complete solution Need of collaboration more between cognitive ergonomics community and the safety & reliability community - and others Page 15 15