Prokrustes säng. och HMI-design



Similar documents
Specification and Analysis of Contracts Lecture 1 Introduction

Ein einheitliches Risikoakzeptanzkriterium für Technische Systeme

Maximization versus environmental compliance

University of Paderborn Software Engineering Group II-25. Dr. Holger Giese. University of Paderborn Software Engineering Group. External facilities

ACCIDENTS AND BARRIERS

26. Legacy Systems. Objectives. Contents. Legacy systems 1

Connectivity. Alliance Access 7.0. Database Recovery. Information Paper

Integrated Risk Management:

Connectivity. Alliance Access 7.0. Database Recovery. Information Paper

Safety Risk. Aligning perception with reality

Remote I/O Network Determinism

WORKSHOP 1. Restock warehouse and visual displays. Display of furniture and development of good visual displays. Selection of furniture

Human-Automation Interaction Design and Evaluation Tools. Michael Feary, PhD

Proactive. approaches to safety. management. Erik Hollnagel. Thought paper May 2012

Standard 5. Patient Identification and Procedure Matching. Safety and Quality Improvement Guide

Dealing with risk. Why is risk management important?

Chapter 5. System security and ancillary services

5 costly mistakes you should avoid when developing new products

Design with Reuse. Building software from reusable components. Ian Sommerville 2000 Software Engineering, 6th edition. Chapter 14 Slide 1

Software Engineering UNIT -1 OVERVIEW

Involve-Project Manager

e-navigation and Geospatial Intelligence for Maritime Operations; Developing a Strategic Vision Digital Ship Athens 2014

Screen Design : Navigation, Windows, Controls, Text,

Peter Mileff PhD SOFTWARE ENGINEERING. The Basics of Software Engineering. University of Miskolc Department of Information Technology

The 9 Ugliest Mistakes Made with Data Backup and How to Avoid Them

Requirements engineering

Backup and Redundancy

LS/ATN Living Systems Adaptive Transportation Networks

The NREN s core activities are in providing network and associated services to its user community that usually comprises:

The Microsoft Large Mailbox Vision

OWN RISK AND SOLVENCY ASSESSMENT AND ENTERPRISE RISK MANAGEMENT

A structured approach to Enterprise Risk Management (ERM) and the requirements of ISO 31000

ENTERPRISE RISK MANAGEMENT FRAMEWORK

APICS INSIGHTS AND INNOVATIONS UNCOVERING CHRONIC DISRUPTION IN SUPPLY CHAIN AND OPERATIONS MANAGEMENT

CRITICAL CHAIN AND CRITICAL PATH, CAN THEY COEXIST?

Data Quality Improvement and the Open Mapping Tools

Project management. Organising, planning and scheduling software projects. Ian Sommerville 2000 Software Engineering, 6th edition.

A Mathematical Programming Solution to the Mars Express Memory Dumping Problem

Discussion Paper 01: Aviation Demand Forecasting

Defining and operationalizing the barrier concept

b) Describe the concept of ERROR CHAIN in aviation.

Human Resource Management (HRM)

Human Error Probability Estimation for Process Risk Assessment with emphasis on Control Room Operations

TCOM 370 NOTES 99-4 BANDWIDTH, FREQUENCY RESPONSE, AND CAPACITY OF COMMUNICATION LINKS

User interface design. Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 16 Slide 1

And the Models Are System/Software Development Life Cycle. Why Life Cycle Approach for Software?

Declaration of Internet Rights Preamble

So You Want to Build an Automated Scheduling System

Project management. Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 5 Slide 1

Foundations for Systems Development

Introduction. Chapter Scope of Electrical Engineering

SCENARIO DEVELOPMENT FOR DECISION SUPPORT SYSTEM EVALUATION

Appendix B: Monitoring Tool Matrices

Functional Decomposition Top-Down Development

ENTERPRISE RISK MANAGEMENT POLICY

An Introduction to the PRINCE2 project methodology by Ruth Court from FTC Kaplan

Developing software which should never compromise the overall safety of a system

Software Testing. Knowledge Base. Rajat Kumar Bal. Introduction

DATA QUALITY AND SCALE IN CONTEXT OF EUROPEAN SPATIAL DATA HARMONISATION

Lecture 7: Clocking of VLSI Systems

Maritime Integrated PNT System

Know-how and commitment are always included. Service offer of CG Drives & Automation

A Design Framework for Flexible Automated Warehouses

EMR Incorporation: Evaluating the Benefits for Your Organization

What is Business Process Design and Why Should I Care?

FaultAnalysisandElectricalProtectionofDistributionTransformers

Project Risk Management: IV&V as Insurance for Project Success

The Systems Engineering Tool Box

Software Engineering. Objectives. Designing, building and maintaining large software systems

RFID: The Solution to Automating IT & Data Center Asset Management

WHITE PAPER HOW TO REDUCE RISK, ERROR, COMPLEXITY AND DRIVE COSTS IN THE ACCOUNTS PAYABLE PROCESS

Interface Design Rules

Availability and Disaster Recovery: Basic Principles

What makes a good process?

Human-Computer Interaction Standards

Assuming the Role of Systems Analyst & Analysis Alternatives

Physical Education: Single Performance National 5

Cognitive and Organizational Challenges of Big Data in Cyber Defense

NORWEGIAN PROJECT MANAGEMENT PRINCIPLES APPLIED IN THE JURONG ROCK CAVERN PROJECT

Introduction. Chapter 1

A Study on Software Metrics and Phase based Defect Removal Pattern Technique for Project Management

SQL-BackTrack the Smart DBA s Power Tool for Backup and Recovery

DATA STORAGE SYSTEM FOR LS7001/LS8000 LIGHTNING DETECTION NETWORKS

Table of Contents 1. INTRODUCTION 2 2. DEFINITION 4 3. UAS CLASSIFICATION 6 4. REGULATORY PRINCIPLES INTERACTION WITH AIR TRAFFIC CONTROL 16

Harmonics and Noise in Photovoltaic (PV) Inverter and the Mitigation Strategies

VoIP for Small Businesses: A Primer. Pete Crane, We Solve IT & Ian Boreham, 3CX

Higher Computing. Software Development. LO1 Software Development process

MARINE PROFILE AB INTERNATIONAL CONSULTANTS SERVING THE MARITIME INDUSTRY IN HUMAN RESOURCE, ORGANISA- TION, HUMAN FACTOR AND SAFETY ISSUES

Introduction to Software Engineering. 8. Software Quality

The need of technology cannot be overstated but the complexity and diversity forces one to take a hand look at the following:

Transcription:

COGNITIVE SYSTEMS ENGINEERING ORATORY Prokrustes säng och HMI-design Erik Hollnagel Cognitive Systems Engineering Laboratory () Department of Computer and Information Science University of Linköping, Sweden E-mail mail: : eriho@ida.liu.se COGNITIVE SYSTEMS ENGINEERING ORATORY One size fits all Procrustes,, whose name means "he who stretches", kept a house by the side of the road where he offered hospitality to passing strangers, who w were invited in for a meal and a night's rest. As soon as the guest lay down Procrustes went to work upon him, stretching him on the rack if he was too short for the bed and chopping off his legs if he was too long. Taylor, F. V. and Garvey, W. D. (1959). The limitations of a 'Procrustean' approach to the optimization of man- machine systems. Ergonomics, 2, 187-194. 194. Page 1 1

COGNITIVE SYSTEMS ENGINEERING ORATORY NASA: 612 shuttle incidents (1990 X - 1993 IV) Other: lightning strikes communication breakdown poor training Faulty procedures 5% Other 21% Human error 66% Equipment failure 8% COGNITIVE SYSTEMS ENGINEERING ORATORY Main causes for maritime accidents (1987-1991) 1991) Structural error 12% Shore control error 10% Equipment and mechanical error 18% Crew error 16% Other 18% Pilot error 7% Officer error 27% Page 2 2

COGNITIVE SYSTEMS ENGINEERING ORATORY Reactions to accidents / incidents Classification Rare event Act of God Unwanted event Agreed Cause The classification of the cause affects the selection of response Do nothing Replacement Barriers Redesign Elimination Response Identical module Improved module Rules Soft barriers Procedures Roles Alarms Hard barriers Interlocks Interface design System design Operational support Task design & allocation Fault tolerant system COGNITIVE SYSTEMS ENGINEERING ORATORY Dependability In order for a system to be useful, it must be dependable (reliable) A system is dependable if it correctly performs a required activity in a required time period (if time is a limiting factor) and does nothing that can degrade the system. Dependability is an issue for single objects and complex systems alike A system is a set of objects together with relationships between the objects and between their attributes i.e., anything that consists of parts connected together. A system is dependable, if: The components (parts and subsystems) are dependable The interactions/couplings between parts are dependable Page 3 3

COGNITIVE SYSTEMS ENGINEERING ORATORY Car engines - complex but dependable. A modern car engine is a complex, but highly dependable, system consisting of many, many parts. The components and their interactions are designed and engineered to a high degree of precision. COGNITIVE SYSTEMS ENGINEERING ORATORY Technological versus joint systems Technological systems (user independent) Design (specification) of components Feasible to a very high degree Design (specification) of interaction Feasible to a very high degree Joint cognitive systems (HMS) (user dependent) Feasible for technology. Impractical for humans Only possible to a limited extent. Page 4 4

COGNITIVE SYSTEMS ENGINEERING ORATORY Coping with complex systems Failures of technical components & systems Events in social system (team, other parts of organisation) Abrupt changes in physical environment ( acts of nature ) Maintaining a good balance between feedforward and feedback Unexpected changes in work environment Constraining variability Simplify control actions (number, time, magnitude, reversibility) Impose constraints on others (team members, organisation) Limit demands by coping (simplification, distribution) Narrow scope of control (temporal, system range) COGNITIVE SYSTEMS ENGINEERING ORATORY The forced automaton metaphor If machines are to function properly, users must provide responses that fall within a pre-defined set of allowed or valid responses Machines have limited power of perception perception and interpretation interpretation. Machines are not able to generalise or go beyond the information given In order to achieve that it is necessary that the designer forces the user to function as a finite state automaton We therefore also have to think of the user in terms of a finite automaton Page 5 5

COGNITIVE SYSTEMS ENGINEERING ORATORY The finite state automaton i 1 i 2... i m A s 1 s 2... o 1 o 2... o n A = (I, O, S, λ, δ) I is a finite set (of inputs) O is a finite set (of outputs) S is a finite set (of internal states) λ: : S x I S is the next-state function δ: : S x I O is the next-output function COGNITIVE SYSTEMS ENGINEERING ORATORY Control system, process & operator Process output Process Process Process input Control system input i 1 i 2... i m Control system s 1 s 2... o 1 o 2... o n Control system output Operator output Operator Operator Operator input Page 6 6

COGNITIVE SYSTEMS ENGINEERING ORATORY Restricting user interaction Possible range of Valid range of input and output Machine of input and output COGNITIVE SYSTEMS ENGINEERING ORATORY Miller (1953): Man-Machine Machine Task Analysis 1 2 3 4 5 6 7 8 Specify the man-machine machine system criterion output. Determine the system functions (variables in the machine's output t which are necessary and sufficient to control the quality of the overall output). Trace each system function to the machine input or control established for the operator to activate. For each function determine what information is displayed by the machine whereby the operator is directed to appropriate control activation (or monitoring) oring) for that function. Determine what indications of response adequacy in the control of o f each function will be fed back to the operator. Determine what information will be available and necessary to the e operator from the man-machine machine "environment." Determine what functions of the system must be modulated by the operator at or about the same time, or in close sequence, or in cycles (tasks). In reviewing the analysis be sure that each stimulus is linked to t o a response and that each response is linked to a stimulus. Page 7 7

COGNITIVE SYSTEMS ENGINEERING ORATORY Compliance as a solution? Need of compliance HI Compliance techniques: standardisation :: procedures / regulations :: formal methods :: interface / interaction design Space mission Power generation Military Transportation Absolute compliance is impossible. Home Public service Commerce Leisure LO LO HI Level of risk (operation) COGNITIVE SYSTEMS ENGINEERING ORATORY Demand-capacity matching Supplement Supplement user by automation. Mismatch between task demand and user capacity Redesign tasks to reduce demands Tasks are often designed so that the minimum demands max. require maximum capacity. min. max. min. Tasks should be designed so that the maximum demands can be achieved with normal / minimum capacity. max. max. min. min. Task demand User capacity Task demand User capacity Page 8 8

COGNITIVE SYSTEMS ENGINEERING ORATORY Automation as a solution? Technocentric view Humans are a major source of failure and should therefore be designed out of the system. Automatic control systems are more rigid, and therefore more reliable. Automation permits a system to function when human capability has been exhausted. Automation is cost-effective because it reduces skill- requirements to operators. Cognitive engineering view Humans are adaptive - and can recover from unexpected situations. Automation relies on software that is often not reliable, even when only moderately complex. Automation is always incomplete, hence requires humans as back-up when system fails. Only true for routine operations; operators must monitor automation, as an extra task. COGNITIVE SYSTEMS ENGINEERING ORATORY Ironies of automation L. Bainbridge (1987), Ironies of automation The basic view is that the human operator is unreliable and inefficient, and therefore should be eliminated from the system. First First irony irony Designer errors can be a major source of operating problems. Second irony irony The designer, who tries to eliminate the operator, still leaves the operator to do the tasks which the designer cannot think how to automate. Page 9 9

COGNITIVE SYSTEMS ENGINEERING ORATORY Design objectives The objective of system design is not to optimise human- machine interaction per se,, but rather to ensure that the joint system can control the situation - and itself. It is not an objective to eliminate errors errors at any cost It is more important to understand the nature of actions that lead to unwanted system conditions. The requirements must reflect actual user needs. Focus on human-machine machine co-operation operation,, rather than on interaction design and interface details. The purpose is to enable the joint system to achieve its goals, not to enhance interface usability. COGNITIVE SYSTEMS ENGINEERING ORATORY Maintaining control What causes the loss of control? Unexpected events Acute time pressure Not knowing what happens Not knowing what to do Not having the necessary resources What can help maintain or regain control? Sufficient time Good predictions of future events Reduced task load Clear alternatives or procedures Being in control of Capacity to the situation means: evaluate and plan Knowing what will happen Knowing what has happened Page 10 10

COGNITIVE SYSTEMS ENGINEERING ORATORY Basic cyclical model Disturbances Events / feedback Modifies Produces Process / application / controlled system Controller / controlling system Construct Action Determines COGNITIVE SYSTEMS ENGINEERING ORATORY Human/system control modes Complete control STRATEGIC Well-planned, highly organised performance, high reliability TACTICAL Well-organised performance but limited planning, good reliability OPPORTUNISTIC Loosely organised performance, scanty planning, limited chance of success No control SCRAMBLED Disorganised performance, failures very likely Page 11 11

COGNITIVE SYSTEMS ENGINEERING ORATORY Control mode dependencies Situation regularity High Tactical (unattended) Tactical (attended) Opportu- nistic Strategic Low Scram- bled Low High Available time (subjective) COGNITIVE SYSTEMS ENGINEERING ORATORY Time needed to assess situation Activity is time limited Delays in feedback / reactions Events Disturbance, interference Aging Aging of information Evaluating / assessing situation Carrying out action Time available to do action (time window) Intention Time needed to choose action Choosing what to do Action Rate of change of process (stability) Page 12 12

COGNITIVE SYSTEMS ENGINEERING ORATORY The pace of work When human work with machines, the pace is set by the maximum speed of the machine. Humans must therefore struggle to keep pace. When humans work with humans, a natural pace develops. Everyone works equally slow or fast. THINK! DO! COGNITIVE SYSTEMS ENGINEERING ORATORY Time to think or time to do? THINK! Work is carefully planned and monitored Demands match capacity Control is kept. Efficient performance requires a balance between thinking and doing. DO! Work is paced by technology and external events. Demands exceed capacity Easy to lose control. Page 13 13

COGNITIVE SYSTEMS ENGINEERING ORATORY Maintaining the balance Lagging behind; shortage of time Feedback Feedforward Efficient performance requires a balance between feedback and feedforward. Decisions buy time but also take time. Model dependency; uncertainty of outcomes Feedback Feedforward Decision-making is embedded in compensatory actions Decision-making an explicit part of planning & scheduling COGNITIVE SYSTEMS ENGINEERING ORATORY Ways of making HMI reliable Limit the possibilities for actions. Use facilitators rather than restrains Build in barriers and interlocks. Prevent against unsafe actions Make sure messages are understandable. Provide unambiguous feedback Feedback for actions, clear indications of system state/mode. Include possibilities to undo actions. Make system robust and resilient Use error correcting algorithms. Practical only for very constrained environments Use plan / intent recognition. Not yet practical Page 14 14

COGNITIVE SYSTEMS ENGINEERING ORATORY Foundations of efficient joint systems Users understand the nature of the process Good mental model - acquired through training and experience Actual interface more powerful than formal training System observability is high Adequate feedback from actions Easy to navigate, locate, and differentiate information System predictability is high Transparent technology - no automation surprises! High correspondence between surface surface system representation and real real system Efficient performance requires all of these Inefficient performance results if just one is missing COGNITIVE SYSTEMS ENGINEERING ORATORY Conclusions Safety is not just a question of fallible humans corrupting perfect computing artefacts Safety cannot be ensured by enforced compliance or automation Need of better ways to describe, analyse and assess safety of joint human-machine machine systems Functioning of a system cannot be considered separate from role of humans (from developers to maintenance) Human-and and-computer must be seen as a single, joint system Efficient system performance requires local optimisation by humans (Efficiency-Thoroughness Trade-Off) - during design, development, and operation No single discipline can provide complete solution Need of collaboration more between cognitive ergonomics community and the safety & reliability community - and others Page 15 15