_time to think and time to do hollnagel 013

17
1 © Erik Hollnagel, 2003 CSE LAB COGNITIVE SYSTEMS ENGINEERING LABORATORY Time to think and time to do? I can fail, and so can you! Erik Hollnagel Dept. of Computer and Information Science University of Linköping SE-581 83 Linköping, Sweden E-mail: eriho@ida.liu .se Dedale 15, Place de la Nation F-75011 Paris, France E-mail: ehollnagel @dedale.net © Erik Hollnagel, 2003 CSE LAB COGNITIVE SYSTEMS ENGINEERING LABORATORY “To err is human, to forgive divine.” An essay on Criticism (1711) by Alexander Pope (1688-1744) What is this thing called “error”? It is one thing to show people they are in an error, and another to put them in possession of truth. John Locke (1632-1704) An Essay Concerning Human Understanding, Bk. IV, Ch. 7

Upload: stanchell

Post on 02-Apr-2015

328 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: _Time to Think and Time to Do Hollnagel 013

1

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY

Time to think and time to do?

I can fail, and so can you!

Erik Hollnagel

Dept. of Computer and Information Science

University of LinköpingSE-581 83 Linköping,

SwedenE-mail: [email protected]

Dedale15, Place de la NationF-75011 Paris, France

E-mail: [email protected]

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY

“To err is human, to forgive divine.”An essay on Criticism (1711) by Alexander Pope (1688-1744)

What is this thing called “error”?

It is one thing to show people they are in an error, and another

to put them in possession of truth.

John Locke (1632-1704) An Essay Concerning Human Understanding, Bk.

IV, Ch. 7

Page 2: _Time to Think and Time to Do Hollnagel 013

2

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY

When things go wrong …

Organisational failure

“Act of god”

Human “causes

”Technical

failure

A rush for explanations

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY

Why do we look for “errors”?

Fundamental

attribution error

Actions are the result of

dispositions

Illusion of free

will

People are free to

choose their actions

Magnitude bias

Causes and consequences are of

similar size

Assumption: the source of error is the

human factor.

Analyse to find where a person is

involved.

Stop analysis when one is found.

“Safe bet” - all systems involve

humans somewhere

Page 3: _Time to Think and Time to Do Hollnagel 013

3

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY

Human error in setting pumps.

“Human error” – or what?

Infusion Pump and Parenteral Delivery Problems Harvard Adverse Drug Event Study. (Leape et al,

1995)

4% attributed errors due to device use.

Accidental tubing disconnections.

Confusion between central and peripheral

lines.

Error in setting pumps induced by design – no feedback.Spontaneous tubing disconnections.No means to visually differentiate between central and peripheral lines.

“… error is the result of an alignment of conditions and occurrences each of which

is necessary, but none alone sufficient to cause the error”.

(Bogner, 1998)

IncidentError

provoking

condition

Alignment of

factors

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY

From reasoning to actions“ In all demonstrative sciences the rules are

certain and infallible; but when we apply them, our

fallible said uncertain faculties are very apt to

depart from them, and fall into error.”

David Hume “ A Treatise of Human Nature” , Part IV, Section I.

1711-1776

Industrial injuries result only from accidents.

Faults of persons are created by environment or acquired by

inheritance.

Unsafe actions and conditions are caused only by faults of persons.

Accidents are caused directly only by

(a) the unsafe acts of persons or (b) exposure to unsafe mechanical

conditions.

Domino Theory of Accidents

(Heinrich, 1931)

Page 4: _Time to Think and Time to Do Hollnagel 013

4

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY

The Domino theory - outside viewS

ocia

l en

viro

nmen

tAnc

estry

Faul

t of

pers

on

Uns

afe

act

Mec

hani

cal &

phy

sica

lHaz

ards

Acc

iden

tAcc

iden

t

Inju

ryIn

jury

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY

The Domino theory - inside view

Page 5: _Time to Think and Time to Do Hollnagel 013

5

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY

Accident rate 0.80 / million vehicle-miles travel / year (freeways)Accident rate 2.9 / million vehicle-miles travel / year (two lane highways)

When things go right!

00,10,20,30,40,50,60,70,80,9

1

19831985

19871989

19911993

19951997

19992001

MajorSerious

Major accidents per million hours flown

Admissions: 36.500.000

“Medial error”

deaths: 44.000

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORYHypotheses about accidents and actions

Hypothesis #1:

Actions leading to

failureActions

leading to success

Accidents

Normal actions

Hypothesis #2:

Actions in general

Accidents

Normal actions

The study of failures cannotbenefit from the study of successes

The study of failures mustbe based on the study of successes

"Knowledge and error flow from the same mental sources, only success can tell one from the other."(Ernst Mach, 1838-1916)

Page 6: _Time to Think and Time to Do Hollnagel 013

6

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY

What should we be looking for?But performance variations can be

positive as well as negative!

Unsafe actNear missIncidentAccident

Time

Per

form

ance

va

riabi

lity

InventionImprovementSmart moveShortcut

Why?

Why?

Human factors has tended to look for negative aspects of performance -deviations or “errors”

Why?

Why? Why

?

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY

Why things work!Optimistic assumption systems are well designed and scrupulously

maintained,procedures are complete and correctpeople behave as they are expected todesigners can foresee and anticipate every contingency.

learn to overcome design flaws and functional glitchesinterpret and apply procedures to match conditionsadapt their performance to meet demandscan detect and correct when things go wrong

Systems can be improved by restraining human variability

Systems can be improved by accommodating human

variability

Things go right because

Realistic assumption

Things go right because people

Page 7: _Time to Think and Time to Do Hollnagel 013

7

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY

Time to evaluate event Time to select

action

TTT - things take time

TimeNeed to do something! (Intention)

Time when it must be

done

Latest starting time

Earliest starting time

Latest finishing time

Earliest finishing time

Time needed

Time available

Time to do

Time when it can be

done

Time to think

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY

TE = time to evaluate event

TE

Events /

feedback

ActionConstruct

TS = time to select actionTS

External event /

disturbance

TA = available time (context dependent)

TA

TP = estimated performance time

TP

If (TE + TS) exceedsTA, then the operator will lag behind the process and may gradually lose control.

If (TE + TS) is less thanTA, then the operator will be able to e.g. refine the current understanding.

TP<TA?

Level of control will vary depending on

performance conditions

Level of control will vary depending on

performance conditions

Everything happens in timeTD =

feedback time delay

TD

Page 8: _Time to Think and Time to Do Hollnagel 013

8

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORYWorking to rule - design assumptions

System input is regular and predictable

Demands and resources are compatible.

Working conditions fall within normal limits.

Other people behave as prescribed

Output (actions) will comply with norms.

… no need to make adjustments

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY

… but in reality

System input may be irregular and unpredictable

Demands vary and resources may be

inadequate.

Working conditions may at times be sub-optimal.

Other people behave

egocentrically

Output (actions) will vary considerably.

… necessary to make local adjustmentsEfficiency-Thoroughness Trade-Off (ETTO)

Page 9: _Time to Think and Time to Do Hollnagel 013

9

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORYETTO: Efficiency-Thoroughness Trade-Off

Mandatory checklist,

procedures

Rules, good practice, experience

People invariably make performance adjustments, which are seen as effective and “intelligent”.Deviations are normally detected and recovered in time.Successful adjustments are used even when they should not be. In hindsight, this is called “error”.

Conflicting demands, incomplete information, time pressure

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY

Some ETTO heuristicsIndividual (cognitive)

Looks fine

Organisational

Judgment under

uncertaintyNot really important

Negative reporting

Cognitive primitives (SM – FG)

Normally OK, no need to check

Reduce redundancy

Will be checked by someone else

Can’t remember how to do it

No time - no resources -do it later

Worked last time

Individual (work related)

Has been checked by someone else

Double-bind

Page 10: _Time to Think and Time to Do Hollnagel 013

10

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY

Herald of Free Enterprise

Assistant bosun, who was directly responsible for closing the doors, was asleep in his cabin, having just been relieved from maintenance and cleaning dutiesBosun did not see door closing as part of his duties.Captain apparently assumed that doors were safely closed unless told otherwise.Chief officer, responsible for ensuring door closure, testified he thought he saw the assistant bosun going to close the doorThe Herald had clamshell doors which opened and closed horizontally. This made it impossible for the ship’s master to see from the bridge if the doors were closed. The Herald backed out of the berth stern first. As the ship rapidly accelerated to 22 knots service speed, a bow wave began to build up under her prow. At 15 knots, with the bow down 3 feet lower than normal, water began to break over the main car deck through the open doors at

Modern Ro-Ro passenger/vehicle ferry with two main vehicle decks. At Dover and Calais double–deck ramps connected to ferry. Zeebrugge only had single-level access ramp which could not quite reach upper vehicle deck. Water ballast was therefore pumped into bow tanks to facilitate loading.When leaving Zeebrügge on March 6, 1987, not all water had been pumped out of ballast tanks, causing her to be some 3 feet down at the bow.

ETTO

ETTO

ETTO

ETTO

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY

Why do actions sometimes fail?Inefficient or

deficient organisation

Lack of adjustment to time of day (circadian rhythm)

Inadequate training and experience

Inefficient crew collaboration

Inefficient communication

Shortage of resources (both

human and technological)

Incompatible working conditions

Inappropriate HMI and operational support

Lack of adequate procedures / plans

Too many simultaneous goals and too little available time

Sharp end?Blunt end?

Page 11: _Time to Think and Time to Do Hollnagel 013

11

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY

No time - no resources - will do it later

Has been checked by someone else

Will be checked by someone else

Normally OK, no need to check

Can’t remember how to do it

Not really important

Worked last time

Looks fine

Efficiency-Thoroughness Trade-Off

Morals, social normsGovernment

Authority

Company

Management

Workplace

Activity

Conflicting demands

Incomplete

information

Time pressure

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY

Sources of successOn the level of individual human performance, local optimisation (shortcuts, heuristics, and expectation-driven actions) is the norm rather than the exception.

Normal performance is not what is prescribed by rules and regulation but rather what takes place as a result of the adjustments (the equilibrium that reflects the regularity of the work environment). It is therefore a mistake to look for the cause of failures in the normal actions since they, by definition, are not wrong.

Normal actions are successful because people adjust to the local conditions and correctly anticipate the developments.

Failures occur when this adjustment goes awry, but both the actions and the principles of adjustment are correct.

Page 12: _Time to Think and Time to Do Hollnagel 013

12

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY

Four postulatesBoth normal performance and failure are emergent phenomena, and neither can be attributed to or explained by specific components or parts.

1

When the outcome of actions differs from what was intended / required it is due to variability of context and conditions rather than failures of actions.

2The adaptability and flexibility of human work is the reason for its efficiency. At the same time it is also the reason for the failuresthat occur, although it is rarely the cause of the failures.

3

People are expected to be both efficient and thorough at the same time – or rather to be thorough, when with hindsight it was wrong to be efficient.

4

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY

The Road to Wisdom

“The road to wisdom?Well. It’s plain and simple to express:Err

and errand err again

but lessand lessand less.”

(Piet Hein, Grooks, 1966)

Page 13: _Time to Think and Time to Do Hollnagel 013

13

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY

Counterfactual reasoning

Possible outcome

1

“Why didn’t they

do B”?

“Why didn’t they

do A”?

Going back through a sequence, investigators often wonder why opportunities to avoid the bad outcome were missed.

This, however, does not explain the failure

Actual outcome

Possible outcome 2

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY

The Devil’s Dictionary

LOGIC, nThe art of thinking and reasoning in strict accordance with the limitations and incapacities of the human misunderstanding

PROOF, nEvidence having a shade more of plausibility than of unlikelihood. The testimony of two credible witnesses as opposed to that of only one

RATIONAL, adj Devoid of all delusions save those of observation, experience and reflection

REASON, v. i.: To weigh probabilities in the scale of desire

Page 14: _Time to Think and Time to Do Hollnagel 013

14

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY

00,10,20,30,40,50,60,70,80,9

1

19831985

19871989

19911993

19951997

19992001

MajorSerious

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY

Time to evaluate eventTime to select action

TimeNeed to do something! (Intention)

Latest finishing time

Time needed

Time available

Time to do

Latest starting time

Time to think

Page 15: _Time to Think and Time to Do Hollnagel 013

15

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY

Freudian slipBush's address to teachers in the nation on NBC with some friends, and this error was GLARING. He said, and I quote, "First I'd like to spank all the teachers..." there was a short pause with a definite facial expression change on his part as he realized his mistake and my friends and I all glanced at each other, laughing. Then he continued his speech.

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY

We can build reliable systemsAn airplane as a closed technical system, even including the pilot, is highly reliable as long as the boundaries are well-defined and the disturbances are not unexpected.

When it becomes part of the environment (transportation system), reliability goes down. There are too many constraints, influences, and disturbances

Weather

ATC

SchedulesRegulation

sMaintenanc

eRoutes

Gate capacity

Page 16: _Time to Think and Time to Do Hollnagel 013

16

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY

CONTEXT

Needed: A model of normal performance

Unsafe actNear missIncidentAccident

Time

Per

form

ance

va

riabi

lity

InventionImprovementSmart moveShortcut

We should understand how things go right (successes) as well as how they go wrong

(failures)

Risk analysis must be based on a model of normal performance, and not just on a model

of “error”.Context is the main determinant of normal performance, and therefore also of action

failures.

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY

Action

4. In hindsight, the alternative “correct” action is identified.

4. In hindsight, the alternative “correct” action is identified.

1. The action is chosen based on the event history and the

current situation.

1. The action is chosen based on the event history and the

current situation.

Outcome of previous

action

Expected

outcome

2. If the action leads to the expected

outcome, then it is considered a correct

action

2. If the action leads to the expected

outcome, then it is considered a correct

action

The difference between correct and incorrect outcomes may be vague rather

than crisp.

Unexpected

outcome3. If the action leads

to an unexpected outcome, then it is

classified as an “error”.

3. If the action leads to an unexpected

outcome, then it is classified as an

“error”.

Actions and “errors”

Page 17: _Time to Think and Time to Do Hollnagel 013

17

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY

What is a “human error”?

Failure not detected

Failure detected but not recovered

Failure detected but toleratedFailure detected and recovered

Correctly performed actions

Actual outcomes = intended outcomes

Immediate effectsLatent effects

Actual outcomes intended outcomes

© Erik Hollnagel, 2003

CSELAB

COGNITIVE SYSTEMS ENGINEERING LABORATORY“Error” rate, detection, and performance

“Error” rate per

hour

2468

101214

40%50%60%70%80%90%

100%

“Error” recovery rate (percentage)

Relaxed (inattentive

)Standard performance Maximu

m perf.Loss of control

Wioland & Amalberti, 1994