the normalization of deviance

12
The Normalization of Deviance Robert Rosen 8/4/2016

Upload: robert-rosen

Post on 14-Apr-2017

260 views

Category:

Engineering


0 download

TRANSCRIPT

Page 1: The Normalization of Deviance

The Normalization of Deviance

Robert Rosen8/4/2016

Page 2: The Normalization of Deviance

2

Original source• The Challenger launch decision : risky technology, culture, and deviance

at NASA Diane Vaughan, Professor of Sociology at Boston College, 1996

Page 3: The Normalization of Deviance

3

Definition“The gradual process through which unacceptable practice or standards become [treated as] acceptable. As the deviant behavior is repeated without catastrophic results, it becomes the social norm for the organization.”

Page 4: The Normalization of Deviance

Copyright © 2016 Boeing. All rights reserved. 4

“Over time, if we take risks and get the false feedback that we can get away with the behaviour, we learn to believe that it’s okay to deviate from a standard. “-- Alan D. Quilley, President of Safety Results, Ltd, Alberta

“Managers’ response when some aspect of operations skews from the norm is often to recalibrate what they consider acceptable risk”-- Harvard Business Review, April 2011

Page 5: The Normalization of Deviance

Copyright © 2016 Boeing. All rights reserved. 5

Once you think it becomes acceptable to deviate from one standard, you can start thinking it’s acceptable to keep deviating from it more and more, or start deviating from other standards.

This can lead to…

Page 6: The Normalization of Deviance

6

The “Deviation Spiral”

Deviation 1

Deviation 2

Deviation 3

Deviation 4

Original Normal

New Normal 2

New Normal 1

New Normal 3

No failureNo

failureNo failure

Page 7: The Normalization of Deviance

7

Challenger Space Shuttle Disaster, 1986 Engineers continually observed defects in the rocket booster O-

Rings, but they became treated as an “acceptable risk”, due largely to schedule pressure, after repeated successful launches

Launch day was especially cold. Engineers initially issued an unprecedented “no-launch” recommendation, but were unable to persuade NASA to cancel the launch

One component suffered a failure of both primary and backup O-rings – led to disintegration of the booster rocket and then the shuttle itself

Page 8: The Normalization of Deviance

8

And as if that wasn’t bad enough…

NASA came to accept foam strikes on shuttle heat shields as “normalized deviance” as well

Page 9: The Normalization of Deviance

9

Gulfstream Business Jet crash, 2014 Jet failed to achieve liftoff, went off the end of the runway Gust Lock was engaged “the pilots had neglected to perform complete flight control checks

before 98% of their previous 175 takeoffs in the airplane… it is likely that they decided to skip the [flight control] check at some point in the past and that doing so had become their accepted practice.” – NTSB accident report

One source concluded the pilots likely had adopted a pattern of neglecting more and more checks over time. None of the standard checks had been performed prior to takeoff.

Go to model

Page 10: The Normalization of Deviance

10

Carbide Industries, 2011

• Manufacturing furnace explosion at Louisville, KY plant – fatalities resulted• US Chemical Safety Board incident report included an entire

section on “Normalization of Deviance” as a cause• “…because Carbide did not thoroughly determine the root causes of

the blows [over-pressure incidents that occurred in 1991 and 2004] and eliminate them, the occurrence became normalized in the day-to-day operations of the facility…CSB interviews verified that furnace blows were considered normal”

Page 11: The Normalization of Deviance

11

Causes (of Normalization of Deviant Practices) A belief that “rules are stupid and inefficient”.

Belief that work goals are best met by breaking rule(s) Imperfect knowledge of standards Fear of speaking up

Source: The normalization of deviance in healthcare delivery. Banja, J. 2010

Page 12: The Normalization of Deviance

12

What Can We Do About It?(Mullane)• Recognize your vulnerability -- “If it can happen to NASA, it can happen

to anyone.”• “Plan the work and work the plan.”• Listen to people closest to the issue.• Archive and periodically review near-misses and disasters so the

corporate “safety” memory never fades.