06 02 01 cognitive engineering - universitetet i oslo · kun gjennom 3 periskop. når man kjører...
TRANSCRIPT
1
Cognitive EngineeringThe difference between
120 alarms and 3 periscopes
Cato A. BjørkliPsychologist
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
From the news:
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
Adresseavisen 2005.03.03:“Sikten er begrenset. Dersom man kjører såkalt taktisk, skalføreren sitte nede i vogna med luka igjen. Da skjer styringenkun gjennom 3 periskop. Når man kjører på denne måten,setter det ekstremt høye krav til årvåkenhet hos føreren.”
A police officercommenting the incidentwhere a tank duringtactical maneuveringaccidentally drove over acivilian vehicle with twopassengers still inside.
2
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
And now, something completely else:
Cognitive Engineering
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
Aim of this lecture
• Basics of Cognitive Engineering• CE and Design
• Data Availability Paradox• Info Soup: Typical Features• Info Soup: How to deal with it and not (?)• Nine Steps: Understanding Accidents / Incidents
• “One sentence says it all”
Finally: Ask important and brilliant questions at the end
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
Background for this lecture
Explicit:• Woods & Patterson (2002) Can we ever escape ...• Woods & Sarter (2000) Learning from automation ...• Dekker & Woods (2002) MABA-MABA or ...• Woods & Cook (2002) Nine steps to move from ...
Implicit:• Hoff (2004) Comments on Ecology of ....• Casey (1998) Set Phasers on Stun ...
3
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
-- scientific method --
a way of doing something,especially a systematic way;implies an orderlylogical arrangement(usually in fixed steps)
-- applied science --
ability to produce solutionsin some problem domain
approximate d(f)
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
The Science ofMan-Machine Interaction (MMI)
What characterizes a good product?
What characterizes a safe system?
What characterizes an efficient system?
How do we know that somethingis what we designed it to be?
(examples?)
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
-- cognitive engineering ---
Cognitive Engineering (CE) is concerned withthe analysis, design, and evaluation of complex sociotechnical systems
The aim of cognitive engineering is to facilitate safe, productive and healthy work incomplex sociotechnical systems
(Vicente, 1999; Rasmussen et al, 1994).
approximate d(f)
CE vs Design vs Usability Engineering?
4
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
Field of Study
Product
Social
Individual
Systems
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
Cognitive Engineering
• Perspective How do we understand the field?
• Theories How do we assume things work?
• Concepts What are the common concepts?
• Issues What kind problems do we solve?
• Method How do we solve problems?
5
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
Cognitive Engineering
• Perspective Complex Dynamic Systems (Non-linear!)
• Theories Human cognition and behavior (Info Pro?)
• Concepts Representation, automation, workload
• Issues Overload, bottlenecks, safety, interfaces
• Method Interviews, observations, experiments,
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
This is the Age ofInformational Soup
“Information is not a scarce resource. Attention is.”Herbert Simon, 1981
What does this imply?
Common features of complex systems (nuclear plants,aerospace, petrochemical) is the vast amount of info available.
(big systems = a lot of information)
Furhter, the complex systems behave in a non-linear fashion(surprises! Ref: Casey; Perrow)
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
The Informational Soup
“Although all of the data was physical available, it was notoperationally effective” Joyce & Lapinski, 1983
(Observability is more than mere data availability - Woods et al, 2002)
Practitioners arebombarded with
information at theirworkspaces
Imagine theview for the
operator:
6
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
...control rooms, cockpits, tv production, stock markets ...
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
The Data Paradox
(for operators:)The more data that is available,
the harder it is to find thesignificance and relevance of it.
(general trend:)More and more data is available,
but our capacity to process it is the same. ... What does thismean for operators?
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
Problem formulation
“(CE) is concerned with the analysis, design,and evaluation of complex sociotechnical systemsand to facilitate safe, productive and healthy work”
Case: 27.12.1991 SAS Airlines Accident Gottora, Sweden Crashed 3 min after take-off No causulties Aircraft written off120 alarms in 30 seconds
7
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
Features of overload
• Overload is often imminentwhen anomalies occur, whencritical incidents happens(ref: data paradox)
• Overload may also occur inaspects of normal running ofthe system in question
CLAIM: Clutter and confusion are
failures of design, not attributes of information
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
Features of overload
1. Amount: There is too much information present for theoperator to handle
2. Relevance: There is trouble of deciding the relevance of agiven data set
3. Bottleneck/time: There is too little time to repsond and relateto the data available
Ref: Woods et al, 2002
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
1) Amount: “Too Much!”
8
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
1) Amount: “Too Much!”
Breaking down datasets intosmaller chunks brings forththe issue of navigation and
criteria of chunking
Hint: Hoff (2004) Ecology of
Computerized Systems
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
2. Relevance: ? versus !
“Given the enormous amount of stuff, and some taskto be done using some of the stuff ... What is therelevant stuff for the task?” (Glymour, 1987)
What information is relevant?(Or: What constitutes relevance of information?)
How do humans know what’s relevant?(How many nuances do you see right now?)
(percept org, control of intention, nose for anomalies)
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
Examples
Driver Support System: Adaptive Front Lights
Nuclear Power Plants: Cooling Fluids (DuressSJ)
High Speed Crafts: Navigation
2. Relevance: ? vs !
9
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
Examples
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
3. Automation
• When there is too much information, and attention isscarce ... let the machines do it ...
• “Automate everything you technically can”Chapanis, 1970
We, as man-technology specialists, know:
... the positive sides?... the negative sides?
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
3. AutomationMyths: Substitution
10
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
3. AutomationMyths: Substitution
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
100 % 70 %
30 %
TaskDemands
HumanCapacity
AutomationCapacity
TaskDemands
Man -Technology
System
3. AutomationMyths: Qualitative Effects
“.. just add man to machine!”
BEFORE AFTER
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
Automation transform practice,and humans adapt to novelties
Practice Artefacts
Requirements
ConstraintsDesign represent beliefs
about how to do things, howhumans think and act.
11
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
How to dealwith ‘Too Much!’?
‘Too Much’ concerns the difference
between
functionality and availability
(What is the cure? CE?)
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
Tanks! revisited
Detour Question: Is there, in principle, a difference between toomuch information and too little?
(Is there a difference between 120 alarms and 3 periscopes?)
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
Info Soup with Humans
• Info Soup and the capacity of humans– Too much! (Amount)– Relevance? (Context-sensitivity)– Bottlenecks (Automation)
• What are our assets as humans?– Perceptual Capabilities (Gestalt, Ecology, particular)– Attentional Control (Physiology, Genetic Make-Up, Attunement)– Nose for Anomalies (Functional, Bateson, 1972, depart from ref)– Natural Teamplayer’s (Social Beings, action is not alone)
• Capabilities: Organize, Prioritize, Synthesize, Adapt
12
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
Effective Solutionsto Info Overload
... ‘Meaning’ is not INdata, but in therelationship betweendata and expertiseand surroundings
1. Organisation precedes selectivity (support attention control)
2. Positive selection enhance structures(facilitate processing, don’t inhibit)
3. Context sensitivity(maintain relationships)
4. Observability, not availability(Provide contextualized info)
5. Conceptual spaces(represent frame of reference)
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
Failure by Default
Universal Law:Failure is certain(In the end, things will always fail!)
Systems will malfunction and fall apart,despite our effort and expertise
To understand failure is similarto understanding success. (Heavily biased!)
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
Nine Steps ...
1. Pursue Second Stories (Wide scope, details, insignificance)2. Avoid Hindsight (Now vs Then: The Info Available)3. Work is the sharp end! (Work is particular, contextual, temp)4. System Weaknesses (Safety lies in systems, not components)5. Look for safety in praxis (Habits and skills are also barriers)6. Underlying patterns (Situation vs General Factors)7. Changes are multiple consequences (Systems are dynamic)8. How do technology support performance (Task / Artefact)9. Complexity and feedback (What is the technology doing?)
13
06-02-01 Cognitive Engineering PSY2403 Cato A. Bjørkli
Existence is not caused, it is conditioned
Questions?
Cognitive Engineering
Cato A. BjørkliPsychologist