automation bias and surprise -...

12
Cockpit Automation Fact sheet Automation Bias and Surprise

Upload: dangkien

Post on 12-Mar-2018

221 views

Category:

Documents


4 download

TRANSCRIPT

Cockpit Automation Fact sheet Automation Bias and Surprise

COCKPITAUTOMATIONFACTSHEET

HirakiJ.&WarninkM.(2016)

2

IntroductionInaviation,modernaircrafthavebecomeincreasinglyreliantoncockpitautomationtoprovideasafeandefficientflight.Ithasprovidedaircraftwithincreased

passengercomfort,improvedflightpathcontrol,reducedworkloadandmanyotheradvantages.However,duetocockpitautomationproblemsconcerningthe

human-computerinteractionhaverisenthathaveledtofatalairaccidents.Twomainphenomenacanbepointedoutthatcausetheseproblemsandinfluence

aviationsafety:automationbiasandautomationsurprise.Tosolvetheseproblems,twomainsolutionshavebeenproposed:revisepilottrainingandredefine

theroleof thepilot in thecockpit.To introduceandsubstantiate this topic, twoexamplesaregivenof flightaccidents thathavehappenedduetohuman-

cockpitautomationerror:TurkishAirlinesflightTK1951onroutefromIstanbulAtatürktoAmsterdamSchipholAirportin2009andAirFranceflightAF447from

RiodeJaneiroGaleāotoParisCharlesdeGaullealsoin2009.

TurkishAirlinescrashedduringapproach

A Boeing 737-800 (flight TK1951) operated by Turkish Airlines was flyingfromIstanbulAtatürkAirportinTurkeytoAmsterdamSchipholAirport,on25 February 2009. During the approach to runway 18 Right (18R) atSchipholAirport,theaircraftcrashedintoafieldatadistanceofabout1.5kilometresfromthethresholdoftherunway.Thisaccidentcostthelivesoffour crew members, including three pilots, and five passengers, with afurtherthreecrewmembersand117passengerssustaininginjuries1.Whatcausedtheaircrafttocrash?The Boeing 737-800 can be flown either manually or automatically. Thisalso applies to the management of the engines. The auto-throttle(managingthrottlesettingsinordertomaintainacertainspeed)regulatesthe thrust of the engines. The aircraft is fitted with two radio altimetersystems,oneontheleftandoneontheright.Inprinciple,theauto-throttleuses the altitude measurements provided by the left radio altimetersystem.Onlyifthereisanerrorintheleftsystemthatisrecognisedassuchbythesystem,theauto-throttleandvariousotheraircraftsystemsusetheright-handradioaltimeter1.The aircraft was being flown by the first officer, who was sitting on theright-handside.Hisprimaryflightdisplayshowedthereadingsmeasuredby

theright radioaltimetersystem.Duringapproach, the left radioaltimetersystemdisplayedan incorrectheightof -8 feet.Thiscouldbeseenonthecaptain’s (left-hand) primary flight display. The first officer’s (right-hand)primaryflightdisplay,bycontrast,indicatedthecorrectheight,asprovidedby the right-hand system. The left-hand radio altimeter system, howevercategorised the erroneous altitude reading as a correct one, and did notrecord any error. This is why there was no transfer to the right-handaltimeter system. In turn, this meant that it was the erroneous altitudereading that was used by various aircraft systems, including the auto-throttle. Unfortunately, the crew was unaware of the incorrect altitudereading of the left-hand radio altimeter and therefore did notmake anycorrectiveactions1.When the aircraftwas on the glidepath the auto-throttlemoved into the‘retard flare’mode, amodewhich is normally only activated in the finalphase of the landing below 27 feet, as a result of the incorrect altitudereading by the left-hand radio altimeter system. The thrust from bothengineswasaccordinglyreducedtoaminimumvalue(approachidle).Thismodewasshownontheprimary flightdisplayas ‘RETARD’.However, theright-hand autopilot, which was activated, was receiving the correctaltitude from the right-hand radio altimeter system. Therefore, theautopilotattemptedtokeeptheaircraftflyingontheglidepathforaslongaspossible.Thismeant that theaircraft’snosecontinuedto rise,creating

COCKPITAUTOMATIONFACTSHEET

HirakiJ.&WarninkM.(2016)

3

an increasing Angle of Attack (AoA) to the wings. This was necessary inordertomaintaintherequiredliftastheairspeedreduced1.Ataheightof750feet,thecrewdidnotnoticethattheairspeedfellbelowthe pre-selected speed of 144 knots. When subsequently, the airspeedreached126knots,theframeoftheairspeedindicatoralsochangedcolourandstartedtoflash.Thecrewalsodidnotrespondtoseveralotherwarningindications. The reduction in speed and excessively high pitch attitude oftheaircraftwerenot recogniseduntil theapproach to stallwarning (stickshaker)wentoffatanaltitudeof460feet.Thiswarningisactivatedshortlybeforetheaircraftreachesastallsituation–asituationinwhichthewingsoftheaircraftarenotprovidingsufficientliftandtheaircraftisnotabletoflyanymore1.Thefirstofficerrespondedimmediatelytothestickshakerbypushingthecontrol column forwardandalsopushing the throttle levers forward. Thecaptainhowever,alsorespondedtothestickshakercommencingbytakingover control. Assumingly the result of this was that the first officer’sselectionofthrustwasinterrupted.Consequently,theauto-throttle,whichwasnotyetswitchedoff,immediatelypulledthethrottleleversbackagaintothepositionwheretheengineswerenotprovidinganysignificantthrust.Once the captain had taken over control, the auto-throttle wasdisconnected,butnothrustwasselectedatthatpoint.Ninesecondsafterthe commencement of the first approach to stall warning, the throttleleverswerepushedfullyforward,butatthatpointtheaircrafthadalreadystalled and the height of about 350 feet was insufficient for a recovery1(Figure1).

Figure1:SideviewoftheBoeing737-800

AirFrancecrashedintotheAtlanticOcean

OnMay312009,anAirbusA330-203(flightAF447)operatedbyAirFrancetook off from Rio de Janeiro Galeāo Airport bound for Paris Charles deGaulle.Unfortunately, the aircraft never reacheddestination and crashedintotheAtlanticOcean.Theaccidentcostthelivesof12crewmembersand216passengers2.Whatcausedtheaircrafttocrash?After take-off from Rio de Janeiro Galeāo Airport the aircraft was in itscruiseflightatFL350withacruisingspeedof0.8M.Duringitscruiseflightthe aircraft entered severeweather conditions causing thepitot tubes tobe obstructed with ice crystals leading to an automatic autopilotdisconnection and inconsistent airspeed indication. As a result of theinconsistentairspeedindicationtheAirbusA330automaticallyswitchestoalternate law.When in alternate law, the aircraft no longer has angle ofattack (AoA) protection. In aerodynamics, angle of attack specifies theangle between the chord line of the wing of an aircraft and the vectorrepresentingtherelativemotionbetweentheaircraftandtheatmosphere.WhentheAoAisincreasedtoomuch,thewingswillnotbeabletoprovide

COCKPITAUTOMATIONFACTSHEET

HirakiJ.&WarninkM.(2016)

4

sufficient lift to keep the aircraft flying (Figure 2). Therefore, an AoAprotection is built into the aircraftwhich limits the input of the pilots sothattheycannotincreasetheAoAtoomuch2.

Figure2:AngleofAttackschematic

ThelossofAoAprotectionhadenabledtheaircrafttoflywithanenormousunconventionaldegreeofpitch.Thecrewdidnotsucceedtodiagnosetheinconsistent airspeed andmanage itwith precautionarymeasures on thepitchattitudeandthethrust.Theoccurrenceofthefailureinthecontextofflight in cruise completely surprised the pilots of flight AF477. Thedifficulties with aircraft control at high altitude in turbulence led toexcessivecontrol inputs in rollandsharpnose-up inputby thepilots.Theevolution of the pitch attitude and vertical speed was added to theerroneous speed indications and ECAMmessages,which complicated thediagnosisoftheactualproblem.TheElectronicCentralisedAircraftMonitor(ECAM) is a system,developedbyAirbus, thatmonitorsaircraft functionsand relays themto thepilots. Italsoproducesmessagesdetailing failuresand in certain cases, lists procedures to undertake to correct them.Unfortunately, the large amount of ECAM messages made it even moredifficult for thepilots to understandwhat problem theywere facing. Thecrew likely never understood that it was faced with a simple loss ofairspeedinformation2.

Due to unreliable indication of the airspeed and inappropriate controlinputs that destabilized the flight path, the crew seemed confused as towhich type of buffet theywere experiencing (highMach number or highAoA).WhenexperiencinghighMachnumberbuffettheaircraftisnotabletoproduce liftanymoreas theairflowseparates fromtheaerofoildue tothe existence of shockwaves on thewings3.When experiencing highAoAbuffettheaircraftis,apposedtohighMachnumberbuffet,flyingtooslowand as a result of this the wings can not produce sufficient lift. As bothpilots of AF 447 did not know to which type of buffet they wereexperiencing,theaircraftwentintoasustainedstall.Despitethepersistentsymptoms, such as stall warnings and erroneous speed indications, thecrew never understood that they were stalling and consequently neverappliedarecoverymanoeuvre,causingtheaircrafttocrashintheAtlanticOcean2(Figure3).

Figure3:TailsectionoftheAirFranceflightAF447

COCKPITAUTOMATIONFACTSHEET

HirakiJ.&WarninkM.(2016)

5

Cockpitautomation:provideasafeandefficientflight

Ingeneral,automationisaprocessorataskthatisexecutedbyamachineor a computer instead of a human4. In aviation, modern aircraft havebecome increasingly reliant on cockpit automation to provide a safe andefficient flight and to increase passenger comfort. Furthermore, cockpitautomation relieves the pilot from repetitive tasks and reduces pilotworkload. The reduction inworkload frees attentional resources to focuson more important tasks. Also, cockpit automation makes sure that theaircraft isbalancedduring the flightwhich leadsa stableoperationandasmooth flight trajectory.Bothof these improvements lead toasignificantreductionoffuelconsumption4.

Examplesofcurrentlyautomatedtasksare:verticalandlateralnavigation,fuel and balance optimisation, throttle settings, critical speed calculationandexecutionoftake-offandlanding.

Theevaluationofcockpitautomation

Threegenerationsofcockpitautomation,canbedistinguished:mechanical,electricalandelectronic. In thebeginningofcommercial flight therewereno instrumental aids to help pilots to fly. It took some years for the firstinstrumentstobeintroducedthatcouldindicatetheaircraft’sairspeedandattitudetothepilot5.Thefirstsignsofautomationwereintroducedonboardaircraftduringthe1920s and 1930s, in the formof amechanical pilot thatwas designed tokeep the aircraft flying straight. As airplanes became bigger, theaerodynamicforces increasedand itbecamenecessarytoapplya formofamplification of the pilots’ physical force by means of pneumatic orhydraulic actuators. Instead of direct control, with the flight yokemechanically attached to the flight control surfaces, mechanisms were

constructedinterveningbetweenthepilot’sinputandtheexpectedoutput.Atthisstage,automationaidedpilotsmainlyintheirflyingskills.In the 1960s, plenty of electrical automation was introduced on boardaircraftthatenhancedflightsafety:electricautopilots,auto-throttle,flightdirectors (used to show pilots how to fly the aircraft to achieve a pre-selected target such as speed and flight path), airborne weather radars,navigation instruments and improved alarming and warning systemscapableofdetectingseveralparametersofenginesandotherequipment5.The third generation of cockpit automation involved electronics, andwasmainly driven by the availability of cheap, accessible, reliable and usabletechnology that invaded the market. The electronic revolution occurringfrom themid-80s (introduction of the personal computer) also helped toshapeanewgenerationofpilots,whowereaccustomedwiththepervasivepresence of technology. Furthermore, electronics helped to diminish theclutter of instruments in the cockpit and allowed for replacing oldindicators (round-dial, black and white mechanical indicators for everymonitored parameter) with integrated coloured electronic displays. Onthesedisplaysanoverviewofmultipleparameterscouldbedisplayed6.

BoeingandAirbus:differentphilosophies

Each airplane manufacturer has a different philosophy on theimplementation anduseof cockpit automation, sodoAirbus andBoeing.Aboveall, thegeneralagreement isthattheflightcrewisandwill remainultimately responsible for the safety of the airplane. The significantdifference between the design philosophies can be found in envelopeprotection. Airbus’ philosophy has led to the implementation of “hard”limits,where the flight crew can providewhatever control input desired,but can never exceed the physical limitations of the aircraft. Boeing has“soft” limits, where the flight crew will meet increasing resistance to

COCKPITAUTOMATIONFACTSHEET

HirakiJ.&WarninkM.(2016)

6

control inputs when the airplane is steered out of the normal flightenvelope5.IntheliteratureAirbususestheword‘operator’forflightcrew,whichisincontrastwithBoeingthatusestheword‘pilot’todesignateflightcrew.Themanufacturers’differentdefinitionsof cockpitautomationcanbedefinedasfollows5:

AutomationBias

AutomateddecisionmakingaidWithin aircraft, automated decision aids are meant to support humancognitive processes of information analysis and/or response selection inorder to help the human user correctly asses a given situation or systemstateandtorespondappropriately.Twodifferent functionsofautomateddecisionaidscanbedistinguished:alertsandrecommendations.Thealertfunctionmakes theuserawareofa situational change thatmight requireaction.Therecommendationfunctioninvolvesadviceonchoiceandaction.Within aircraft a Ground Proximity Warning System (GPWS) is a goodexampleofanautomateddecisionaid.TheGPWSisdesignedtoalertpilotswhen the aircraft is in immediate danger of flying into the ground or an

obstacle. In such case the system alerts pilots by giving an auralwarning(i.e. “caution terrain, caution terrain”) and in someaircraft it also gives arecommendation(i.e.“Pullup!Pullup!Pullup!”).Two types of errors can occur with automated decision aids: an error ofomission, or a commission error. Anerror of omission refers to amistakethatconsistofnotdoingsomethingyoushouldhavedone,ornotincludingsomething such as an amount or fact that should be included7. Acommission error refers to a mistake that consists of doing somethingwrong,suchasincludingawrongamountinthewrongplace8. Intermsofautomateddecisionaids,anerrorofomission referstotheuserthatdoesnot respond to a critical situation, relating to the alert function. Acommissionerror is related to the specific recommendationsordirectivesprovided by an aid. In this case, users follow the advice of the aid eventhoughitisincorrect9.DefinitionofautomationbiasQuite often, users of automated systems have a tendency to ascribegreaterpowerandauthority toautomatedsystems than toother sourcesofadvice9.Inotherwords;usersofautomatedsystemsprefertheoutcomeandrecommendationsofanautomatedsystemsratherthanothersources(forexampletheirownknowledgeandexpertise)ofinformation.AccordingtoMosierandSkitka(1996)10automationbiascanbedefinedasphenomenaresultingfrompeopleusingtheoutcomeofadecisionaid“asaheuristic replacement foramoreeffortfulprocessof informationanalysisand evaluation”. Automation bias refers to phenomena caused by theincreasedthrustthatpeoplehaveinautomatedsystems.PhenomenathatcontributetotheoccurrenceofautomationbiasThreemainfactorsofautomationbiashavebeenassumedtocontributetotheoccurrenceofautomationbias:

Airbus

Automalonshouldleadtheaircramthroughanormalandsafeflightenvolopeandmustnotworkagainsttheoperator'sinputs,exceptwhenabsolutely

necessaryforsafety

Boeing

Thepilotisthefinalauthorityfortheoperalonoftheairplane.Onlyapplyautomalonasatool

toaidthepilotandnottoreplacethepilot

COCKPITAUTOMATIONFACTSHEET

HirakiJ.&WarninkM.(2016)

7

1. Oneisthefavourtouseautomatedrecommendationsanddirectivesasareplacementforamoreeffortfulprocessofinformationanalysisandevaluation.

2. Asecondfactoristheperceivedtrustofhumansinautomatedsystems.A side-effect of this factor is the phenomenon referred to ascomplacency. In the aviation community, complacency refers tosituations in which pilots, air traffic controllers, or other operatorspurportedly did not conduct sufficient checks of system states andassumes “all was well” when in fact a dangerous condition wasdevelopingthatledtotheaccident9.

3. Thethirdfactorisdiffusionofresponsibility;wherebysocialloafingcanoccur.Social loafingrefers to thetendencyofhumanstoreducetheirown effort when working redundantly within a group (or with anautomatedsystem)thanwhentheyworkindividually9.

These factors mainly arise when the user has often experienced reliableautomation. These factors are inconsequential as long as the automationmaintains reliability. However, performance of automation users cancompromise considerably in case of automation failures, that is, if thesystemdoesnotalert theusertobecomeactive, if thesystemprovidesafalse recommendation or directive or if the automated system does notoperateproperly.ThedurationofautomationbiasBoer, Heems and Hurts (2014)11 have performed a study aimed atquantifying the duration of automation bias. The objective of their studywas to find how long automation bias phenomena persists and whichfactors influences its duration, by means of simulating malfunctions in asimulator. They found that the duration of failure detection for licensedpilotsvariedfrom18sectoover720sec.withonly11%oftheparticipantsmeetingthestandardforthedetectionofvisiblealarmsof23to45sec.Inaddition, they also found a significant difference in the sensitivity toautomationbiasbetween junior andexperiencedpilots. Less experienced

pilots are on average less sensitive to automation bias but have morevariationinperformancethanmoreexperiencedpilots.AutomationbiasinvolvedinTurkishAirlinesflightTK1951Turkish Airlines flight TK1951 intercepted the localizer signal at 5.5 NMfrom the runway threshold at an altitude of 2000 feet. Therefore, theglideslope had to be intercepted from above rather than the preferredintercept from below. As a result, the crew was forced to carry out anumberofadditionalprocedures,resultinginagreaterworkload.Thisalsocausedthelandingchecklisttobecompletedduringalatermomentintheapproachthanstandardoperationalproceduresprescribe.Thecockpit crewdidnothave information regarding the interrelationshipbetweenthe (failureof the) left radioaltimetersystemandtheoperationof the auto-throttle readily available. Of all the available indications andwarning signals, only a single indication referred to the incorrect auto-throttle mode, namely ‘RETARD’ annunciation on the primary flightdisplays.Asaresultofthegreaterworkload,theincorrectoperationoftheauto-throttle was obscured for the crew. Despite the indications in thecockpit,thecockpitcrewdidnotnoticethelargedecreaseinairspeeduntiltheapproach-to-stallwarning.With the cockpit crew – including the safety pilot –working to completethelandingchecklist,noonewasfocussingontheprimarytask:monitoringtheflightpathandtheairspeedoftheaircraft.Thistaskwasperformedbytheautomatedsystemanditisassumedthattheflightcrewstatedthatallwaswell,when in fact a dangerous situationwas happening. This canbereferredtoasthesecondfactorofautomationbias:theperceivedtrustofhumansinautomatedsystems.AutomationbiasinvolvedinAirFranceflightAF447The obstruction of the pitot tube with ice resulted in an inconsistentairspeedindicationandalossofAoAprotection.ThefactthatthehighAoAsituation was not identified by the Pilot Flying (PF) while he was

COCKPITAUTOMATIONFACTSHEET

HirakiJ.&WarninkM.(2016)

8

continuously pulling on the control stick seems to indicate that he mayhaveembracedthecommonbeliefthattheaircraftcouldnotstall,and inthiscontextastallwarningwasinconsistent.Thisisagainanexampleoftheperceived trust of human users in automation, which lead to a fatalaccident.Asecondbias involvedtothisflight isanerrorofomission,withthecrewnotrespondingtothealert functionof theaircraft (i.e.stallwarningsandspeedwarnings).

AutomationSurprise

Automationtechnologywasoriginallydevelopedtoincreaseprecisionandeconomy of operations while, at the same time, reducing operatorworkload and training requirements. Itwas considered possible to createan autonomous system that required little human involvement andtherefore reducedor eliminated theopportunity forhumanerror.Withinthis type of automated systems, the computer performsmore andmorehuman-related functions and operations, with the possible consequencethathumanshave less interactionwiththesystem12.Thehumanusercanbecomelessawareofthesystemoperationsandeventuallybe“out-of-the-loop”.Whenthehumanisout-of-the-loop,theengagementtotheprocessis significantly reduced causing the human user to experience, due toautomation, limited access to relevant information in a given situation8.Theirabilitytodetectproblems,determinethecurrentstateofthesystem,understandwhat has happened andwhat courses of actions are needed,andreacttothesituationwilldecrease.Thisphenomenoncanbedescribedaslossofsituationalawareness13.The results can be situations where the operator is surprised by thebehaviour of the automation asking questions like, what is it doing now,whydid itdo that,orwhat is it going todonext12. Thus, automationhas

created surprises for human operators who are confronted withunpredictableanddifficulttounderstandsystembehaviour.AutomationsurpriseinvolvedinTurkishAirlinesflightTK1951Duringtheapproachtheaircraftwasalreadyin landingconfigurationwithgear down and flaps extendedmore than 15 degrees. This configurationalongwith thewrongheight indicationof -8 feetby the leading left radioaltimetersystem,causedtheauto-throttletobesetin‘Retardflaremode’.On the one hand, a primary cause which led to the fatal accident is thefailure of the crew to monitor the flight path and the airspeed of theaircraft. On the other hand, the automatic switch to the ‘Retard FlareMode’onaheightgreaterthan27feetasaresultofthefailureoftheleftradio altimeter system might have surprised the pilots and caused anadditionalpressureonthealreadyextrahighworkload.Theaviation industrywas already familiarwith thepossible failureof theradioaltimetersystemanditssubsequentactions,butitwasassumedthatthe pilots would easily manage and recover from this situation. Severalairlines, including Turkish Airlines, regarded the problems with the radioaltimeterasatechnicalproblemratherthanahazardtoflightsafety.Asaresult,thepilotswerenotinformedofthisissue.AutomationsurpriseinvolvedinAirFranceflightAF447TheAirbusA330’sFly-By-Wire(FBW)systemissodesignedthatwheninanabnormal condition, the system will automatically switch to a so called‘AlternateLaw’mode.InthismodetheAoAprotectionwillbedisengagedandtheaircraftwillbeabletoflyahighdegreeofpitch.Theoccurrenceofautomationsurpriseisexemplifiedinthisaccidentfortworeasons:thefirstreasonisthefactthatthepilotfailedtocorrectlylinkthestallwarningwiththeairplane’s attitudeunder alternate law. The second reason is the factthat the pilot monitoring did not understood why the aircraft wasseemingly not responding to its input, as the pilot monitoring was alsogivinginputtothecontrols.

COCKPITAUTOMATIONFACTSHEET

HirakiJ.&WarninkM.(2016)

9

Improvinghuman-cockpitautomationinteraction

RethinkingpilottrainingOneoftheconclusionsthatcanbedrawnfromthereportonflightAF447isthat flighttrainingsimulatorsshouldbe improved insuchawaythattheycan reproduce realistic scenarios of abnormal situation. Furthermore, thetrainingscenariosshouldensuretheeffectsofautomationsurpriseinordertotrainpilotstofacethesephenomena14.Several publications provide recommendation on how to improve pilottraining. For example, Gray (2007) refers to evidence that pilots are notbeingtrainedtoalevelthatenablesthemtouseautomatedsystemsinallcircumstances.Hestatesthatpilotsshouldbetrainedtounderstandwhenautomated systems should be discarded and the pilots should revert tomanualcontrol15.Although the daily routine of flying is driven by automation, the laws ofphysicshavenot changed.Basic flying skills need tobe trained tobuild astrong foundation that every pilot can fall back to when necessary16.Antonovich(2008)recommendsthatpilottrainingshouldtrainthepilottoanalyse the influencesofautomation, so that theyarebetterpreparedtodealwith the problems associatedwith automation andmaintain greatersituationalawarenessduringtheflight17.The European Aviation Safety Agency (EASA) performed a questionnaireamongmorethan150respondents,mostlyairlinepilotsfromEurope,thatsubstantiatetheseviews.EASAconcludedfromitsquestionnairethatbasicand manual flying skills tend to decline because of the lack of practice.Furthermore, the pilot interaction with automation, the selection ofdifferent flight modes, can distract the pilot from the actual flying. Theoverall recommendation in the report is to improvebasic airmanshipandmanualflyingskillsofthepilots.Itisalsostatedthatrecurrenttrainingandtesting practices with regard to automation management should beimproved18.

RedefiningtheroleofthepilotAnotheroptionthatcanleadtosignificantimprovementsinhuman-cockpitautomationinteraction,isimprovingandsimplifyingcurrentcockpitdesign.Twodifferentapproaches incockpitdesignhavebeenproposedbyLetsu-Dakeetal(2012):PilotasPilotandpilotasmanager.PilotaspilotIn the Pilot as Pilot approach the cockpit design intends to support theflight crew in their conventional role as pilots. The flight crew is activelyengagedinflightcontrolwiththefinalauthorityandresponsibilityovertheaircraft. In thisdesign thepilot standsabove thecockpitautomation,andcan authorize anddelegate tasks to the cockpit automation and can takeback manual control over the aircraft at any moment during flight andgroundoperation.Thecockpit layout,functionsandformats,aredesignedinsuchawaythatthepilotisabletomaketacticaldecisionsintheplanningoftheflight.Theinterfacesofthefunctionsaresimplifiedandprovidethepilotwithnecessaryinformationanddecisionproposals20.In the Pilot as Pilot cockpit design two key positive and negativeimplications have been identified. One of the key positive implications inthis design is that it is based on traditional human-centred designprinciples. Itappearsthathumansperformbetter inemergencysituationswhentheyhavebeenactivelyinvolvedinperformingrelevanttasksduringnormal flightconditions.Anegativeaspect in thisdesignrelates tohigherworkload resulting from the active involvement in relevant tasks. Thehigherworkloadmightnegatively influencetheabilityofpilotstoperforminanincreasinglycomplexenvironmentwithagreatdemandforprecisionandefficiencyandsmallsafetymargins19.PilotasmanagerIntheapproachofdesignwherethepilotactsasamanagertheflightcrewwill shareauthorityandresponsibilitywithautomatedsystems for factorssuchassafety,efficiencyandpassengercomfort.Mostofflighttaskswillbe

COCKPITAUTOMATIONFACTSHEET

HirakiJ.&WarninkM.(2016)

10

executedby cockpit automation andmanagedby the flight crew.Cockpitautomation is in this case responsible formostof theaircraft controlandinformation processing.What is necessary for this approach is extremelyreliable and highly capable cockpit automation that is able to performcurrentcockpit functions.Situationassessmentandtactical flightplanningaresharedresponsibilitiesbetweenthepilotandautomation19.Just like in thePilot asPilot approach thePilot asManager approachhassome key positive and negative implications. One of the main positiveimplicationisthattheflightcrewwillhavegreaterbandwidthtomanageallaspects of the flight, because they are not spending time, effort and payattention on performing lower levelmanual tasks. A negative side of thepilotintheroleofmanageristokeeptheflightcrewengagedintheflyingprocess. Actively performing tasks is known to be more engaging thanmanaging tasks. The challenge is tomake the flight crew behave activelyinsteadofpassivelyandbefullyinvolved19.

Doesautomationimproveorreduceaviationsafety?

In general, cockpit automation improves overall aviation safety. Modernaircraft are increasingly reliant on automation for safe and efficientoperations.Automationhasbroughtsignificantadvantagesforflightsafetyand operations. However, automation can be challenging for instance toseniorpilotswhomaybelesscomfortablewithautomationwhilethenewgeneration of pilots may lack basic flying skills when the automationdisconnects or fails orwhen there is a need to revert to a lower level ofautomation,includinghandflyingtheaircraft14.Improvedpilottrainingandstrictroledefinitioncouldimproveaviationsafetyevenmore.

Glossary

• Angle of Attack: In aerodynamics, angle of attack specifies the anglebetweenthechordlineofthewingoffixed-wingaircraftandthevectorrepresenting the relative motion between the aircraft and theatmosphere.

• Aerodynamic(s): dynamics of bodies moving relative to gases,

especiallytheinteractionofmovingobjectswiththeatmosphere.• Automation bias: Phenomena resulting from people’s using the

outcome of a decision aid “as a heuristic replacement for a moreeffortfulprocessofinformationanalysisandevaluation.

• Automationsurprise:Situationswheretheoperatorissurprisedbythe

behaviour of the automation asking questions like, what is it doingnow,whydiditdothat,orwhatisitgoingtodonext?

• Commissionerror:referstoamistakethatconsistsofdoingsomething

wrong,suchasincludingawrongamountinthewrongplace.• Complacency:referstoaccidentsorincidentsinwhichpilots,airtraffic

controllers, or other operators purportedly did not conduct sufficientchecks of system state and assumes “all was well” when in fact adangerousconditionwasdevelopingthatledtotheaccident.

• Error of omission: refers to a mistake that consist of not doing

something you shouldhavedone,ornot including something suchasanamountorfactthatshouldbeincluded.

• Flighttrajectory:preciseroutetakenorduetobetakenthroughtheair

byanaircraft.

COCKPITAUTOMATIONFACTSHEET

HirakiJ.&WarninkM.(2016)

11

• Out-of-the-loop:Reducedhumanengagementtoaprocess.• Situational awareness: the ability to detect problems, determine the

currentstateofthesystem,understandwhathashappenedandwhatcoursesofactionsareneeded,andreacttothesituation.

• Social loafing: refers to the tendencyofhumans to reduce theirown

effortwhenworkingredundantlywithinagroup(orwithanautomatedsystem)thanwhentheyworkindividually.

• Stall: A stall is what happens when an aerofoil can not provide

sufficientliftinordertokeeptheaircraftinlevelflight.

References

1 TheDutchSafetyBoard(2010).Crashedduringapproach,Boeing737-800,nearAmsterdamSchipholAirport(projectnumber:M2009LV0225_01).Retrievedfromhttp://www.onderzoeksraad.nl/uploads/items-docs/1748/Rapport_TA_ENG_web.pdf

2 BAE(2012).FinalReportOntheaccident[…]flightAF447RiodeJaneiro–Paris.Retrievedfromhttp://www.bea.aero/docspa/2009/f-cp090601.en/pdf/f-cp090601.en.pdf3FederalAviationAdministration(2003-01-02),AC61-107B-AircraftOperationsatAltitudesAbove25,000FeetMeanSeaLevelorMachNumbersGreaterThan.75,retrieved2015-12-144Oxforddictionaries(2015).DefinitionofAutomate.Retrievedfrom:http://www.oxforddictionaries.com/definition/learner/automate5Chialastri,A.,2012.AutomationinAviation.6Skybrary(2015).CockpitAutomation-AdvantagesandSafetyChallenges.Retrievedfrom:http://www.skybrary.aero/index.php/Cockpit_Automation_-_Advantages_and_Safety_Challenges

7Oxford dictionaries (2015). Definition of Error of Omission. Retrieved from:http://dictionary.cambridge.org/dictionary/english/error-of-omission8Oxford dictionaries (2015). Definition of Commission error. Retrieved from:http://dictionary.cambridge.org/dictionary/english/error-of-commission?q=commission+error9Parasuman, R., Manzey, D.H., 2010. Complacency and Bias in Human Use of Automation: AnAttentional Integration. In: Human Factors: The Journal of the Human Factors and ErgonomicsSociety52,pp.381-410.10Mosier, K.L., Skitka, L.J., 1996.Human decisionmakers and automated decision aids:Made foreachother?In;Parasuman,R.,Manzey,D.H.,2010.11Robert J. deBoer,WijnandHeems&KarelHurts (2014).TheDuration ofAutomationBias in aRealistic Setting, The International Journal of Aviation Psychology, 24:4, 287-299, DOI:10.1080/10508414.2014.94920512Sarter,N.,Woods,D.D.,&Billings,C.1997.AutomationSurprises. InG.Salvendy,HandbookofHumanFactors&Ergonomics,secondedition.Wiley.13Kaber, D.B., Perry, C.M., Segall, N., McClernon, C.K., Prinzel III, L.J., 2006. Situation awarenessimplicationsofadaptiveautomation for informationprocessing inanair trafficcontrol - relatedtask.In:InternationalJournalofIndustrialErgonomics36,pp.447-462.14BEA(2012).FinalReport–AF447.15G.A.C.Gray,G.A.C.,(2007).PilotHandlingofHighlyAutomatedAircraft.16Wiener, E.L. 1989. Human factors of advanced technology ("glass cockpit") transport aircraft.(NASAContractorReportNo.177528).MoffettField,CA:NASA-AmesResearchCenter.17Antonovic,B.,(2008).TheFlightCrewandAutomation.

COCKPITAUTOMATIONFACTSHEET

HirakiJ.&WarninkM.(2016)

12

18EASA(2013)EASAAutomationPolicyBridgingDesignandTrainingPrinciples.19Letsu-Dake,E.,Rogers,W.,Dorneich,M.C.,&DeMers,R.(2012).InnovativeFlightDeckFunctionAllocationConceptsforNextGen.pp.301-310.

Imagereferences(toptobottom,lefttoright)

1 JulianHiraki(2014)2TheDutchSafetyBoard(2010).Crashedduringapproach,Boeing737-800,nearAmsterdamSchipholAirport(projectnumber:M2009LV0225_01).Retrievedfromhttp://www.onderzoeksraad.nl/uploads/items-docs/1748/Rapport_TA_ENG_web.pdf3FAAAviationHandbook(2013).PrinciplesofflightRetrievedfromhttps://www.faa.gov/regulations_policies/handbooks_manuals/aircraft/airplane_handbook/4BAE(2012).FinalReportOntheaccident[…]flightAF447RiodeJaneiro–Paris.Retrievedfromhttp://www.bea.aero/docspa/2009/f-cp090601.en/pdf/f-cp090601.en.pdf

DutchSummary

Binnen de luchtvaartwordt er gebruik gemaakt van automatisering in decockpit om zo een veilig en efficiënte vlucht en toenemend passagiercomfort te creëren.Met de opkomst van steedsmeer zeer betrouwbareautomatiseringbinnendecockpitwordtdeveiligheidgewaarborgd.Helaasheeftdeopkomstvanautomatiseringookeenaantalproblemenmetzichmee gebracht omtrent demens-computer interactie. Als gevolg van dezeproblemen zijn er al een aantal fatale ongelukken gebeurd waaronderTurkish Airlines vlucht TK1951 en Air France vlucht AF447. Hoofdzakelijkzijn het twee fenomenen die een grote rol spelen: Automation Bias enAutomation Surprise. Betere crewtraining en het opnieuw definiëren vande verantwoordelijkheden van piloten wordt aanbevolen om dezeproblemenoptelossen.ThisisaLuchtvaartfeiten.nl/AviationFacts.eupublication.

Authors:JulianHiraki&MikeWarnink

Editorialstaff:R.J.deBoerPhDMsc,G.BoostenMSc,A.L.C.Roelen&G.J.S.

VlamingMSc

Copyingtextsisallowed.Pleasecite:

‘Luchtvaartfeiten.nl(2016),CockpitAutomationFactsheet,

www.luchtvaartfeiten.nl’

Luchtvaartfeiten.nl is an initiative by the Aviation Academy at the

Amsterdam University of Applied Sciences (HvA). Students and teachers

share knowledge with politicians and the general public to ensure

discussionsarebasedonfacts.

February2016