review: development and trends in vehicle safety automation

5

Upload: dominic-portain

Post on 18-Jan-2015

1.440 views

Category:

Automotive


0 download

DESCRIPTION

A brief review about the history, philosophy and current trends concerning automotive safety automation ...and why we won't have automatic cars during the next few decades

TRANSCRIPT

Page 1: Review: Development and trends in vehicle safety automation

Review: Development and trends in vehicle safety automation

Dominic Portain

04/25/10

Abstract

Objective: This review covers the recent 50 years of driving-related research and o�ers an out-

look into future developments of vehicle safety automation from the perspective of human factors

and ergonomic design. Background: Safety features in automotive design and engineering feature an

increasing amount of automated processes, creating the need for evaluation of driver behavior and

interaction. Structure: A brief comparison of the contemporary, basic approaches regarding automa-

tion is presented to provide a solid foundation for the following topics. After a brief historcal overview

of the related �ndings in automotive safety, the focus is changed towards present issues and trends:

the example of intelligent seatbelt indicators is used as an introduction to the discussion between

hard and soft automation. Finally, the combined remarks are compiled into an outlook about the

future development of safety automation in automotive design.

Introduction

Ever since the introduction of the personal automobile, considerable e�ort was expended to increasesafety for the passengers while driving. The safety devices for early cars were directly adopted fromhorse carriers, as both were travelling at approximately the same speed and weight. However, whenautomobile parts became more powerful and the electric systems were increasingly portable, new features- such as the brake light - became neccessary to prevent frequent accidents. Along with the increasinglyminiaturized electronics, essential- as well as comfort and safety features were subject to a strong trendtowards automation. Quickly, indicator lights were blinking in a steady rhythm and the engine didn'trequire a crank for startup any more.

Approximately in the early 70s, a paradigm that would later be described as performance-oriented byParasuraman & Ridley (1997), peaked in its popularity among automotive engineers and designers. Tra�caccidents were seen as consequence of lacking mechanic properties of the vehicle parts. As the historicsolutions often excerted undesired behavior in extreme situations (such as self-oscillating suspensionsthat cause a barrel roll after aprupt changes of direction), this explanation followed sound reasoning.Only when the major technical issues were resolved - tra�c accidents dropping by almost 80% -, anothercontributing factor for accidents became increasingly evident. The human in the driver's seat - and his,often unexplicably low, performance - had to be incorporated into future safety measures as well.

Performance versus Behavior

The major paradigm shift in those decades was led by the increasingly important distinction betweenperformance and behavior (Lee 2008b). While performance in the traditional sense could be (and, inmilitary context, was) described as the limit of physical and mental capabilites, behavior was a betterdescription for the processes in daily tra�c. In contrast to the military de�nition, the possible limitsof man and machine were found to be rarely reached even in extreme situations: Casual drivers neitherdispose of the extensive training nor of the clear objectives that make the performance paradigm a validassumption in combat situations. Several other factors prove to be in�uental to the drivers' behavior:

• limits of perception and attention (especially in heavy tra�c) decrease the information basis onwhich the reactions are based;

1

Page 2: Review: Development and trends in vehicle safety automation

• the three most common impairments - fatigue, distraction and alcohol - decrease reaction speedand accuracy;

• priorities, safety margins and trade-o�s (such as the cost of an accident versus the time of arrival)e�ectively decrease the general limit of performance;

• personal goals, needs and motivations lead to subjectively biased - and, consequently, unpredictable- decisions.

Additional to these factors - which are more or less valid for every human driver - individual di�erencesplay an essential role in modelling the causes of an accident. For example, the risk of having a fatalaccident is increased by �ve times for elderly people (80 and beyond), compared to the age group between40 and 50 years. Although the risk is - similarly - increased tenfold when comparing drivers under theage of 20 to the reference group, the underlying reason is completely di�erent. In order to understandwhy the pure risk percentages are insu�cient in the process of planning a future safety feature, one must�rst acknowledge the inherent complexity and inhomogeneity within its user group. Made possible byadvances in the young �eld of cognitive psychology and -ergonomics, several studies were conducted thatintroduced a certain amount of transparency into the �eld of complex interactions between man andmachine.

Adoption of Technology

Parasuraman & Ridley (1997) have introduced four rudamentary but robust categories that describe thebasic forms of interaction: Use, Abuse, Misuse and Disuse. The distinction is reasonably robust becauseit is based on the interaction of underlying long-term factors, such as personal skills or design usability.First, the concept Use describes the ideal type of interaction: the system provides valuable support tothe user's task, and the user decides to engage the automation during adequate situations (e.g., the rain-intensity dependent windscreen wiper that prevents the user from constantly switching the wiping speed).In automotive design, solutions that promote use are highly desired because of the decreasing cognitiveworkload (CWL; see Patten et al., 2004) by enabling cognitive outsourcing. In cases of abuse, the useris willing to accept the system's intervention. However, most likely due to an unforseen combination ofcircumstances, the automation does either not provide adequate support, or even counteracts the user. Aprominent example features the safety system in modern aircrafts that prevents full reverse throttle whenno weight is measured on the landing wheels. This principle also holds when the responsible sensor hasfailed, preventing a safe landing. Unfortunately, this design �aw only becomes apparent at the momentwhen the catastrophe is already inevitable.

To understand the following two aspects of �awed man-machine interaction, grasping the concept ofoperator con�dence or trust is imperative. Due to a series of psychological factors - including severalmisattributed heuristics from social psychology - the operator of a complex system begins to developa decision bias in whether to trust in the automated process. As a general rule, trust increases withthe regularity of expected behavior; and is severely decreased after an unexpected event. An operatorcon�dence level that is too high (overestimation of reliability) is attributed with the term misuse, whilea overly low con�dence level creates disuse. Both processes lead to �awed operator decisions concerningthe use of existing automation - either through trusting a system which is not adequately equipped tocope with the situation (e.g. a parking sensor that is expected to detect light bushes) or through notengaging the automation even when it could righteously support the user (e.g. turning the parking sensoro� before reversing into one's garage).

For those aspects, it is not only imperative to estimate the amount of perceived con�dence in the earlydesign, but also to consider the method of enabling or disabling one speci�c automation. Additionalto the operator's evaluation whether or not to trust the automation (decision bias), the action also canin�uence the perceived control and risk normalization: current risk psychology assumes that operatorsbehave in an equilibrium of (perceived) risk and control. The activation of an additional safety mechanismcan lead to an inadequate increase of perceived safety, causing a greater tolerance towards other risks.For example, this mechanism leads to more careless driving behavior (higher speeds, less safety distance)when the adaptive cruise control is perceived as capable to execute the neccessary braking maneuvers.

2

Page 3: Review: Development and trends in vehicle safety automation

If, additionally, either road or tra�c conditions fall outside the implemented parameters, the situationimperceptibly degrades to misuse. When the safety automation isn't consciously perceived - as, forexample, is the case with seatbelts or airbags - the negative in�uence through risk normalization is farsmaller then the actual increase in safety, causing an overall positive e�ect.

To prevent the occurence of each of the several biases (con�dence, perceived risk and control pose only asmall, although signi�cant, sample), current ergonomic design strives to reach a balance of authority (Lee,2008a): The operator is in complete control as long as he is capable of resolving the situation, yet theautomation takes initiative when it is adequate to provide support. This balance requires an excellentevaluation of capabilities from either the user or the automation - and careful consideration from thedesigner, when including the interface to enable or disable certain automatic safety systems. As theseissues require increasingly detailed psychological insights much more than physical performance data, theview on automation has generally shifted towards an operator-centric perspective.

Hard or Soft Driving?

The balance of authority is a rather recent prospect to automotive engineers. However, the issue hasalready received su�cient attention in aviation over the past two decades. The two opposite approaches,soft and hard automation, are represented prominently through the design paradigms by Boeing andAirbus. If the process of pulling up a Boeing 777 exceeds the plane's safety limit of structural strength,the yoke becomes sti�er - increasing the required force to ful�ll the critical action (soft automation). Incontrast, the Airbus 380 limits the actions of its pilots to the prede�ned safety margins (hard automation).In an incident that involved a critical engine failure in a China Airlines 747 in 1985 (see Young, Stantonand Harris, 2007), the airplane su�ered signi�cant structural damage while losing almost 30.000 feet inan uncontrollable dive. Because the control inputs that led to the recovery of control had exceeded thesafety limits, the plane would have crashed under a hard automation. Overall, the Airbus paradigm isdesigned to overrule small control mistakes or human overreactions that could result in larger accidents;in turn taking control from the pilots when an unforeseen situation occurs (bias towards abuse: designersignore operators' skills). The Boeing paradigm, in contrast, is designed to refrain from taking controleven over apparent errors (bias towards disuse: operators ignore automation capabilities) and providingadequate feedback instead.

Young et al. performed a meta-study to provide an indication which paradigm results in more desirableresults. The indicator in air tra�c can be directly compared to automotive tra�c: number of accidentscaused by automation errors in either of the two aircraft types. The data reveals that more than twice asmany major faults were caused by �awed automation systems that followed the hard approach. Addition-ally, many of the Boeing-related accidents were only barely tenuously related to automation, as primarilybeing caused by a lack of situational awareness among the crew. In the aviation industry, conclusionswere drawn according to those tendencies. Overly automated systems seem to develop animacy : they areperceived to have an own will, that has to be defeated when the intentions of humans and automation donot match. This problem was condensated to a lack of mode awareness: the crew couldn't understand theprocedure that was currently executed by the automation, provoking a blind wrestling for control. Forexample, when an automated �ight level change was overridden by a manual increase of climb rate, an�invisible� mode was activated; resulting in a surprising behavior when this mode causes an error duringthe landing procedure (several hours later).

As a resolution to the con�ict between soft and hard automation, the authors introduce a third philosophy- again, based on the concept of shared authority. To improve communication between operator andmachine and prevent errors (e.g., of mode awareness), an intuitive feedback system is proposed to beconstructed using usability engineering.

Application

One of the earliest technologies in vehicle safety that incorporate the principle of shared authority is theconcept of intelligent seatbelt indicators (Lie, 2006). This implementation, although technically followingthe hard paradigm, refrains from taking control over any driving functions. As an early example of

3

Page 4: Review: Development and trends in vehicle safety automation

successfully shared authority, the safety feature only provides (visible and audible) constant feedback tothe operator when his seatbelt is not engaged yet. The swedish researchers compared seatbelt use inautomobiles with or without seatbelt indicators within urban areas across Europe. The results showedthat the lack of this device is connected to a �vefold increased rate of driving without seatbelt. Theseatbelt indicator design uses the risk normalization to its advantage: The actual seatbelt use is perceivedas almost neutral when risk prevention is concerned (see Lee, 2008b). By bringing the lack of a seatbelt tothe drivers' attention, the perceived risk is increased - causing either more careful driving behavior or theengagement of the seatbelt. An interesting factor concerns perceived control, as drivers' risk perceptionis also in�uenced by their actual choice whether to wear the seatbelt or not. The system is currentlydesigned to stop after at least 90 seconds of signaling; the ratio of seatbelt use is expected to decreaseslightly when this is not the case (and drivers feel forced into engaging the seatbelt).

Currently, automobile automation can be classi�ed into one of two categories: vehicle automation anddriving automation (Bishop, 2000). The former (e.g., automatic transmissions, ABS, ESP, classic cruisecontrol) a�ects only the lowest level of driving: brakes, gears, accelerator; controlled by the direct prox-imity (traction, motor revolutions or wheel speed). Driving automation techniques (e.g., adaptive cruisecontrol, collision avoidance system, parking aids) are, in contrast, context-dependent - in�uenced andcontrolled by cues from a complex environment. As a result from this inherent complexity, errors anddesign �aws become much more apparent and in�uential when driving automation is implemented underthe paradigm of hard automation: the user has no means, in contrast to the soft paradigm, to overridean erronous intention in a critical situation. Generally, driving automation is therefore better suited forsoft paradigms. The context-independent vehicle control, in comparison, has a far smaller spectrum ofpossibilities - which, additionally, are not as well perceived by the driver. Therefore, hard paradigms tendto be suited better for automation systems that increase the safety and stability of vehicle control. Evenif this categorization already seems to crystallize from concurrent implementations, the sharp separationof the two paradigms is not only unneccessary, but prevents the incorporation of essential feedback thatcould prove useful against human mode errors and cognitive biases.

The consequence of these �ndings not only a�ects future automation in automotive and aviation design,but also coincides with the paradigm shift that can simultaneously be observed in computer science:the trend towards steadily increasing automation has come to a standstill. The increasingly importantfocus on usability, together with the retention from clustering multiple services to one bundle (e.g., theYahoo search portal, the Adobe Creative Suite), provides the user with a perception of greater relevance.Software and automotive systems become increasingly aware of their limitations, and - as formerly,the quality of �intelligent� behavior - balance of authority will become the next most relevant designconsideration. As a result of this subtle paradigm shift, interaction with technical devices will be muchmore e�ortless in the near future - allowing the intuitive incorporation of several automative solutionsinto our daily procedures. As the �all-in-one� paradigm will further decrease in relevance, we will be using(and depending on) several di�erent devices throughout our day - each interface optimized to presentonly the most neccessary choices and information.

Finally, the increasingly strong prognosis about the development of fully automated automobiles (ex-claimed univocally by both futurologists and engineers) will probably not become reality in the next fewdecades - until the automation can retain the vehicle control without the need for human intervention.Ironically, when the automation is more capable than a human driver in all situations, the driver notonly becomes utterly super�uous .

Conclusions

The view on interface design and safety automation has undergone a shift towards a user-centric per-spective. One of the essential aspects concerns the balance of authority between driver and automatedsystem, which can additionally be in�uenced by a variety of psychological factors. The choice for one oranother type of design (e.g., hard or soft automation) can strongly in�uence the actual increase of safetyin daily usage.

Designers are strongly adviced to incorporate the users and their cognitive biases already in the earlystages of design. Not only increases usability optimization cost exponentially with the elapsed progress;the intimate cooperation prevents all involved parties from falling victim to common engineering fallacies.

4

Page 5: Review: Development and trends in vehicle safety automation

References

[Bishop(2000)] R. Bishop. A survey of intelligent vehicle applications worldwide. In Proceedings of the

IEEE Intelligent Vehicles Symposium, volume 2000, 2000.

[Lee(2008a)] John D. Lee. Fifty years of driving safety research. Human Factors: The Journal of the

Human Factors and Ergonomics Society, 50(3):521�528, June 2008a. doi: 10.1518/001872008X288376.URL http://hfs.sagepub.com/cgi/content/abstract/50/3/521.

[Lee(2008b)] John D. Lee. Review of a pivotal human factors article: "Humans and automation: Use,misuse, disuse, abuse". Human Factors: The Journal of the Human Factors and Ergonomics Society,50(3):404�410, June 2008b. doi: 10.1518/001872008X288547. URL http://hfs.sagepub.com/cgi/

content/abstract/50/3/404.

[Lie et al.(2008)Lie, Kra�t, Kullgren, and Tingvall] A. Lie, M. Kra�t, A. Kullgren, and C. Tingvall. In-telligent seat belt Reminders�Do they change driver seat belt use in europe? Tra�c Injury Preven-

tion, 9(5):446�449, 2008.

[Parasuraman and Riley(1997)] R. Parasuraman and V. Riley. Humans and automation: Use, misuse,disuse, abuse. Human Factors, 39(2), 1997.

[Patten et al.(2004)Patten, Kircher, \Östlund, and Nilsson] C. J.D Patten, A. Kircher, J. \Östlund, andL. Nilsson. Using mobile telephones: cognitive workload and attention resource allocation. Accidentanalysis & prevention, 36(3):341�350, 2004.

[Young et al.(2007)Young, Stanton, and Harris] M. S Young, N. A Stanton, and D. Harris. Driving au-tomation: learning from aviation about design philosophies. International Journal of Vehicle Design,45(3):323�338, 2007.

5