25977-automobile sensors may usher in self driving cars pdf

6
A robocar of the future would be so intelligent that its driver would be able to read, play, or work rather than piloting the car. The benefits would include safety, freeing up the driver for other tasks or recreation, and the more effective use of the traffic infra- structure due to more efficient traffic regulation and fuel efficiency. Motor-vehicle accidents are the leading cause of death of 13- to 29-year-olds in the United States. According to Sebastian Thrun, an engineer at Google and the direc- tor of the Stanford Artificial Intelligence Laboratory, which cre- ated the Google robocar, almost all of these accidents are the result of human error rather than machine error, and he believes that machines can prevent some of these accidents. “We could change the capacity of highways by a factor of two or three if we didn’t rely on human precision for staying in the lane and [instead] depended on robotic precision,” says Thrun. “[We could] thereby drive a little bit closer together in a little bit narrower lanes and do away with all traffic jams on highways.” Doubling highway capacity by a fac- tor of three with no added infrastruc- ture costs and freeing an hour or two a day for productive or relaxing pur- suits seem like worthy goals, but how close is the auto industry to achieving a practical self-driving car? Google is not in the car-production business and has no business plan for monetizing its research (Reference 1). In Google’s approach, autonomous vehicles will not require a government mandate to become reality. The Google fleet uses LIDAR (light-detection-and-r anging) technology, such as that in a system Automobile sensors may u s her in se lf -drivin g c ars G oogle last year demonstrated the results of its research-and-deve lopment efforts to create an autonomous vehicle. The small fleet of specially equipped cars— six Toyota Priuses and one Audi TT— has logged more than 140,000 miles of daytime and nighttime driving in California, including traversing San  Francisco’s famously crooked Lombard Street and the Los Angeles freeways (  Figure 1 ). In all cases, an engineer was in the driver’s seat, monitoring each car’s performance and ready to take over if necessary. By Margery Conner  TeChniCal ediTor    i    m    a    g    e   :    i    s    t    o    c    k 38 eDn | May 26, 2011

Upload: sandrawarman-balasundram

Post on 06-Apr-2018

219 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 25977-Automobile Sensors May Usher in Self Driving Cars PDF

8/3/2019 25977-Automobile Sensors May Usher in Self Driving Cars PDF

http://slidepdf.com/reader/full/25977-automobile-sensors-may-usher-in-self-driving-cars-pdf 1/6

A robocar of the future would beso intelligent that its driver would beable to read, play, or work rather thanpiloting the car. The benefits wouldinclude safety, freeing up the driverfor other tasks or recreation, and themore effective use of the traffic infra-structure due to more efficient trafficregulation and fuel efficiency.

Motor-vehicle accidents are the

leading cause of death of 13- to29-year-olds in the United States.According to Sebastian Thrun, anengineer at Google and the direc-tor of the Stanford ArtificialIntelligence Laboratory, which cre-ated the Google robocar, almost allof these accidents are the result ofhuman error rather than machineerror, and he believes that machines

can prevent some of these accidents.“We could change the capacity of

highways by a factor of two or threeif we didn’t rely on human precisionfor staying in the lane and [instead]depended on robotic precision,” saysThrun. “[We could] thereby drive alittle bit closer together in a little bitnarrower lanes and do away with alltraffic jams on highways.”

Doubling highway capacity by a fac-tor of three with no added infrastruc-

ture costs and freeing an hour or twoa day for productive or relaxing pur-suits seem like worthy goals, but howclose is the auto industry to achievinga practical self-driving car? Google isnot in the car-production business andhas no business plan for monetizing itsresearch (Reference 1). In Google’sapproach, autonomous vehicles willnot require a government mandate tobecome reality. The Google fleet usesLIDAR (light-detection-and-ranging)technology, such as that in a system

Automobilesensorsmay usher inself-driving cars

Google last year demonstrated the resultsof its research-and-development effortsto create an autonomous vehicle. Thesmall fleet of specially equipped cars—six Toyota Priuses and one Audi TT—has logged more than 140,000 milesof daytime and nighttime driving inCalifornia, including traversing San

 Francisco’s famously crooked LombardStreet and the Los Angeles freeways ( Figure 1). In all cases,

an engineer was in the driver’s seat, monitoring each car’sperformance and ready to take over if necessary.

By Margery Conner • TeChniCal ediTor

   i   m   a   g   e  :   i   s   t   o   c   k

38 eDn | May 26, 2011

Page 2: 25977-Automobile Sensors May Usher in Self Driving Cars PDF

8/3/2019 25977-Automobile Sensors May Usher in Self Driving Cars PDF

http://slidepdf.com/reader/full/25977-automobile-sensors-may-usher-in-self-driving-cars-pdf 2/6

Self-driving carS offer many advantageS 

over often-diStracted human driverS.

although the emergence of a truly

autonomouS car iS Several yearS in thefuture, currently available radar, lidar, and 

viSion-baSed SyStemS are coaleScing into

an automotive-SenSor platform that pointS

the way toward tomorrow’S robocar.

May 26, 2011 | eDn 39

Page 3: 25977-Automobile Sensors May Usher in Self Driving Cars PDF

8/3/2019 25977-Automobile Sensors May Usher in Self Driving Cars PDF

http://slidepdf.com/reader/full/25977-automobile-sensors-may-usher-in-self-driving-cars-pdf 3/6

available from Velodyne’s HDL (high-definition LIDAR)-64D laser-sensorsystem, which uses 64 spinning lasersand then gathers 1.3 million points/secto create a virtual model of its surround-ings. One reason to use LIDAR ratherthan radar is that the laser’s higher-energy, shorter-wavelength laser lightbetter reflects nonmetallic surfaces,such as humans and wooden powerpoles. Google combines the LIDARsystem with vision cameras and algo-rithmic vision-processing systems toconstruct and react to a 3-D view of the world through which it is driving(Reference 2).

The enabling sensor hardware in thevehicles enables the cars to see every-thing around them and make decisionsabout every aspect of driving, accordingto Thrun. Although we are not close yetto a fully autonomous vehicle, the tech-nology, including the sensor platform of radar, ultrasonic sensors, and cameras, isavailable in today’s intelligent vehicle.It remains only to standardize the car’shardware platform and develop the soft-ware. Cars are approaching the pointthat smartphone platforms had reachedjust before the introduction of the Apple

iPhone and the Motorola Android.As sensors decrease in price and

increase in integration, they willbecome ubiquitous in all cars. Onceusers accept them as normal parts of acar, then automotive-OEM companiescan integrate more intelligence intothem until they achieve the goal of anautonomous car. Today’s intelligentautomobile can perform many driver-assistance tasks, such as avoiding andpreventing accidents and reducing the

severity of accidents. To perform thesetasks, the vehicles have passive safetysystems, such as air bags and seat belts;active safety systems, such as electronicstability control, adaptive suspension,and yaw and roll control; and driver-assistance systems, including adaptivecruise control, blind-spot detection,lane-departure warning, drowsy-driveralert, and parking assistance. These sys-tems require many of the same sensorsthat the autonomous car requires: ultra-sonic sensors, radar, LIDAR systems,and vision-imaging cameras.

Cars now use ultrasonic sensors to pro-vide proximity detection for low-speedevents, such as parallel parking and low-speed collision avoidance. Ultrasonicdetection works only at low speedsbecause it senses acoustic waves; whenthe car is moving faster than a person canwalk, the ultrasonic sensor is blind.

Although ultrasonic-sensor technol-ogy is more mature and less expensivethan radar, car designers who care aboutthe aesthetics of the car’s appearance arereluctant to have too many sensor aper-tures visible on the car’s exterior. As amore powerful and more flexible technol-ogy, radar should begin to replace ultra-

 At A GlA nce

↘  Although self-driving cars are

still five to 10 years in the future,

they offer many benefits in traffic

and fuel efficiency, safety, and time

saving.

↘ Self-driving cars will require

relatively few infrastructure changes,

with cars relying on sensors and

processors to “see” and react to

their surroundings.

↘ Some of the key sensors will

be radar or LIDAR (light-detection-

and-ranging) and vision systems,

such as visible light and IR (infrared)

cameras.

Figure 1 Google’s fleet of specially equipped cars has logged more than 140,000 miles of daytime and nighttime driving in

California, including traversing San Francisco’s famously crooked Lombard Street and the Los Angeles freeways.

40 eDn | May 26, 2011

Page 4: 25977-Automobile Sensors May Usher in Self Driving Cars PDF

8/3/2019 25977-Automobile Sensors May Usher in Self Driving Cars PDF

http://slidepdf.com/reader/full/25977-automobile-sensors-may-usher-in-self-driving-cars-pdf 4/6

sonic sensors in future designs ( Figure 2).

Radar works in any type of weatherand has short-, medium-, and long-range characteristics. For example,adaptive cruise control works in thelong range, looking 200m in front of the car, tracking the car, and accelerat-ing or braking the car to maintain acertain distance. Radar also providesblind-spot detection and lane-departurewarning. Early versions of these systemsaudibly warned the driver of an impend-ing problem, but some implementationsnow take control of the car to avoid the

problem. For example, the 2011 InfinitiM56 has an optional blind-spot-warn-

ing/intervention system that relies on

radar scans from the left and the rightrear quadrant of a car. If the radar systemdetects a car in the driver’s blind spot,a light comes on. If the driver activatesthe turn signal, an audible beep comeson. If the driver persists and starts tomove into another lane, the car gentlyapplies brakes on the opposite side of the car, moving the car back into thecenter of the lane (Reference 3).

Most automotive radar systems cur-rently are not highly integrated, tak-ing up significant space, and are costly.

Analog Devices’ recently introducedAD8283 integrated automotive-radar-

receiver analog front end represents the

increasing integration that decreasesthe size and cost of automotive radar( Figure 3). It will sell for about 50% lessthan a discrete design for an automotiveanalog front end and fits into a 10×10-mm package. “The market is movingtoward putting radar into nonluxuryvehicles—cars for the rest of us,” saysSam Weinstein, product manager forthe Precision Linear Group at AnalogDevices. The sample price for a six-channel AD8283 is $12.44 (1000).

IR (infrared) LEDs and photosen-

sors find use in automotive applications,such as rain sensing/wiper activation on

LANE-CHANGE

 ASSISTANCE

LANE-CHANGE

 ASSISTANCE

BLIND-SPOT

DETECTION

BLIND-SPOT

DETECTION

CROSS-

TRAFFIC

 ALERT

CROSS-TRAFFIC

 ALERT

SELF-PARKING

SELF-PARKING

LANE-DEPARTURE WARNING

LANE-DEPARTURE WARNING

SIDE IMPACT

SIDE IMPACT

PARKING

 ASSISTANCE/ 

 VISION

PARKING

 ASSISTANCE

 ADAPTIVE

CRUISE

CONTROL

BRAKE-

 ASSISTANCE/ 

COLLISION

 AVOIDANCE

 

E IMPACTSI

E IMPACTSI

 /  A

RADAR APPLICATION

ULTRASONIC

Figure 2 Several driver-assistance systems are currently using radar technology to provide blind-spot detection, parking assistance,

collision avoidance, and other driver aids (courtesy Analog Devices).

42 eDn | May 26, 2011

Page 5: 25977-Automobile Sensors May Usher in Self Driving Cars PDF

8/3/2019 25977-Automobile Sensors May Usher in Self Driving Cars PDF

http://slidepdf.com/reader/full/25977-automobile-sensors-may-usher-in-self-driving-cars-pdf 5/6

 

POWER-

SUPPLY

UNIT

TRANSMITTER-CHANNEL SIGNAL GENERATION

RAMP

GENERATOR

DSPMICRO-

CONTROLLER

HOST CPU

CONTROLLER-

 AREA-NETWORK

FLEXRAY

RECEIVER-CHANNEL SIGNAL PROCESSING

 ADC MULTI-

PLEXER

 A AF PGA LNA

 A AF PGA LNA

 A AF PGA LNA

n … n … n …

NOTES: A AF: ANTI-ALIASING FILTER

LNA: LOW-NOISE AMP

PGA: PROGRAMMABLE-GAIN AMP

VCO: VOLTAGE-CONTROLLED OSCILLATOR

 VCO

MONOLITHIC MICROWAVE IC

POWER

 AMP

 ANTENNA

Figure 3 The Analog Devices AD8283 radar analog front end comprises trans-

mitter-channel-signal-generation, power-supply, host-CPU, receiver-channel-

signal-processing, and monolithic-microwave-IC blocks.

the BMW 7 series and the Ford Edge.Sophisticated IR cameras enable safetyapplications, such as drowsy-driversensing, which is also an option in theMercedes E550 sedan. Drowsy-driver

sensing uses an IR camera to watch thedriver’s eyelids to tell whether they areblinking rapidly, indicating that thedriver is alert, or blinking slowly oreven closing. The car emits an audible

warning or vibrates the driver’s seat.Out-of-position sensing similarly

uses IR cameras. Today’s passenger seatsmust have pressure sensors to determinethe weight of the passenger and use the

44 eDn | May 26, 2011

Page 6: 25977-Automobile Sensors May Usher in Self Driving Cars PDF

8/3/2019 25977-Automobile Sensors May Usher in Self Driving Cars PDF

http://slidepdf.com/reader/full/25977-automobile-sensors-may-usher-in-self-driving-cars-pdf 6/6

information to deploy the passenger’sair bags. The air bags deploy at differ-ent speeds, depending on the weightof the passenger. This sensor does notknow, however, whether the passengeris leaning on the dashboard, recliningin the seat, or moving to the left orthe right. The closer the passenger isto the deploying air bag, the greaterthe impact. The camera monitors thepassenger’s position, and, upon impact,deploys the air bag appropriately to thepassenger’s size and position.

These cameras use IR LEDs rath-er than those in the visible spectrumbecause they must be able to work atnight. It would be distracting to illu-minate the driver or the passenger with

visible light for the camera to sense.The human eye detects light as visibleat distances as great as approximately700 nm, whereas IR cameras detect 850-to 900-nm-distant light.

IR imaging also has a place outsidethe car for crash avoidance, and theseapplications require IR illumination.According to Sevugan Nagappan, mar-keting manager of the infrared business

unit at Osram Opto Semiconductors,IR cameras can help in collision avoid-ance by seeing beyond what the highbeams illuminate. “IR-LED illumina-tion allows you to see when you can’thave your high beams on to see pastyour headlamps, for example, allowingthe system to see beyond the headlightsto see and avoid a deer entering theroad,” he says.

IR LEDs’ primary use has so far beenin remote controls. However, theseinexpensive LEDs use 10 mW or less of power. Automotive applications requireoutput power of greater than 1W toilluminate the subject. In addition, theIR LED must be small enough to fit nextto the IR camera and be inconspicuous.

 Nagappan estimates that the cameraneeds to measure less than 10 mm2,and illuminating the IR LED requires5 mm2. He says that manufacturers canmake LEDs in small packages that canprovide 3.5W and that these devicesare enabling new applications. Osram’s3.5W SFH 4236 IR LED has an integrallens with a narrow beam angle to focusthe IR light, increase the beam inten-sity, and focus the beam into the eye boxto watch the driver’s eyes.

Innovation is also driving down thecost of the cameras. The FraunhoferInstitute expects to bring to market acamera as small as a grain of salt and

costing only a few euros. The resolu-tion currently is 250×250 pixels. Thesecameras could replace side-view mirrors,reducing airflow drag ( Figure 4).EDN

RefeRences1 Markoff, John, “Smarter Than You

Think: Google Cars Drive Themselves,

in Traffic,” The New York Times, Oct 9,

2010, http://nyti.ms/jZUkLX.2 Schwarz, Brent, “LIDAR: Mapping

the world in 3D,” Nature Photonics,

July 2010, pg 429, http://bit.ly/ 

ktNDmr.3 “Blind Spot Warning (BSW) System/ 

Blind Spot Intervention (BSI) System,”

YouTube, http://bit.ly/jmGVzE.

Figure 4 The Fraunhofer Institute expects

to bring to market a camera as small as

a grain of salt, which could replace side-

view mirrors, reducing airflow drag.

yu c c

Tcc et

M C

t 1-805-461-8242

m.

[email protected].

fOR MORe InfORMAtIOn

 anl Deve

www.analog.com

 apple cpuer

www.apple.com

BmW

www.bmwusa.com

Frd

www.ford.com

Frunhfer inue

www.fraunhofer.de/en

gle

www.google.com

infn

www.infinitiusa.com

merede-Benz

www.mbusa.com

mrl

www.motorola.com

or op

sendur

www.osram-os.com

snfrd arfl

inellene

Lbrry

http://robotics.stanford.edu

ty

www.toyota.com

 Veldyne

www.velodyne.com