the self-driving car

24
The Self-Driving Car: An exercise in technology assessment Fred Phillips fred.phillips@ stonybrook.edu

Upload: fred-phillips

Post on 24-Jan-2017

201 views

Category:

Automotive


1 download

TRANSCRIPT

Page 1: The Self-Driving Car

The Self-Driving Car:An exercise in

technology assessment

Fred [email protected]

Page 2: The Self-Driving Car

A word about words...

• In the past, “self-drive car” meant one driven by the owner, rather than by a hired driver.

• Today we will use SDC to mean driverless car.– Or AV for “autonomous vehicle.”

• Which raises an interesting initial question:– In countries where upper middle class car owners

can afford to hire a driver, will owners see any point at all in buying a SDC?

Page 3: The Self-Driving Car

Why SDCs?• “The overriding reason for self-driving cars is to

save lives and reduce injuries. – “About 90% of all crashes are caused at least in part by

human error.”• “The[y] will also slash the costs of car accidents,

estimated at $300 billion annually in the U.S. – “That cost is a direct result of the fact that we humans

are, frankly, terrible drivers.• “... improved fuel consumption.

http://e360.yale.edu/feature/self-driving_cars_coming_soon_to_a_highway_near_you/2554/

Page 4: The Self-Driving Car

However, traffic deaths are already declining.

Page 5: The Self-Driving Car

Perhaps more to the point...

http://en.wikipedia.org/wiki/List_of_motor_vehicle_deaths_in_U.S._by_year

Page 6: The Self-Driving Car

Failure engineering: Google SDC expected to contain up to

100M LOC• The Chevy Volt contains ten million lines of

code. http://www.wired.co.uk/magazine/archive/2012/01/features/this-car-drives-itself/viewall

• An SDC will easily involve many more LOC.• Let’s assume 50M LOC.

Page 7: The Self-Driving Car

How many errors to expect per thousand lines of code?

• “Industry Average: about 15-50 errors per KLOC, over a mix of coding techniques.

• “Microsoft Applications: ~ 10-20 defects/KLOC during in-house testing, and 0.5/KLOC in released product.

• “A few projects - for example, the space-shuttle - have achieved a level of 0 defects in 500,000 LOC using a system of formal development methods, peer reviews, and statistical testing.”– And leisurely development times.

http://amartester.blogspot.kr/2007/04/bugs-per-lines-of-code.html

Page 8: The Self-Driving Car

Estimated bugs in SDCs

50,000,000 LOC x 0.5 bugs/1,000 LOC

= 50,000 x 0.5 = 25,000 bugs

Assuming 5% of these bugs are potentially lethal,

25k x 0.05 = 1250 potentially lethal bugs

Page 9: The Self-Driving Car

Is this estimate defensible?• Time-to-market pressure will not allow endless

NASA-type testing.• Testing can be done only in simulation.

– Road testing of small numbers of SDCs; no road testing of massive numbers of SDCs.

– In projects of over 1MLOC, open-sourcing does not reduce bugs.

• What about “driver feel” that reveals subtle mechanical difficulties?

• Google moving outside its core competence of desktop and smartphone applications.

Page 10: The Self-Driving Car

However,

• 1250 lethal bugs is a conservative estimate, because– A bug can “crash”

• not just one car, • but the whole network of cars with which it

communicates.

• Also,– A bad human driver can kill a few people, but– The same bug in every SDC can kill massive

numbers of people.

Page 11: The Self-Driving Car

Other price-performance issues

• Estimate: Up to half the miles driven will be passengerless.– E.g., car drops you off and goes to park itself.– Thus biasing “fatalities/vehicle-mile” stats.

• Car-to-car communications are being built into human-driven cars.– Not unique to SDCs– Reduces SDCs’ increment of benefit

Page 12: The Self-Driving Car

Other assessment issues

• Entrenched opposition: Whose ox will be gored?– End of motel industry?

• Consumer acceptance– Privacy concerns

• “What If Your Autonomous Car Keeps Routing You Past Krispy Kreme?” www.theatlantic.com

Page 13: The Self-Driving Car

Consumer Watchdog Will Expose Real-Life Situations Robot Cars Can’t Handle, Then Testify In Support Of Proposed DMV Safety Regulations

• MEDIA ADVISORY — NEWS CONFERENCE. Thursday, Jan. 28, 2016 @ 9:30 am Pacific. CONTACT: John M. Simpson, 310-292-1902

• Visual:  Blow-up of pie chart showing reasons, including pedestrians, bicycles, other cars and bad weather, that caused Google’s robot car technology to fail 341 times.

• Google’s own numbers show a self-driving car needs a driver behind the wheel who can take control. Its self-driving robot cars failed and the human driver took control 341 times.  The robot technology sensed there was a problem it could not handle and turned control over to the human 272 times. The test driver was worried enough to intervene 69 times.  Reasons range from bad weather, to other reckless drivers, to a failure to correctly perceive objects like overhanging branches, to making an unwanted maneuver, as well as software and hardware failures.

• The California Department of Motor Vehicles has proposed regulations that require a driver behind the wheel capable of taking control of a self-driving vehicle. Consumer Watchdog’s John M. Simpson will support the regulations at the workshop, noting that Google’s own numbers demonstrate that robot cars aren’t ready for deployment without a human driver.

• View DMV’s background on autonomous vehicle public workshop and proposed regulations here: https://www.dmv.ca.gov/portal/dmv/detail/vr/autonomous/auto

• View DMV’s news release about the workshop here: https://www.dmv.ca.gov/portal/dmv/detail/pubs/newsrel/newsrel16/2016_02

Page 14: The Self-Driving Car

Discussion questions1. What other qualitative considerations should

supplement this quantitative “failure analysis,” to make a complete assessment?

2. Where might the biggest markets for SDCs first develop? Where will SDCs remain irrelevant for the near future?

3. What other social, industrial, and policy changes will follow the introduction of SDCs? (Other than a hit to motels!)

4. Would you buy a car that is programmed to kill you? http://qz.com/536738/should-driverless-cars-kill-their-own-passengers-to-save-a-pedestrian/

Page 15: The Self-Driving Car

More discussion questions4. How do passengerless miles bias the

fatalities per miles driven statistics?5. Try to separate the ‘safety’ and ‘car-to-car

communications’ aspects from the ‘driverless’ aspect. How does this affect your assessment?

6. Read the short article on Ford’s futures department.

– What is the relationship between cycle time and tech forecasting & assessment?

Page 16: The Self-Driving Car

Alexis Madrigal reports in The Atlantic:

Daimler's Car2Go, which offers on-demand, one-way rentals to its users, crashed. Not physically, but in the code that powers the ridesharing service and controls the cars. Would-be drivers in Washington, Los Angeles, Vancouver, Portland, and other cities couldn't access the fleet of vehicles, leaving the service's customer-service crews scrambling on social media to explain what was going on. Starting at 2pm, the service's city-level Twitter accounts started warning people that they were experiencing, "a partial interruption and are quickly working to resolve the issue." For about 12 hours, the service appears to have been completely down down.For those who remember Twitter's fail-whale, it was a familiar scene. But the difference between not being able to send tweets and not being able to drive home from work or pick up your kids is huge. As with the hackable toilet we reported on last week, when we make pieces of our infrastructure "smart" with computers, we also give them the other characteristics of computers, like bugs, crashes, hackability, and downtime. These tradeoffs might be worth it -- after all, trains and cars break down for all sorts of reasons already -- but the ways that things don't work will be novel.In this case, Car2Go's Vancouver branch responded to a tweet asking if they'd gotten hacked by saying, "We are still identifying root causes but are taking this very seriously." Whether it was a bug or an attack, this is also part of the future of mobility, along with the gee-whizness of picking up a car off the street with your phone.

Page 17: The Self-Driving Car

Research Reveals Most Dangerous Threat Posed by AVs

Bianca Bosker reports in the Huffington Post: Posted: 08 Oct 2013 09:30 AM PDT

• The data suggest that in some ways they may be much safer than the current approach. • But there are, of course, certain risks.

–Most people believe the scariest thing is that some onboard computer will go haywire and the car will swerve into oncoming traffic. Or that terrorists will program multiple vehicles to cause murder and mayhem. Or that they'll break down in the high speed lane during rush hour.

• serious menace posed by this new technology: it is the moment when human drivers attempt to take over from the computer. JL–In that instant, the human must quickly rouse herself from whatever else she might have been doing while the computer handled the car and focus her attention on the road.

• Thrust back into control while going full-speed on the freeway, the driver might be unable –to take stock of all the obstacles on the road, or she might still be expecting her computer to do something it can't. Her reaction speed might be slower than if she'd been driving all along, she might be distracted by the email she was writing or she might choose not to take over at all, leaving a confused car in command. There's also the worry that people's driving skills will rapidly deteriorate as they come to rely on their robo-chauffeurs.

• Psychologists, engineers and cognitive scientists are now probing how humans interact with such cars, cognizant that these realities must shape how the systems operate.–Inside a dark room at Stanford University's automotive research lab sits a four-week-old, $600,000 driving simulator that will be one of the first used to study how drivers trade duties with their self-driving cars and how the cars should be designed to ensure the trade-off is done safely.–His lab's findings will help inform the design of future driverless cars -- from the layout of their dashboards and infotainment systems, to how they deliver alerts and ask drivers to take control. Do people drive more safely if their cars speak to them, flash messages or, say, vibrate the steering wheel? Should cars give an update on road conditions just before the human driver takes over at the wheel, or are such details distracting? And how does a driverless car clearly outline what it can and can't do?

Page 18: The Self-Driving Car

These car manufacturers, along with Google, have assured the public that driverless cars will make our commutes safer and conserving fuel.

Machines don't drink and drive or doze off at the wheel. Drivers will be able to read, text and work while their intelligent vehicles handle four-

way stops.

Yet despite these rosy predictions, carmakers won't immediately deliver robo-taxis. The first generation of self-driving cars are more likely to be capable co-pilots that pass driving duties back to a human when complex situations arise, much as planes' autopilot systems ask pilots for help in emergencies. As one report authored by researchers at the Massachusetts Institute of Technology recently noted, "driverless is really driver-optional.”

Stanford’s solution: driverless cars should eventually be capable of acting as our "wingmen," proactive and aware of our faults so they can assist us in the best possible way.

We"You can start to think about a radical new way of designing cars that starts from the premise that [the car] and I are a team."

Page 19: The Self-Driving Car

In one of Nass' first studies, he will try to determine how long it takes drivers to "get their act together" after the autonomous car hands back control. Google's self-driving Lexus SUV offers one current template for the hand-off: When the car knows it needs human help -- often when approaching a construction zone or merging onto a freeway -- an icon or message will flash on a custom-made screen mounted on the car's dash, and drivers usually have 30 seconds' notice before they need to take over. But is that just enough time, too much or too little?

“One of the critical issues with autonomous cars is trust. Because if you don't trust the car, it won't work.”

Page 20: The Self-Driving Car

If automation can cause skill degradation among an elite group of [pilots] who train for years, imagine what it may do to drivers, who are tested only once (when they get their driver's license) and have a much broader range of driving abilities. Drivers will get rusty, making them ill-equipped to take over for their cars. Autonomous vehicles are likely to need assistance with the most challenging driving scenarios -- think slippery streets -- that out-of-practice drivers would likely be poorly prepared to handle.

"It's ironic: We have all these automated planes, but what we need is to go back to flying without automation," observes Raja Parasuraman, a psych professor at GMU and director of the graduate program in human factors and applied cognition. "I could envision a similar situation in driving."

Operating driverless cars will ultimately be extremely boring. When required to monitor autonomous systems for long periods of time, human babysitters frequently get distracted and tune out, which can lead to accidents, slowed reaction times and delays in recognizing critical issues. In 2009, two pilots operating a flight to Minneapolis from San Diego entrusted the autopilot with control of the plane, and eventually turned their attention to their laptops. They became so engrossed in their computer screens that they failed to realize they'd overshot the airport by about 110 miles.

In the recent MIT report: "[A]t precisely the time when the automation needs assistance, the operator could not provide it and may actually have made the situation worse.”

Engineers at Toyota, Ford and Mercedes-Benz, are already looking ahead to creating cars that monitor both road and driver, and could behave differently depending on the driver's mood or mental state. The self-driving car could one day map its drivers as well as it maps the roads. Nass says. "From a business standpoint, this is the dream of the century.”

Page 21: The Self-Driving Car

Andrew Hawkins reports in The Verge:

“Carmakers, technology firms, and ride-sharing startups join forces to pressure the federal government.Ford, Google, Uber, Lyft, and Volvo announced Tuesday the formation of the Self-Driving Coalition for Safer Streets, a lobbying group with the express purpose of advocating autonomous driving. It's a power move by some of the most high-profile names behind the still nascent technology, made at a time when regulators and policymakers in Washington, DC are still wrapping their heads around the concept of self-driving cars“”

The coalition will be headed up by David Strickland, a former administrator of the National Highway Traffic Safety Administration (NHTSA). He will serve as the group's counsel and spokesperson. In essence, Strickland will be lobbying his former agency, which has been tasked by Department of Transportation Secretary Anthony Foxx to come up with a set of rules for self-driving cars by early summer.

"Self-driving vehicle technology will make America's roadways safer and less congested," Strickland said in a statement. "The best path for this innovation is to have one clear set of federal standards, and the Coalition will work with policymakers to find the right solutions that will facilitate the deployment of self-driving vehicles.”

Google’s self-driving Lexus SUVs and Google-designed prototypes racking up over a million miles of autonomous driving in three US cities. Ford has been testing its own technology on its Dearborn, Michigan, campus; meanwhile Uber is building its own research facility devoted to self-driving cars in Pittsburgh. Lyft recently teamed up with General Motors (which is noticeably not a member of the coalition) to create a fleet of self-driving, for-hire vehicles. And Volvo announced its plan to test 100 autonomous vehicles in China.

Page 22: The Self-Driving Car

And from the draft blog on AVs

• Optimism bias• Agency

• Will the manufacturers rent, not sell?

• No-human-drivers-allowed traffic zones?

Page 23: The Self-Driving Car

Discussion

• List the considerations raised here from:– Engineering – Psychology – Human Factors Design– Sociology– Philosophy– Public Health

• That help us assess AV technology.

Page 24: The Self-Driving Car

谢谢

감사합니다

Thank you

[email protected]