with the increase of automated driving in the …wc053rq8118/me310...as cars become driverless,...

148

Upload: others

Post on 18-Jun-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger
Page 2: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

With the increase of automated driving in the automotive industry, the driver-car team dynamic is

evolving from a master-slave relationship, where the user simply controls the machine, to a teammate

dynamic, where the car and driver are working together. As cars become driverless, fully autonomous

vehicles, the driving experience will realistically be more of a passenger experience.

Renault, the French car manufacturer, is interested in the future of autonomous technology. They have

provided us with a project prompt that pertains to understanding and improving the future

communication between an autonomous car and the person ‘driving’ it. They feel ambient

communication can play a key role in this new team dynamic.

Last quarter, the focus of ME310 was to better understand the design space and project prompt. This

process involved need-finding, benchmarking, and initial prototyping to build the foundation for future

designs.

The goal for this winter quarter was to define requirements for a final system and envision a final

product that will be delivered in the spring, accomplished by testing a wide variety of potential

solutions. The main tools for developing these requirements are prototyping and user feedback.

Throughout the winter quarter, Team Renault designed, built, and tested a large number of widely

different prototypes diverging from one design to the next in order to ensure all avenues of the design

space were explored. These designs included almost every sense including: smell, sight, hearing, and

touch. Examples include a Dark Horse Prototype that paired smell with navigational cues (Figure 1A) and

an LED display for changes in acceleration (Figure 1B).

Figure 1: Images of Smell (A) and LED (B) prototypes

These prototypes led to the development of a series of design requirements, which were broken down

into two categories: functional and physical. Functional requirements describe what the system must

do. Key functional requirements include improving awareness of vehicle’s knowledge and increasing

trust in vehicle safety, which are crucial to improving the team dynamic between a driver and an

autonomous car. Physical requirements describe what the system must be. These physical requirements

include the final system working regardless of ambient conditions, and for a wide variety of passengers.

Page 3: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

With requirements established, a “nugget,” or key finding, was still missing. Rounds of need-finding,

benchmarking, and discussions with the Renault liaison directed Team Renault towards a final vision. A

user survey was conducted that tried to gauge what type of information future users find useful. One of

the key findings is that passengers would like to ensure their drivers know where obstacles are around

them.

NTNU built and tested a prototype that tried to convey the intentions of a car. This prototype moved a

pedal at the passenger’s feet when the car was about to brake. This prototype was met with positive

results, as users were comforted by knowing what the car was going to do; the device allowed them to

prepare for the car’s actions.

A study by Wendy Ju of Stanford’s CDR also helped with the design development. In this study, an

autonomous car simulator highlighted obstacles in the car’s field of view. The car did not act on these

obstacles, but just highlighted them, as if to say, “don’t worry, I see that.” Ju determined that this

greatly improved trust between the passenger and the car.

This design development led to Team Renault’s nugget—improving the team dynamic between the

passenger and an autonomous car by improving a passenger’s trust in autonomous technology.

Improving trust involves providing transparency to autonomous vehicles, allowing the user to know

more about what the car knows and intends to do. A specific system that implements this vision involves

an ambient LED string that displays obstacles in the car’s field of view. An ambient foot pedal will

simultaneously convey the car’s intentions. An explicit display will accompany the ambient technology

to reinforce the car’s message. An illustration of this system can be seen Figure 2.

Figure 2: Illustration of Design Vision. The yellow lights indicate the location of objects ambiently, while the red display shows the car’s planned reaction to the objects.

In order to implement this vision, a strong project management plan is required. A timeline has been

created that outlines deliverables for the spring quarter leading up to EXPE. A budget has also been

developed to ensure Team Renault does not run out of funds. The current estimate of cost for the final

system is $4400 plus $1000 for EXPE materials, which will allow the Renault team to remain under

budget. Improved coordination with NTNU and well-established roles should help in the development of

the final system.

Page 4: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

This winter quarter has flown by, and reflecting back on the past three months provides some key

lessons to carry into the spring quarter. First, designing for a problem of the future is not easy.

Forecasting who the users will be and what needs they might have can be frustrating. Understanding

this difficulty will help guide Team Renault’s spring design phase. Also, leveraging the ME310 network

should be a priority. ME310 projects are difficult, but with insights from the teaching team, global

partners, coaches, and corporate liaisons, the projects become more manageable.

Page 5: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

1.0 Executive Summary ........................................................................................................................... 1

Contents ........................................................................................................................................................ 4

List of Tables ................................................................................................................................................. 8

List of Figures ................................................................................................................................................ 9

1.1 Glossary ....................................................................................................................................... 10

2.0 Context and Front Matter ............................................................................................................... 12

2.1 Need Statement .......................................................................................................................... 12

2.2 Problem Statement ..................................................................................................................... 12

2.3 Corporate Partner ....................................................................................................................... 13

2.4 The Design Team ......................................................................................................................... 14

2.4.1 Stanford Team ..................................................................................................................... 14

2.4.2 International Team.............................................................................................................. 15

2.5 Team Coaches ............................................................................................................................. 16

2.6 Corporate Liaison ........................................................................................................................ 17

3.0 Design Requirements ...................................................................................................................... 18

3.1 Functional.................................................................................................................................... 18

3.1.1 Requirements ...................................................................................................................... 18

3.1.2 Opportunities ...................................................................................................................... 19

3.1.3 Assumptions ........................................................................................................................ 19

3.1.4 Constraints .......................................................................................................................... 19

3.2 Physical........................................................................................................................................ 19

3.2.1 Requirements ...................................................................................................................... 19

3.2.2 Opportunities ...................................................................................................................... 20

3.2.3 Assumptions ........................................................................................................................ 20

3.2.4 Constraints .......................................................................................................................... 20

4.0 Design Development ....................................................................................................................... 21

4.1 Overview ..................................................................................................................................... 21

4.2 Motivation ................................................................................................................................... 21

4.3 Fall Need-finding and Benchmarking .......................................................................................... 23

4.3.1 Need-finding ....................................................................................................................... 23

Page 6: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

4.3.2 Benchmarking ..................................................................................................................... 25

4.4 Winter Need-finding and Benchmarking - Stanford ................................................................... 25

4.4.1 New Persona ....................................................................................................................... 25

4.4.2 Matrix .................................................................................................................................. 27

4.4.3 Autonomous Passenger Survey .......................................................................................... 29

4.5 Research Studies ......................................................................................................................... 30

4.6 Critical Experience Prototype (CEP) ............................................................................................ 30

4.6.1 Goal ..................................................................................................................................... 30

4.6.2 System Overview ................................................................................................................. 30

4.6.3 Testing Results .................................................................................................................... 31

4.6.4 Requirements Learned ........................................................................................................ 31

4.7 Critical Functional Prototype ...................................................................................................... 31

4.7.1 Goal ..................................................................................................................................... 31

4.7.2 System Overview ................................................................................................................. 32

4.7.3 Test ...................................................................................................................................... 32

4.7.4 Testing Results .................................................................................................................... 33

4.7.5 Requirements Learned ........................................................................................................ 34

4.8 Dark Horse Prototype ................................................................................................................. 34

4.8.1 Goal ..................................................................................................................................... 34

4.8.2 System Overview ................................................................................................................. 35

4.8.3 Test ...................................................................................................................................... 35

4.8.4 Test Results ......................................................................................................................... 36

4.8.5 Requirements Learned ........................................................................................................ 36

4.9 Funktional Prototype .................................................................................................................. 37

4.9.1 Goal ..................................................................................................................................... 37

4.9.2 System Overview ................................................................................................................. 37

4.9.3 Testing Results .................................................................................................................... 37

4.9.4 Requirements Learned ........................................................................................................ 38

5.0 NTNU Winter Quarter Progress ...................................................................................................... 39

5.1 Need-finding and Benchmarking ................................................................................................ 39

5.2 Prototyping ................................................................................................................................. 43

5.3 Reflections on the process .......................................................................................................... 45

6.0 Design Specifications ...................................................................................................................... 46

Page 7: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

6.1 Functional Prototype .................................................................................................................. 46

6.1.1 Goal ..................................................................................................................................... 46

6.1.2 System Overview ................................................................................................................. 46

6.1.3 Testing Results .................................................................................................................... 47

6.2 Final System Vision ..................................................................................................................... 48

6.2.1 Ambient Visual Display........................................................................................................ 48

6.2.2 Ambient Brake Pedal ........................................................................................................... 48

6.2.3 Explicit Visual Display .......................................................................................................... 49

6.2.4 Full System Requirements .................................................................................................. 49

6.2.5 Total System ........................................................................................................................ 49

7.0 Project Management ...................................................................................................................... 51

7.1 Deliverables and Milestones ....................................................................................................... 51

7.1.1 Winter Quarter Review ....................................................................................................... 51

7.2 Project Timeline .......................................................................................................................... 51

7.3 Distributed Team Management .................................................................................................. 52

7.4 Project Budget............................................................................................................................. 53

7.5 Reflections and Goals .................................................................................................................. 54

7.5.1 Matilde Bisballe ................................................................................................................... 54

7.5.2 Adam Craig .......................................................................................................................... 54

7.5.3 Jørgen Erichsen ................................................................................................................... 54

7.5.4 Caroline Flowers .................................................................................................................. 55

7.5.5 Achim Gerstenberg ............................................................................................................. 55

7.5.6 Jacob Gowan ....................................................................................................................... 55

7.5.7 Kornel Niedziela .................................................................................................................. 56

7.5.8 Carl C. Sole Semb ................................................................................................................ 56

8.0 Resources ........................................................................................................................................ 57

8.1 Components ................................................................................................................................ 57

8.2 Material Suppliers ....................................................................................................................... 57

8.3 Other Systems ............................................................................................................................. 57

8.4 Experts ........................................................................................................................................ 58

9.0 References ...................................................................................................................................... 59

Appendix A .................................................................................................................................................. 61

Appendix B .................................................................................................................................................. 65

Page 8: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Appendix C .................................................................................................................................................. 74

Critical Experience Prototype ............................................................................................................. 74

Critical Functional Prototype .............................................................................................................. 77

Dark Horse Prototype ......................................................................................................................... 80

Funktional Prototype .......................................................................................................................... 88

Functional Prototype .......................................................................................................................... 99

Appendix D ................................................................................................................................................ 122

Appendix E ................................................................................................................................................ 138

Appendix F ................................................................................................................................................ 146

Page 9: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Table 1: Functional Requirements .............................................................................................................. 18

Table 2: Physical Requirements .................................................................................................................. 19

Table 3: NTNU Prototype Requirements .................................................................................................... 44

Table 4: Spring Missions ............................................................................................................................. 51

Table 5: Distributed Team Roles ................................................................................................................. 52

Table 6: Winter Quarter Budget ................................................................................................................. 53

Table 7: Estimated Spring Budget ............................................................................................................... 54

Page 10: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Figure 1: Images of Smell (A) and LED (B) prototypes .................................................................................. 1

Figure 2: Illustration of Design Vision. The yellow lights indicate the location of objects ambiently, while

the red display shows the car’s planned reaction to the objects. ................................................................ 2

Figure 3: Brainstorming of Being a Teammate ........................................................................................... 22

Figure 4: Brainstorming Ways to Promote Trust, Comfort, and Confidence in Autonomous Cars ............ 23

Figure 5: Our new persona Lucas, all ready for his morning commute. ..................................................... 26

Figure 6: Design matrix of possible ambient communication and information pairings. .......................... 28

Figure 7: Results from Autonomous Car Survey of asking what passengers most wanted to know during

a ride (left), and what passengers most wanted the driver to know ......................................................... 29

Figure 8: Block Diagram of CEP ................................................................................................................... 30

Figure 9: CFP In-Car Surround Sound System Diagram ............................................................................... 32

Figure 10: Block Diagram of CFP ................................................................................................................. 32

Figure 11: CFP Sound Direction Detection Accuracy .................................................................................. 33

Figure 12: Dark Horse Wizard of Oz Block Diagram .................................................................................... 35

Figure 13: Block Diagram of Funktional Prototype 1 .................................................................................. 37

Figure 14: Block Diagram of Funktional Prototype 2 .................................................................................. 37

Figure 15: The pictures show the Kinect system in the lab ........................................................................ 39

Figure 16: When focusing on business men we created the persona Thor ............................................... 41

Figure 17: Jørgen in an old suit simulating the physical disabilities of an 80-year-old............................... 42

Figure 18: Workshop with people from TrollLABS. Leading the way through smell. ................................. 43

Figure 19: Footrest communication intention of braking, acceleration and actual braking ...................... 44

Figure 20: Photo of LED Strip for Functional Prototype ............................................................................. 46

Figure 21: Block Diagram of Functional System.......................................................................................... 47

Figure 22: Final System Vision .................................................................................................................... 48

Figure 23: Final System Block Diagram ....................................................................................................... 50

Figure 24: (Left) Demonstrating the motion tracking of the Kinect [5] ...................................................... 65

Figure 25: (Right) User gesturing to accept call with Navdy [6] ................................................................. 65

Figure 26: An example of clmtrackr, analyzing a pleasantly surprised facial expression [7] ...................... 66

Figure 27: Gazetracker eye monitoring software [8] .................................................................................. 66

Figure 28: An alarm system autonomously monitoring the safety of a house [10] ................................... 67

Figure 29: A Roomba making its rounds, autonomously cleaning [11] ...................................................... 67

Figure 30: The cockpit of a commercial airliner showing the complexity of the controls [13] ................. 68

Figure 31: Wendy Ju of Stanford CDR [14].................................................................................................. 69

Figure 32: Shad Laws, our liaison extraordinaire [15] ................................................................................ 69

Figure 33: The project path of trust as more autonomy is introduced. ..................................................... 70

Figure 34: Nancy, our ideal user for design purposes ................................................................................ 73

Figure 35: Images of Critical Functional Prototype ..................................................................................... 74

Figure 36: Image of Dark Horse Wax Prototype ......................................................................................... 81

Figure 37: Image of Dark Horse Liquid Scent Release Prototype ............................................................... 83

Figure 38: Photos of Funktional Prototype ................................................................................................. 88

Figure 39: Photos of Functional Prototype including Kinect Data and User Testing ................................ 100

Page 11: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Ambient Communication: Ambient communication is communication that is not explicit, and therefore

can be interpreted by the user without requiring all of their focus. Ambient communication takes the

form of a feeling more than an explicit form of communication.

Automation: Automation refers to the implementation of a routine by a robot to improve efficiency. An

automated task is one that involves no decisions to be made by the robot, this is different from

autonomy where the robot must make decisions on its own.

Car-Driver Team Dynamic: The Car-Driver team dynamic is the relationship between a car and a driver.

As autonomous cars become more prevalent, the car-driver team dynamic will change drastically

overtime.

Critical Experience Prototype (CEP): A CEP, or critical experience prototype, is a prototype that is

created to observe how the experience of the user is affected by the prototype.

Critical Function Prototype (CFP): A CFP, or critical function prototype, is a prototype that is created to

test the functionality of a given technology. This is basically a feasibility prototype.

Device: In requirements section of report, any and all methods of communicating with the user.

Driver: The definition of driver changes based on the context of its use. In fully autonomous vehicles, the “driver” can be any passenger of the car. In current technology, the driver is the passenger controlling the vehicle.

Explicit Communication: Explicit communication is direct communication where the message being

expressed is clear. E.g. a visual display that has a flashing warning or a vocal cue that tells you a specific

piece of information.

Fully-Autonomous Vehicle: A fully autonomous vehicle is defined as a vehicle that requires no human

input to complete the act of driving. There is no transition of control between a fully autonomous car

and a driver, because a fully autonomous car completes the entire driving task.

Funktional Prototype: A funktional prototype is a team firsts attempt to build an integrated system;

while it is expected to be comprised of several working subsystems, it is not expected to be pretty—the

archetype is the hummer created on the Red Green Show by duct-taping two cars together.

Haptics: Haptics refers to tactile or touch feedback to the user. This is an area of HMI research that is

currently on the rise.

Human-Machine-Interface (HMI): HMI refers to the area of research that is looking to improve how

machines and humans are able to interact. Ambient communication could be a key technology that can

improve HMIs.

Master-Slave Topology: A two-being dynamic where one gives commands (in the instance of car, a user,

through steering/pedals/etc.) which are blindly followed by second (in this case, the car itself) without

feedback or input.

Nugget: A nugget is a key finding that leads towards the vision of a final system. This nugget is

something to build off of and adds value to your team’s work.

Page 12: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Olfactory Fatigue: When humans smell a strong scent, their olfactory system becomes fatigued and

cannot identify other strong scents as distinctly—instead, the new smells are duller and harder than

they would normally be.

Perception System: All of the autonomous car’s sensors and detection. The perception system does not

make any decisions or act on any of the data, but rather simply measures, filters and processes what is

around it.

Semi-Autonomous Vehicle: In a semi-autonomous vehicle, the car cannot fully make decisions on its

own. Therefore, there needs to be a transition of power from the car to the user in order to complete a

drive.

Users: The definition of users changes throughout this report depending on the context. Users for fully-

autonomous cars can be any passenger of the vehicle. The users of current cars are the drivers.

Page 13: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

The growth of autonomous vehicles has exploded; all automotive leaders have autonomous car

projects—for example, Mercedes-Benz recently had a driverless-car navigate San Francisco [1] -- and a

future with autonomous cars seems almost inevitable. Autonomous vehicles will revolutionize how

humans use an automobile. Instead of having to remain focused on the task of driving, an autonomous

vehicle can drive itself. For example, commutes to work can be spent answering emails, long road trips

can occur through the night as the “driver” is able to sleep, and being stuck in traffic can turn into a

chance to relax.

As the use of cars drastically changes, so too will the communication between the car and driver.

Currently, the car-driver team dynamic has not changed significantly since the advent of the automobile;

the car and driver travel together from point A to point B, with the driver in control at all times.

However, as the car is able to drive itself, the car-driver dynamic will, by necessity, change; instead of

the common master-slave relationship, the car and driver will become a team, working together to

accomplish a common goal. As this dynamic changes, the communication between the car and driver

must change as well. While current communication between a car and driver is mainly explicit, other

potential methods exist; one of these is the idea of ambient communication.

Ambient communication is, essentially, communication that is not explicit. Explicit communication could

include oral warnings or a flashing sign—anything that directly states what is being communicated.

Ambient communication takes a more subtle form and usually accompanies the explicit message. For

example, in human-to-human communication, the explicit message could be communicated through

words, but the tone of voice, body language, and facial expressions provide more information about the

main message in an ambient way. Ambient communication also usually manifests itself as a feeling;

instead of being something you simply hear or see, it is something you feel.

With the automotive industry moving towards fully autonomous automobiles, there exists a need for

new methods of communication between the car and driver. The car-driver team dynamic will

drastically change and the way vehicles convey information must change as well. One of these new

methods that can be explored is ambient communication to improve the driver-car team dynamic.

In order to fulfill the need for new methods of communication in order to improve the car-driver team

dynamic, there are two main questions that need to be answered:

1) What information/emotions/knowledge should be conveyed to the user?

2) How should that information be conveyed into a feeling?

Not all information is effectively communicated in an ambient way. Also, not all information needs to be

communicated to the user if he or she is not involved in every decision. Determining what information a

fully autonomous vehicle user wants to know, and ensuring this information can be communicated

Page 14: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

effectively in an ambient way, is a major issue that needs to be addressed in the completion of this

project.

The information provided to the user also has a goal: improve the team dynamic between a driver and

an autonomous car. One way to improve this team dynamic is to allow the driver to have more trust in

the car. Without trust between team members, it is impossible to have an effective partnership.

Once it is known what information can improve trust between a driver and a car, there are many ways

this information can be conveyed into a feeling. Ambient communication can occur through a number of

various forms; haptic vibrations, light intensity, and smell are all examples of potential ambient

feedback. Determining what method of ambient communication will be most effective is a second major

task of this project.

The corporate partner for this project is Renault, an automotive manufacturer that is mostly based in

Europe. Headquartered in France, Renault is the third biggest automaker in Europe and the ninth

biggest in the world. Renault is also partnered with Nissan as part of the Renault-Nissan Alliance.

Together the Renault-Nissan Alliance is interested in the development of, among other automotive

advances, autonomous vehicles. They are currently researching aspects related to semi-autonomous

vehicles, since these will be first to market, preceding their fully autonomous brothers. Therefore, while

the area of ambient communication between a user and a fully autonomous vehicle is not currently

being researched by Renault, they are highly interested in what we will learn from this project. [2]

Page 15: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

The enthusiastic members of the design team from the ME310 class at Stanford University can be seen

below.

Adam Craig

Status: Mechanical Engineering Graduate Student

Skills: Mechatronics, Electrical Engineering, System Integration, Embedded

Programming

Undergrad: Mechatronics Engineering at the University of Waterloo, Canada

Interests: Wearable technology, Biomechatronics, Robotics

I come to Stanford after completing 5 years of a Mechatronics Engineering from the

University of Waterloo. I have learned a lot about electronic devices and system

integration, through both work and school. However, the only design I have ever known is a very

structured engineering approach. I am taking my Master’s at Stanford in order to approach design more

creatively. My interests include mechatronic and biomechanical systems, and I love building and

tinkering with new things.

Caroline Flowers

Status: Mechanical Engineering Graduate Student

Skills: Mechanical Engineering, Biomedical Engineering, Product Design

Undergrad: Mechanical Engineering at MIT

Interests: Designs to improve everyday life, medical devices

I came to Stanford after completing a Bachelor’s degree in Mechanical Engineering at

MIT. As a Master’s (and hopefully one day PhD) student at Stanford, I’m focusing on

incorporating a more user-centric design process along with the more analytical one I

learned as an undergraduate, with current design interest focusing on medical devices, specifically

orthopedics. When not working, I can usually be found baking and then feeding everyone around me.

Jacob Gowan

Status: Mechanical Engineering Coterm Student

Skills: Mechanical Engineering, Product Realization

Undergrad: Mechanical Engineering at Stanford University

Interests: Manufacturing and design process, motorcycles

I am a first year coterminal student specializing in manufacturing engineering. At

Stanford I played football for 4 years before deciding to come back to further explore

my love of engineering. I spend my free time outside of ME 310 working on personal

projects and TAing in the PRL. On weekends I can be found napping in the sun (if

there’s no assignment due the next day), riding my motorcycle (if it is running), and making cider (if I am

feeling particularly ambitious).

Page 16: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Kornel Niedziela

Status: Mechanical Engineering Graduate Student

Skills: Mechatronics, Mechanical Engineering, Product Realization, Product Design,

Embedded Programming

Undergrad: Mechatronics Engineering at the University of Waterloo, Canada

Interests: Automotive, Space, Consumer Electronics

I am a Mechanical Engineering Masters student at Stanford University, but deep

down I am into robotics and mechatronics systems. I completed my Bachelor’s of

Applied Science in Mechatronics Engineering at the University of Waterloo. My goal is to further expand

my creative thinking and to explore more user centric designs over technical designs. I have two and a

half years of industry experience, but feel like there is still much to learn. I am enjoying ME310 as I am

learning about design methodology while being able to prototype and test innovative new technologies

without worrying so much about the bureaucracy involved in larger corporations.

The international team for this project attend the Norwegian University of Science and Technology

(NTNU). The awesome team members can be seen below:

Jorgen Erichsen

Status: Mechanical Engineering Graduate Student

Skills: CAD, Mechanics, Medical Devices

Undergrad: Mechanical Engineering from NTNU

Interests: Product development, Music, Tinkering with Gadgets

I was born outside the Norwegian capital, Oslo, and I am studying Mechanical

Engineering at NTNU. My specialization is in Product Development and Materials.

Previous experience includes classical mechanical engineering (statics, dynamics,

material science, etc.), and also Design Thinking, CAD, FE-analysis and some Mechatronics. Apart from

studying, I enjoy photography, biking and music.

Matilde Bisballe

Status: Engineering Design and Prototyping PhD Student

Skills: User centered design, product development

Undergrad: Engineering Design & Innovation at Technical University of Denmark

Interests: Running, Fiction Books, Interaction between people and technology

I’m a Design Engineer from the technical University of Denmark, currently writing my

PHD on problem-solving of engineering problems in the early stages of product

development. My interests are to investigate creative processes and how we can help people come up

with the best and most radical solutions, which could be by understanding the several roles and

functions of prototypes better.

Page 17: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Achim Gerstenberg

Status: Engineering Design and Prototyping PhD Student

Skills: Physics, Product Realization, Mechatronics

Undergrad: Experimental Physics at Bonn University and University of Duisburg-Essen

Interests: Hiking, Saxophone

I have a Master’s in experimental physics from Duisburg University in Germany.

Currently, I am a PhD candidate at TrollLABS at the Norwegian University of Science

and Technology in Trondheim. There I try to find and understand the underlying

principles for bottom-up network like project architectures applied to large-scale, complex and highly

uncertain engineering projects.

Carl C. Sole Semb

Status: Mechanical Engineering Graduate Student

Skills: Metal Workshop Experience, CAD, Arduino

Undergrad: Mechanical Engineering at NTNU

Interests: Outdoors, Engineering Design

I received my Bachelor’s degree from NTNU in Mechanical Engineering and am

currently pursuing my Master’s. I love the outdoors and being away from technology,

which allows me to have an interesting design perspective. I love having adventures

and am extremely spontaneous. My goal for this course is to apply my creativity to the design process to

help the team think outside of the box.

There are two ME310 alumni coaches for this project. Information about them can be seen below.

Bill Lukens

Occupation: Product Design Engineer at Apple

Education:

Bachelor of Science (Mechanical Engineering), Carnegie Mellon University

Master of Science (Mechanical Engineering), Stanford University

ME310: ME310 alumni, 2013-2014

Page 18: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Scott Steber

Occupation: Associate Director at Radicand Inc.

Education:

Bachelor of Science (Mechanical Engineering), Boston University

Master of Science (Mechanical Engineering), Stanford University

ME310: ME310 alumni, 2011-2012

The corporate liaison for this project is Shad Laws, details about him can be found below.

Shad Laws

Occupation: Innovation Projects Manager at Renault Innovation

Education:

Bachelor of Science (Mechanical Engineering), Northwestern University

Master of Science (Mechanical Engineering), Stanford University

PhD (Mechanical Engineering), Stanford University

ME310: ME310 alumni, 2003-2004

Page 19: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Throughout the winter quarter, the project requirements have been discovered and refined. These

requirements can be classified as functional or physical requirements.

Functional requirements describe what the system must do and how it should behave. In this section

functional requirements as well as opportunities, assumptions, and constraints are described.

Table 1: Functional Requirements

Requirement Justification Criteria/Test

Improves awareness of vehicle’s knowledge

Purpose of the system is to create a teammate dynamic by making the user understand why the vehicle makes decisions

When asked why the car will make certain decisions, the user can articulate why based on communicated information

Increases trust in vehicle safety Trust in the vehicle’s decision-making and driving abilities are the primary force behind the project; if the user trusts the vehicle, they are more likely to feel comfortable in the car

When asked how safe the user feels with the technology compared to a blind test, the former rates higher than the latter

Does not overwhelm senses If senses get saturated (for example, if too many smells are put into the cabin without clearing the air), the user might lose ability to understand communicated information, or might no longer feel comfortable in the system

There is some kind of ‘scent-clearing’ capability in the system, or majority of users tested without one respond that they do not feel overwhelmed

Non-invasive/unobtrusive If user feels the system is too much in control or they are too distracted to be able to focus on anything else, they are less likely to use the system

When given a task or allowed to focus on something else, majority of users rank the distraction from the device as minimal to non-existent on a scale of options; users do not have to stop the test because of how the device makes them feel

Clearly indicative User less likely to use the system if overly complicated

Majority of tested users can pick up on what the device is telling them and answer questions on it within five minutes, given only a broad category

Page 20: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Communicates Useful Knowledge

If too much information is communicated, or the information could be better communicated in another way, users will be less likely to use the device, or more easily confused

Majority of survey respondents and tested users, when asked if they want that information communicated, respond in the affirmative

Improve passenger-car team dynamic

Creating a team dynamic between user and vehicle should allow the user to feel more comfortable despite not being in control

When asked after a test, majority of tested users feel they understand the system and they are part of the driving process

Will use both ambient and explicit aspects

Multiple forms of communication can be used

Multiple types of information can be conveyed

Driver can contribute to car’s functionality/overall driving experience

‘Useful knowledge’ is global—drivers in the US would be interested in the same things as drivers in

Norway or India, for example

Experience of driving or being a passenger won’t change drastically—roads/highways/intersections

will still be essentially the same in the future

People respond differently to the same stimuli

The ambiently-communicated information must convey a feeling

Physical requirements describe what the system must be and the shape it should take. In this section

physical requirements as well as opportunities, assumptions, and constraints are described.

Table 2: Physical Requirements

Requirement Justification Criteria/Test

Works regardless of passenger’s physical condition (notably height)

Potential users could be of any size (for example, both children and adults could be in the vehicle), and height even among adults varies widely. If the system is not comfortable, the user is less likely to use it

When testing a wide variety of physical body types, majority of users are able to use the system without feeling uncomfortable

Page 21: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Comfortable for user If the system is not comfortable, the user is less likely to use it—for example, LEDs being too bright

Majority of users do not have complaints about the system and are capable of completing a 30 minute test without changing anything

Detection works regardless of outdoor conditions

Detection software often does not work as well in sunny areas as in shady or dark ones

90% of objects seen in the least optimal conditions (90% chosen as amount well-calibrated systems can see in sunny conditions for technology we’re currently using) [3]

Notification works regardless of outdoor conditions

Can be harder to see lights when the ambient light is bright (i.e. midday)

90% of display changes can be seen in least optimal conditions

Will use multiple senses (at the moment, sight and haptic feedback)

Object detection technology will be included as part of autonomous car technology

Object detection technology will improve along with autonomous car technology

Limited to current technology for prototyping purposes

Cannot use more power than car battery can supply

Must be completed by June 2015

Page 22: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

The design requirements were developed throughout the winter quarter. Some of the requirements are

stated in the project prompt provided by Renault-- these requirements can be found in Appendix A.

Other requirements were learned through building prototypes and testing with users. The following

sections describe how the prototypes completed this quarter relate to the development of the design

requirements. This section also highlights how the need-finding, benchmarking, and ideation tasks

contributed to design development process.

The logic behind this project is not necessarily readily apparent when examining our project’s prompt of

use ambient communication in fully autonomous vehicles to change the relationship from a master-

slave topology to a teammate topology. Focusing on fully autonomous vehicles allows the team the

ability to work in a space that is not nearly as crowded as that of the semi-autonomous world.

Researchers are heavily looking into the semi-autonomous space while fully-autonomous is less

explored. Likewise, one can develop things well before they become necessary, having a head-start on

potential competitors.

The use of ambient communication serves the purpose of making the communication of information

less invasive. The final system will not be exclusively ambient, but the goal is to have ambient

components appropriately implemented to communicate more effectively with the passengers.

Ambiency is important because most explicit communication systems (e.g. warning lights, sounds, or

displays) often feel invasive and eventually are turned off, completely negating their effectiveness.

The teammate topology part of this project is to make the car and driver more efficient and effective

while working together. As the car is given more responsibility and agency in the relationship, the

master slave dynamic that most manufactures have designed into their cars will change. As cars are

capable of more and responsible for more, the relationship will change out of necessity or will stay the

same and be a barrier to forming trust with the fully autonomous vehicle.

To flush out these issues more fully, Team Renault spent an extensive amount of time early in the

quarter revisiting the prompt to better understand the project goals. Definitions for being in a team with

a car were created (Figure 3). To start, brainstorming what it meant to be on a team was completed;

settling on things like a division of labor and a trust in one’s teammates. From that, it was concluded

that an effective team member is, among other things reliable, does its part, and watches out for the

other team members.

Page 23: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Figure 3: Brainstorming of Being a Teammate

When applied to a car, one can incorporate these team member attributes in a variety of aspects:

driving in a safe, comfortable, and predictable manner; being responsive to the passengers’ needs;

explaining what is happening and why it is making certain decisions; etc. It was decided that increasing

the users’ trust, via comfort with, and confidence in the system would, be the best way to accomplish

the project goal. Ways in which a car might damage trust, comfort and confidence, were also

brainstormed (Figure 4).

Page 24: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Figure 4: Brainstorming Ways to Promote Trust, Comfort, and Confidence in Autonomous Cars

During the winter quarter, the team was much more oriented towards building and testing prototypes

than the fall quarter. This was necessary both to explore how one could fill the needs of the space while

leveraging the skill set of the team, and to test a wide variety of potential solutions to the project

prompt. However, while it was feasible to use the results of the fall quarter’s need-finding and the user

testing of our prototypes, additional need-finding proved necessary.

The team spent a significant portion of last quarter exploring the space through need-finding,

benchmarking, and brainstorming activities. We found that these activities were particularly necessary

for the project for multiple reasons. First, the space that the team is exploring, that of technology that

Page 25: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

acts as a teammate toward the user, is in its infancy. Second, the fully-realized autonomous vehicles

described in the project prompt do not actually exist yet, therefore it was required to find comparable

situations and technologies to study.

The team started by considering what could be communicated and sorted these into five categories

based upon how the information changed over time.

Constant: The constant category consists of those things which are relatively constant for any car or

driver, and will not change based on current changeable conditions.

Examples: Cardinal direction of car/orientation with regards to North, time of sunrise/sunset, current

location, etc.

Car Conditions: Car conditions consists of things about the car that directly relate to its function, but

could require input/assistance from the user, or could depend on factors the car has taken into account

but has no way of avoiding or controlling.

Examples: Car diagnostics (tire pressure, low fuel, etc), car confidence in a current situation (such as

driving along a route with a large number of bars around closing time, or a route with a large number of

blind turns), etc.

Directly Controlled: Directly controlled aspects consist of those things that, in a non-fully-autonomous

car, would be in the control of the driver.

Examples: Car speed, sudden changes such as braking, route choices, etc.

Indirectly Controlled: Indirectly controlled aspects are those which the car has some control over, but

also involves interaction with cars or people—for example, while the car might be able to control the

location of traffic around the vehicle relative to it by slowing down or speeding up, or by picking an

alternative route with less traffic, the decisions of the other cars/drivers must also be taken into

consideration.

Examples: Traffic around vehicle, time to destination, entering/leaving specific zones (such as school

zones), etc.

Uncontrolled: Uncontrolled aspects are those that, while subject to change, do not accept any input

from the car about that change.

Examples: weather conditions, road conditions, etc.

To flush out how ambient communication might interact with these needs in the car the team

considered different forms of ambient communication. These modes of ambient communication were

broadly broken down by the sense or system targeted.

Vision: Lights that change color or brightness depending on state of car or displays that indicate certain

information.

Hearing: Changes in ambient noise type, volume, or location of speakers (sound vectoring).

Page 26: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Smell: Pre-defined smells to indicate location along an often-traveled route, type of environment

(urban, rural, suburb), or weather conditions.

Touch/Haptics: Possible solutions include a vibrating belt to indicate states or changes in the physical

state of car to show a change in the situation.

Changes in the Car: Car would modify itself based off of either the user or ongoing changes (darkening

the windows if the user appears to be sleeping, chair stabilizing itself based on turns or sudden speed

changes, etc).

This need-finding cycle formed the initial impressions of what could be possible in fully autonomous

vehicles and set the stage for further need-finding in winter quarter. A full description of our fall need-

finding can be found in Appendix B.

In the fall, exploration of what technologies that are currently being developed or used in various

industries was conducted. These technologies ranged from the aviation industry, to home robotics, and

of course, the automotive industry. These technologies actually appeared in some of our prototypes and

provided us with directions for potential future developments. A full summary of our fall benchmarking

can be found in Appendix B.

The team realized partway through the quarter that our technology-distrusting persona from the fall

quarter, Nancy, was no longer necessarily the use case that fit the problems we were most interested in

testing. While trust between the user and car was still key, someone who would feel the car was more

of a teammate than something that had to be tolerated was important. That is where the idea for

“Lucas” came from. The team sought to pursue a persona who would view the autonomous technology

not as a machine to be controlled and not trusted, but as a tool to be used and acclimated to.

Lucas is a 28-year-old up-and-coming white-collar worker who spends his weekends driving to outdoor

adventures with his girlfriend of two years whom he lives with. He views his time in all situations as a

valuable asset that is being wasted on, among other things, commutes. So while he knows he shouldn’t,

he texts and emails while driving in straight stretches or stopped at lights during his 30-minute commute

on the 101 freeway (Figure 5).

Page 27: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Figure 5: New persona Lucas, all ready for his morning commute.

Continuing the prototyping process through the quarter, and settling on what space to explore for the

final vision, the team drew from aspects of both of our personas—designing for someone who will not

automatically trust autonomous technology, but who is willing to take the chance with it to get some of

the advantages it would provide.

Page 28: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Although several different modes of communication and possible sources of valuable information to

communicate were outlined, it was realized that the list was by no means exhaustive, and it was desired

to consider every possible solution to the project prompt. So, under advice from the Renault corporate

liaison, the team sat down as a group and ideated as many separate ambient communication methods

and desirable pieces of information with both teams. Everyone then had time to rank each pairing both

on feasibility, adaptability, and usefulness (for example, using temperature of the car to communicate

outside temperature might satisfy the first two, but not the last) (Figure 6). In the figure, red indicates a

pairing that received more votes and is therefore a more viable option. Through this exercise, while

there was some differences in opinion, the team was able to lay out a smaller number of options to

consider in a final solution.

Page 29: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Fig

ure

6: D

esig

n m

atr

ix o

f p

oss

ible

am

bie

nt

com

mu

nic

ati

on

an

d in

form

ati

on

pa

irin

gs.

Page 30: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

The team also realized that while, through prototyping, one could get a lot of feedback from how users

felt about distinct solutions, interviewing a lot of potential users about what they would want in an

optimal solution had not been done. It was decided to remedy this by creating a survey on

SurveyMonkey covering a wide variety of car-related topics, from car-sickness to what they would do in

a vehicle they can’t do now. After analyzing the approximately one hundred responses, it was found that

there were two particular questions that stood out to us as valuable to the teams (Figure 7).

Figure 7: Results from Autonomous Car Survey of asking what passengers most wanted to know during a ride (left), and what passengers most wanted the driver to know

The first was “What would you like to know as a passenger?” This question, seemed to have a good deal

of implicit trust in the driver. We recognized that answers like arrival time, and traffic conditions were

questions you would ask if you knew that the driver was trustworthy and safe—these were ways to

optimize your time spent in the car, not reassure you.

The more valuable input was the answers to the question “What would you like the driver to know?”

While some people answered that they would like the driver to know all of these things, the team felt

safe to interpret these results as things a responsible driver should know. The answers to this part

largely consisted of wanting to know about where other cars, imminent threats, and dangerous

conditions were. It was thought that this section provides valuable insight that the most important thing

Page 31: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

for cars to impart onto their passengers would be the feeling that the situation is under control or, put

more plainly, the car can tell them “I got this.”

The team thinks that the sentiment expressed by that saying encapsulates the combination of

situational awareness and competency that the vehicle will need to communicate to gain the trust of its

passengers.

Though a number of studies out there provide interesting insights on the future of autonomous vehicles,

one particular study was formative for our vision: Wendy Ju’s study on how communication of an

autonomous vehicles perception of the world promotes trust between the passenger and vehicle in

simulations.

In this study, they placed people in the “autonomous” vehicle and proceeded as a normal autonomous

vehicle, with one exception. Periodically, the vehicle would point out things that it saw as it was driving.

These observations were not specific to items that the vehicle was acting on but simply acknowledging

what it could see. Even though this communication provided no piece of what the cars intended action

was, the vehicle simply pointing out (accurately) what it could see, helped to build trust with the

passenger [4]. This was done through a simulation and not an actual car.

The timing of this study was very fortuitous as the findings support what the team has come to believe is

the nugget, or key insight to build a valuable prototype off of.

The following sections describe how the prototypes built and test this winter quarter have aided the

design development process.

The goal of the CEP was to test a critical aspect of the user experience. For this prototype, the goal is to

determine how users respond to ambient vibration/haptic feedback on the torso. This ambient feedback

will be used to provide information about upcoming turns. Allowing the user to know when a turn is

about to occur, how severe the turn will be, and the direction of the turn.

The system was mounted into a car, with three main components: a potentiometer, an Arduino, and a

vibrating belt. The potentiometer was controlled by a Stanford team member and provided input to the

system. The Arduino took this input and controlled where the vibrations would occur. The vibrating belt

would be placed on the user’s torso and provide ambient feedback. The system can be seen in the block

diagram in Figure 8.

Figure 8: Block Diagram of CEP

Page 32: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

More details regarding the implementation of this prototype including Arduino code can be seen

Appendix C.

The system was tested with five users with mixed results. Users wanted to know the information, but

the feedback was too invasive. The constant vibrations against the torso were annoying and irritating.

This prototype provided insights into how information needs to be provided to the user. Specifically, a

non-invasive form of communication is required. If the information is too invasive, it can take away from

the information being provided and be detrimental to the user experience.

The goal of the CFP was to test a critical function. After testing tactile feedback during the CEP, it was

decided that sound could be a useful medium to convey information. Other than voice systems, sound is

very ambient as humans tend to process it subconsciously for geometric location and tone.

Studies have been done on how accurately humans can detect sound location and motion [5], but the

critical question for this prototype was how well passengers can detect the sound location in a car; in

everyday driving, there are plenty of internal and external sources of noise that may throw off the

passenger. The goal was to determine if the passenger could detect the direction of sound in a real

driving environment while doing a task.

Since the nature of the design question required testing users in a driving environment, this prototype

also doubled as another CEP. The user-experience test consisted of using the same 3D sound system,

which projected a sound from a direction, and pairing it to an electronic compass so that the sound

would always come from magnetic north. The secondary goal was to test if users gained a better idea of

their absolute orientation, and if that gave them a better sense of location over time.

Page 33: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

The system comprised of a five speaker surround sound set mounted directly around the passenger’s

head, as shown in Figure 9.

Figure 9: CFP In-Car Surround Sound System Diagram

The compass was connected to an Arduino, which sent angle values to a computer through a serial COM

port. The data was read into a basic C# program that used the irrKlang audio engine to play a song, and

then change the 3D song location according to the current car angle (Figure 10).

Figure 10: Block Diagram of CFP

The test comprised of two parts: testing a sense of cardinal direction, and testing accuracy of

determining direction of sound. First, sound was played from various speakers, and the user was told

which speaker was being used. This introduction was done so the user could become accustomed to the

different sound directions, in accordance with the study mentioned above which found that training

users before testing increased the accuracy with which they can detect sound direction [5].

After this training period, the users were driven on a pre-selected route with varying curvature, turns

and straights. At first, they were driven around with music direction coming from magnetic north

without doing any tasks. After about five minutes of acclimation, the users were given an article to read,

COM Port Compass Arduino

irrKlang 5 Channel Audio

Computer

Page 34: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

to mimic a more authentic situation of someone working while riding in an autonomous car. Users were

then driven around for another 5-10 minutes to test the system.

The final stage of testing comprised of setting the sound to come from a fixed angle relative to the user,

and then have the passengers say which angle they heard it coming from, which was done while still

driving around. Users were again given an article to read, for the same reasons as above.

Functionally speaking, the test results were relatively decent. While reading, the users could detect the

sound direction within +/- 45 degrees, which meant that they could tell their orientation within one of

the 4 quadrants (N, S, E, W). When the users were not reading, most of the results were within +/- 20

degrees with ambient car noise and motion. The empirical results are shown in Figure 11.

Figure 11: CFP Sound Direction Detection Accuracy

The correlation between perceived and true sound direction was reasonably strong. It was noted that

there were a few trouble areas in the setup, where the acoustics caused users to misjudge sound

location--with the setup used, the sound direction directly to the user’s side was generated by a

combination of the front-side and rear-side speakers. If the users were closer to one or the other, they

Page 35: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

tended to claim the sound was coming from that speaker rather than the side, even though both

speakers had equal volume.

The user experience testing had somewhat better results. For one, an unexpected result was users

finding the movement of sound soothing or relaxing; it was unclear if it was due to the link between

sound motion and the motion of the car or simply due to sound motion in general.

Another result was that it was not distracting, which was critical; however, it was brought up that if the

sound constantly came from a single direction (such as a long drive down a straight country road), then

it could become fatiguing. In general, users could see themselves using the system.

Additionally, some users did find that having the sound come from a direction helped them develop

better understanding of their orientation. They also expressed that the motion of the sound during the

turn was a better indicator than the absolute sound direction itself, most likely due to the dead spots in

the system mentioned above. This problem could be solved using a setup with more speakers.

This prototype strengthened the requirement that the method of conveying information needs to be

non-invasive. People found changing sound direction to be non-distracting, and therefore found it far

more pleasant than the belt from our CEP.

Other critical learning for future research and testing was that user training is very important. Training

significantly helped users understand the system better and many of the testing to date was too short

for the user to become acclimated. Even after including training time in the CFP, testing with longer

times and testing with the same person on multiple runs would have led to better results.

The goal of the Dark Horse prototype was for the team to get completely out of the norm and try

something different and crazy. After searching for a sufficiently unorthodox form of communication and

technically challenging design space, smell was chosen--more specifically, could smell be used to provide

unobtrusive navigational information?

The motivation for this area of testing came from the strong tie between memory and smell; a human’s

sense of smell is the strongest sense to induce a memory. The goal was to train a user to correlate

certain smells to percentage of route completion.

The idea is that if passengers smell the same scent halfway through their routes every single trip, then

over time they will train themselves to associate that scent with being halfway to their destination.

Afterward, every time they smell it they will instantly know they are halfway done. The motivation

behind providing distance along the route is that it was a common need for many people. People always

seem concerned with how much time is left in their journeys, especially if it is a longer ride or a boring

commute. Smell could be an unobtrusive method to convey this information.

Other unexplored options of using smell included:

Different scents for different locations (e.g. Palo Alto always smells like grass, Berkeley like

lemons, etc.)

Page 36: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Short scent burst when entering a new area (to alert user that scenery/location has changed)

There are many considerations to take into account when dealing with smell. The ones focused on were:

Controlling the intensity and release of the scent

Preventing cross-contamination between scents

The best form to store the scent in (e.g. solid wax, liquid, gas canister, etc.)

Handling olfactory fatigue (reduction in smell intensity after an intense smell)

A couple of functional prototypes were developed to flush out if a system would be technically

achievable. The first one consisted of resistors that melted different scented waxes to release scent; it

was quickly discovered that it would take either a significant amount time or a significant amount of

power to do so. Additionally, from a thermodynamics standpoint, repeatedly melting a solid wastes a lot

of energy.

After, that a second prototype was made that had multiple compartments filled with scented oils. These

compartments had valves that opened and closed based on the desired scent, which was channelled

into the main flow towards the user.

While the prototypes showed potential in creating a working product, at the level of quality of

prototyping, they were inadequate for use in a proper in-car test. Therefore, a Wizard of Oz setup was

utilized for user experience testing. A person in the back of the car would open a vial of scented oil and

put it behind a fan that blew at the user (Figure 12). The air entrained the scent as it passed over the

vial. It was preferred to pumping the scent directly at the user as the scent was less overwhelming. With

the fan setup the scent was more dispersed and therefore more pleasant.

Figure 12: Dark Horse Wizard of Oz Block Diagram

As stated earlier, from the CFP it was learned that training periods and longer tests were critical for

ambient communication. Therefore, the dark horse user experience testing consisted of two twenty-

five-minute drives conductive on three different days using a six total routes.

The first two days were training days, where the users were allowed to look around and correlate the

smells to progress. For additional reinforcement, an LED progress bar was shown to the users to give

them a general idea of route completion. Throughout the drive, the users were instructed to note

whenever they noticed a new smell. Different scents were released at 50%, 75%, and 85% of route

Scented Oil

Fan Passenger

Page 37: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

completion, as well as at three minutes before arrival at the destination. The non-linear schedule was

chosen to focus on when people may care more about how far they are (having a three minute warning

to their destination is good for informing users to start wrapping up whatever they are working on,

while being notified before reaching halfway may be unnecessary). The scents released at each

checkpoint were always in the same order. In between each scent, coffee grounds were used to cleanse

the olfactory palette to prevent olfactory fatigue.

On the third day, the users were told to do a task for the whole drive and not report back when they

noticed a new scent, instead just focusing on their tasks. At the end the users were asked about their

experience. Did they like the scents? Were they distracting? Did they even notice them?

There was some interesting feedback from the users:

Some users did not associate particular smells with a given time to their destination. Instead,

they counted scents--when they had detected four different ones they knew they were close to

their destination.

They felt the experience was relaxing, and having the different smells change while reading was

soothing.

It definitely helped them understand how close the user was to their destination though they

believed there was a more effective way of conveying the information.

There were also a few unexpected observations:

Both users had an initial reaction to look up when they detected a smell. This was not intended

but could lead to using smells as a warning system.

Smells were detected more when the user was working silently. If the user was having a

conversation, some smells were missed.

There are a few key things to take away from these results. First, as anticipated, smell was a particularly

ambient form of communication. Users did have a better idea of their route completion without being

distracted by the information or having to focus all of their attention on it.

Additionally the fact that users would detect the information when they were busy working or bored,

and were able to completely ignore it when they were entertained and did not care about how far they

were is important. That means it is not too intrusive, and that it is very context dependent.

The dark horse prototype resulted in further confirming the requirement that the system being non-

invasive and unobtrusive. Users said they enjoyed the experience because the system was those two

things.

It also helped refine the team’s definition of ambient communication. It showed that training is an

integral part, and that to assess a form of ambient communication properly it takes time for the user to

become acclimated to a system. The feedback and data received was more meaningful than previous,

shorter experience prototype tests.

Page 38: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

The goal of the Funktional prototype was to gain experience putting an entire system together—it did

not have to be polished, but it had to be complete. This stage was necessary because previous

prototypes had tested individual subsystems, instead of designing a closed loop system. The funktional

prototype combined visual and vibrational feedback in order to convey turn and speed information. A

second funktional prototype was also made that combined a visual display with an accelerometer to

have a closed system, instead of a system that requires user input.

Two different systems were used for completing a funktional prototype. The first system used three

potentiometers as input to the system. An Arduino was used to control two LED strips as well as

vibrating motors placed on the back of a seat, shown as a block diagram in Figure 13.

Figure 13: Block Diagram of Funktional Prototype 1

This first prototype provided turn information via three LEDs that would move across the strip; for

example, moving right if the car was turning right. A second LED strip was designed to provide both

current and upcoming speed information. The vibrating motors on the back of the user’s chair provided

upcoming turn information as well as large changes in acceleration.

The second prototype tried to move away from a Wizard-of-Oz type implementation. Instead of relying

on user input, this prototype was controlled via an accelerometer. The accelerometer provided data that

would be conveyed to the user via an LED strip. Once again this prototype was controlled via an Arduino,

shown as a block diagram in Figure 14.

Figure 14: Block Diagram of Funktional Prototype 2

As various accelerations are measured, the LED strip would change color from red (negative forward

acceleration), blue (no acceleration), and green (positive forward acceleration). More details about the

system implementation can be seen in Appendix C.

Because this prototype was mostly used to test system functionality, user testing was not a priority.

Therefore, each prototype was only tested with two users. The results for the first prototype were poor.

Test subjects found the system confusing and were unsure what information the LEDs were trying to

Page 39: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

convey. However, users responded more positively to the more discrete vibrational feedback. Another

useful insight was how intuitive users found the LED strip that was used to convey turn information.

With the second prototype, most of the feedback was directed towards the functionality of the system.

Because this test was the first prototype where the control was not input by a user, there were issues

with the accuracy of the information being conveyed. However, the information from the single LED

strip was easier to process than the combined LED strip display.

There were many requirements learned from this prototype. For example, it was learned that there is a

limit to how much information one can encode in a single sense. If too much information gets encoded,

the ambient nature of the information is lost, as the user must pay more attention to the LED strip to

understand the car’s message.

Another learned requirement is that the information must be easy to learn. This was proven via the LED

strip used to indicate upcoming turns. Users found this information easy to understand because it

leveraged current sensations regarding turning. A flashing yellow light indicated severe turns which is

very similar to a blinker. Therefore, if the final system can build on sensations that users are familiar

with, the system can be easy to adopt.

Page 40: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

This semester has been a continuously shifting journey on the way to finding a nugget to use in the final

solution. It has been challenging since the initial challenge has been so broad and futuristic, which did

not help on narrowing down to a certain focus area or helping to understand an actual use case.

Therefore the team allowed themselves to explore many different topics from motion sickness, to

working situations, and even elderly citizens. These activities have started due to a previous prototype.

All of the explorations have provided insights that, in the end, will become valuable knowledge and

background for decision making when designing and finishing the final solution.

In the following sections activities, which turned out to be particularly useful, are described.

Understanding motion sickness

As we were rounding off the previous quarter with “context awareness” and “trust issues”, we were

keen on exploring whether the issue of motion sickness had an impact on the users trust towards the

vehicle or not. We started building a setup where we could simulate driving in our workshop, and

monitoring users using a Kinect 2.0 by Microsoft. We made a small simulator using a spare widescreen

television, along with some plywood sheets and a chair (Figure 15). The simulator had room enough for

one person, who would enjoy a pre-recorded video of a car driving on a racetrack. The simulator gave us

some pointers that it is possible to make someone motion sick by having them watch a pre-recorded

video, although there were great individual differences.

Figure 15: The pictures show the Kinect system in the lab

Communication through fans and air-flow changes

We thought (and still think) that having a way to deal with motion sickness would have a huge impact on

the current car industry. From benchmarking and testing, we learnt that motion sickness develops much

quicker when the users’ internal sense of motion does not match what they observe visually. We started

prototyping how to replace the visual reference by using other senses, and thought wind would be a

good place to start. By using two fans in front of the user, we made a wizard-of-Oz prototype that used

airflow for pre-cueing the upcoming turns and accelerations.

Although we made the fans work quite well, our prototype was not comfortable to use, and gave a slight

claustrophobic feeling to some users. We had also tried making some artificial horizons by using viscous

liquids in transparent containers, but did not get any results that were conclusive enough to produce a

“nugget”.

Page 41: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

User Studies

Continuously throughout this semester we have kept on relating the final solution to a particular use

context. This context has been changing which has resulted in the need for getting many different

insights from users. Hence we have done several user-involved activities. Some of them and their

findings are described below.

Prototype conversation on Central Station

At the central station in Trondheim we brought several prototypes to initiate a discussion on the future

autonomous car. It was interesting to see the wide differences between people being afraid of

technology and not considering an autonomous car a product for them, while others said they would

trust a car more than an unknown human driver.

Observations in the train to Værnes Airport Trondheim

In this session we observed people taking the train back and forth to Værnes Airport. We quietly

observed what people were doing while being transported and recorded the sound in the train.

People were mostly:

- Looking out of the window - Sleeping - Talking to each other - Working on a laptop

It seemed like people who had chosen the quiet zone were more introvert (either working or sleeping)

than the people in the “normal” zones.

The user studies at the central station of Trondheim, the train to Værnes, and additional information on

Norwegian business men let to the development of the persona Thor (Figure 16).

Page 42: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Figure 16: When focusing on business men we created the persona Thor

Page 43: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

User studies with elderly people

The user studies with elderly people involved interviews and observations as well as trying out a full

body old suit owned by the Institute of Product Design at NTNU (See Figure 17).

Focusing on elderly people we became aware of certain needs that could be expanded to address

younger users as well. These needs concerned:

- removing complexity - keeping control - providing information on the intentions of braking and acceleration - providing information on what the car has ”seen”

Figure 17: Jørgen in an old suit simulating the physical disabilities of an 80-year-old

Workshop with TrollLabs

With the workshop with people from our research group, TrollLabs, we became aware of how easy it is

for humans to learn new rules when it comes to understanding information. For example, the

stimulation through smell was easily understood by the different participants (Figure 18).

When it comes to improving the car-experience the following topics were mentioned:

- getting feedback with disturbing the eyesight - removing less useful information from surroundings - controlling the mode of the car regarding speed and air environment

Page 44: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Figure 18: Workshop with people from TrollLABS. Leading the way through smell.

What should the final solution do - requirements?

When observing an intersection close to our office for several hours we got hold of the need of clear

interaction between autonomous cars other participants in traffic such as drivers of human controlled

vehicles (cars, bikes, motorcycles, trucks and busses …) but also pedestrians. The intersection has clear

right of way rules, but nevertheless the traffic flow during rush hour was not determined by traffic rules,

but by hesitant and decisive drivers and pedestrians that negotiated the order at the intersection. This

lead to a high degree of uncertainty resulting in hesitation and no one acting. Or worse, many

participants acting at the same time and sometimes leading to dangerous traffic situations. Also we

observed many stop and go movements as drivers change their decisions based on the other drivers’

behavior. This results in stopping and driving of both vehicles at the same time. Overall this observations

combined with previous findings let us to the focus of how to communicate the intentions of the future

autonomous vehicle.

Communicating intentions outside the car

To eliminate this uncertainty we tried to communicate the intention of the autonomous vehicle to its

surroundings. We wanted to communicate the presence of the autonomous vehicle and whom it had

seen as well as the intention of braking and waiting or accelerating and maintaining speed.

So far, we tested the use of the headlights of the car as indicators for stopping at a pedestrian crossing.

We turned them off as soon as it was obvious that a pedestrian wanted to cross the road. This signal

caused irritation as pedestrians are not used to this. However, they described the signal as intuitive and

helpful when using it a second time. In the same context, we tried signs with text indicating that the car

will stop and wait. This text was far less intuitive, sometimes overlooked from a distance, and even

though it was a short text, only fully recognized too late, usually while already walking in front of the car.

Communicating intentions inside the car

The footplate idea was a result of the wish to communicate intentions of the car in a non-intrusive, but

still intuitive way. Further ideating led us to an early prototype consisting of a wooden plate, two wires

and an operator with a handle sitting in the back seat, black-boxing the angle adjustment. After three

external test subjects had surprisingly good feedback, we saw the need for a more automatic system.

Page 45: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

This second version will be tested next quarter and hopefully give the passenger an increased feeling of

the car “coming alive”. In order to do this we need to mimic the intention of the autonomous car. Since

we do not have an autonomous car we use an ultrasonic range meter attached to the brake pedal to

measure if there is the driver’s foot is hovering above the pedal. If that is the case, we interpret this as a

braking intention before the driver actually steps onto the brake and decelerates the car.

Figure 19: Footrest communication intention of braking, acceleration and actual braking

Our solution compared with the initial requirements from Renault or the ones presented in the earlier

report

Keywords from our initial functional requirements were unobtrusive, clearly indicative, communicate

useful information and easy to learn. By evaluating the first prototype of the footplate after these

requirements we got the following relevant feedback from test subjects.

Table 3: NTNU Prototype Requirements

Requirement Feedback Reflection

Unobtrusive Distraction while reading was minor. Should not be required to keep feet on the plate at all times.

By keeping the plate as an optional position of the feet, you give the passenger a choice whether or not to receive the information.

Clearly indicative Movement of plate when driver was just hovering over the gas pedal was confusing. One figured it out by himself, and another needed a 1 minute explanation.

Intuitively the movement of the plate corresponds to the movement of the gas/brake-pedal.

Page 46: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Communicate useful information

All subjects somehow commented that the car came alive with this additional information.

This feeling that all our subjects had was surprising, and interesting enough to look more into.

Easy to learn The first prototype had wires that needed to go on the inside of the seatbelt. The new one will presumably be perceived as a footrest, without raising questions about how to use it.

We could draw the outline of two feet on the footplate. At the same time, less is probably more in this matter.

For our group it has been very interesting and challenging to work in such a diverse group. Normally

when you work with people of same background you mostly spend time discussing analysis, where to go

from a certain finding, or other “after-process” decisions.

We have spent a lot time discussing how to approach the task, which methods to use, and how long to

work in a certain area. With that we mean that it seems as if when you work in a multidisciplinary

theme, an extra dimension of discussion is added when deciding how to approach the task. On the other

hand, the diversity comes in handy when ideating and dealing with new problems. The different

backgrounds provide knowledge and inspiration for new solutions.

Still, for a fruitful multidisciplinary team to work we believe there must be mutual acceptance of

different backgrounds, but also certain rules to follow when decision-sessions get stuck and no one is

willing to give way for other approaches than their own. This could be mediated via rules decided on

before starting up a project or a group leader, making the final decisions on where to go next.

Long distance group work is even more challenging than the multidisciplinary part of the teamwork

since ownership of ideas, knowledge sharing, and time difference provides so many barriers for the

teamwork to flow.

Working with an explorative and futuristic problem brief from Renault has been challenging. The

statement ”We heard about ambient communication…how can we use it in cars…and maybe increase

the trust relation at the same time” is so wide which has given us many difficulties on defining what

problem we were actually looking to solve.

At the same time the “prototyping-mantra” of the course has forced us to start prototyping from a very

early stage, when we did not know what to look for yet. In hindsight, we could have used our prototypes

in a more explorative, future-insight way which is also suggested by Carleton [6].

Page 47: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

With the various design requirements, user surveys, and research studies, a nugget was found:

Improving the car-driver team dynamic via trust, gained through the user knowing more about what the

car knows and intends to do. In order to test the validity of this nugget a functional prototype was

developed.

The goal of the functional prototype was to test how well the design vision would be received. This

vision was the pairing of an ambient and explicit display conveying the perception and intention of the

autonomous car. For this prototype, the perception aspect was the main focus. This system would find

obstacles around the car and illustrate where the obstacles were to the passenger.

This prototype involved using a Kinect 2.0 sensor. The Kinect senor has a plethora of functions including

object detection, object tracking, and facial detection. One feature of the Kinect is a point cloud depth

image. This image provides a depth for every pixel within the Kinects resolution. This depth frame was

filtered and windowed in order to obtain depth data for obstacles in front of the car.

Once the Kinect data was acquired, it was mapped to an angle around the user. A 2m long Neopixel LED

strip was used to provide 180° coverage around the passenger. The LED strip only outputted a single

color and did not differentiate between objects. If an image was in the windowed frame of the Kinect, it

was displayed on the LED strip (Figure 20).

Figure 20: Photo of LED Strip for Functional Prototype

Page 48: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

The Kinect sensor was attached to a laptop that then communicated to an Arduino via a COM port. The

Arduino drove the LED strip, shown as a block diagram in Figure 21.

Figure 21: Block Diagram of Functional System

More details about this prototype can be found in Appendix C.

This prototype was tested with three users at different times of the day--testing at different times of day

was important because daylight greatly affects both the visibility of the LED strip, and the accuracy of

the Kinect sensor. Fortunately, all users were able to effectively see the LED feedback, and the Kinect

sensor functioned in both day and night.

In terms of user feedback the results were very positive. Users both enjoyed the experience of the LEDs

and commented that seeing what the car sees allows them to trust the technology better, which was

the precise goal of the system. These results prove that adding transparency to autonomous car

technology can improve the user’s feeling of trust.

Page 49: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Our vision for the final system incorporates both part of the Stanford team’s Functional system and of

the NTNU team’s braking prototype. The end goal is to provide the passenger with information about

what the car’s perception system sees, as well as specific information about the car’s intensions.

The expected system will present three forms of communication to the user: a visual ambient display, an

ambient foot pedal, and an explicit visual display (Figure 22).

Figure 22: Final System Vision

The ambient visual display will extend around the passenger’s field of view. While the passengers can

look out the window and clearly see what is around them, this will assure them that the car’s perception

system sees nearby obstacles as well. This setup meets the requirement to increase the passenger’s

awareness of the vehicle’s knowledge, as well as increase their trust in vehicle safety.

The exact method of display is something to be determined (a diffused LED strip, a projected image etc).

Regardless of the final method, the system will need to detect the ambient lighting conditions and

adjust the brightness of the display accordingly, so it is not over-powering at night and visible in bright

day-light, which meets the physical requirement that the system be comfortable for the user.

At the passenger’s feet there will be a moving pedal or floor board that tilts depending on upcoming

braking or acceleration, based off of NTNU’s prototype that explored conveying vehicle intentions to the

user. From early testing this already had positive results in increasing the passenger’s trust towards the

vehicle. It also meets the requirement of increasing the passenger’s awareness of the vehicle’s

knowledge.

Page 50: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

It was never expected that all forms of communication in the system would be ambient—as with

people, explicit communication is sometimes necessary. For that reason, we chose to strengthen the

passenger’s understanding of the vehicle’s view and intentions by presenting an explicit display. This

display will show very similar information to the ambient visual display as well the moving floor board,

but in a purely explicit and more detailed way. That way the passenger can look at a more detailed

representation if they wish, and it is a good way to train the user on what the system is doing.

It is important to note that this is not a replacement for the other two methods. Realistically a passenger

will not be constantly watching the display, so the ambient visual display will always be seen in the

passenger’s peripherals and the foot pedal will always move.

While some fulfilled requirements have been touched upon, the communication system as a whole

meets some of the other requirements too.

Since each form of communication (visual vs. motion) only conveys one piece of information, it does not

overwhelm the user’s senses. Neither is invasive or obtrusive as the light brightness will be dynamically

adjusted, and the moving floor board motion will be smooth. Users did not find the motion to be

distracting or invasive in testing.

They are also clearly indicative. In both systems, users were able to understand what the system was

telling them with ease.

Users expressed that the communicated information in both systems was useful, especially in the area

of increasing trust in the vehicle’s ability. This will improve the passenger-car team dynamic as the

passenger will henceforth feel like they can see part of the vehicle’s mind.

The communication system is, to us, the most interesting part of the project, as the other systems will

have better technology by the time autonomous cars come to market. Still, the final system will require

a few extra components to simulate information the system would receive from the autonomous driving

algorithm (Figure 23).

Page 51: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Figure 23: Final System Block Diagram

COM Port

Display Out

Kinect Array or LIDAR

Arduino

Kinect SDK

Computer and Software

Ultrasonic Sensor on

Brake Pedal

Ambient Visual Display

Ambient Brake Pedal Actuator

Explicit Visual Display

Perception System Communication System

Page 52: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

A great deal was learned by the massive amount of prototyping done during the winter quarter (which

averaged out to more than one a week), and by some of the additional need-finding. While the large

number of prototypes meant we did not focus as much on any single idea, the wide variety of sensory

outputs used (hearing, smell, touch, and sight) gave us a strong perspective for how users might react to

each case.

Likewise, since we are trying to focus on what could be considered a more acquired ‘intuition’, we chose

to test fewer users, with more time devoted for each test (for example, with smell during Dark Horse,

two users underwent six tests, with only the last two involving fully ambient conditions). This change in

testing procedure allowed for what we felt are more reasonable results, and will likely be how we

continue testing during the spring quarter.

There are two major things we will have to work on during the spring quarter that we did not do as well

as during the winter: the refining process, and creating a full system instead of a ‘Wizard of Oz’ style

simulation. While we needed the wide range of prototypes to get an idea of how we wanted the final

solution to be, we missed out on the chance to practice refining one idea until it is completely solid. The

same rapid changes in plan caused us to mainly use simulations instead of fully integrated systems—

while this gave us the user feedback we needed, it will not work for a final product. However, we have

confidence in our abilities to use the resources around us to fix these gaps.

While the winter quarter was focused mainly on prototyping a wide variety of potential solutions to the

project prompt, the spring quarter will be focused on delivering one completed product by the

beginning of June at EXPE. While we have a relatively firm vision, finalizing that vision will be occurring

while the two teams meet at NTNU during spring break; the following is simply a first-round

approximation.

Table 4: Spring Missions

Mission Further Description Due Date

Travel/NTNU Collaboration Stanford team travels to Norway to do in-person prototyping and testing with the NTNU team. Two members of the Stanford team will also be traveling to Paris to consult with the Renault Human-Machine-Interface lab

March 20th-April 5th

Spring Hunting Plan Create a comprehensive plan for completing project by EXPE

April 2nd

Page 53: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Part X is Finished Have one (non-trivial) part of the final product completely finished Probable Part: Since our project is based so heavily on using detected information, Part X will likely involve either the object detector or a mechanism to detect the pressure on the brake and acceleration pedals.

April 23rd

Manufacturing Plans Lists of all materials, processes, schedules, quotes from outside retailers, draft of final requirements, etc.

April 23rd

Penultimate Review Largely completed hardware, with a product that should functionally stand on its own at EXPE, completed design requirement

May 21st

Expe Brochure and Poster Finished brochure and poster for the EXPE presentations

Draft: May 26th Final: May 29th

EXPE Presentations June 4th

Final Documentation June 9th

When working in teams, particularly ones that are wide-spread geographically, clearly defined roles are

important to guarantee task completion, so that no aspect falls through the cracks. Therefore, both the

Stanford and NTNU teams assigned four roles: Chief Communications Officer, Chief Documentation

Officer, Chief Financial Officer, and Chief Planning Officer.

Table 5: Distributed Team Roles

Job Stanford NTNU

Chief Communications Officer

Adam Craig Matilde Bisballe

Chief Documentation Officer

Kornel Niedziela Jørgen Erichsen

Chief Financial Officer

Jacob Gowan Carl Semb

Chief Planning Officer

Caroline Flowers Achim Gerstenberg

Communication and collaboration occurred regularly and frequently, using several methods. First, there

was a weekly skype meeting between both the NTNU and Stanford teams, usually lasting about an hour.

Page 54: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Second, there was a weekly/bi-weekly (depending on the amount of pivoting occurring and our

corporate liaison’s availability) between NTNU, Stanford, and our corporate liaison, also held via Skype.

Periodic check-ins about the state of projects/sharing of interesting news and developments also

occurred via a private Facebook group, limited to the eight members of the combined NTNU and

Stanford teams. Weekly emails to our coaches regarding updates and attaching documents we had

worked on during the previous week, kept them in the loop and gave them a forum for providing advice.

Document collaboration occurred through two methods. First, for things that might require

simultaneous input from multiple members of Team Renault (transcripts of team meetings,

brainstorming sessions, survey suggestions, etc.), Microsoft OneDrive was used. For sharing results and

final documentation, DropBox was used.

While in the fall, most of the need-finding and benchmarking was done cooperatively, the winter

quarter saw a divergence in areas of research and testing, with the final results being two subsystems (a

braking system and an object detection system) that are both planned to be included in the final

product, with the NTNU and Stanford teams focusing on each, respectively.

Our project continues to come in well under-budget, using less than half of our allocated winter budget.

Table 6: Winter Quarter Budget

Activity Cost

CFP $40.23

Dark Horse $489.74

FUNKtional $89.15

Functional $348.87

SUDS $374

Obtaining Car for Testing $150.91

Personalizing Team’s Loft Area $75

Total Expenditures $1490.90

Allocated Winter Budget $3000.00

Rollover Fall Budget $233.81

Rollover Winter Budget $1,742.91

Most of the budget was spent on prototyping materials: for example, during Dark Horse we needed

various kinds of scents, during Funktional and Functional we needed several kinds of electronic display

equipment, etc. This category will also likely be the biggest source of expenses during the spring quarter

as well.

The biggest concern budget-wise at the moment is obtaining a car to use for testing; while we have

been able to use a privately owned car, or rented one, this setup will not work for when a car needs to

be used for weeks in a row.

Page 55: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Table 7: Estimated Spring Budget

Expenditure Estimated Cost

Display Components $900

Sensing Equipment (up to 5 Kinects) $1000

EXPE $1000

Obtaining a Car $2500

SUDS $300

Total $5400

Available Budget $6250

For our group it has been very interesting and challenging to work in such a diverse group. Normally

when you work with people of same background you mostly spend time discussing analysis, where to go

from a certain finding or other “after-process” decisions.

We have experienced to spend much time discussing how to approach the task, which methods to use

and how long to work in a certain area. With that we mean that it seems as if when you work in a

multidisciplinary theme an extra dimensions of discussion is added when deciding how to approach the

task. On the other hand the diversity come in handy when ideating and dealing with new occurring

problems. Then the different backgrounds provides knowledge and inspiration for new solutions.

Still for a fruitful multidisciplinary team to work we believe there must be mutual acceptance of

different backgrounds, but also certain rules to follow when decision-sessions get stuck an no one is

willing to give way for other approaches than their own. This could be rules decided in the group before

starting up a project or a leader-role making the final decisions on where to go next.

This quarter has been a rollercoaster. There were times when I was loving ME310 and the project, and

others were I was frustrated with the design process and the problem we have been given. This quarter I

learned how hard it can be to design around a problem where there are no users. Autonomous cars are

not being driven today, therefore, trying to forecast who our users are and what their problems might

be is insanely difficult. This has been the most frustrating task for me this winter quarter.

This quarter I have also learned how cyclical the ideation process can be. At the start of the fall quarter

we were presented with the issue of trust. We began to focus elsewhere at the start of the winter

quarter. Facilitating task completion was our key need. As we continued to ideate and prototype we

circled back to focusing on trust once again. Now that we have diverged and explored as many spaces as

possible, this spring I am ready to start converging and refining a final system that will result in some

“WOWS!”.

I think working with ME310 is both very interesting and very challenging. Collaborating across such a

geographical distance is hard, although I feel that our team is both willing and able to communicate

despite different time zones and distance. For me, this quarter has been all about divergence, where we

Page 56: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

have not been sure what to do, and what our “next big thing” will be. I think we have had a need for

stronger leadership within our team, although I feel this is getting better. Therefore, I am very excited to

see where our project is headed.

While at times it felt like we were completely lost vis-à-vis a final vision as we frolicked hither and

thither between different senses on our prototyping journey, at the same time I feel like we got a lot out

of this quarter. This quarter was also more reliant on the knowledge of those with a mechatronics

background on our team, but that will hopefully change as we move away from rapidly changing

prototyping (while they were more than willing to do the heavy lifting, the other two of us look forward

to being able to contribute as significantly again).

It’s been a process, without a doubt. We moved all over the place, had several, um, very intense

discussions, completely redid our persona (then completely redid our plan so this persona also didn’t

fit), prototyped more than we did anything else, and were accurate if not precise. But that feels like it’s

the point of winter quarter—to be out there and test everything. Yes, at times we may have scared the

teaching team with the wide directional changes, but with as large a design space as we were given, we

definitely needed the amount of divergence we did. However, despite our lack of experience refining

things during the winter quarter, I feel that it is something we can transition to without too much

difficulty. The team dynamic, both among the Stanford team and the combined Team Renault, remains

strong, and I’m excited to see where we’ll go now that we have a relatively solid vision.

About two months ago we had many ideas how to ambiently communicate. However, it was uncertain

what information is most valuable to be communicated. This resulted in a planless trying out of different

ideas with no clear vision. Therefore, we went back to needfinding and the group decided to focus on

elderly people and trying to find their pain points. At the same time I was staring out of my office

window during rush hour recognizing a lot of uncertainty between drivers at an intersection. The idea

came to mind to use the ambient communication technology that we developed earlier in someone

else’s car to communicate intentions of braking/waiting and driving. Within the group it felt like we had

consensus on this. We quickly developed the tools to measure the intentions of the driver. The hard

part, at least with the resources we have in the lab, is how to send that information to the other vehicles

and therefore we used it to communicate the driver’s intention to the passenger. Testing showed that

passengers like the system a lot. Therefore, we spend a lot of time making this system work. However, I

see this side project as a nice gimmick and we shifted away from the original and agreed upon idea,

which for me has a lot more “nugget potential”. I now figured out how to technically solve the wireless

communication problem between the cars in a fast way with enough range and I will hope to bring the

others back to that idea.

This quarter was pretty rough, not only because everyone’s schedules were significantly busier than last

quarter, but because the troubles of not having an existent user base. Though we tried to avoid

conjecture as much as possible we found it hard to design for anything without making certain

assumptions about the future technology and user. I like how we took the bold rather than the cautious

approach where this is concerned. We decided to assume what we could and apologize later if we were

Page 57: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

too far off. We found that this made it easier to iterate more quickly and did not diminish the quality of

our work significantly.

I think that the biggest challenge for us in the coming quarter is to more effectively and efficiently.

Justify our reasons for pursuing a particular path as well as refining our nugget from rough ore, to a

highly polished product. I have great faith in my team’s ability to undertake the challenge ahead of us

and to produce the “wow” factor that we are seeking.

This quarter was definitely much busier than last in this course and others. I feel like we have lost and

found our way time and time again, trying things we may not have expected, or being shot down by our

liaison. It’s been a rough journey, but I think we may finally be onto something. There are still some

details to iron out, and an international team to convince but I think we are getting pretty close to the

point where we can commit to a goal for the end of spring quarter.

Winter quarter has been very heavy on prototyping. With the prototypes being much more hardware in

nature with less focus on mechanical, Adam and I found it hard to delegate tasks as the added overhead

of splitting up writing code, or handling the electronics didn’t make sense, even when Caroline and

Jacob asked how they could help. As a result Adam and I took most of the prototyping load. I have to

give thanks to Adam for covering one week for me when I had two midterms in that week, as I can only

imagine how much work it was. Next quarter I hope that we can better handle delegation, which is

something I will focus on, but I think we will be able to better leverage our team’s skills to refine and

manufacture the design in spring quarter.

Working with our NTNU team has also been pretty good. There were some weeks where we did not

have much to talk about in meeting, either because they were busy with other things or we were busy.

As a whole though I think we’ve been pretty decent and keeping up to date and bouncing concepts and

findings off each other. We’ve been pretty fortunate that the interdependence between us in winter

quarter was not heavy as we were diverging and trying different things. I think spring quarter will be

much more challenging in communicating with NTNU as we will have to actively work together to

converge, and not just have two separate, unrelated systems at EXPE.

This quarter has provided several challenges. Because there are no physical interaction between the two

teams it was hard to give proper feedback and throw ball with each other. I think the project would have

benefited greatly from a meeting also this quarter. It has been really fun and challenging to explore and

diverge in so many directions. The fact that we are heading back in the directions of what we were doing

before Christmas is with mixed feelings, but not necessarily a bad thing.

Page 58: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

We were lucky to have a vast wealth of sources for materials and manufacturing in our area.

Jameco Electronics – Electrical Components

1355 Shoreway Road

Belmont, CA 94002

Jameco.com

SparkFun.com – Electrical Components

6333 Dry Creek Parkway

Niwot, CO 80503

Sparkfun.com

Alan Steel – Steel, Aluminum

505 E Bayshore Road

Redwood City, CA 94063

Aluminum REM – Recycled Aluminum

3185 De La Cruz Boulevard

Santa Clara, CA 95054

Stanford PRL – Assorted Materials

447 Santa Teresa Street

Stanford, CA 94305

productrealization.stanford.edu

TAP Plastics – Plastics

312 Castro Street

Mountain View, CA 94041

tapplastics.com

Amazon – Various Other Items

amazon.com

Microsoft Store – Kinect

microsoftstore.com

SurveyMonkey – Used to create and track user surveys

Page 59: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Surveymonkey.com

Walmart – Various Other Items

600 Showers Drive

Mountain View, CA 94040

Shad Laws – Renault

Work: Engineer

[email protected]

Wendy Ju – CDR

Work: Executive Director, Interaction Design Research & Associate Professor

[email protected]

Chris Gerdes – DDL

Work: Lab Director, Dynamic Design Lab & Professor in Mechanical Engineering

[email protected]

Sven Beiker - CARS

Work: Executive Director, Center for Automotive Research at Stanford

[email protected]

Page 60: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

[1] S. Mlot, "Mercedes' Driverless Concept Car Turns Heads in SF," PC Magazine, 10 March 2015.

[Online]. Available: http://www.pcmag.com/article2/0,2817,2478085,00.asp. [Accessed 3 March

2015].

[2] R. Group, "Renault Group - About Us," Renault Group, 2015. [Online]. Available:

http://group.renault.com/en/our-company/a-group-an-alliance-and-partnerships/. [Accessed 3

March 2015].

[3] Y. Wang, X. Chen and K. Henrickson, "Pedestrian Detection With Microsoft Kinect," 2014. [Online].

Available: http://onlinepubs.trb.org/onlinepubs/conferences/2014/NATMEC/Wang-Kinect-Pres.pdf.

[4] W. Ju, Interviewee, Car/Driver Relationship Design Research. [Interview]. 17 February 2015.

[5] F. Bellott, R. Berta, A. De Gloria and M. Margarone, "Using 3D Sound to Improve the Effectiveness of

the Advanced Driver Assistance Systems," D.I.B.E., Department of Electronics and Biophysical

Engineering, University of Genova, Genova, Italy, pp. 155-163.

[6] T. Carleton and W. Cockayne, "The Power of Prototypes in Foresight Engineering," in Proceedings of

the 17th International Conference on Engineering Design, Stanford, CA, 2009.

[7] G. Gasior, 4 5 2014. [Online]. Available: http://techreport.com/news/24858/updated-kinect-motion-

sensor-coming-to-the-pc-next-year. [Accessed 2 12 2014].

[8] Navdy, "Navdy," [Online]. Available: http://www.navdy.com/. [Accessed 22 11 2014].

[9] "Face Tracker," [Online]. Available:

http://auduno.github.io/clmtrackr/examples/clm_emotiondetection.html. [Accessed 10 2014].

[10] "Gaze Tracking Library," [Online]. Available: http://sourceforge.net/projects/gazetrackinglib/.

[Accessed 10 2014].

[11] [Online]. Available: http://feelspace.cogsci.uni-osnabrueck.de/. [Accessed 10 2014].

[12] M. Brown. [Online]. Available:

http://www.pcworld.com/article/259997/protect_your_assets_a_buying_guide_to_office_security_

systems.html. [Accessed 12 2014].

[13] "The Style," 5 2009. [Online]. Available: http://yuppiecrap.com/irobot-560/. [Accessed 12 2014].

[14] A. Waytz, J. Heafner and N. Epley, "The Mind in the Machine: Anthropomorphism Increases Trust,"

Journal of Experimental Social Psychology, p. 21, 2014.

Page 61: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

[15] [Online]. Available: http://www.gearthhacks.com/downloads/related.php?file=31263. [Accessed 12

2014].

[16] "Wendy Ju Design at Large," [Online]. Available:

https://www.flickr.com/photos/calit2/15471345782/. [Accessed 12 2014].

[17] S. Laws. [Online]. Available: https://www.linkedin.com/in/shadlaws. [Accessed 11 2014].

Page 62: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

The project prompt submitted to ME310 by Shad Laws on behalf of Renault can be seen below.

Page 63: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger
Page 64: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger
Page 65: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger
Page 66: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

As fully autonomous cars for commercial use do not exist yet, we chose to explore a wide variety of

sources to gain as much information about the potential future space. Choosing to focus on the

theoretical aspects of fully autonomous driving allows us to focus on enhancing the dynamic of the car

and driver as a relationship of teammates rather than the logistics of creating a semi- or fully

autonomous vehicle.

As mentioned above, our current definition of ambient is non-explicit communication, more akin to a

sense rather than a conscious intake and analysis of information (e.g. sensing approximately what time

of day it is by the color of the sky or angle of the sun versus reading a clock). In our search for examples

of ambient communication, we found that current technology exists in one of two categories: user-to-

machine communication and machine-to-user communication.

User-to-machine communication focuses on the machine interpreting non-explicit human

communication to provide a better user experience. While difficult for machines, this ability is almost

universal in the human experience, and therefore sometimes subconsciously expected by humans in

things outside of human-to-human interaction. Giving the machine ways to respond to the user’s

physical outputs allows the creation of a more natural-feeling team dynamic.

Gesture Tracking

Gesture tracking pertains to movements of appendages and tracking these to quantify movements.

Currently most gesture technology is only used explicitly but could be of use for ambient

communication. This type of tracking is already being done with both wearable and optical systems;

examples include the video game software Kinect and Navdy (explicit communication), a new method of

communicating with a smart device while driving using gestures.

Figure 24: (Left) Demonstrating the motion tracking of the Kinect [7]

Figure 25: (Right) User gesturing to accept call with Navdy [8]

Page 67: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Emotion Tracking

Emotion tracking software tracks facial expressions and body languages with the intention of quantifying

and translating these movements to give the machine the current emotional state of the user.

Algorithms are currently being researched that use the tracking of key features to provide an

approximation of emotion. An extremely basic version of this technology is clmtrackr, which uses

recognition of facial aspects like eye or mouth shape to hypothesize the displayed emotion.

Figure 26: An example of clmtrackr, analyzing a pleasantly surprised facial expression [9]

Eye Tracking

Using vision and EMG (Electromyography) systems, human eye movement can be quantified and

analyzed. The data may then be used to determine where the user is focusing and frequency of eye

movements. This data can be used to help the machine react to the user’s behavior in a favorable way.

Figure 27: Gazetracker eye monitoring software [10]

Ambient communication from the machine to user is a less prevalent field of research, but we believe

that there are significant opportunities to explore in this area.

Haptic Feedback

Haptic feedback uses tactile sensations to communicate with the user. Current research in this area is

not only focusing on the hardware technology (motors, magnets, and more) but also the applications of

haptic feedback (remote communication, virtual reality); an example of ambient haptic technology is a

belt that participants in a study with one point always vibrating to magnetic north [11]. After several

weeks the participants developed an extra innate sense of where north was without having to be

constantly aware of the vibration.

Page 68: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Benchmarking non-automotive ambient and autonomous technologies provided a good opportunity for

us to learn how these spaces develop--especially useful as semi-autonomous vehicles are in their

infancy, but other spaces have already been through an autonomous adoption period.

Autonomous Consumer Products

There are countless autonomous and semi-autonomous products that are readily available to the

average consumer. Products like the Roomba vacuum cleaner are trusted to adjust to changes in their

environment without directly needing to be informed of them by the user. Automated security systems,

particularly those that can be armed through mobile apps, are likewise trusted to respond to remote

commands. However, the trust implicit in both of these examples occurs from the situation being low-

risk, in the case of the Roomba, or easy to prove that it is working correctly, in the case of security

systems.

Figure 28: An alarm system autonomously monitoring the safety of a house [12]

Figure 29: A Roomba making its rounds, autonomously cleaning [13]

Human-Machine Interactions (HMI)

Studies [14] show that users appreciate interacting with a machine that has human-like traits, such as a

voice, a name, or something that gives it personality; it is easier for people to understand and justify the

behavior of a machine if it is slightly anthropomorphized. In general, being able to see a machine as

comparable to a human makes the user experience better and the machine more trustworthy.

Page 69: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Travel

Although ubiquitously autonomous travel is a very long way off, there are current examples of

technologies that could be applied or be considered analogous to autonomous vehicles. A popular

analogous technology that many automotive companies are focusing on is autopilot for airplanes,

though this example may in some ways be a false analogy, as the time frames and error margins for

autonomous driving are much smaller than those for commercial planes; if there were an error with the

controls in a plane, there could be up to several minutes before the plane would hit the ground, while in

a car there might only be a matter of seconds. Also, while the results are often much more catastrophic

with commercial airlines complications, those events are far less common. Likewise, with the training

pilots receive, if the autopilot failed the human pilots could retake control without disaster. Last, most

commercial passengers have no experience operating a plane, nor much comprehension of the logistics.

As a result, there is a high level of trust in these technologies.

Figure 30: The cockpit of a commercial airliner showing the complexity of the controls [15]

We were lucky to have very knowledgeable experts at our fingertips to talk with early in our research.

These experts helped us explore some technologies that were previously unknown to us, as well as

enlightening us to what they have witnessed in the space.

Wendy Ju – Stanford Center for Design Research (CDR)

As a professor of mechanical engineering, Wendy Ju’s research has focused on the intricacies of the

human experience in automobiles. One of the projects focuses on providing pre-haptic feedback for

turning, and is a mock up of an autonomous car. Another of her projects is measuring how people

respond in a semi-autonomous vehicle simulator. One of the insights that she has shared so far is that

people mistrust a vehicle that disobeys them, even if it is doing so to be safer or to obey the law. They

have also seen that one of the most common responses for someone who is being semi-autonomously

driven in the simulator for more than about eight minutes is to fall asleep. However, while this

automatic reaction could be a major problem for semi-autonomous driving, it would not necessarily be

so for the user of a fully autonomous car.

Page 70: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Figure 31: Wendy Ju of Stanford CDR [16]

We have plans to return and consult with Wendy periodically to get her guidance on our progress.

Shad Laws – Renault

In addition to being our corporate liaison, Shad has provided us with indispensable input in our project;

for example, he steered us towards exploring extreme users of ambient communication between two

living creatures, such as horse riders and people with service dogs. Shad also cautioned us of the “hey

dummy” effect; as described by him, this phenomenon is the tendency for systems to function more as

warnings at an almost insulting level, rather than being an active, positive contributor to the

user/machine relationship, which leads to people choosing not to use them. Knowing this tendency, and

to avoid it, is something that has helped to drive our design choices. Shad also provides extra sources for

expert opinion, such as a talk at the Nissan research facility in Sunnyvale to listen to a talk from a

professional pilot who works on the Google self-driving project.

Figure 32: Shad Laws, our liaison extraordinaire [17]

Although there are no commercial self-driving cars available yet to the public but there are several

precursor technologies that are being used in semi-autonomous cars

Warning Technologies

There are technologies currently on the market that, while helpful to semi-autonomous drivers, are not

in the spirit of what we are seeking to create; they are either overly explicit or employ the ‘hey dummy’

effect that Shad has described. Examples of these types of technology are blind spot detection, and lane

departure warning--functioning more as reminders that you are doing a poor job driving rather than

helping the driver work in harmony with the car.

Page 71: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Teammate Technologies

There are some technologies, however, that are already embodying the teammate-style relationship.

These technologies take a few good first steps towards a teammate style relationship by giving the car

some agency to impact the drive, which gives the driver confidence in the car and in themselves while

driving. Examples of this technology are self-adjusting cruise control and automatic braking. [7]

Through our benchmarking we learned that the critical part of any autonomous device (whether it be a

car, plane, or vacuum,) was trust. Trusting machines is something that is hard for people to do, even

when it can be empirically shown that machines often are better at their tasks than people are. The path

that automotive manufacturers would like to see is displayed in Figure 33. In fact they would like to

minimize the generation 3 where the public’s level of trust in their cars is projected to decrease.

Figure 33: The project path of trust as more autonomy is introduced.

To learn more about this all important trust question, we interviewed a wide range of users both around

Stanford and in Norway. We divided our users into 3 major groups: the general public, current users,

and Extreme Users.

Extreme users give us a new perspective. We chose to interview extreme users that included the

visually impaired, persons with extreme motion sickness, horse riders, and technology junkies. We

found with the people in this group we had vastly different interactions both with cars and with the

people that drive them. For instance those who are very susceptible to motion sickness need constant

awareness of their motion as well as carefully controlled atmosphere in the car to avoid being sick.

Page 72: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Conversely automobile enthusiasts actually enjoy the feeling of driving and as a result would not enjoy

passively sitting in a car.

Severe commuters would enjoy a more connected car that provides more opportunity for productivity

on their long drives, where as the most important thing for a control freak would be to be completely

aware of the situation and be able to have a high level of agency even in an autonomous vehicle. These

users show that especially among the fringe groups the idea of an autonomous vehicle means vastly

different things. For some it is an opportunity and convenient but for others it is terrifying or

detrimental to the experience.

Current users that we interviewed are users that are around autonomous machines. This was an

important area for us to explore. These people have extensive experience with either automation or

with situations where autonomous vehicles may be functioning. An example of this is someone who

routinely is driven around in India. The traffic is such a scary experience there that the person has to

actively distract themselves from observing the road to keep calm. Other examples include pilots and

train conductors. We learned from this group that with increased automation you tend to have less

situational awareness and as a result are less engaged in the activities. This has been the major cause of

many of the recent major airline crashes, the disconnect between the autopilot system and the pilots

leaves holes where accidents can happen. But even with those accidents the introduction of autopilot

has drastically reduced the risk of flying. We hope that this hold true for autonomous vehicles as well.

In general the public has very little conception for autonomous vehicles as they do not exist yet. Most

assume that for them the perfect ride will be the smoothest ride. Most general users also desire

increased productivity while driving while simultaneously not being surprised of the cars actions while

driving.

We found that our instincts with regard to trust really were close to the mark. Some users do not trust

autonomous technology at all such as control freaks or most general users. Some users however trust

autonomous communication too much, such as pilots and horse riders. We think that this gap of trust is

the place where ambient communication can most effectively improve the user and machine

interaction.

We also established the needs of our users. To summarize:

1) The smooth ride is the perfect ride: Passengers trust the driver more and feel safer when the ride is

smooth. If it is rough then they feel the conditions are less safe and have less trust in the situation as a

whole (driver, road conditions, etc).

2) Know the motion of the vehicle: For those who suffer from motion sickness it is important to feel the

car’s motion. As they lose these cues they become more sick due to the disconnect between body

motion and visual motion.

Page 73: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

3) Minimal distraction when doing tasks: When passengers are able to do a task in a car (e.g. sleep,

read or browse the internet) they want to be able to do so with minimal distraction. That means they

want less signals such as warnings or rough driving. This need is motivation for a context aware system

that can detect the context of the passenger and adjust accordingly.

4) Minimal Surprises: As a passenger, nobody likes surprises such as the driver pulling out of the lane

randomly. While the action may be necessary it is still unsettling. This is motivation to evaluate

increasing situational awareness of the passenger, potentially giving them foresight about the vehicle’s

imminent actions or surroundings. This may induce better trust in the system with fewer surprises.

5) Don’t be too machine-like or pretend to be too human-like: People don’t like their technology being

cold and impersonal like a machine, but they don’t believe the technology is there yet to create a

convincing Human AI. They feel that current attempts at giving technology human characteristics are

poor and feel unnatural. This means that unless a truly realistic human personality can be achieved, we

should not attempt to make the car act like a person, such as with a voice or personality.

A multitude of potential users we could be designing for exist, divided into two main categories: those

looking for convenience and comfort, and those needing help trusting the technology. Examples of the

convenience and comfort user include, among others, the serial commuter who spends all of his or her

time in the car, the technology geek who must always have the latest and greatest toy, and the car

fanatic. However, while all potentially interesting aspects to build a persona around, we didn’t find them

truly compelling; after all, they are all the types to buy a fully autonomous car no matter how it did or

did not make them feel. Therefore, we chose to focus on the potential user who has a hard time trusting

a fully autonomous car, which was also one of the first things Shad informed us was a major potential

concern. We therefore chose the most, on occasion, irrationally nervous group of which we could think

(mothers) and came up with Nancy, our ideal user (Figure 34), using characteristics we gained from our

interviewees and other real-life sources. If we could make a vehicle that could make her comfortable,

then we presume anyone else would be comfortable in it too.

Page 74: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Figure 34: Nancy, our ideal user for design purposes

Page 75: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

This appendix provides more details about the various prototypes completed each quarter. For each

prototype a component list as well as any code that was used to complete the prototyping.

Component List:

7 Vibrating Motors

1 Arduino Mega

1 10k Potentiometer

5V Power Supply

1 Solder Protoboard

7 Transmitters

Assorted Wires

Images of the prototype can be seen below:

Figure 35: Images of Critical Functional Prototype

Page 76: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

The microcontroller used for completing this prototype was an Arduino. The code for the control of this

Arduino can be seen below:

// These constants won't change. They're used to give names

// to the pins used:

const int analogInPin = A0; // Analog input pin that the

potentiometer is attached to

int vib7 = 2;

int vib6 = 3;

int vib5 = 4;

int vib4 = 5;

int vib3 = 6;

int vib2 = 7;

int vib1 = 8;

int sensorValue = 0; // value read from the pot

double angle = 0;

void setup() {

// initialize serial communications at 9600 bps:

Serial.begin(9600);

pinMode(vib7, OUTPUT);

pinMode(vib6, OUTPUT);

pinMode(vib5, OUTPUT);

pinMode(vib4, OUTPUT);

pinMode(vib3, OUTPUT);

pinMode(vib2, OUTPUT);

pinMode(vib1, OUTPUT);

digitalWrite(vib7,LOW);

digitalWrite(vib6,LOW);

digitalWrite(vib5,LOW);

digitalWrite(vib4,LOW);

digitalWrite(vib3,LOW);

digitalWrite(vib2,LOW);

digitalWrite(vib1,LOW);

}

void loop() {

// read the analog in value:

sensorValue = analogRead(analogInPin);

angle = 0.2538*(sensorValue - 540);

Page 77: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

if (angle < -75)

{

digitalWrite(vib7,HIGH);

digitalWrite(vib6,LOW);

digitalWrite(vib5,LOW);

digitalWrite(vib4,LOW);

digitalWrite(vib3,LOW);

digitalWrite(vib2,LOW);

digitalWrite(vib1,LOW);

} else if (angle < -45)

{

digitalWrite(vib7,LOW);

digitalWrite(vib6,HIGH);

digitalWrite(vib5,LOW);

digitalWrite(vib4,LOW);

digitalWrite(vib3,LOW);

digitalWrite(vib2,LOW);

digitalWrite(vib1,LOW);

} else if (angle < -15)

{

digitalWrite(vib7,LOW);

digitalWrite(vib6,LOW);

digitalWrite(vib5,HIGH);

digitalWrite(vib4,LOW);

digitalWrite(vib3,LOW);

digitalWrite(vib2,LOW);

digitalWrite(vib1,LOW);

} else if (angle < 15)

{

digitalWrite(vib7,LOW);

digitalWrite(vib6,LOW);

digitalWrite(vib5,LOW);

digitalWrite(vib4,HIGH);

digitalWrite(vib3,LOW);

digitalWrite(vib2,LOW);

digitalWrite(vib1,LOW);

} else if (angle < 45)

{

digitalWrite(vib7,LOW);

digitalWrite(vib6,LOW);

digitalWrite(vib5,LOW);

digitalWrite(vib4,LOW);

digitalWrite(vib3,HIGH);

digitalWrite(vib2,LOW);

digitalWrite(vib1,LOW);

} else if (angle < 75)

{

Page 78: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

digitalWrite(vib7,LOW);

digitalWrite(vib6,LOW);

digitalWrite(vib5,LOW);

digitalWrite(vib4,LOW);

digitalWrite(vib3,LOW);

digitalWrite(vib2,HIGH);

digitalWrite(vib1,LOW);

} else {

digitalWrite(vib7,LOW);

digitalWrite(vib6,LOW);

digitalWrite(vib5,LOW);

digitalWrite(vib4,LOW);

digitalWrite(vib3,LOW);

digitalWrite(vib2,LOW);

digitalWrite(vib1,HIGH);

}

// print the results to the serial monitor:

Serial.print("sensor = " );

Serial.println(sensorValue);

delay(25);

}

Component List:

1 5.1 Surround Sound System

1 Arduino Uno

1 LSM303DLM Compass Breakout Module

5V Power Supply

1 Car 12V DC to 120V AC Inverter

Assorted Wires

Code

The following is the Arduino sketch that reads compass data and sends it to the computer on the serial

port.

#include <Wire.h>

#include <LSM303.h>

LSM303 compass;

void setup() {

Page 79: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Serial.begin(9600);

Wire.begin();

compass.init();

compass.enableDefault();

//Calibration Values

compass.m_min = (LSM303::vector<int16_t>){-344, -573, -379};

compass.m_max = (LSM303::vector<int16_t>){+136, -33, -202};

}

void loop() {

compass.read();

float heading = compass.heading();

Serial.println((int)heading);

delay(100);

}

The C# code that takes the data and outputs the 3D data information is below.

using System; using System.Collections.Generic; using System.Linq; using System.Text; using IrrKlang; using System.IO.Ports; using System.Threading; using System.Timers; namespace ConsoleApplication2 { class Program { static SerialPort _serialPort = new SerialPort(); static bool _continue = true; static double angle = -1; static double rockingPeriod = 5; //s static bool useSet = false; static System.Timers.Timer timer; static double timeInterval = 1; static double distance = 0.5; const float radius = 5; static void Main(string[] args) { string name; string message; StringComparer stringComparer = StringComparer.OrdinalIgnoreCase; Thread readThread = new Thread(Read); // Allow the user to set the appropriate properties. _serialPort.PortName = "COM11";

Page 80: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

_serialPort.BaudRate = 9600; Set the read/write timeouts _serialPort.ReadTimeout = 500; _serialPort.WriteTimeout = 500; _serialPort.Open(); readThread.Start(); Console.WriteLine("Type QUIT to exit"); while (_continue) { message = Console.ReadLine(); if (stringComparer.Equals("quit", message)) { _continue = false; } else if (stringComparer.Equals("s", message)) { useSet = true; } else if (stringComparer.Equals("c", message)) { useSet = false; } else { } if (useSet == true) { try { rockingPeriod = Convert.ToDouble(message); } catch (FormatException) { Console.WriteLine("Format 2"); } } } readThread.Join(); _serialPort.Close(); } public static void Read() { int angleOffset = 167; // start the sound engine with default parameters ISoundEngine engine = new ISoundEngine(); // Now play some sound stream as music in 3d space, looped.

Page 81: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

// We play it at position (0,0,0) in 3d space ISound music = engine.Play3D("ctd.mp3", 0, 0, 0, true); if (music != null) music.MinDistance = 5.0f; // Print some help text and start the display loop Console.Out.Write("\nPlaying streamed sound in 3D."); Console.Out.Write("\nPress ESCAPE to quit, any other key to play sound at random position.\n\n"); double posOnCircle = 0; while (_continue) { try { string message = _serialPort.ReadLine(); angle = Convert.ToInt32(message); if (useSet) angle = setAngle; angle = 180-(angleOffset - angle); } catch (TimeoutException) { Console.WriteLine("Timeout"); angle = 0; } catch (FormatException) { angle = 0; } posOnCircle = (double)angle * Math.PI / 180.0; Vector3D pos3d = new Vector3D(radius * (float)Math.Cos(posOnCircle), 0, radius * (float)Math.Sin(posOnCircle)); engine.SetListenerPosition(0, 0, 0, 0, 0, 1); if (music != null) music.Position = pos3d; } } } }

Prototype 1

Component List:

Page 82: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

1 Ice Cube Tray

1 Arduino Uno

4 Assorted Scented Wax Cubes

12V Power Supply

6 Alligator Clips

4 10W High Power 10 ohm Resistors

4 RFP50N06 High Power Logic Level Switching MOSFET

1 Low Noise PC Fan

Assorted Wires

An image of the prototype:

Figure 36: Image of Dark Horse Wax Prototype

Prototype Code

// give it a name:

int scent1 = A0;

int scent2 = A1;

int current_scent = 0;

int num_scents = 2;

// the setup routine runs once when you press reset:

void setup() {

Page 83: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

// initialize the digital pin as an output.

pinMode(scent1, OUTPUT);

pinMode(scent2, OUTPUT);

}

// the loop routine runs over and over again forever:

void loop() {

current_scent = current_scent + 1;

if (current_scent == num_scents+1)

{

current_scent = 1;

}

if (current_scent == 1)

{

digitalWrite(scent1,HIGH);

digitalWrite(scent2,LOW);

} else if (current_scent == 2)

{

digitalWrite(scent1,LOW);

digitalWrite(scent2,HIGH);

}

for (int i = 1; i < 60*20; i++)

{

delay(1000);

}

}

Prototype 2

Component List:

1 3 x 3 Muffin Tray

1 Arduino Uno

12V Power Supply

18 Alligator Clips

9 Solenoid Valves

9 RFP50N06 High Power Logic Level Switching MOSFET

9 Assorted Scented Oils

1 Air Mattress Pump

3ft, 5mm ID, 8mm OD PVC Tubing

10”x10”x1/4” Acrylic Sheet

Assorted Wires

Page 84: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

An image of the prototype:

Figure 37: Image of Dark Horse Liquid Scent Release Prototype

Prototype Code

#include "Timers.h"

#define SCENT1 2

#define SCENT2 3

#define SCENT3 4

#define SCENT4 5

#define SCENT5 6

#define SCENT6 7

#define SCENT7 8

#define SCENT8 9

int scent_number = -1;

int time_open = -1;

int scent_to_close = 0;

// the setup routine runs once when you press reset:

void setup() {

// initialize the digital pin as an output.

Page 85: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

pinMode(SCENT1,OUTPUT);

pinMode(SCENT2,OUTPUT);

pinMode(SCENT3,OUTPUT);

pinMode(SCENT4,OUTPUT);

pinMode(SCENT5,OUTPUT);

pinMode(SCENT6,OUTPUT);

pinMode(SCENT7,OUTPUT);

pinMode(SCENT8,OUTPUT);

Serial.begin(9600);

Serial.print("Enter Scent #: ");

}

// the loop routine runs over and over again forever:

void loop() {

if (scent_number == -1)

{

if (Serial.available())

{

scent_number = Serial.parseInt();

Serial.println(scent_number);

Serial.print("Open for how long [mins] (0 = no time): ");

}

} else if (time_open == -1)

{

if (Serial.available())

{

time_open = Serial.parseInt();

Serial.println(time_open);

}

}

if (time_open != -1 && time_open != -2 && time_open != -3)

{

digitalWrite(scent_number+1,HIGH);

if (time_open > 0)

{

TMRArd_InitTimer(0, (long)time_open*60*1000);

scent_to_close = scent_number;

time_open = -2;

} else {

Serial.print("Enter 1 to close scent: ");

scent_to_close = scent_number;

time_open = -3;

}

Page 86: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

}

if (time_open == -3)

{

if (Serial.available())

{

if (Serial.parseInt() == 1)

{

Serial.println("Closing");

digitalWrite(scent_to_close+1,LOW);

Serial.print("Enter Scent #: ");

scent_number = -1;

scent_to_close = 0;

time_open = -1;

}

}

} else if (TMRArd_IsTimerExpired(0) == 1 && scent_to_close != 0)

{

digitalWrite(scent_to_close+1,LOW);

Serial.print("Closing Scent ");

Serial.println(scent_to_close);

scent_to_close = 0;

Serial.print("Enter Scent #: ");

scent_number = -1;

time_open = -1;

}

}

Wizard of Oz Prototype

Component List:

12V Power Supply

1 PC Fan

4 Assorted Scented Oils

1 Car 12V DC to 120V AC Inverter

PC to track map and results

GPS to track location

Progress Bar Code

/*

Blink

Turns on an LED on for one second, then off for one second,

repeatedly.

Page 87: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

This example code is in the public domain.

*/

// Pin 13 has an LED connected on most Arduino boards.

// give it a name:

#define RED 7

#define YELLOW 6

#define GREEN1 5

#define GREEN2 4

#define BLUE1 3

#define BLUE2 2

int current_zone;

// the setup routine runs once when you press reset:

void setup() {

// initialize the digital pin as an output.

pinMode(RED,OUTPUT);

pinMode(YELLOW,OUTPUT);

pinMode(GREEN1,OUTPUT);

pinMode(GREEN2,OUTPUT);

pinMode(BLUE1,OUTPUT);

pinMode(BLUE2,OUTPUT);

digitalWrite(RED,HIGH);

digitalWrite(YELLOW,HIGH);

digitalWrite(GREEN1,HIGH);

digitalWrite(GREEN2,HIGH);

digitalWrite(BLUE1,HIGH);

digitalWrite(BLUE2,HIGH);

Serial.begin(9600);

Serial.print("Enter Command [0 - back, 1 - forward]: ");

current_zone = 0;

}

// the loop routine runs over and over again forever:

void loop() {

if(Serial.available())

{

int input = Serial.parseInt();

Serial.println(input);

Serial.print("Enter Command [0 - back, 1 - forward]: ");

if (input == 0)

{

current_zone = current_zone - 1;

Page 88: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

if (current_zone == -1)

{

current_zone = 0;

}

} else {

current_zone = current_zone + 1;

if (current_zone == 5)

{

current_zone = 4;

}

}

if (current_zone == 0)

{

digitalWrite(RED,HIGH);

digitalWrite(YELLOW,HIGH);

digitalWrite(GREEN1,HIGH);

digitalWrite(GREEN2,HIGH);

digitalWrite(BLUE1,HIGH);

digitalWrite(BLUE2,HIGH);

} else if (current_zone == 1)

{

digitalWrite(RED,HIGH);

digitalWrite(YELLOW,HIGH);

digitalWrite(GREEN1,HIGH);

digitalWrite(GREEN2,HIGH);

digitalWrite(BLUE1,LOW);

digitalWrite(BLUE2,LOW);

} else if (current_zone == 2)

{

digitalWrite(RED,HIGH);

digitalWrite(YELLOW,HIGH);

digitalWrite(GREEN1,LOW);

digitalWrite(GREEN2,LOW);

digitalWrite(BLUE1,LOW);

digitalWrite(BLUE2,LOW);

} else if (current_zone == 3)

{

digitalWrite(RED,HIGH);

digitalWrite(YELLOW,LOW);

digitalWrite(GREEN1,LOW);

digitalWrite(GREEN2,LOW);

digitalWrite(BLUE1,LOW);

digitalWrite(BLUE2,LOW);

} else if (current_zone == 4)

{

digitalWrite(RED,LOW);

Page 89: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

digitalWrite(YELLOW,LOW);

digitalWrite(GREEN1,LOW);

digitalWrite(GREEN2,LOW);

digitalWrite(BLUE1,LOW);

digitalWrite(BLUE2,LOW);

}

}

}

Component List:

0.5m Neopixel LED strip x 2

Accelerometer - MPU6050

Vibrating Motors x 3

5V – 5A Power Supply

Arduino Uno

Potentiometer x3

Assorted Connectors

Assorted Wires

Power MOSFETS x3

470uF Bulk Capacitor

Images of this prototype can be seen below:

Figure 38: Photos of Funktional Prototype

Once again an Arduino was used to control this system, the code for both prototypes can be seen below.

Potentiometer Controlled Prototype:

// NeoPixel Ring simple sketch (c) 2013 Shae Erisson

// released under the GPLv3 license to match the rest of the AdaFruit

NeoPixel library

#include <Adafruit_NeoPixel.h>

#include <avr/power.h>

Page 90: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

// Which pin on the Arduino is connected to the NeoPixels?

// On a Trinket or Gemma we suggest changing this to 1

#define SPEED_PIN 7

#define TURN_PIN 6

#define L_VIB 5

#define C_VIB 4

#define R_VIB 3

#define SPEED_START 8

#define SPEED_END 22

#define ACCEL_END 7

#define ACCEL_START 23

#define TURN_NUM_PIXELS 3

// How many NeoPixels are attached to the Arduino?

#define NUMPIXELS 30

#define LED_MAXVAL 150

#define VIB_TIME_ON 100

#define VIB_TIME_OFF 200

Adafruit_NeoPixel pixelsSpeed = Adafruit_NeoPixel(NUMPIXELS,

SPEED_PIN, NEO_GRB + NEO_KHZ800);

Adafruit_NeoPixel pixelsTurn = Adafruit_NeoPixel(NUMPIXELS, TURN_PIN,

NEO_GRB + NEO_KHZ800);

int flasher = 0;

int flasher_sub = 1;

int counter = 0;

int counter_turn = 0;

int counter_accel = 0;

#define TURN_BUF_SIZE 5

int turn_buffer[TURN_BUF_SIZE];

int turn_buffer_index = 0;

int turn_flasher = 0;

int turn_flasher_sub = 1;

void setup() {

// End of trinket special code

Serial.begin(9600);

pixelsSpeed.begin(); // This initializes the NeoPixel library.

pixelsTurn.begin();

Page 91: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

for (int i=0;i<NUMPIXELS;i++)

{

pixelsSpeed.setPixelColor(i,pixelsSpeed.Color(0,0,LED_MAXVAL));

pixelsTurn.setPixelColor(i,pixelsTurn.Color(0,0,0));

}

pixelsSpeed.show();

pixelsTurn.show();

pinMode(5,OUTPUT);

pinMode(4,OUTPUT);

pinMode(3,OUTPUT);

for (int i = 0; i < TURN_BUF_SIZE; i++)

{

turn_buffer[i] = analogRead(A2);

}

}

void loop() {

///// LED CODE ////

int speed_val = analogRead(A1);

int turn_val = analogRead(A2);

int accel_val = analogRead(A3);

turn_buffer[turn_buffer_index] = turn_val;

turn_buffer_index++;

if (turn_buffer_index == TURN_BUF_SIZE)

{

turn_buffer_index = 0;

}

turn_val = 0;

for (int i = 0; i<TURN_BUF_SIZE;i++)

{

turn_val = turn_val + turn_buffer[i];

}

turn_val = turn_val / float(TURN_BUF_SIZE);

speed_val = 1023 - speed_val;

turn_val = 1023 - turn_val;

accel_val = 1023 - accel_val;

//////// TURNING CODE //////////

Page 92: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

int centre_pixel = turn_val/1023.0*30.0;

int yellow = 0;

if (turn_val > 923)

{

digitalWrite(L_VIB,HIGH);

if (counter_turn > VIB_TIME_ON + VIB_TIME_OFF)

{

counter_turn = 0;

}

if (turn_flasher_sub == 1)

{

yellow = LED_MAXVAL - turn_flasher;

turn_flasher++;

if (turn_flasher == LED_MAXVAL-15)

{

turn_flasher_sub = 0;

}

} else {

yellow = LED_MAXVAL - turn_flasher;

turn_flasher--;

if (turn_flasher == 15)

{

turn_flasher_sub = 1;

}

}

} else if (turn_val < 100)

{

digitalWrite(R_VIB,HIGH);

if (counter_turn > VIB_TIME_ON + VIB_TIME_OFF)

{

counter_turn = 0;

}

if (turn_flasher_sub == 1)

{

yellow = LED_MAXVAL - turn_flasher;

turn_flasher++;

if (turn_flasher == LED_MAXVAL-15)

{

turn_flasher_sub = 0;

}

} else {

yellow = LED_MAXVAL - turn_flasher;

turn_flasher--;

if (turn_flasher == 15)

{

turn_flasher_sub = 1;

Page 93: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

}

}

} else {

yellow = LED_MAXVAL;

}

counter_turn++;

if (counter_turn > VIB_TIME_ON)

{

digitalWrite(L_VIB,LOW);

digitalWrite(R_VIB,LOW);

}

if (centre_pixel < TURN_NUM_PIXELS/2)

{

centre_pixel = TURN_NUM_PIXELS/2;

} else if(centre_pixel > NUMPIXELS - TURN_NUM_PIXELS/2-1)

{

centre_pixel = NUMPIXELS - TURN_NUM_PIXELS/2 - 1;

}

for (int i=0;i<NUMPIXELS;i++)

{

pixelsTurn.setPixelColor(i,pixelsTurn.Color(0,0,0));

if ((i-1) == centre_pixel || (i+1) == centre_pixel || i ==

centre_pixel)

{

pixelsTurn.setPixelColor(i,pixelsTurn.Color(yellow, yellow, 0));

}

}

///// CURRENT SPEED CODE ////////

int red;

int blue;

int green;

if (speed_val < 256){

red = 255;

blue = speed_val;

green = 0;

} else if (speed_val < 512) {

red = 512 - speed_val;

blue = 255;

green = 0;

} else if (speed_val < 768)

{

red = 0;

Page 94: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

blue = 255;

green = speed_val - 512;

} else {

red = 0;

blue = 1023-speed_val;

green = 255;

}

for (int i=SPEED_START; i<SPEED_END+1;i++)

{

pixelsSpeed.setPixelColor(i,pixelsSpeed.Color(red,green,blue));

}

//// ACCEL CODE //////

int a_red;

int a_blue;

int a_green;

if (accel_val < 170) {

digitalWrite(C_VIB,HIGH);

if (counter_accel > VIB_TIME_ON + VIB_TIME_OFF)

{

counter_accel = 0;

}

if (flasher_sub == 1)

{

a_red = 255 - flasher;

flasher++;

if (flasher == 240)

{

flasher_sub = 0;

}

} else {

a_red = 255 - flasher;

flasher--;

if (flasher == 15)

{

flasher_sub = 1;

}

}

a_green = 0;

a_blue = 0;

} else if (accel_val < 340){

a_red = 255;

a_blue = (accel_val-170)/170.0*256.0;

a_green = 0;

} else if (accel_val < 510) {

Page 95: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

a_red = 255 - (accel_val-340)/170.0*256.0-1;

a_blue = 255;

a_green = 0;

} else if (accel_val < 680)

{

a_red = 0;

a_blue = 255;

a_green = (accel_val-510)/170.0*256.0;

} else if (accel_val < 850){

a_red = 0;

a_blue = 255-(accel_val-850)/170.0*256.0-1;

a_green = 255;

} else {

digitalWrite(C_VIB,HIGH);

if (counter_accel > VIB_TIME_ON + VIB_TIME_OFF)

{

counter_accel = 0;

}

if (flasher_sub == 1)

{

a_green = 255 - flasher;

flasher++;

if (flasher == 240)

{

flasher_sub = 0;

}

} else {

a_green = 255 - flasher;

flasher--;

if (flasher == 15)

{

flasher_sub = 1;

}

}

a_red = 0;

a_blue = 0;

}

counter_accel++;

if (counter_accel > VIB_TIME_ON)

{

digitalWrite(C_VIB,LOW);

}

for (int i=0; i<ACCEL_END+1;i++)

{

pixelsSpeed.setPixelColor(i,pixelsSpeed.Color(a_red,a_green,a_blue));

Page 96: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

}

for (int i=ACCEL_START; i < NUMPIXELS; i++)

{

pixelsSpeed.setPixelColor(i,pixelsSpeed.Color(a_red,a_green,a_blue));

}

//// Show Pixels ////

pixelsSpeed.show();

pixelsTurn.show();

delay(5);

}

Accelerometer Prototype:

// NeoPixel Ring simple sketch (c) 2013 Shae Erisson

// released under the GPLv3 license to match the rest of the AdaFruit

NeoPixel library

#include <Adafruit_NeoPixel.h>

#include <avr/power.h>

#include "I2Cdev.h"

#include "MPU6050.h"

// Arduino Wire library is required if I2Cdev I2CDEV_ARDUINO_WIRE

implementation

// is used in I2Cdev.h

#if I2CDEV_IMPLEMENTATION == I2CDEV_ARDUINO_WIRE

#include "Wire.h"

#endif

// class default I2C address is 0x68

// specific I2C addresses may be passed as a parameter here

// AD0 low = 0x68 (default for InvenSense evaluation board)

// AD0 high = 0x69

MPU6050 accelgyro;

int16_t ax1, ay1, az1;

int16_t gx1, gy1, gz1;

#define OUTPUT_READABLE_ACCELGYRO

Page 97: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

// Which pin on the Arduino is connected to the NeoPixels?

// On a Trinket or Gemma we suggest changing this to 1

#define ACCEL_PIN 7

#define GYRO_PIN 6

#define L_VIB 5

#define C_VIB 4

#define R_VIB 3

#define TURN_NUM_PIXELS 3

// How many NeoPixels are attached to the Arduino?

#define NUMPIXELS 30

#define LED_MAXVAL 100

#define VIB_TIME_ON 1000

#define VIB_TIME_OFF 2000

#define MAX_ACCEL_VAL 100

#define MAX_GYRO_VAL 100

#define BUF_SIZE 50

#define P_ACCEL 3000

#define N_ACCEL -4000

#define A_OFFSET -3600

Adafruit_NeoPixel pixelsAccel = Adafruit_NeoPixel(NUMPIXELS,

ACCEL_PIN, NEO_GRB + NEO_KHZ800);

Adafruit_NeoPixel pixelsGyro = Adafruit_NeoPixel(NUMPIXELS, GYRO_PIN,

NEO_GRB + NEO_KHZ800);

int accel_vib_state = 0;

long prev_accel_vib_time = 0;

int gravity = 15515;

void setup() {

#if I2CDEV_IMPLEMENTATION == I2CDEV_ARDUINO_WIRE

Wire.begin();

#elif I2CDEV_IMPLEMENTATION == I2CDEV_BUILTIN_FASTWIRE

Fastwire::setup(400, true);

#endif

// End of trinket special code

Serial.begin(9600);

Serial.println("Initializing I2C devices...");

accelgyro.initialize();

Page 98: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Serial.println("Testing device connections...");

Serial.println(accelgyro.testConnection() ? "MPU6050_1 connection

successful" : "MPU6050_1 connection failed");

pixelsAccel.begin(); // This initializes the NeoPixel library.

pixelsGyro.begin();

pinMode(L_VIB,OUTPUT);

pinMode(R_VIB,OUTPUT);

pinMode(C_VIB,OUTPUT);

Serial.println("a_avg,led_val,a_red,a_green,a_blue");

}

void loop() {

// Get average accel and gyro data //

float a_avg = 0;

float g_avg = 0;

for (int i = 0; i < BUF_SIZE; i++)

{

accelgyro.getMotion6(&ax1, &ay1, &az1, &gx1, &gy1, &gz1);

delay(5);

a_avg = a_avg + ax1;

g_avg = g_avg + sqrt(gx1*gx1+gy1*gy1+gz1*gz1);

}

a_avg = a_avg/float(BUF_SIZE);

g_avg = g_avg/float(BUF_SIZE);

// LED ACCEL CODE //

int a_led_val = 0;

if (a_avg > 0)

{

a_led_val = 512.0*(a_avg-A_OFFSET)/float(P_ACCEL)+511;

} else if (a_avg < 0)

{

a_led_val = 512-(a_avg-A_OFFSET)/float(N_ACCEL)*512.0;

}

if (a_led_val > 1023)

Page 99: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

{

a_led_val = 1023;

} else if (a_led_val < 0)

{

a_led_val = 0;

}

int a_red;

int a_blue;

int a_green;

if (a_led_val < 256){

a_red = 255;

a_blue = a_led_val;

a_green = 0;

} else if (a_led_val < 512) {

a_red = 512 - a_led_val;

a_blue = 255;

a_green = 0;

} else if (a_led_val < 768)

{

a_red = 0;

a_blue = 255;

a_green = a_led_val - 512;

} else {

a_red = 0;

a_blue = 1023-a_led_val;

a_green = 255;

}

for (int i=0; i<NUMPIXELS;i++)

{

pixelsAccel.setPixelColor(i,pixelsAccel.Color(a_red,a_green,a_blue));

}

pixelsAccel.show();

pixelsGyro.show();

// ACCEL VIB CODE //

if ((a_avg-A_OFFSET) > P_ACCEL || (a_avg-A_OFFSET) < N_ACCEL)

{

delay(3000);

if (accel_vib_state == 0 && millis() - prev_accel_vib_time >

VIB_TIME_OFF)

{

digitalWrite(C_VIB,HIGH);

prev_accel_vib_time = millis();

accel_vib_state = 1;

Page 100: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

} else if (accel_vib_state == 1 && millis() - prev_accel_vib_time

> VIB_TIME_ON)

{

digitalWrite(C_VIB,LOW);

prev_accel_vib_time = millis();

accel_vib_state = 0;

}

} else {

digitalWrite(C_VIB,LOW);

prev_accel_vib_time = 0;

accel_vib_state = 0;

}

// GYRO LED CODE //

Serial.print(a_avg);Serial.print(", ");

Serial.print(a_led_val);Serial.print(", ");

Serial.print(a_red);Serial.print(", ");

Serial.print(a_green);Serial.print(", ");

Serial.println(a_blue);

}

Component List:

Kinect v2 Sensor

Cables to allow for Windows Utility

2m Neopixel LED strip

Arduino Uno

5V-5A Power Supply

Laptop running Windows and Kinect SDK

Mounting Materials – Cardboard, Tape

LED Diffuser – Paper Towel

USB Cable

Assorted Wires

Assorted Connectors

Some images of this prototype can be found earlier in this report. More photos can be seen below.

Page 101: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Figure 39: Photos of Functional Prototype including Kinect Data and User Testing

Once again an Arduino was used to control the LED strips. Code for this Arduino can be seen below. The

Arduino code was required to read the serial port as well as driving the LEDs.

#include <Adafruit_NeoPixel.h>

#include <avr/power.h>

#define PIN 8

// How many NeoPixels are attached to the Arduino?

#define NUMPIXELS 120

#define MAX_LED 100

#define NUMFRAMES 5

Page 102: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Adafruit_NeoPixel pixels = Adafruit_NeoPixel(NUMPIXELS, PIN, NEO_GRB +

NEO_KHZ800);

char pixel_array[NUMPIXELS][NUMFRAMES];

char pixel_sum[NUMPIXELS];

int pixel_index = -1;

// the setup routine runs once when you press reset:

void setup() {

// initialize serial communication at 9600 bits per second:

Serial.begin(115200);

// make the pushbutton's pin an input:

pixels.begin();

for(int i=0;i<NUMPIXELS;i++){

// pixels.Color takes RGB values, from 0,0,0 up to 255,255,255

pixels.setPixelColor(i, pixels.Color(0,0,0)); // Moderately bright

green color.

}

pixels.show(); // This sends the updated pixel color to the hardware.

while (1)

{

char input;

if (Serial.available())

{

input = Serial.read();

if (input == 'R' || input == 82)

{

pixel_index++;

if (pixel_index >= NUMFRAMES)

{

break;

}

for (int i = 0; i < NUMPIXELS; i++)

{

pixel_array[i][pixel_index] = 0;

}

} else if (input == 'f' || input == 102)

{

while (!Serial.available()) {}

int led_num = Serial.parseInt();

if (pixel_index != -1 && led_num >= 0 && led_num < NUMPIXELS)

{

Page 103: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

pixel_array[led_num][pixel_index] = 1;

}

}

}

}

for (int i = 0; i < NUMPIXELS; i++)

{

pixel_sum[i] = 0;

for (int j = 0; j < NUMFRAMES; j++)

{

pixel_sum[i] = pixel_sum[i] + pixel_array[i][j];

}

}

pixel_index = -1;

}

// the loop routine runs over and over again forever:

void loop() {

// read the input pin:

char input;

if (Serial.available())

{

input = Serial.read();

if (input == 'R' || input == 82)

{

pixel_index++;

if (pixel_index >= NUMFRAMES)

{

pixel_index = 0;

}

for (int i = 0; i < NUMPIXELS; i++)

{

if (pixel_sum[i] > 3)

{

pixels.setPixelColor(i,pixels.Color(MAX_LED/2,MAX_LED/2,0));

} else {

pixels.setPixelColor(i,pixels.Color(0,0,0));

}

pixel_sum[i] = pixel_sum[i] - pixel_array[i][pixel_index];

pixel_array[i][pixel_index] = 0;

}

Page 104: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

pixels.show();

} else if (input == 'f' || input == 102)

{

while(!Serial.available()) {}

int led_num = Serial.parseInt();

if (pixel_index != -1 && led_num >= 0 && led_num < NUMPIXELS)

{

pixel_array[led_num][pixel_index] = 1;

pixel_sum[led_num] = pixel_sum[led_num] + 1;

}

}

}

}

In order to complete this prototype it was also required to hack into the Kinect SDK. This allowed the

system to send information to the Arduino via a COM port. The code that accomplished this can be seen

below.

//--------------------------------------------------------------------

----------

// <copyright file="DepthBasics.cpp" company="Microsoft">

// Copyright (c) Microsoft Corporation. All rights reserved.

// </copyright>

//--------------------------------------------------------------------

----------

#include "stdafx.h"

#include <strsafe.h>

#include "resource.h"

#include "DepthBasics.h"

#include "SerialClass.h"

#include <string>

#include <stdio.h>

Serial* SP;

/// <summary>

/// Entry point for the application

/// </summary>

/// <param name="hInstance">handle to the application instance</param>

/// <param name="hPrevInstance">always 0</param>

Page 105: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

/// <param name="lpCmdLine">command line arguments</param>

/// <param name="nCmdShow">whether to display minimized, maximized, or

normally</param>

/// <returns>status</returns>

int APIENTRY wWinMain(

_In_ HINSTANCE hInstance,

_In_opt_ HINSTANCE hPrevInstance,

_In_ LPWSTR lpCmdLine,

_In_ int nShowCmd

)

{

UNREFERENCED_PARAMETER(hPrevInstance);

UNREFERENCED_PARAMETER(lpCmdLine);

SP = new Serial("\\\\.\\COM15");

while (!SP->IsConnected())

{

SP = NULL;

SP = new Serial("\\\\.\\COM15");

OutputDebugStringA("We're Not Connected\n");

}

OutputDebugStringA("We're Connected");

CDepthBasics application;

application.Run(hInstance, nShowCmd);

}

/// <summary>

/// Constructor

/// </summary>

CDepthBasics::CDepthBasics() :

m_hWnd(NULL),

m_nStartTime(0),

m_nLastCounter(0),

m_nFramesSinceUpdate(0),

m_fFreq(0),

m_nNextStatusTime(0LL),

m_bSaveScreenshot(false),

m_pKinectSensor(NULL),

m_pDepthFrameReader(NULL),

m_pD2DFactory(NULL),

m_pDrawDepth(NULL),

m_pDepthRGBX(NULL)

{

LARGE_INTEGER qpf = {0};

if (QueryPerformanceFrequency(&qpf))

{

Page 106: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

m_fFreq = double(qpf.QuadPart);

}

// create heap storage for depth pixel data in RGBX format

m_pDepthRGBX = new RGBQUAD[cDepthWidth * cDepthHeight];

}

/// <summary>

/// Destructor

/// </summary>

CDepthBasics::~CDepthBasics()

{

// clean up Direct2D renderer

if (m_pDrawDepth)

{

delete m_pDrawDepth;

m_pDrawDepth = NULL;

}

if (m_pDepthRGBX)

{

delete [] m_pDepthRGBX;

m_pDepthRGBX = NULL;

}

// clean up Direct2D

SafeRelease(m_pD2DFactory);

// done with depth frame reader

SafeRelease(m_pDepthFrameReader);

// close the Kinect Sensor

if (m_pKinectSensor)

{

m_pKinectSensor->Close();

}

SafeRelease(m_pKinectSensor);

}

/// <summary>

/// Creates the main window and begins processing

/// </summary>

/// <param name="hInstance">handle to the application instance</param>

/// <param name="nCmdShow">whether to display minimized, maximized, or

normally</param>

Page 107: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

int CDepthBasics::Run(HINSTANCE hInstance, int nCmdShow)

{

MSG msg = {0};

WNDCLASS wc;

// Dialog custom window class

ZeroMemory(&wc, sizeof(wc));

wc.style = CS_HREDRAW | CS_VREDRAW;

wc.cbWndExtra = DLGWINDOWEXTRA;

wc.hCursor = LoadCursorW(NULL, IDC_ARROW);

wc.hIcon = LoadIconW(hInstance, MAKEINTRESOURCE(IDI_APP));

wc.lpfnWndProc = DefDlgProcW;

wc.lpszClassName = L"DepthBasicsAppDlgWndClass";

if (!RegisterClassW(&wc))

{

return 0;

}

// Create main application window

HWND hWndApp = CreateDialogParamW(

NULL,

MAKEINTRESOURCE(IDD_APP),

NULL,

(DLGPROC)CDepthBasics::MessageRouter,

reinterpret_cast<LPARAM>(this));

// Show window

ShowWindow(hWndApp, nCmdShow);

// Main message loop

while (WM_QUIT != msg.message)

{

Update();

while (PeekMessageW(&msg, NULL, 0, 0, PM_REMOVE))

{

// If a dialog message will be taken care of by the dialog

proc

if (hWndApp && IsDialogMessageW(hWndApp, &msg))

{

continue;

}

TranslateMessage(&msg);

DispatchMessageW(&msg);

}

Page 108: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

}

return static_cast<int>(msg.wParam);

}

/// <summary>

/// Main processing function

/// </summary>

void CDepthBasics::Update()

{

if (!m_pDepthFrameReader)

{

return;

}

IDepthFrame* pDepthFrame = NULL;

HRESULT hr = m_pDepthFrameReader-

>AcquireLatestFrame(&pDepthFrame);

if (SUCCEEDED(hr))

{

INT64 nTime = 0;

IFrameDescription* pFrameDescription = NULL;

int nWidth = 0;

int nHeight = 0;

USHORT nDepthMinReliableDistance = 0;

USHORT nDepthMaxDistance = 0;

UINT nBufferSize = 0;

UINT16 *pBuffer = NULL;

hr = pDepthFrame->get_RelativeTime(&nTime);

if (SUCCEEDED(hr))

{

hr = pDepthFrame-

>get_FrameDescription(&pFrameDescription);

}

if (SUCCEEDED(hr))

{

hr = pFrameDescription->get_Width(&nWidth);

}

if (SUCCEEDED(hr))

{

hr = pFrameDescription->get_Height(&nHeight);

Page 109: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

}

if (SUCCEEDED(hr))

{

hr = pDepthFrame-

>get_DepthMinReliableDistance(&nDepthMinReliableDistance);

}

if (SUCCEEDED(hr))

{

// In order to see the full range of depth (including

the less reliable far field depth)

// we are setting nDepthMaxDistance to the extreme

potential depth threshold

nDepthMaxDistance = USHRT_MAX;

// Note: If you wish to filter by reliable depth

distance, uncomment the following line.

//// hr = pDepthFrame-

>get_DepthMaxReliableDistance(&nDepthMaxDistance);

}

if (SUCCEEDED(hr))

{

hr = pDepthFrame->AccessUnderlyingBuffer(&nBufferSize,

&pBuffer);

}

if (SUCCEEDED(hr))

{

ProcessDepth(nTime, pBuffer, nWidth, nHeight,

nDepthMinReliableDistance, nDepthMaxDistance);

}

SafeRelease(pFrameDescription);

}

SafeRelease(pDepthFrame);

}

/// <summary>

/// Handles window messages, passes most to the class instance to

handle

/// </summary>

/// <param name="hWnd">window message is for</param>

/// <param name="uMsg">message</param>

/// <param name="wParam">message data</param>

Page 110: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

/// <param name="lParam">additional message data</param>

/// <returns>result of message processing</returns>

LRESULT CALLBACK CDepthBasics::MessageRouter(HWND hWnd, UINT uMsg,

WPARAM wParam, LPARAM lParam)

{

CDepthBasics* pThis = NULL;

if (WM_INITDIALOG == uMsg)

{

pThis = reinterpret_cast<CDepthBasics*>(lParam);

SetWindowLongPtr(hWnd, GWLP_USERDATA,

reinterpret_cast<LONG_PTR>(pThis));

}

else

{

pThis =

reinterpret_cast<CDepthBasics*>(::GetWindowLongPtr(hWnd,

GWLP_USERDATA));

}

if (pThis)

{

return pThis->DlgProc(hWnd, uMsg, wParam, lParam);

}

return 0;

}

/// <summary>

/// Handle windows messages for the class instance

/// </summary>

/// <param name="hWnd">window message is for</param>

/// <param name="uMsg">message</param>

/// <param name="wParam">message data</param>

/// <param name="lParam">additional message data</param>

/// <returns>result of message processing</returns>

LRESULT CALLBACK CDepthBasics::DlgProc(HWND hWnd, UINT message, WPARAM

wParam, LPARAM lParam)

{

UNREFERENCED_PARAMETER(wParam);

UNREFERENCED_PARAMETER(lParam);

switch (message)

{

case WM_INITDIALOG:

{

// Bind application window handle

Page 111: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

m_hWnd = hWnd;

// Init Direct2D

D2D1CreateFactory(D2D1_FACTORY_TYPE_SINGLE_THREADED,

&m_pD2DFactory);

// Create and initialize a new Direct2D image renderer

(take a look at ImageRenderer.h)

// We'll use this to draw the data we receive from the

Kinect to the screen

m_pDrawDepth = new ImageRenderer();

HRESULT hr = m_pDrawDepth->Initialize(GetDlgItem(m_hWnd,

IDC_VIDEOVIEW), m_pD2DFactory, cDepthWidth, cDepthHeight, cDepthWidth

* sizeof(RGBQUAD));

if (FAILED(hr))

{

SetStatusMessage(L"Failed to initialize the Direct2D

draw device.", 10000, true);

}

// Get and initialize the default Kinect sensor

InitializeDefaultSensor();

}

break;

// If the titlebar X is clicked, destroy app

case WM_CLOSE:

DestroyWindow(hWnd);

break;

case WM_DESTROY:

// Quit the main message pump

PostQuitMessage(0);

break;

// Handle button press

case WM_COMMAND:

// If it was for the screenshot control and a button

clicked event, save a screenshot next frame

if (IDC_BUTTON_SCREENSHOT == LOWORD(wParam) && BN_CLICKED

== HIWORD(wParam))

{

m_bSaveScreenshot = true;

}

break;

}

Page 112: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

return FALSE;

}

/// <summary>

/// Initializes the default Kinect sensor

/// </summary>

/// <returns>indicates success or failure</returns>

HRESULT CDepthBasics::InitializeDefaultSensor()

{

HRESULT hr;

hr = GetDefaultKinectSensor(&m_pKinectSensor);

if (FAILED(hr))

{

return hr;

}

if (m_pKinectSensor)

{

// Initialize the Kinect and get the depth reader

IDepthFrameSource* pDepthFrameSource = NULL;

hr = m_pKinectSensor->Open();

if (SUCCEEDED(hr))

{

hr = m_pKinectSensor-

>get_DepthFrameSource(&pDepthFrameSource);

}

if (SUCCEEDED(hr))

{

hr = pDepthFrameSource->OpenReader(&m_pDepthFrameReader);

}

SafeRelease(pDepthFrameSource);

}

if (!m_pKinectSensor || FAILED(hr))

{

SetStatusMessage(L"No ready Kinect found!", 10000, true);

return E_FAIL;

}

return hr;

}

Page 113: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

/// <summary>

/// Handle new depth data

/// <param name="nTime">timestamp of frame</param>

/// <param name="pBuffer">pointer to frame data</param>

/// <param name="nWidth">width (in pixels) of input image data</param>

/// <param name="nHeight">height (in pixels) of input image

data</param>

/// <param name="nMinDepth">minimum reliable depth</param>

/// <param name="nMaxDepth">maximum reliable depth</param>

/// </summary>

void CDepthBasics::ProcessDepth(INT64 nTime, const UINT16* pBuffer,

int nWidth, int nHeight, USHORT nMinDepth, USHORT nMaxDepth)

{

if (m_hWnd)

{

if (!m_nStartTime)

{

m_nStartTime = nTime;

}

double fps = 0.0;

LARGE_INTEGER qpcNow = {0};

if (m_fFreq)

{

if (QueryPerformanceCounter(&qpcNow))

{

if (m_nLastCounter)

{

m_nFramesSinceUpdate++;

fps = m_fFreq * m_nFramesSinceUpdate /

double(qpcNow.QuadPart - m_nLastCounter);

}

}

}

WCHAR szStatusMessage[64];

StringCchPrintf(szStatusMessage, _countof(szStatusMessage), L"

FPS = %0.2f Time = %I64d", fps, (nTime - m_nStartTime));

if (SetStatusMessage(szStatusMessage, 1000, false))

{

m_nLastCounter = qpcNow.QuadPart;

m_nFramesSinceUpdate = 0;

}

}

Page 114: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

// Make sure we've received valid data

if (m_pDepthRGBX && pBuffer && (nWidth == cDepthWidth) && (nHeight

== cDepthHeight))

{

RGBQUAD* pRGBX = m_pDepthRGBX;

// end pixel is start + width*height - 1

const UINT16* pBufferEnd = pBuffer + (nWidth * nHeight);

UINT16 range = 100;

//UINT16 start_row = cDepthHeight/2-range+1;

//UINT16 end_row = cDepthHeight/2+range;

UINT16 start_row = 180-range+1;

UINT16 end_row = 180+range;

const UINT16* pBufferStartIndex = pBuffer + (nWidth *

(start_row-1));

const UINT16* pBufferEndIndex = pBuffer + (nWidth *

(end_row))-1;

const UINT16* pBufferStart = pBuffer;

char debugout[256] = "";

UINT col[cDepthWidth] = { 0 };

int index_counter = 0;

int minThres = 1000;

int maxThres = 6000;

while (pBuffer < pBufferEnd)

{

USHORT depth = *pBuffer;

//sprintf(debugout, "Depth: %d\n",depth);

//OutputDebugStringA(debugout);

// To convert to a byte, we're discarding the most-

significant

// rather than least-significant bits.

// We're preserving detail, although the intensity will

"wrap."

// Values outside the reliable depth range are mapped to 0

(black).

// Note: Using conditionals in this loop could degrade

performance.

Page 115: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

// Consider using a lookup table instead when writing

production code.

BYTE intensity = static_cast<BYTE>((depth >= nMinDepth) &&

(depth <= nMaxDepth) ? (depth % 256) : 0);

if (pBuffer > pBufferStartIndex && pBuffer <=

pBufferEndIndex)

{

pRGBX->rgbRed = intensity;

pRGBX->rgbGreen = 0;

pRGBX->rgbBlue = 0;

}

else {

pRGBX->rgbRed = intensity;

pRGBX->rgbGreen = intensity;

pRGBX->rgbBlue = intensity;

}

if (pBuffer > pBufferStartIndex && pBuffer <=

pBufferEndIndex)

{

/*col[index_counter] += depth;

index_counter++;*/

/*if (depth < col[index_counter])

{

col[index_counter] = depth;

}

index_counter++;*/

if (depth > minThres && depth < maxThres)

{

col[index_counter]++;

}

index_counter++;

if (!(index_counter < cDepthWidth))

{

index_counter = 0;

}

}

Page 116: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

++pRGBX;

++pBuffer;

}

char outgoing[8] = "R";

int dataLengthOut = 8;

//char incoming[256];

//int dataLengthIn = 256;

//int readResult = 0;

int prev_pin = 0;

SP->WriteData(outgoing, dataLengthOut);

OutputDebugStringA("\nR");

sprintf(outgoing,"");

for (int i = 0; i < cDepthWidth; i++)

{

//col[i] = col[i] / (range * 2);

if (/*col[i] > minThres && col[i] < maxThres*/ col[i]

>= 10)

{

col[i] = 1;

int angle = (120.0-120.0*i / float(cDepthWidth));

if (angle == prev_pin)

{

}

else {

sprintf(outgoing, "f%d", angle);

OutputDebugStringA(outgoing);

SP->WriteData(outgoing, dataLengthOut);

sprintf(outgoing, "");

prev_pin = angle;

}

}

else {

col[i] = 0;

}

Page 117: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

//SP->ReadData(incoming, dataLengthIn);

//std::string test(incoming);

//sprintf(incoming,"%s\n", incoming);

//OutputDebugStringA(incoming);

}

pRGBX = m_pDepthRGBX;

pBuffer = pBufferStart;

index_counter = 0;

while (pBuffer < pBufferEnd)

{

USHORT depth = *pBuffer;

//sprintf(debugout, "Depth: %d\n",depth);

//OutputDebugStringA(debugout);

// To convert to a byte, we're discarding the most-

significant

// rather than least-significant bits.

// We're preserving detail, although the intensity

will "wrap."

// Values outside the reliable depth range are mapped

to 0 (black).

// Note: Using conditionals in this loop could degrade

performance.

// Consider using a lookup table instead when writing

production code.

BYTE intensity = static_cast<BYTE>((depth >=

nMinDepth) && (depth <= nMaxDepth) ? (depth % 256) : 0);

if (pBuffer > pBufferStartIndex && pBuffer <=

pBufferEndIndex)

{

pRGBX->rgbRed = 255 * col[index_counter];

pRGBX->rgbGreen = 0;

pRGBX->rgbBlue = 0;

index_counter++;

if (!(index_counter < cDepthWidth))

{

index_counter = 0;

}

}

else {

Page 118: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

pRGBX->rgbRed = intensity;

pRGBX->rgbGreen = intensity;

pRGBX->rgbBlue = intensity;

}

++pRGBX;

++pBuffer;

}

// Draw the data with Direct2D

m_pDrawDepth->Draw(reinterpret_cast<BYTE*>(m_pDepthRGBX),

cDepthWidth * cDepthHeight * sizeof(RGBQUAD));

if (m_bSaveScreenshot)

{

WCHAR szScreenshotPath[MAX_PATH];

// Retrieve the path to My Photos

GetScreenshotFileName(szScreenshotPath,

_countof(szScreenshotPath));

// Write out the bitmap to disk

HRESULT hr =

SaveBitmapToFile(reinterpret_cast<BYTE*>(m_pDepthRGBX), nWidth,

nHeight, sizeof(RGBQUAD) * 8, szScreenshotPath);

WCHAR szStatusMessage[64 + MAX_PATH];

if (SUCCEEDED(hr))

{

// Set the status bar to show where the screenshot was

saved

StringCchPrintf(szStatusMessage,

_countof(szStatusMessage), L"Screenshot saved to %s",

szScreenshotPath);

}

else

{

StringCchPrintf(szStatusMessage,

_countof(szStatusMessage), L"Failed to write screenshot to %s",

szScreenshotPath);

}

SetStatusMessage(szStatusMessage, 5000, true);

// toggle off so we don't save a screenshot again next

frame

Page 119: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

m_bSaveScreenshot = false;

}

}

}

/// <summary>

/// Set the status bar message

/// </summary>

/// <param name="szMessage">message to display</param>

/// <param name="showTimeMsec">time in milliseconds to ignore future

status messages</param>

/// <param name="bForce">force status update</param>

bool CDepthBasics::SetStatusMessage(_In_z_ WCHAR* szMessage, DWORD

nShowTimeMsec, bool bForce)

{

INT64 now = GetTickCount64();

if (m_hWnd && (bForce || (m_nNextStatusTime <= now)))

{

SetDlgItemText(m_hWnd, IDC_STATUS, szMessage);

m_nNextStatusTime = now + nShowTimeMsec;

return true;

}

return false;

}

/// <summary>

/// Get the name of the file where screenshot will be stored.

/// </summary>

/// <param name="lpszFilePath">string buffer that will receive

screenshot file name.</param>

/// <param name="nFilePathSize">number of characters in lpszFilePath

string buffer.</param>

/// <returns>

/// S_OK on success, otherwise failure code.

/// </returns>

HRESULT

CDepthBasics::GetScreenshotFileName(_Out_writes_z_(nFilePathSize)

LPWSTR lpszFilePath, UINT nFilePathSize)

{

WCHAR* pszKnownPath = NULL;

HRESULT hr = SHGetKnownFolderPath(FOLDERID_Pictures, 0, NULL,

&pszKnownPath);

if (SUCCEEDED(hr))

Page 120: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

{

// Get the time

WCHAR szTimeString[MAX_PATH];

GetTimeFormatEx(NULL, 0, NULL, L"hh'-'mm'-'ss", szTimeString,

_countof(szTimeString));

// File name will be KinectScreenshotDepth-HH-MM-SS.bmp

StringCchPrintfW(lpszFilePath, nFilePathSize,

L"%s\\KinectScreenshot-Depth-%s.bmp", pszKnownPath, szTimeString);

}

if (pszKnownPath)

{

CoTaskMemFree(pszKnownPath);

}

return hr;

}

/// <summary>

/// Save passed in image data to disk as a bitmap

/// </summary>

/// <param name="pBitmapBits">image data to save</param>

/// <param name="lWidth">width (in pixels) of input image data</param>

/// <param name="lHeight">height (in pixels) of input image

data</param>

/// <param name="wBitsPerPixel">bits per pixel of image data</param>

/// <param name="lpszFilePath">full file path to output bitmap

to</param>

/// <returns>indicates success or failure</returns>

HRESULT CDepthBasics::SaveBitmapToFile(BYTE* pBitmapBits, LONG lWidth,

LONG lHeight, WORD wBitsPerPixel, LPCWSTR lpszFilePath)

{

DWORD dwByteCount = lWidth * lHeight * (wBitsPerPixel / 8);

BITMAPINFOHEADER bmpInfoHeader = {0};

bmpInfoHeader.biSize = sizeof(BITMAPINFOHEADER); // Size

of the header

bmpInfoHeader.biBitCount = wBitsPerPixel; // Bit

count

bmpInfoHeader.biCompression = BI_RGB; //

Standard RGB, no compression

bmpInfoHeader.biWidth = lWidth; // Width

in pixels

bmpInfoHeader.biHeight = -lHeight; // Height

in pixels, negative indicates it's stored right-side-up

Page 121: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

bmpInfoHeader.biPlanes = 1; //

Default

bmpInfoHeader.biSizeImage = dwByteCount; // Image

size in bytes

BITMAPFILEHEADER bfh = {0};

bfh.bfType = 0x4D42;

// 'M''B', indicates bitmap

bfh.bfOffBits = bmpInfoHeader.biSize + sizeof(BITMAPFILEHEADER);

// Offset to the start of pixel data

bfh.bfSize = bfh.bfOffBits + bmpInfoHeader.biSizeImage;

// Size of image + headers

// Create the file on disk to write to

HANDLE hFile = CreateFileW(lpszFilePath, GENERIC_WRITE, 0, NULL,

CREATE_ALWAYS, FILE_ATTRIBUTE_NORMAL, NULL);

// Return if error opening file

if (NULL == hFile)

{

return E_ACCESSDENIED;

}

DWORD dwBytesWritten = 0;

// Write the bitmap file header

if (!WriteFile(hFile, &bfh, sizeof(bfh), &dwBytesWritten, NULL))

{

CloseHandle(hFile);

return E_FAIL;

}

// Write the bitmap info header

if (!WriteFile(hFile, &bmpInfoHeader, sizeof(bmpInfoHeader),

&dwBytesWritten, NULL))

{

CloseHandle(hFile);

return E_FAIL;

}

// Write the RGB Data

if (!WriteFile(hFile, pBitmapBits, bmpInfoHeader.biSizeImage,

&dwBytesWritten, NULL))

{

CloseHandle(hFile);

return E_FAIL;

Page 122: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

}

// Close the file

CloseHandle(hFile);

return S_OK;

}

Page 123: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

During Small Group Meetings or SGMs, handouts were created to illustrate weekly developments and

key findings. The handouts for this quarter can be seen below.

Page 124: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger
Page 125: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger
Page 126: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger
Page 127: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger
Page 128: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger
Page 129: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger
Page 130: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger
Page 131: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger
Page 132: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger
Page 133: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger
Page 134: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger
Page 135: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger
Page 136: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger
Page 137: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger
Page 138: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger
Page 139: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

On March 12th, 2015, presentations for ME310 Winter Quarter took place. The presentation slides for

Team Renault can be seen below.

Page 140: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger
Page 141: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger
Page 142: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger
Page 143: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger
Page 144: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger
Page 145: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger
Page 146: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger
Page 147: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger

Team Renault’s Winter Brochure can be seen below.

Page 148: With the increase of automated driving in the …wc053rq8118/ME310...As cars become driverless, fully autonomous vehicles, the driving experience will realistically be more of a passenger