white paper: challenges to highly automated vehicle … · white paper: challenges to highly...

15
1 WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE (HAV) PRODUCTION SEPTEMBER, 2017

Upload: others

Post on 08-Jul-2020

7 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE … · WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE (HAV) PRODUCTION SEPTEMBER, 2017 . 2 ... hardware IP (aiWare) on Deep

1

W HIT E P A P ER :

CHA L L ENGES T O HIGHL Y A UT O MAT ED V EHICL E (HA V ) P R O DUC T IO N

S E P T E M B E R , 2 0 1 7

Page 2: WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE … · WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE (HAV) PRODUCTION SEPTEMBER, 2017 . 2 ... hardware IP (aiWare) on Deep

2

TABLE OF CONTENTS

Introduction .......................................................................................................................... 2

Challenge 1: Scenario development ....................................................................................... 4

Challenge 2: Enabling technologies - sensing, perception, computation, decision and control

for L4+ Systems ..................................................................................................................... 7 Sensors for L4+ vehicles: .................................................................................................................8 AI applications for perception and decision: ....................................................................................8 Computation: .................................................................................................................................9

Challenge 3: Safety .............................................................................................................. 11

Challenges 4: Testing & Validation ....................................................................................... 12 Testing and Validation of AI: ......................................................................................................... 13

Summary and Conclusion ..................................................................................................... 14

INTRODUCTION

A few automotive OEMs are announcing their plans to produce Highly Automated Vehicles (HAVs)

corresponding to SAE level 4+ (L4+) by early next decade. However, developers of automated systems

today are facing many challenges in realizing such promises in the given timeframe. Some are purely

technical challenges, others are legal and/or ethical ones, and then there is the issue of lack of guidelines

regarding both areas.

In this whitepaper, Vision Systems Intelligence (VSI) will mainly focus on the technical challenges in

realizing L41 production vehicles and the inherent difficulties with highly automated driving.

At issue today is how to design, validate, and certify, when is a system “good enough”. Which scenarios

should be tested until a system can be released? How many kilometers are needed to be driven to prove

1 There are distinctive attributes that qualify a L3 (Conditional Automation), L4 (High Automation), and L5

(Full Automation) system. While both a L3 and L4 system have to automatically execute vehicle controls

(steering, acceleration/deceleration) in most cases, when the vehicle has to perform fallback from a

dynamic driving task (DDT), the L3 system has to give the control to human driver, while a L4 system

should be able to perform the fallback by itself in given operational design domains (ODD). On the other

hand, a L5 system will not have such conditional requirements, hence it should be able to handle all

driving modes regardless of ODD. Therefore, L4 system designs require human-out-of-the-loop system

performances in case of fallback with its ODD so that the system always successfully achieves a safe state

in those designated areas.

Page 3: WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE … · WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE (HAV) PRODUCTION SEPTEMBER, 2017 . 2 ... hardware IP (aiWare) on Deep

3

that a system is robust and fail-safe? And should there be a re-test of all scenarios after a hardware- or

software-update?

Even though there is no consensus yet around the answers to these questions, it is clear that relying on

traditional test-methods (i.e. road testing and test-track testing) will not fulfil the needs of testing HAVs.

Any company which plans to test or deploy highly automated vehicles on public roads in the United States

is required to submit a Safety Assessment Letter to the NHTSA’s Office of the Chief Counsel according to

the Federal Automated Vehicles Policy, published September of 2016. The NHTSA’s guideline for

automated vehicle development calls for many items to be detailed in the letter. Although the policy will

be updated later this year, it is important to note that the guidelines are mainly for the development

phase rather than the production of HAV systems.

Figure 1. Coverage of Safety Assessment Letter to NHTSA (Source: NHTSA)

This means that we still do not have detailed performance standards or guidelines regarding the building

blocks of autonomy – especially at the higher-level automation. The voluntary participation of industry

players will help collectively give real-world insights into L4+ production systems, as the policy initiates

comprehensive frameworks where they can outline the challenges.

Systems for autonomous driving still require the combined experience from prototypes and testing, and

experience gained in hardware and software solutions. This means that a set of stable requirements

Page 4: WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE … · WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE (HAV) PRODUCTION SEPTEMBER, 2017 . 2 ... hardware IP (aiWare) on Deep

4

cannot be created for the required algorithms yet and system architectures are still subject to research.

Design changes to the system should be expected in the course of development and the software

development process should lean on agile methodologies.

In general, the missing elements in HAV system development are use cases, scenarios, and validation of

autonomous features against scenarios. It is extremely important to create a library of scenarios and

models to validate features before testing takes place on road. Scenarios and scenario models contribute

not only to performance validation but also to requirement validation and articulation. Furthermore,

several non-functional and performance requirements are driven from use cases, which bring significant

value in autonomous vehicle development.

In partnership with AImotive (Budapest, Hungary), VSI has deeply investigated the challenges and

solutions to developing and producing highly automated vehicles. AImotive provides an artificial

intelligence-based full stack software suite (aiDrive) for higher-level automated system development, as

well as internal development and validation tools (aiKit), providing an end-to-end tool chain for scenario-

based HAV system development and validation. It has also developed Neural Network (NN) acceleration

hardware IP (aiWare) on Deep Neural Network (DNN) processing. By developing AI software,

development tools and hardware IP, AImotive verifies and validates higher-level automated systems

against HAV testing scenarios.

CHALLENGE 1: SCENARIO DEVELOPMENT

Defining driving scenarios is a critical first step for OEMs, tier ones and other technology companies that

want their production HAVs to be out on the road. Defining where (such as roadway types, at which

roadway speeds, etc.) and when (under what conditions such as day/night, normal or work zone, etc.) an

HAV is designed to operate is required to be described in detail in the letter to NHTSA.

Scenario-based system development or testing is nothing new. Many ADAS systems are the result of

numerous test cases and scenarios to ensure the intended performance of safety systems.

For example, AImotive has detailed step-by-step approaches and a set of tools for setting up Highly

Automated Driving (HAD) scenarios:

1. Verification and validation plan: AImotive has developed new procedures to test and improve

machine learning based codes and software. Focusing on AI with deep learning, all Neural

Network (NN) related testing methods are included in a verification and validation procedure

plan.

2. Definition of driving functionalities, environmental limitations and objects: AImotive separates test

preparation into 3 main parts:

• Driving functions for use-cases such as speed, longitudinal and lateral control

Page 5: WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE … · WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE (HAV) PRODUCTION SEPTEMBER, 2017 . 2 ... hardware IP (aiWare) on Deep

5

• Environmental conditions for all driving functions

• Objects to be considered in all environmental conditions

3. Creation of condition matrix: All of the test preparation data above can be stored in a condition

matrix which contains all the base data to create test cases.

4. Test case generation: Test cases are categorized into best-case scenarios - AImotive considers the

following as part of best-case scenarios: at: Daylight, clear weather, dry asphalt road surface,

etc.).

5. Test scenario generation – generic/common test scenario table: After test cases are created for

driving functions with necessary objects, these test cases can be permutated and multiplied with

environmental conditions. These permutated test cases will form the complete test set to run in

simulation tests.

6. Create official test scenarios: Based on the generic test scenario table, it is necessary to create

and store official test scenarios. These should be included in an application lifecycle management

(ALM) tool.

Figure 2. Main Steps of Test Scenario Creation (Source: AImotive)

Page 6: WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE … · WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE (HAV) PRODUCTION SEPTEMBER, 2017 . 2 ... hardware IP (aiWare) on Deep

6

Level 4 autonomous vehicles are not required to be able to handle every scenario and/or every condition.

However, they are required to be able to detect scenarios they are unable to handle and to safely reach a

safe state. The system needs to always accurately detect these scenarios, and detect them with enough

time to safely move into a safe state.

VSI recently developed several L4 scenarios based on unexpected or exceptional events that HAV

developers may face during their development, which according to AImotive, is harder to define than

“normal” driving situations. The definition of scenarios with exceptional events could not be produced as

systematically as the “normal” driving situations. Sometimes creative testing methods and brain storming

techniques should be used to define a driving situation.

VSI believes that higher-level automated driving scenarios should focus on these unexpected/exceptional

events rather than normal driving situations, so the HAV developers can find more “edge” cases and the

required enabling technologies to overcome such situations.

Therefore, VSI has structured a L4 scenario generation protocol as below based on

unexpected/exceptional events, which is defined as an anomaly in scenario component(s) and/or

condition(s).

Figure 3. VSI L4 Scenario Structure (source: VSI)

All of the ODD elements defined in the NHTSA guidelines are included in either the Scenario Components

or Conditions.

Page 7: WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE … · WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE (HAV) PRODUCTION SEPTEMBER, 2017 . 2 ... hardware IP (aiWare) on Deep

7

In the state-by-state scenario narratives, VSI utilized existing literatures and tools. In the Pre-State, VSI

referred to Tass International’s “PreScan” simulation software, which has a long list of “Experiement

Components.” In the Present-State and Future-State/Resolution, VSI’s structure was based on SAE J3016

“Surface Vehicle Recommneded Practice,” a revision from the original in September 2016,. It provides a

taxanomy describing the full range of levels of driving automation and includes functional definitions for

advanced levels of driving automation and related terms and definitions such as DDT (Dynamic Driving

Task), DDT Fallback, System Failure, Use Case Sequence, etc.

The automotive industry already has a variety of driving scenarios to test system performance via diverse

development and testing tools, however those are mostly for ADAS (low-level safety) systems rather than

highly automated systems (L4+). Therefore, VSI believes that the VSI L4 scenario structure and protocols

would help identify the most challenging driving scenarios, from which developers can find required

technologies to overcome such situations.

CHALLENGE 2: ENABLING TECHNOLOGIES - SENSING, PERCEPTION,

COMPUTATION, DECISION AND CONTROL FOR L4+ SYSTEMS

Based on the structure, VSI created a small set of L4 scenarios which help verify the technical challenges

in L4 system functional domains and the enabling technologies to overcome such challenges.

Such technical challenges include:

• Challenges in sensing/ detection: there can be situations where sensors and sensing software

have difficulties in detecting objects due to challenging weather situations or environments such

as thick fog, snowstorms, low lights, etc.

• Challenges in perception (understanding): L4 vehicles will have to perceive not only stationary

objects, but also dynamic objects (moving target vehicles, hand gestures, eye contacts,

negotiation between ego and target vehicle drivers, etc.) and there often will be times that the

vehicles have to handle aggressive behaviors of ODD (Operational Design Domain) actors (target

vehicle, pedestrian, bicycle)

• Challenges in making decisions: Even after understanding the scene, there will be situations

where the ego vehicle has to decide its driving path when there are multiple threats and many

moving objects in the scene. Sometimes the ego vehicle needs to be aggressive enough to go

through a very complex traffic flow, for example.

• Challenges from the Automated Driving System (ADS) or vehicle failure: These scenarios are

coming from internal vehicle factors unlike the scenarios above where most of challenges are

from external factors, for example when an ADS fails or the vehicle itself breaks down. At level 4

and 5, in case of Dynamic Driving Task (DDT) performance-relevant system failure(s) or upon ODD

exit, vehicles are required to perform DDT Fallback to achieve a minimal risk condition (fail safe).

A L5 vehicle will even have a fail operational system in place in most cases.

Page 8: WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE … · WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE (HAV) PRODUCTION SEPTEMBER, 2017 . 2 ... hardware IP (aiWare) on Deep

8

A different set of enabling technologies will tackle the challenges outlined above and successfully perform

driving tasks within ODD in each scenario. To get to L4 vehicles in production in a few years, not only are

advances in sensors and computing technology needed, but also development of the software stack and

various algorithms, training of optimal AI models, engineering the platform as a whole (integrating all the

various components together), and massive fleet testing to work out the edge cases.

SENSORS FOR L4+ VEHICLES:

The major sensors being used for environmental sensing are camera, radar and LiDAR. Each of the

sensors used in autonomous driving help solve another part of the sensing challenge. By gradually

figuring out which data from which sensor can be used to correctly deal with particular edge cases—

either in simulation or in actual road testing —the cars can learn to deal with more complex situations or

scenarios.

Sensor redundancy (including the availability of HD Map) is a requirement for L4 production vehicles. No

single sensing modality covers all situation – at least two different types of sensors among camera, radar

and LiDAR must be used for a comprehensive, safety critical environment perception.

While cameras are essential members of the sensor setup, LiDAR and radar are competing sensors that

enable L4. In general, radar is cheaper but at lower performance and LiDAR is higher performing but

more expensive. However, due to Millimeter Wave Radar/ 77GHz CMOS Radar advances coming from

major radar suppliers, these high-resolution radars have potential to do what LiDAR can do in terms of

classification, mapping and hidden object detection.

HD maps are commonly made with LiDAR, although there are some solutions for mapping that are

camera based such as the Mobileye REM. Despite the potential of LiDAR, it will take 3 years or more until

LiDAR suppliers can deliver on their automotive-ready production. The LiDAR type closer to production is

mostly “scanning” LiDAR, while very few automotive “Flash” LiDAR companies are ready for automotive

production.

In terms of scanning methods - mechanical (spinner or mirror) methods are declining due to issues with

cost and design. Instead, Electro-Optic (Quanergy’s Optical Phased Arrays or Innoluce’s MEMS or ADI’s

Liquid Crystal Optical Waveguide) methods are getting more attention due to cost reduction from beam

steering methods.

AI APPLICATIONS FOR PERCEPTION AND DECISION:

AI-based perception applications are also required for L4 production vehicles, since the complexity of the

possible scenarios reaches beyond the current capabilities of traditional CV-based algorithms, primarily in

terms of scalability, generalization and context-based recognition.. AI-based object detection and

classification, lane detection, traffic sign recognition and free-space detection are being developed by

many companies and have a good chance of being adopted in L4 production vehicles in time.

Page 9: WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE … · WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE (HAV) PRODUCTION SEPTEMBER, 2017 . 2 ... hardware IP (aiWare) on Deep

9

However, more sophisticated perception applications like “Pixel-level Semantic Segmentation” need

further development as such methods are computationally more expensive than the other perception

inference models.

In transitioning from L4 to L5, more sophisticated AI technologies will be needed. Basic path planning can

be done by non-AI based algorithms in most L4 ODDs such as highways, suburban, and urban (without

heavy traffic) environments, but not in heavy or chaotic traffic, roundabouts, and/or double lane

merges. No current AI-based path/trajectory planning applications are suitable for L4 automation.

Also, AI for decision making and/or driver negotiation from object trajectory and prediction is required for

L4 vehicles to be efficient in ODDs where there are these conditions. No current AI-based decision

making/ driver negotiation is suitable for L4 automation.

AImotive is tackling these difficult problems with their Motion Engine, which handles the decision chain of

self-driving with AI:

1. Tracking of objects over time

a. Detection, location, and association of objects over multiple time instances

b. High level fusion of classical and AI-based methods for tracking

2. Prediction of the next state

a. Behavior prediction for relevant object classes

b. State transition prediction and the search for causality in the actions of objects

3. Decision of action

a. Identification and selection of operation mode and the exploration of feasible actions

b. Identification of alert level (normal, alert or emergency)

4. Trajectory planning

a. Dynamic adaptation to the current driving scenario

b. Efficient, safe and feasible trajectory planning based on heuristics, hard rules and AI

According to AImotive, AI aids decision making from the functionality point of view and fills the gaps that

were not addressed using heuristic planning methods. AI can also extend the limitation of traditional

methods. Its motion filtering and refinement with learning by observation creates better decision-making

AI algorithms which enable a smooth transition between machine/human-in-the-loop states and

significantly increases the user experience.

COMPUTATION:

There is no universally agreed reference design for optimal L4 computing platforms. There are different

architectures being tried out that combine different instruction sets. However, there are trends that

point to certain architectural patterns:

• Host processor(s) coupled with various accelerators

• Processor watchdogs (processor level safety management – safety monitor)

• Move toward a centralized domain control

Page 10: WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE … · WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE (HAV) PRODUCTION SEPTEMBER, 2017 . 2 ... hardware IP (aiWare) on Deep

10

• Move toward the fusion of “raw” (as opposed to “object”) sensor data

• Creating the right balance of performance, power, thermal limits, and cost

Nvidia, Intel, NXP and Renesas are the SoC companies that came up with L4 reference platforms -

Multiple Domain Controllers. This is basically the reason these SoC companies are leading the AV

development ecosystem. Overall, Nvidia gained the most attention from developers of AV systems as it

provides very comprehensive AV development stacks, while others are offering system hardware only

along with software tooling partners.

VSI also sees (usually within SoCs) at least one co-processor that is optimized/designed specifically to be

efficient at computing AI inference models. Autonomous cars will implement a system with both high-

performing CPUs and a separate hardware acceleration unit (whether GPU, FPGA or dedicated hardware)

to run Artificial Intelligence.

According to AImotive, ASIC/ASSP is most suitable for autonomous car AI tasks given several criteria such

as power consumption, parallelism capabilities, tool availabilities, memory bandwidth, cost, and

scalability.

AImotive also develops aiWare, a hardware-independent and general neural network accelerator IP,

which helps chip companies build low power, high performance AI optimized hardware. Neural Networks

in the automotive field require such features along with ASIL-level safety certification. According to

AImotive, current architectures (GPUs and DSPs) are not suited to meet such requirements as GPUs are

designed for computer graphics and thus have too much redundant precision (even in FP16), require high

power consumption, and are too complex to adhere to ASIL requirements. DSPs lack necessary software

SDKs to move from floating to fixed point calculations. Thus, aiWare provides a neural network-specific

architecture designed to meet such automotive requirements, with software support for neural network

tools. This way, processing large amounts of sensor data using a robust AI-based recognition system is

addressed, opening up new possibilities in productizing inexpensive, vision-based self-driving

functionalities over the contested LIDAR-based solutions.

In summary, the required enabling technologies for L4+ AVs per each functional domain are listed as

below:

Functional Domains

Hardware Algorithms Tools Others

Sensing/ Detection

• A choice of high-resolution primary sensors (camera, LiDAR or high-resolution radar)

• Secondary sensors for redundancy (relying on different physics than that of the primary sensor)

• Syncronization algorithms

• Raw sensor data fusion

• Calibration and recording tools

• Modular software framework

HD map data

Perception • Strong computational power to process sensor fusion and create the environmental model

• Sensor fusion-based 360 sensing

• Verification tools and generalized

Page 11: WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE … · WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE (HAV) PRODUCTION SEPTEMBER, 2017 . 2 ... hardware IP (aiWare) on Deep

11

Functional Domains

Hardware Algorithms Tools Others

• V2X communication and CaaS (Communication as a Sensor)

• AI-based static/dynamic object detection and classification

• AI-based free space (drivable road) detection

• AI-based hand-gesture perception

• AI-based driver-actor negotiation understanding

performance metrics

Decision/ Control

• A scalable drive-by-wire solution

• ISO 26262 certified hardware for task execution

• High-performance decision software program facing a variety of situations

• AI-based dynamic object movement/behavior prediction

• Traditional state representation and decision modules

• AI-aided decision making and trajectory planning

• Training and testing tools

• Simulation environment

Table 1. Enabling technologies for L4+ AVs (source: VSI)

CHALLENGE 3: SAFETY

While other embedded automotive systems in the multimedia and entertainment domain are mostly

detached from safety-relevant subsystems, controllers (ECUs) to implement autonomous driving are

safety-relevant. They are directly linked with actuators intended to affect or manipulate the driving

process and can therefore cause harm to occupants and the environment.

One of the most important elements of enabling technologies in L4 scenarios is to ensure the safety of

the vehicle and its system whether or not it successfully completes the driving task scenario. As

mentioned above from the list of Challenges, in the case of an ODD exit or ADS/vehicle failure, L4 vehicles

have to transition to the minimum risk condition, using a fail-safe system in the car.

L4 commercial deployments will require dual or triple system-level redundancies at the control level as

well. This means a vehicle could have two complete systems running in parallel, so the other will take

over if one loses power and crashes. In other words, L4+ production vehicles will have to have fail safe

and/or fail operational systems through redundant system architectures. Such architectures will be

Page 12: WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE … · WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE (HAV) PRODUCTION SEPTEMBER, 2017 . 2 ... hardware IP (aiWare) on Deep

12

applied across various system levels from ASIL D certified MCUs, SoCs, ECUs (domain controllers), and x-

by-wire controls, to actuation systems.

VSI commonly sees a separate safety dedicated chip (MCU), often called a safety monitor, that will have

the highest level of ASIL rating, and will monitor/filter the other SoC which may have a lower ASIL rating.

This means the high ASIL chip will serve as a safety net, but still allows for efficiency on the

heterogeneous SoC. Processors used for safety are often dual core lockstep which is a requirement for

ASIL D rating.

At the ECU level, 2oo2DFS (2 out of 2 Diagnosis Failure System) architectures which are currently used in

the latest planes are also being considered for autonomous car fail operational systems where each

computing channel (ECU) uses a lockstep MCU and each channel fails salient.

Figure 4. 2oo2DFS system (source: Elektrobit)

X by Wire technologies refer to the replacement of traditional mechanical linkages with an electronic

system that no longer has a direct connection. They come in many forms and are applied to the major

subsystems in the vehicle such as throttle, steering and braking. Active safety systems benefit from X-by-

wire technologies and autonomous control systems more or less require such technologies which are

more precise, faster, and save weight.

CHALLENGES 4: TESTING & VALIDATION

Developers struggle to determine a diverse suite of test scenarios from which they could test their L4-L5

designs. In addition, some testing may be better suited to simulation than real world on the premise that

real world testing may not be feasible or practical for public road. Nevertheless, some tests, especially for

vehicle handling, performance and durability should be done at a proving ground. In any event, L4 or L5

vehicles will have gone through various venues of testing to some extent. Below are the pros and cons of

each venue:

• Field tests: It is important to drive as many miles as possible. Any processed data level can be

stored. Unfortunately, field tests cannot provide enough trend setting scenarios to fully qualify a

complex system. Ground truth data (generally) is not available and hence accurate performance

Page 13: WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE … · WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE (HAV) PRODUCTION SEPTEMBER, 2017 . 2 ... hardware IP (aiWare) on Deep

13

measurement or training is difficult. Closed loop performance for algorithm changes/calibration

are not available. However, data from field tests can be used to provide valuable probabilistic

models for use elsewhere and hence enhance the testing of complex systems

• Test Tracks: Allow for the creation of trend setting scenarios as well as dangerous scenarios. Any

processed data level can be obtained. Ground truth data is generally available. Closed loop

testing can be done but could be time consuming to change. Appropriate test tracks for

automated and connected cars are still in their infancy. Even with excellent test track design and

availability it is impossible to fully test complex systems with test tracks alone because of the

multitude of scenarios as well as randomness of the system/data

• Simulation: Allows for the thorough testing of complex systems because of speed and cost.

Generally, it is more difficult to provide raw data and early processed data because of complexity

and processing time. Ground truth data is available (including segmentation) and closed loop

testing is simple, allowing for the testing and validation of individual functionalities. The best

simulation programs get as close to raw data as possible and can add in probabilistic models to

create random variations.

In summary, it is recommended to utilize detailed simulations to develop the systems, test-track tests to

validate components and full-vehicles, and field tests to verify the real-life system robustness, whose

results can be utilized to train a neural network for further testing and simulating of a specific system.

Meanwhile, in the development phase of L4+ systems/vehicles, developers may interact with each other

with different skill sets while verifying and validating AV features against scenarios. These challenges

could be overcome by integrating toolchains (simulation platforms – HW/SW kits – physical test facilities)

– where simulation results can be used to define the physical experiments, which will in turn be used to

validate the simulations.

Among HAV development tools, the perception side is fairly well understood and many players are

supplying very good tools from modeling to validation. But continuous verification and validation

throughout the V-model development process from scenario generation, algorithm training for

perception/behavior modules, validating via simulations (SiL, HiL and VeHiL), to road testing of integrated

systems for solid actuations, all within one company is very rare.

At the same time, VSI has found lots of partnership efforts (via plug-ins or participating in independent

phases of development) between development tool companies as well.

TESTING AND VALIDATION OF AI:

One of the biggest challenges in ensuring safety for HAV systems is the fact that malfunctions concerning

AI-caused misbehavior are not in the scope of functional safety. Therefore, it is hard to trace where the

errors occurred in the chain of AI-based software and/or validate whether they perform better than the

original algorithms or traditional programing codes.

Therefore, AImotive embedded Neural Network (NN) related testing methods in its test management V-

cycle as it initiated the Neural Network Exchange Format (NNEF) standard. AImotive verifies new NN’s

NNEF compatibility, validates NN algorithms with a benchmark, checks NN functionality, visualizes

convolution layers, and integrates and tests verified and validated NN at system level.

Page 14: WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE … · WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE (HAV) PRODUCTION SEPTEMBER, 2017 . 2 ... hardware IP (aiWare) on Deep

14

Since AImotive is focusing on artificial intelligence with deep learning, beside traditional testing methods

new procedures had to be introduced to make sure neural network (NN) algorithms are well trained and

performance is above the defined minimum threshold. All NN-related testing methods are included in a

verification and validation procedure plan.

AImotive initialized Neural Network Exchange Format (NNEF) standard in the KHRONOS Group. This

makes it possible to keep certain limits to all group members for NN implementation. Each new NN

solution needs to comply to standardization.

After verifying NNEF-compatibility, a benchmark is necessary to see that the new NN algorithm has better

metrics result than predecessors. Besides benchmarking, it is also necessary to check NN functionality on

each required object. The visualization of convolution layers with deconvolution process can show how

the NN can recognize different objects to prove they are detected based on objects’ common attribute

and not environmental conditions. After visualization technique, a verification method will test NN

precision. This will be done with proper verification material and test parameters like required false

positive and false negative rate.

After NN verification and validation, the network can be integrated in HAV system to continue test

methods on a system level. For validation of HAV system, simulations and vehicle tests are recommended

to prove system functionalities before public road testing.

SUMMARY AND CONCLUSION

By creating efficient L4 driving scenarios based on most challenging situations created by exceptional

events, VSI was able to identify a list of technical challenges in realizing L4 performance.

Each challenge can be handled by a various set of enabling technologies in autonomous driving functional

domains and safety systems. Many of the enabling technologies, including sensor hardware and software,

and AI applications for perception are being developed to meet automotive performance requirements.

However, most of the safety architectures via redundancy systems are still at the conceptual stage and

the industry needs more sophisticated AI-based decision technologies. Computational platforms are

coming out with reference designs and development platforms, but actual production of L4 systems

would need the right balance of performance, power, thermal limits, and cost.

Finally, it is important to utilize simulation, test-tracks and real road tests based on various scenarios in

the development phase, and all those test results should be inputted back to train a neural network for

further testing and simulation.

Page 15: WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE … · WHITE PAPER: CHALLENGES TO HIGHLY AUTOMATED VEHICLE (HAV) PRODUCTION SEPTEMBER, 2017 . 2 ... hardware IP (aiWare) on Deep

15

In case of inquiries on this report, please contact:

Cami Zimmer Senior Industry Relations Analyst, Vision Systems Intelligence, LLC. (VSI) Email: [email protected] Phone: +1-952-239-9822 Or, Daniella Rédei Communications, AImotive Kft. Email: [email protected] Phone: +36-3069-67651