tug presentation

36
Teradyne Users Group Conference | April 28 – 30, 2014 | Anaheim, California TEST QUALITY IMPROVEMENT FOR AUTOMOTIVE PRODUCTS Davide Appello – STMicroelectronics [email protected] Tamas Kerekes – NplusT [email protected] 1

Upload: nplust

Post on 29-May-2015

1.746 views

Category:

Software


2 download

DESCRIPTION

You can find the text of NplusT presence at TUG below

TRANSCRIPT

Page 1: Tug presentation

1Teradyne Users Group Conference | April 28 – 30, 2014 | Anaheim, California

TEST QUALITY IMPROVEMENT FOR AUTOMOTIVE PRODUCTS

Davide Appello – STMicroelectronics

[email protected]

Tamas Kerekes – NplusT

[email protected]

Page 2: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 2

MOTIVATIONS

• Quality is a key performance for products targeting automotive applications

• Test quality has a strong impact on product quality• Test complexity is growing in parallel with product complexity

• Thousands of tests are applied• Tests are split in several test insertions

• Recent data shows that quality issues are dominated by “errors” in test coverage implementation rather than by intrinsic test methods and approaches

Page 3: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 3

GOALS OF OUR WORK

• Automation of the annoying test development and industrialization tasks, like:• Verification of the expected test coverage• Coherency with the expected test limits• Guard-banding and Cpk analysis,

• Creation of the basis of a new toolset to provide additional advantages:• Scalability of test program structure• Reuse of test IPs• Support for concurrent engineering

Page 4: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 4

CONTENT / AGENDA

• Challenges in ensuring test program quality• Concepts of a technical solution• Implementation and deployment aspects

Page 5: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 5

CHALLENGES IN ENSURINGTEST PROGRAMS QUALITY

Page 6: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 6

CHALLENGES

Requirements (what and how to test)

Design/Test Engineer DFT/Test Program

Does the test program- include all the tests defined in the requirements?- implement correctly these tests?

Page 7: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 7

DREAM (IDEAL WORLD)

Requirements (what to test)

Test Results

Test Program

Tool

Page 8: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 8

DREAM (“LESS IDEAL” WORLD)

Requirements (what to test)

Test Results

Test Program

Tool

List of not covered requirements

List of not correctly implemented tests

List of correctly implemented tests

Page 9: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 9

CURRENT “METHODOLOGY”

Not structured documentation describing

test requirements

Ad-hoc information pick-up Manual coding

Manual verification

Page 10: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 10

THIS PROCESS IS NOT SUITABLE TO ENSURE TEST COVERAGE AND AS A

CONSEQUENCE, THE PRODUCT QUALITY, ESSENTIAL IN AUTOMOTIVE

APPLICATIONS

Page 11: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 11

“THE IDEAL WORLD”

CONCEPTS OF A TECHNICAL SOLUTION

Page 12: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 12

TARGET PROCESS

Modeling and Formal Description of the Requirements• Test requirements described based on a predefined model and implemented in

specific format files and/or databases.• Support for importation of external databases and file formats.• Possibility of manual insertion and editing.

Built-in Traceability• Test specifications described following a predefined model and stored in a

database.• Automated connection of requirements/specification/test program/datalog.• Manual editing.• Possibility of import and export in files.

Automated Generation of Test Program Structure• Test program skeleton (not the flow) is generated automatically from the test

specification (which is ATE independent) with an ATE-specific code generator.

Automated Validation of the Results• Reference datalog is created by running the test program on a golden setup and

golden device lot.

Page 13: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 13

TOOL ARCHITECTURE

DatabasesMINT Data Model

Business Logic

GUI

• requirements• test specifications

• test results

Cu

sto

m Im

po

rter

s(r

eq

uire

me

nts,

test

pro

gra

ms)

Cu

sto

m G

en

era

tors

(te

st p

rog

ram

ske

leto

ns)

(doc

um

en

tatio

n)

(ver

ifica

tion

/va

lidat

ion

rep

ort

s)

Page 14: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 14

REQUIREMENT MODELING

• Formal requirement description is fundamental• Sources

• User (datasheet)• Device functionality and parameters• Ex.: “ADC linearity error < 1%”

• Design• Verifications linked to implementation• Ex.: ATPG test

• Process• Verifications needed for correct execution• Ex.: continuity test

• Quality• For ensuring reliability and quality• Ex.: need of burn-in

Page 15: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 15

REQUIREMENT MODEL

Type• Parametric or

functional test• Single or multiple

measures

LimitsOnly for parametric• Min limit• Max limit

CPK TargetOnly for parametric

TagUnique identifierEx.: U1234Imported or generated automatically

DescriptionFree text descriptionEx.: “ADC linearity test”

MnemonicShort text referenceEx.: ADC_LINCan be the name of the covering test

Page 16: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 16

TEST CONDITIONS

• A device feature might need to be tested in different conditions• ADC linearity test at -40°C and at +125°C• ADC linearity test at Vccmin and at Vccmax• Combination of temperature and Vcc

• The requirement model has been extended by• Definition of test conditions• Manual assignment of test conditions to requirements• Possibility to verify if matching test conditions using formulas

Page 17: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 17

TEST SPECIFICATION MODELING

• We model the test specification in a hierarchic structure of test blocks• Test strategy (“production test”)• Test insertion (“EWS1”)• Module test (“analog”)• Operation (“ADC linearity”)

• Support of reuse• Parametrizable subflows• Test IP libraries

• Dealing with project variants• Different package, memory size, …

Page 18: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 18

TEST SPECIFICATION EXAMPLE (1)

Page 19: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 19

TEST SPECIFICATION EXAMPLE (2)

Page 20: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 20

TEST SPECIFICATION EXAMPLE (3)

Page 21: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 21

BACKBONE FOR TRACEABILITY

CONCEPTS OF A TECHNICAL SOLUTION

Page 22: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 22

REQUIREMENT TRACEABILITY THROUGHOUT THE FLOW

Requirements&

Test Conditions

Test SpecificationTest Blocks

Test Program(s)

Datalog(s)

Requirement Tagging

Test Block Tagging

Test Names

Test Numbers/Names

Page 23: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 23

COVERAGE ANALYSIS SUMMARY

• Requirements vs. Test Specification• Extraction of the tags of the test blocks• Verification of the presence of each requirement tag

• Requirements vs. Test Program• Analysis on multiple test programs, per test insertion• Extraction of the test names and retrieval of the tags• Verification of the presence of each requirement tag

• Requirements vs. Datalog• Analysis on multiple datalogs, per test insertion, by choosing a golden device• Extraction of test names and retrieval of the tags• Verification of the presence of each requirement tag

• Datalog Quality• Analysis on multiple datalogs, per test insertion, on all good devices• Link to the requirements via tagging• Check of the limits• Cpk calculus and check

Page 24: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 24

COVERAGE REPORT EXAMPLE

Page 25: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 25

TEST SKELETON GENERATION

CONCEPTS OF A TECHNICAL SOLUTION

Page 26: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 26

TEST PROGRAM GENERATION

• “Flattening”: puts in sequence of the test blocks, as they will be executed

• Resolves the parameter expressions• Automatically generates unique test names from the name of the test

blocks• Will be used for coverage analysis of the test program and of the datalog

• Creation of an output file• Customizable format

Page 27: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 27

TEST PROGRAM GENERATION EXAMPLE

void main(){ Insertion_EWS1(); ModuleTest_Digital_EWS(); ModuleTest_Analog_EWS(); Operation_ADC_Linearity(); Operation_Power_On(); SubFlow_ADC_LIN(); Operation_Power_Off(); Operation_ADC_Accuracy();}

void Insertion_EWS1(){}void ModuleTest_Digital_EWS(){}void ModuleTest_Analog_EWS(){}void Operation_ADC_Linearity(){}void Operation_Power_On(){}

Page 28: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 28

“THE REAL WORLD”

IMPLEMENTATION AND DEPLOYMENT

Page 29: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 29

IN THEORY…

• The “ideal process” might be a too big step in an already consolidated workflow• Experience of the engineering team• Code reuse in new test programs• Interfacing with the environment

• Requirement input/import• Difficulty of automated import due to the not (not fully) codified requirement

database

• Legacy cases• Existing test program pool

• Our proposed process should allow a gradual introduction

“In theory, theory and practice are the same. In practice, they are not.” (Albert Einstein)

Page 30: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 30

REQUIREMENT INPUT

• Challenges• Manual input is a too big headache• Several sources and formats

• PDF, database, …

• Solution• Importing procedure using customizable plug-ins for the various formats• Manual adjustment and/or input if needed

Page 31: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 31

TEST PROGRAM GENERATION

• Option 1: fully automated• Test engineer inputs all information in the tool• Tool generates a final output format using an ATE-

dependent plug-in module

• Option 2: skeleton generation• Tool generates test program frame with test names (ATE-

dependent plug-in)• Test engineer extends it manually

• Option 3: manual• Test engineer uses the test names suggested by the tool

(specific output file for instructions)

Page 32: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 32

LEGACY TEST PROGRAMS

• Legacy test programs have not been prepared to enable fully automated analysis

• A manually supported procedure has been implemented

• The same procedure can be used for datalog analysis

Parsing of the Test Program(ATE specific plug-in)

or STDF

Generation of the Test Specification(flat rather than hierarchic)

Manual assignment of the test blocks to the requirements

Automated coverage analysis and reporting

Page 33: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 34

EXPERIMENTS AND PILOT PROJECT

IMPLEMENTATION AND DEPLOYMENT

Page 34: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 35

PILOT PROJECT BACKGROUND

• Background• Reuse of the experience of custom test program generation and analysis

tools• Developed as joint ST – NplusT projects in the past years• Skeleton generation for embedded non-volatile memory testing• Test vector generation from high level description• Test program change workflow management• Test program difference detection and analysis for J750

• Built on NplusT’s new MINT technology

• Target• Automotive microcontrollers• J750 environment• Limited to the test of the flash modules

• Requirement input• User requirements imported from tagged PDF• Manual adjustment for test conditions• Other requirements input manually

Page 35: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 36

PILOT PROJECT TEST PROGRAM MANAGEMENT

• Step 1:• Existing test program• Automated load and creation of test specification• Manual connection to the requirements

• Step 2:• New test program• Test specification input via tool• Automated test program frame generation• Manual final test program implementation

Page 36: Tug presentation

Teradyne Users Group Conference | April 28 – 30 | Anaheim, California 37

CONCLUSIONS

• Automated verification process reduces the possibility of human error in test program implementation – quantification in progress

• Automated generation tools support test engineer in the development increasing efficiency and quality

• The concept is applicable for legacy cases as well – with more manual operations thus with more human error possibility

Requirements

Test Specification

Test Program

DatalogCheck

&Certification

Supported Generation

Supported Generation