yves le traon 2003 oo system testing behavioral test patterns automatic test synthesis from uml...

42
Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Upload: raul-banter

Post on 13-Dec-2015

215 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

OO System TestingBehavioral test patterns

Automatic test synthesis from UML models

Page 2: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

Outline

System testing Behavioral test patterns Generating behavioral test patterns

Page 3: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

Testing product lines

Benefiting from the PL specificities Testing commonalities Deriving tests according to the variants Specific tests

Reusing tests Building test assets

Defining test independently from the products Using generic scenarios Deriving product-specific test cases from those generic

scenarios

Page 4: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

Planifier

Ouvrir

Cloturer

Consulter

Demander la Parole

Donner la Parole

Parler

Sortir

Participant

Animateur Organisateur

Entrer

Test système et UML

Meeting distribué

Page 5: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

The use case scenarios

High level, simple, incomplete Wildcards for genericity Example:

Enter use case scenario (x is a scenario parameter)

(b)

:Server

x:userenter(*, x)

ok

(a) Nominal case (b) Exceptional case

:Server

x:userenter(*, x)

nok

Page 6: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

Test système et UML

Cas d’utilisation Scénarios Nominaux

Scénarios Exc. Rares

Scénarios Exc. Echecs

A Planifier NA1, NA2   EA1, EA2

B Ouvrir NB1   EB1, EB2

I Clôturer NI1 RI1  

C Consulter NC1   EC1

D Entrer NC1 RD1 ED1, ED2

E Demander la Parole

NE1   EE1

G Parler NG1, NG2 RG1 EG1, EG2

H Sortir NH1   EH1

F Donner la Parole NF1   EF1, EF2

Page 7: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

Test système et UML

Critère minimum:

Couvrir chaque scénario avec une donnée de test Ici 27 cas de test

Critère de couverture des combinaisons de use-cases Prérequis : un diagramme d’activité des use-cases

Page 8: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

Activity diagrams

A swimlane per actor Visually significant Within UML notations Suitable to apply algorithms

Actor_3Actor_2Actor_1

UC_2 UC_3

UC_4

UC_1

Difficult to build Hard or impossible to

express certain behaviors Not suitable for use cases

shared by actors

Page 9: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

Test système et UMLA Planifier

B Ouvrir

<<*>> C Consuter

D Entrer

E Demander la Parole

G Parler

H Sortir

I Cloturer

F Donner la Parole

Organisateur Participant Animateur

Page 10: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

Test système et UML

Critère : tester chaque scénario de chaque cas d’utilisation dans chaque séquence nominale élémentaire (1 passage par boucle)

Page 11: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

Test système et UML

Données de tests à générer pour la séquence A.B.I.H => 4 + 2x3 + 2x1x2 + 2x1x2x2 = 22 cas de test doivent être générés via ce « pattern »

Cible de Test A B I H

Combinaison des Cas de

Test

2

1

2

1

A

A

A

A

E

E

N

N

2

1

A

A

N

N

2

1

1

B

B

B

E

E

N

2

1

A

A

N

N 1BN

1

1

I

I

R

N

2

1

A

A

N

N 1BN

1

1

I

I

R

N

1

1

H

H

E

N

Page 12: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

Test système et UML

En évitant les redondances: ordonner les « cibles de test » => 10 cas de test doivent être générés via ce « pattern »

Cible de Test H I B A

Combinaison des Cas de

Test

2

1

A

A

N

N

1BN NI1

1

1

H

H

E

N

2

1

A

A

N

N

1BN RI1

2

1

A

A

N

N

1BE

2

1

A

A

E

E

Page 13: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

Behavioral Test Patterns

Based on the use case scenarios high level generic (use of wildcards) incomplete nominal or exceptional

A selection from among the scenarios : An accept scenario ( test objective) Reject scenarios (optional) Prefix scenarios ( initialisation, optional)

Page 14: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

Benefits from test patterns

Generation of product specific test cases from product independant test patterns

But tedious to build test patterns especially for « basis tests »

Idea : being able to build automatically significant sets of test patterns

Page 15: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

How to exploit use cases ordering ?

Generate pertinent paths of use cases In order to reach a test criterion Issues:

An algorithm to assemble the use cases taking into account the pre and post conditions

Defining pertinent test criterions

Page 16: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

Conclusion

From early modeling to test cases : From reusable and generic test pattern To concrete test cases, specific to each product

Two ways of selecting test patterns: manually (qualitative approach) driven by use cases sequential dependencies

(quantitative approach)

Page 17: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

From system level test patterns to specific test cases :

application to product-Line architectures

Page 18: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

Product Line architectures

A product line : a set of systems which share a common software architecture and a set of reusable components. Building a product line aims at developing once the common

core of a set of products, and to reuse it for all the products.

Defining a product family Variants and commonalities Reuse assets

For our purpose: specify behavioural test patterns, that become reusable “test assets” of the product-line

Page 19: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

Product Line architectures: a key challenge

Use case scenarios cannot be used directly for testing Generic and incomplete. Parameters are not known, nor object instances (scenarios

concern roles). Specify the general system functionality without knowing – at

that stage - the exact sequence calls/answers.

Generating test cases from such test patterns for a given UML specification is thus one of the key challenges in software testing today.

Page 20: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

PL

Variants optional, when a component can be present or not, alternative, when at a variation point, one and only one

component can be chosen among a set of components, multiple, when at a variation point, several components can

be chosen among a set of components. All the variants must appear in the architecture but

not all the possible combination of variants Extracting a product from the global product line

architecture : product instantiation

Page 21: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

Product Line architectures: example Virtual Meeting Server PL offers simplified web conference

services: it aims at permitting several kinds of work meetings, on a distributed

platform. ( general case of a ‘chat’ software). When connected to the server, a client can enter or exit a meeting, speak, or

plan new meetings. Three types of meetings

standard meetings where the client who has the floor is designated by a moderator (nominated by the organizer of the meeting)

democratic meetings which are standard meetings where the moderator is a FIFO robot (the first client to ask for permission to speak is the first to speak)

private meetings which are standard meetings with access limited to a defined set of clients.

Page 22: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

The Virtual Meeting Example

Connection to the server

Planning of meetings Participation in

meetings Moderation of meetings

VirtualMtgenter

plan

open

close

consult

leave

hand overspeak

moderator

manager user

connect

Virtual meeting use case diagram

Page 23: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

Product Line architectures: example

Due to marketing constraints, the Virtual Meeting PL is derivable into three products a demonstration edition: standard and limited a personal edition: any type but limited an enterprise edition: any type, no limitations

Two variants : type (multiple) and participants limitation (optional ) (also OS, languages, interfaces etc.)

Page 24: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

The Virtual Meeting Example

Two main variants: the kinds of meetings

available the limitation of the

number of participants

Three products: Demonstration edition Personal edition Enterprise edition

Virtual Meeting

Variant 1 {multiple}: available meetings

Variant 2 {optional}: meetings limitation

Demonstration edition

Standard true

Personal edition

Standard, private,

democratictrue

Enterprise edition

Standard, private,

democraticfalse

Page 25: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

Testing product lines

Benefiting from the PL specificities Testing commonalities Deriving tests according to the variants Specific tests

Reusing tests Building test assets

Defining test independently from the products Using generic scenarios Deriving product-specific test cases from those

generic scenarios

Page 26: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

A contradiction

Test scenarios must be expressed at a very high level to be reusable to be independent from the variants and the

products

Generic scenarios are too vague and incomplete cannot be directly used on a specific product

Impossible to reuse generic test scenarios ?

Page 27: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

Behavioral Test Patterns

Based on the use case scenarios high level generic product independent nominal or exceptional

A selection from among the scenarios : An accept scenario Reject scenarios Prefix scenarios

Page 28: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

Testing a PL

Behavioral Test Patterns (or Test Objective) an accept scenario: it expresses the behavior that has to be

tested, e.g. the successful exit (“leave” a meeting use case) of a participant from a meeting,

one or several (optional) reject scenarios: they express the behaviors that are not significant for the tester, e.g. the consult function of a meeting state does not interact with the entering into a meeting.

one or several (optional) preamble (or prefix) scenarios that must precede the accept scenario. For example, a meeting must be opened before any participant can enter the virtual meeting.

Page 29: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

An Example

S-

Prefix

S+

:Server

x:userenter(*, x)

nokx:user

:Server

connect(x)ok

plan(*, x)ok

open(*, x)ok

:Servery:user

close(*, y)

:Server

leave(*, y)y:user

Page 30: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

The reject scenarios

Optional

Reduce the « noise » Avoid calls irrelevant for the test

Exclude interfering calls

Page 31: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

Describes the preamble part of the test case Guides the synthesis A composition of use-case scenarios Scenarios versus object diagram ?

The prefix

Prefix

x:user

:Server

connect(x)ok

plan(*, x)ok

open(*, x)ok

user2:user

user3:user

user4:user

server:demoServer

user1:user

Object diagram

Page 32: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

Typical reject scenarios

Some scenarios can be added automatically Use of a boolean dependency matrix

Plan Open Close Consult Enter Speak Leave

Plan X X

Open X X

Close X X

Consult X X

Enter X X X X

Speak X X X

Leave X X

Scenarios independent from

the enter use case : added as reject scenarios

Page 33: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

Typical reject / prefix scenarios

Use of the activity diagram Accept scenario = the targeted scenario in a use

cas Prefix = the previous scenarios in the path Reject = all the scenarios of the use cases that are

not involved in the path.

Page 34: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

Generating test patterns

Product instanciation

DetailedDesign

General Design

Main classesInterfaces…

P1

P2

P3

TP1 TP2Test cases synthesis

Use casesUC1

UC2

nominal exceptional

nominal exceptional

Evolution

Test patterns specification

(test objective)

Accept scenario Reject scenarios (optional) Prefix scenarios (optional)

selectionmanual

automatedor

Page 35: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

U M L S p ec ifica tio n (c la ss d iag ram , o b jec t

d iag ram , s ta tech a rts ,… )

M o d e llin g

T est p a ttern

2 . F o rm a l o b jec tiv e d e riv a tio n

L T S te s t o b jec tiv e

1 . F o rm a l sp ec if ica tio n d e riv a tio n

L T S sp ec ifica tio n m o d e l v ia s im u la to r A P I

U M L A U T

T G V

v is ib le ac tio n s ( .h id e ) in p u ts /o u tp u ts ( .io )

X M I

X M I

Y o u r fa v o u r ite C A S E to o l

U M L A U T

D isp lay

3 . T es t S y n th es is (o n th e f ly )

IO L T S te s t ca se

4 . U M L te s t c a se d e riv a tio n

T est ca se

X M I

Y o u r fa v o u r ite C A S E to o l

Y o u r fa v o u r ite

C A S E to o l

Page 36: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

Compiling the Test Pattern

Inputs venant d’UML: Le diagramme de classes détaillé avec – autant que

possible – un statechart par classe active du système Un diagramme d’objets initial Le pattern de test

Les aspects dynamiques sont fournis à TGV sous forme d’API

Output : Un scénario détaillé UML décrivant tous les appels précis

et les verdicts attendus à effectuer sur le système pour observer le comportement spécifié dans le pattern

Page 37: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

Compiling the Test Pattern

accept+ = sequential composition of the prefix and the accept scenario

Scenarios making up the test case = accepted by accept+ rejected by none of the reject scenarios

accept+ LTS S+

reject scenarios {seqj-}jJ LTS {Sj

-}j J

Test pattern LTS S+ j J Sj-

Page 38: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

Synthesis of the test case

Inputs of TGV: Simulation API LTS representing the Test Pattern Which actions are internal ? Which actions are inputs ? outputs ?

Output of TGV: IOLTS representing a test case

UML test case derivation

Page 39: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

Product Line architectures: example

opened planned

closedsaturated

create

open

enter[usersEntered.card>=max]

leave

close

close

close/nok(not yet opened)

enter/nok(meeting saturated)

opened planned

closed

create

open

close

close/nok(not yet opened)

(a) non-limited meetings (b) limited meetings

Page 40: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

An Example

S-

Prefix

S+

:Server

x:userenter(*, x)

nokx:user

:Server

connect(x)ok

plan(*, x)ok

open(*, x)ok

:Servery:user

close(*, y)

:Server

leave(*, y)y:user

Page 41: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

Test patterns and test cases

user2:user user3:user user4:user

server:demoServer

user1:user

enter(aMtg, user2)

okenter(aMtg, user3)

okenter(aMtg, user4)

nok

enter(aMtg, user1)ok

connect(user1)ok

plan(aMtg, user1)ok

open(aMtg, user1)ok

Preamble

Test Objective

Page 42: Yves Le Traon 2003 OO System Testing Behavioral test patterns Automatic test synthesis from UML models

Yves Le Traon 2003

Conclusion

From early modeling to test cases : From reusable and generic test pattern To concrete test cases, specific to each product

Methodology « not fully automated » …