model-based testing with labelled transition …tarot2010.ist.tugraz.at/slides/tretmans.pdf1...
TRANSCRIPT
1
Model-Based Testing
with Labelled Transition Systems
There is Nothing More Practicalthan a Good Theory
Jan Tretmans
Embedded Systems InstituteEindhoven, NL
Radboud UniversityNijmegen, NL
2
Overview
� Software testing
�Model-based testing
�Model-based testing
with labelled transition systems
�Applications:
♦ electronic passport
♦ wireless sensor network
4© Jan Tretmans
Testing
Testing:
checking or measuring some quality characteristics
of an executing object
by performing experiments
in a controlled way
w.r.t. a specification
IUT
tester
specification
5© Jan Tretmans
The Triangle Program [Myers]
“A program reads three integer values. The three values
are interpreted as representing the lengths of the sides
of a triangle. The program prints a message that states
whether the triangle is scalene, isosceles, or equilateral.”
Write a set of test cases to test this program
6© Jan Tretmans
1. valid scalene triangle ?
2. valid equilateral triangle ?
3. valid isosceles triangle ?
4. 3 permutations of previous ?
5. side = 0 ?
6. negative side ?
7. one side is sum of others ?
8. 3 permutations of previous ?
9. one side larger than sum ofothers ?
10. 3 permutations of previous ?
11. all sides = 0 ?
12. non-integer input ?
13. wrong number of values ?
14. for each test case: is expected output specified ?
15. check behaviour after output was produced ?
Test cases for:
The Triangle Program [Myers]
7© Jan Tretmans
1. is it fast enough ?
2. doesn’t it use too much memory ?
3. is it learnable ?
4. is it usable for intended users ?
5. is it secure ?
6. does it run on different platforms ?
7. is it portable ?
8. is it easily modifiable ?
9. is the availability sufficient ?
10. is it reliable ?
11. does it comply with relevant laws ?
12. doesn‘t it do harm to other applications ?
13. . . . . .
Test cases for:
The Triangle Program [Myers]
8© Jan Tretmans
Testing: A Definition
Software testing is:
• a technical process,
• performed by executing / experimenting with a product,
• in a controlled environment, following a specified procedure,
• with the intent of measuring one or more characteristics /
quality of the software product
• by demonstrating the deviation of the actual status of the
product from the required status / specification.
9© Jan Tretmans
Paradox of Software Testing
But also:
� ad-hoc, manual, error-prone
� hardly theory / research
� no attention in curricula
� not cool :“if you’re a bad programmeryou might be a tester”
Testing is:
� important
�much practiced
� 30-50% of project effort
� expensive
� time critical
� not constructive(but sadistic?)
Attitude is changing:
�more awareness
�more professional
10© Jan Tretmans
Sorts of Testing
�Many different types and sorts of testing:
♦ functional testing, acceptance testing, duration testing,
♦ performance testing, interoperability testing, unit testing,
♦ black-box testing, white-box testing,
♦ regression testing, reliability testing, usability testing,
♦ portability testing, security testing, compliance testing,
♦ recovery testing, integration testing, factory test,
♦ robustness testing, stress testing, conformance testing,
♦ developer testing, acceptance, production testing,
♦ module testing, system testing, alpha test, beta test
♦ third-party testing, specification-based testing, ………
11© Jan Tretmans
Sorts of Testing
unit
integration
system
efficiency
portability
functionality
white box black box
Level of detail
Accessibility
Characteristics
usability
reliability
module
maintainability
Other dimensions:- phases in
development- who does it- goal of testing- . . . . .
12© Jan Tretmans
static analysis
reviewinginspection
walk-throughdebugging
certification
CMM
quality controlrequirement management
software process
Quality: There is more than Testing
verification
QUALITYQUALITY
testingtestingtesting
14© Jan Tretmans
Automated Model-Based TestingModel-BasedAutomated Testing
model
IUT
IUTconftomodel
TTCNTTCNtestcases
pass fail
test
tool
testgeneration
tool
testexecution
tool
IUT passes tests
IUT confto model
⇔⇔ ⇔⇔⇑⇑⇑⇑ ⇓⇓⇓⇓ soundexhaustive
15© Jan Tretmans
Model-Based Testing
�Model based testing has potential to combine
♦ practice - testing
♦ theory - formal methods
�Model Based Testing :
♦ testing with respect to a (formal) model / specification
state model, pre/post, CSP, Promela, UML, Spec#, . . . .
♦ promises better, faster, cheaper testing:
• algorithmic generation of tests and test oracles : tools
• formal and unambiguous basis for testing
• measuring the completeness of tests
• maintenance of tests through model modification
16© Jan Tretmans
Types of Testing
unit
integration
system
efficiency
maintainability
functionality
white box black box
Level of detail
Accessibility
Characteristics
usability
reliability
module
portability
17© Jan Tretmans
A Model-Based Development Process
specification
realization
design
code
formalizablevalidation
formalverification
testing
model-based
informal world
world of models
real world
informalideas
18© Jan Tretmans
Approaches to Model-Based Testing
Several modelling paradigms:
� Finite State Machine
� Pre/post-conditions
� Labelled Transition Systems
� Timed Automata
� Programs as Functions
� Abstract Data Type testing
� . . . . . . .
Labelled Transition Systems
20© Jan Tretmans
Model Based Testing
pass fail
model
IUT
IUTconftomodel
test
tool
testgeneration
tool
testexecution
tool
IUT passes tests
IUT confto model
⇑⇑⇑⇑ ⇓⇓⇓⇓ soundexhaustive
21© Jan Tretmans
Model Based Testing
s ∈ LTS
i ∈ IOTS ⊆ LTS
i ioco s
pass fail
test
tool
gen : LTS→→→→ ℘(TTS)
t || i
i || gen(s) →→→→ pass
i ioco s
⇑⇑⇑⇑ ⇓⇓⇓⇓ soundexhaustive
with Transition Systems
23© Jan Tretmans
Labelled Transition Systems
Labelled Transition System ⟨⟨⟨⟨ S, L, T, s0 ⟩⟩⟩⟩
Coin
Button
Alarm Button
Coffee
states
actions transitionsT ⊆ S × (L∪{τ}) × S
initial states0 ∈ S
24© Jan Tretmans
Labelled Transition Systems
S0 S1dub
transition
dub coffeeS0 S3
transitioncomposition
kwart teaS0 executable
sequence
non-executablesequence
kwart soupS0
LTS(L) all transitionsystems over L
L = { dub,kwart,coffee,tea,soup}
dub
coffee
kwart
tea
S1 S2
S3
S0s
S4
S5
ττττ
25© Jan Tretmans
Labelled Transition Systems
dub
coffee
dub
tea
S1 S2
S3 S4
S0traces (s) = { σ ∈ L* | s σ }
s after σ = { s’ | s σ s’ }
s Sequences of observable actions:
Reachable states:
traces(s) = { ε, dub, dub coffee, dub tea }
s after dub = { S1, S2 }
s after dub tea = { S4 }
26© Jan Tretmans
� Explicit :
⟨⟨⟨⟨ {S0,S1,S2,S3},
{dub,coffee,tea},
{ (S0,dub,S1), (S1,coffee,S2), (S1,tea,S3) },
S0 ⟩⟩⟩⟩
� Transition tree / graph
� Language / behaviour expression :
S := dub ; ( coffee ; stop [] tea ; stop )
coffee
dub
tea
S1
S2 S3
S0
S
Representation of LTS
27© Jan Tretmans
a a
cb
a ; b ; stop ||| c ; d ; stop
d
a c
b
d
c a
b
d
c a
b
a ; b ; stop [] a ; c ; stop
a
b c
a ; ( b ; stop [] c ; stop )
≠≠≠≠
Representation of LTS
28© Jan Tretmans
a
a
a
a
b
b
b
b
Q, where
Q := a ; ( b ; stop ||| Q )a
P, where
P := a ; P
Representation of LTS
30© Jan Tretmans
Observable Behaviour
a aaτ
a
τa?
≈≈≈≈?
≈≈≈≈?
≈≈≈≈
“ Some systems are more equal than others “
31© Jan Tretmans
S1 S2
environment environment
� Suppose an environment interacts with the systems:
♦ the environment tests the system as black box
by observing and actively controlling it;
♦ the environment acts as a tester;
� Two systems are equivalent if they pass the same tests.
Comparing Transition Systems
32© Jan Tretmans
Comparing Transition Systems
S1 S2
environmente
environmente
↓↓↓↓ ↓↓↓↓
? ?
S1 ≈≈≈≈ S2 ⇔⇔⇔⇔ ∀∀∀∀ e ∈∈∈∈ E . obs ( e, S1 ) = obs (e, S2 )
33© Jan Tretmans
Trace Equivalence
S1 S2
environment environment
s1 ≈≈≈≈tr s2 ⇔⇔⇔⇔ traces ( s1 ) = traces ( s2 )
traces (s) = { σσσσ ∈ L* | s σ }Traces:
34© Jan Tretmans
a
b
a
τ
b
aa
b
bτ
a
≈≈≈≈tr
(Completed) Trace Equivalence
≈≈≈≈ctr
≈≈≈≈tr
≈≈≈≈tr
≈≈≈≈tr
≈≈≈≈ctr
≈≈≈≈ctr
≈≈≈≈ctr
35© Jan Tretmans
Equivalences on Transition Systems
isomorphism
bisimulation( weak )
failure trace= refusal
failures= testing
completedtrace
trace
weak
strong
observing sequences of actions and their end
observing sequences of actions
test an LTS with another LTS
test an LTS with another LTS, and try again (continue) after failure
test an LTS with another LTS, and undo, copy, repeat as often as you like
now you need to observe ττττ's ……
38© Jan Tretmans
Input-Output Transition Systems
dub
coffee
kwart
tea
S1 S2
S3 S4
S0
LI = { ?dub, ?kwart }
LU = { !coffee, !tea }
dub, kwart coffee, tea
from user to machine from machine to userinitiative with user initiative with machinemachine cannot refuse user cannot refuse
input outputLI LU
LI ∩ LU = ∅ LI ∪ LU = L
!
??
!
39© Jan Tretmans
LI = { ?dub, ?kwart }
LU = { !coffee, !tea }
Input-Output Transition Systems
IOTS (LI ,,LU ) ⊆ LTS (LI ,∪ LU )
IOTS is LTS with Input-Output
and always enabled inputs:
for all states s,
for all inputs ?a ∈ LI :
?dub?kwart
?dub?kwart
?dub?kwart
?dub?kwart
?dub
!coffee
?kwart
!tea
S?a
Input-Output Transition Systems
40© Jan Tretmans
?dub
!choc
?kwart
!tea
!coffee
?dub?kwart
?dub?kwart
?dub?kwart
!choc
?dub
!tea
Example Models
?dub
!coffee
?dub
Input-Output Transition Systems
41© Jan Tretmans
implementationi
specifications
“Preorders” onInput-Output Transition Systems
environmente
environmente
imp
i ∈∈∈∈ IOTS(LI,LU) s ∈∈∈∈ LTS(LI,LU)
imp ⊆⊆⊆⊆ IOTS (LI,LU) x LTS (LI,LU)
Observing IOTS where system inputsinteract with environment outputs, and v.v.
42© Jan Tretmans
i ioco s =def ∀σ ∈ Straces (s) : out (i after σ) ⊆ out (s after σ)
CorrectnessImplementation Relation ioco
p δδδδ p = ∀∀∀∀ !x ∈∈∈∈ LU ∪∪∪∪ {ττττ} . p !x
out ( P ) = { !x ∈ LU | p !x , p∈P } ∪∪∪∪ { δδδδ | p δδδδ p, p∈P }
Straces ( s ) = { σ ∈ (L∪∪∪∪{δδδδ})* | s σ }
p after σ = { p’ | p σ p’ }
43© Jan Tretmans
i ioco s =def ∀σ ∈ Straces (s) : out (i after σ) ⊆ out (s after σ)
Intuition:
i ioco-conforms to s, iff
• if i produces output x after trace σσσσ,then s can produce x after σσσσ
• if i cannot produce any output after trace σσσσ,
then s cannot produce any output after σσσσ ( quiescence δδδδ )
CorrectnessImplementation Relation ioco
44© Jan Tretmans
δδδδ
δδδδ
Implementation Relation ioco
out ( i after εεεε ) =
out ( i after ?dub ) =
out ( i after ?dub.?dub ) =
out ( i after ?dub.!coffee) =
out ( i after ?kwart ) =
out ( i after !coffee ) =
out ( i after ?dub.!tea ) =
out ( i after δδδδ ) =
!coffee
?dub
?dub?kwart
?dub?kwart
i?kwart
{ δδδδ }
{ !coffee }
{ !coffee }
{ δδδδ }
{ δδδδ }
∅∅∅∅
∅∅∅∅
{ δδδδ }
i ioco s =def ∀σ ∈ Straces (s) : out (i after σ) ⊆ out (s after σ)
45© Jan Tretmans
!coffee
?dub
?dub
?dub
i
!coffee
?dub
s
out (i after εεεε) = { δδδδ }
out (i after ?dub) = { !coffee }
out (i after ?dub.!coffee)= {δδδδ }
out (s after εεεε) = {δδδδ }
out (s after ?dub) = { !coffee }
out (s after ?dub.!coffee) = {δδδδ }
ioco
Implementation Relation ioco
i ioco s =def ∀σ ∈ Straces (s) : out (i after σ) ⊆ out (s after σ)
46© Jan Tretmans
!coffee
?dub
?dub
?dub
i
!coffee
?dub
s
!tea
out (i after ?dub) = { !coffee } out (s after ?dub) = { !coffee, !tea }
ioco
Implementation Relation ioco
i ioco s =def ∀σ ∈ Straces (s) : out (i after σ) ⊆ out (s after σ)
47© Jan Tretmans
!coffee
?dub
s
?dub
?dub
!coffee
?dub
i
!tea
?dub
out (i after ?dub) = { !coffee, !tea } out (s after ?dub) = { !coffee}⊄⊄⊄⊄
ioco
Implementation Relation ioco
i ioco s =def ∀σ ∈ Straces (s) : out (i after σ) ⊆ out (s after σ)
48© Jan Tretmans
ioco
?dub
?dub?kwart
!coffee
?kwarti
!tea !coffee
?dub
s
out (i after ?dub) = { !coffee }
out (i after ?kwart) = { !tea }
out (s after ?dub) = { !coffee }
out (s after ?kwart) = ∅∅∅∅
But ?kwart ∉∉∉∉ Straces ( s )
Implementation Relation ioco
i ioco s =def ∀σ ∈ Straces (s) : out (i after σ) ⊆ out (s after σ)
49© Jan Tretmans
!coffee
?dub
i?kwart
?dub?kwart
?dub?kwart
out (s after ?kwart) = { !tea }out (i after ?kwart) = { δδδδ }
ioco
Implementation Relation ioco
i ioco s =def ∀σ ∈ Straces (s) : out (i after σ) ⊆ out (s after σ)
s?dub
!coffee
?kwart
!tea
50© Jan Tretmans
out (s after ?dub) = { !coffee }out (i after ?dub) = { δδδδ, !coffee }
ioco
?dub
?dub
?dub
!coffee
?dub
i
?dub
!coffee
?dub
s
Implementation Relation ioco
i ioco s =def ∀σ ∈ Straces (s) : out (i after σ) ⊆ out (s after σ)
51© Jan Tretmans
out (i after ?dub.?dub) = out (s after ?dub.?dub) = { !tea, !coffee }
i ioco s
Implementation Relation ioco
i ioco s =def ∀σ ∈ Straces (s) : out (i after σ) ⊆ out (s after σ)
i
?dub
?dub
?dub ?dub
!tea
?dub
?dub
!coffee
?dub
s
!coffee
?dub
?dub
?dub ?dub
!tea
?dub
?dub
?dub
?dub
!tea
s ioco i
out (i after ?dub.δδδδ.?dub) = { !coffee } ≠≠≠≠ out (s after ?dub.δδδδ.?dub) = { !tea, !coffee }
52© Jan Tretmans
? x (x >= 0)
! y
( |y2 – x| < 0.001 )
specification
! √x
? x (x < 0)
? x (x >= 0)
implementationmodels
? z
LTS and ioco allow:
• non-determinism
• under-specification
• the specification of properties
rather than construction
! -√x? x (x < 0)
? x (x >= 0)
? z
! error
Implementation Relation ioco
53© Jan Tretmans
Genealogy of ioco
Labelled Transition SystemsLabelled Transition Systems
IOTS (IOA, IOSM, IOLTS)
IOTS (IOA, IOSM, IOLTS)
Testing Equivalences(Preorders)
Testing Equivalences(Preorders)
Refusal Equivalence(Preorder)
Refusal Equivalence(Preorder)
Canonical Testerconf
Canonical Testerconf Quiescent Trace PreorderQuiescent Trace Preorder
Repetitive QuiescentTrace Preorder
(Suspension Preorder)
Repetitive QuiescentTrace Preorder
(Suspension Preorder)
iocoioco
ioconfioconf
55© Jan Tretmans
Test Cases
♦ labels in L ∪∪∪∪ { θθθθ }
• ‘quiescence’ label θθθθ
♦ tree-structured
♦ ‘finite’, deterministic
♦ sink states pass and fail
♦ from each state:
• either one input !a and all outputs ?x
• or all outputs ?x and θθθθ
Model of a test case= transition system :
!dub
!kwart
?tea
?coffee?tea
θθθθ
!dub
θθθθ
pass
failfail
failpass
failfail
?coffee?tea
failpass
?coffee?tea
failfail
?coffee?tea
?coffee
LU ∪∪∪∪ {θ}
pass
LU ∪∪∪∪ {θ}
fail
56© Jan Tretmans
Test Generation
i ioco s =def ∀σ ∈ Straces (s) : out (i after σ) ⊆ out (s after σ)
out (s after σ)
= { !x, !y }
s
!x
σσσσ
!y
i
!x
σσσσ
!z
out (i after σ)
= { !x, !z }
out (test after σ) = LU
pass fail
test
?x
σσσσ
?y?z
pass
57© Jan Tretmans
Test Generation
i ioco s =def ∀σ ∈ Straces (s) : out (i after σ) ⊆ out (s after σ)
δδδδ
out (s after σ)
= { !x, !y, δδδδ }
σ
s
!x !yδδδδ
i
!x
σ
!z
out (i after σ)
= { !x, !z, δδδδ }
out (test after σ)
= LU ∪∪∪∪ { θθθθ }
θθθθ
passpass fail
test
?x
σ
?y ?z
pass
58© Jan Tretmans
Algorithm
To generate a test case t (S) from a transition system specification S, with S ≠ ∅∅∅∅: set of states ( initially S = s0 after εεεε )
1 end test case
pass
Apply the following steps recursively, non-deterministically:
Test Generation Algorithm
allowed outputs (or δδδδ): !x ∈out (S)
forbidden outputs (or δδδδ): !y ∉out (S )
3 observe all outputs
fail
t ( S after !x )
fail
allowed outputsforbidden outputs?y
θθθθ?x
2 supply input !a
!a
t ( S after ?a ≠ ∅∅∅∅ )
fail
t ( S after !x )
fail
allowed outputsforbidden outputs?y ?x
59© Jan Tretmans
?coffee
failpass
θθθθ?tea?choc
failfail
Test Generation Example
test
?coffee
failfail
θθθθ?tea
?choc
pass
s
!tea
?dub
!coffee
fail
!dub
?coffee
fail
?tea
?choc
fail
60© Jan Tretmans
?coffee θθθθ?tea
passfail fail
?coffee
passfail
θθθθ?tea
δδδδ
δδδδ
δδδδ
Test Generation Example
s
?dub
!coffee
?dub
test
fail
?tea !dub
?coffee
fail
61© Jan Tretmans
Test Execution
Test execution = all possible parallel executions (test runs) oftest t with implementation i going to state pass or fail
Test run : t i σσσσ pass i' or t i σσσσ fail i'
t t’, i i’a a
t i t’ i’a
i i’τ
t i t i’τ
t t’ ,θ
t i t’ i’θ
i i’δ
62© Jan Tretmans
Test Execution Example
Two test runs :
t idub tea pass i'
fail i''t i dub choc i fails t
!choc
?dub
!tea
i
i' i''
?coffee
failpass
θθθθ?tea?choc
failfail
test
?coffee
failfail
θθθθ?tea
?choc
pass
fail
!dub
?coffee
fail
?tea
?choc
fail
63© Jan Tretmans
Test Execution Example
Two test runs :
t idub θθθθ pass i'
pass i't i dub coffee θθθθ
i?dub
!coffee
?dub
?dub
?dub
i passes t
?coffee θθθθ?tea
passfail fail
?coffee
passfail
θθθθ?tea
test
fail
?tea !dub
?coffee
fail
65© Jan Tretmans
Validity of Test Generation
For every test t generated with algorithm we have:
� Soundness :t will never fail with correct implementation
i ioco s implies i passes t
� Exhaustiveness :each incorrect implementation can be detectedwith a generated test t
i ioco s implies ∃∃∃∃ t : i fails t
66© Jan Tretmans
s ∈ LTS
IUT
i ioco s
pass
fail
test
tool
gen : LTS→→→→ ℘(TTS)
t i
IUT gen(s) →→→→ pass
IUT ioco s
⇑⇑⇑⇑ ⇓⇓⇓⇓ soundexhaustive
Proof soundness and exhaustiveness:
∀∀∀∀i∈∈∈∈IOTS .
( ∀∀∀∀t∈∈∈∈gen(s) . i passes t )
⇔⇔⇔⇔ i ioco s
Test assumption :
∀∀∀∀IUT∈∈∈∈IMP . ∃∃∃∃iIUT ∈∈∈∈IOTS .
∀∀∀∀t∈∈∈∈TTS . IUT passes t⇔⇔⇔⇔ iIUT passes t
Formal Testing with Transition Systems
67© Jan Tretmans
Model-Based Testing :There is Nothing More Practical
than a Good Theory
A well-defined and sound testing theory brings:
� Arguing about validity of test cases
and correctness of test generation algorithms
� Explicit insight in what has been tested, and what not
� Use of complementary validation techniques: model checking, theorem
proving, static analysis, runtime verification, . . . . .
� Implementation relations for nondeterministic, concurrent,
partially specified, loose specifications
� Comparison of MBT approaches and error detection capabilities
69© Jan Tretmans
Test Generation Tools
� AETG
� Agatha
� Agedis
� Autolink
� Conformiq
Qtronic
� Cooper
� Uppaal-Cover
� G∀∀∀∀st
� Gotcha
� JTorX
� TestGen (Stirling)
� TestGen (INT)
� TestComposer
� TestManager
� TGV
� TorX
� TorXakis
� T-Vec
� Uppaal-Tron
� Tveda
� . . . . . .
TorX
Some Model-Based Testing Approaches and Tools
� MaTeLo
� Phact/The Kit
� QuickCheck
� Reactis
� RT-Tester
� SaMsTaG
� Smartesting
Test Designer
� SpecExplorer
� Statemate
� STGJTorX
TorXakis
G∀∀∀∀st
70© Jan Tretmans
A Tool for Transition Systems Testing: (J)TorX(akis)
�On-the-fly test generation and test execution
� Implementation relation: ioco
�Mainly applicable to reactive systems / state based systems
TorX IUTobserveoutput
offerinput
nextinput
specificationcheckoutput
passfailinconclusive
user:manualautomatic
71© Jan Tretmans
(J)TorX(akis) Case Studies
� Conference Protocol
� EasyLink TV-VCR protocol
� Cell Broadcast Centre component
� ‘’Rekeningrijden’’ Payment Box protocol
� V5.1 Access Network protocol
� Easy Mail Melder
� FTP Client
� “Oosterschelde” storm surge barrier-control
� DO/DG dose control
� Laser interface
� The new Dutch electronic passport
� Copier control software
� Wireless Sensor Network node
academic
Philips
LogicaCMG
Interpay
Lucent
LogicaCMG
academic
LogicaCMG
ASML/Tangram
ASML/Tangram
Min. of Int. Aff.
OCE
Chess
72© Jan Tretmans
(J)TorX(akis) Case Study Lessons
�model construction
♦ difficult: missing or bad specs
♦ leads to detection of design errors
♦ not supported by integrated environment yet
♦ research on test based modelling / model learning
� adapter/test environment
♦ specific for each system
♦ laborious, but not only for MBT
� longer and more flexible tests
♦ full automation : test generation + execution + analysis
� no notion of test selection or specification coverage yet
♦ only random coverage or user guided test purposes
74© Jan Tretmans
Variations on a Theme
i ioco s ⇔⇔⇔⇔ ∀∀∀∀σσσσ ∈∈∈∈ Straces(s) : out ( i after σσσσ) ⊆⊆⊆⊆ out ( s after σσσσ)
i ≤≤≤≤ior s ⇔⇔⇔⇔ ∀∀∀∀σσσσ ∈∈∈∈ ( L ∪∪∪∪{δδδδ} )* : out ( i after σσσσ) ⊆⊆⊆⊆ out ( s after σσσσ)
i ioconf s ⇔⇔⇔⇔ ∀∀∀∀σσσσ ∈∈∈∈ traces(s) : out ( i after σσσσ) ⊆⊆⊆⊆ out ( s after σσσσ)
i iocoF s ⇔⇔⇔⇔ ∀∀∀∀σσσσ ∈∈∈∈ F : out ( i after σσσσ) ⊆⊆⊆⊆ out ( s after σσσσ)
i uioco s ⇔⇔⇔⇔ ∀∀∀∀σσσσ ∈∈∈∈ Utraces(s) : out ( i after σσσσ) ⊆⊆⊆⊆ out ( s after σσσσ)
i mioco s multi-channel ioco
i wioco s non-input-enabled ioco
i eco e environmental conformance
i sioco s symbolic ioco
i (r)tioco s (real) timed tioco (Aalborg, Twente, Grenoble, Bordeaux,. . . .)
i iocor s refinement ioco
i hioco s hybrid ioco
i qioco s quantified ioco
i poco s partially observable game-based. . . . . .
Model-Based Testing
applied to an Electronic Passport
Jan Tretmans
ESI
& Radboud University Nijmegen
Wojciech Mostowski
Erik Poll
Ronny Wichers Schreur
Radboud University Nijmegen
Julien Schmaltz
ESI & OU
77
Electronic Passport
New Passport
• Machine Readable Passport ( MRP, E-passport )
• with chip (JavaCard), contact-less
• storage of picture, fingerprints, iris scan, .......
• access to this data protected by encryption and a new protocol
• just released in EU
Our job: testing of e-passports
• emphasis on access protocol
== exchange of request-respons messages
between passport and reader (terminal)
78
INITINIT BACBAC EACEAC
Basic Access Control
initial protocol
read photo
read personal data
Extended Access Control
requires successful BAC
read photo
read personal data
read fingerprints
E-Passports : Basic Model
80
SUT
TTCNTTCNTest
cases
pass fail
model-basedtest
generation
test execution
Model-Based Testing of E-Passports
system
model
81
pass fail
SUT
TTCNTTCNTest
cases
model-basedtest
generation
test execution
Model-Based Testing of E-Passports
system
model
model-based
test tool:
TorXakis
java
drivers
adapter
state-basedmodel
e-passport& wireless
reader
pass fail
test runs
englishspecifications
82
MBT for E-Passports : Process
makemodel
understandingICAO specs
maketest
environment
testing:generation,execution,analysis
SUT
passport
refine
pass
fail
83
• Tested:
– Basic Access Control (BAC)
– Extended Access Control (EAC)
– Active Authentication (AA)
– Data Reading: requires data testing
• Tests up to about 1,000,000 test events
– complemented with manual tests
• No error found yet ......
MBT for E-Passports : Results
Model-Based Testing
of a Wireless Sensor Network Node
Marcel Verhoef Chess
Axel Belinfante University of Twente
Jan Tretmans ESI / Radboud University
Feng Zhu Radboud University
86
SUT passes tests
SUTconforms to
model
⇔⇔ ⇔⇔
system
model
SUT
TTCNTTCNTestcases
pass fail
test execution
model-basedtest
generation
Model-Based Testing
SUTconforms to
model
87
SUT passes tests
SUTconforms to
model
⇔⇔ ⇔⇔
system
model
SUT
TTCNTTCNTestcases
pass fail
test execution
model-basedtest
generation
MBT of WSN
SUTconforms to
model
88
Testing of Wireless Sensor Networks
• Different possibilities:
– test of application
– test of communication
– test of single node
89
Testing of Wireless Sensor Networks
• Different possibilities:
– test of application
– test of communication
– test of single node
• Node test:
– protocol conformance test:
gMAC protocol
– according to ISO 9646
– local test method
90
WSN Node in Protocol Layers
node 1
Application
Layer
gMAC
Layer
Radio
Layer
Application
Layer
gMAC
Layer
Radio
Layer
node 2
Application
Layer
gMAC
Layer
Radio
Layer
node 3
Medium Layer
application
interface
radio
interface
Upper Tester
Lower Tester
91
Local Test Method
application
interface
radio
interface
Upper Tester
Lower Tester
GMAC
layer
First approach:
• only software, on host
• simulated time
if(gMacFrame.currentSlotNumber >
gMacFrame.lastFrameSlot) {
gMacFrame.currentSlotNumber = 0;
gMacFrame.syncState = gMacNextFrame.syncState;
if (semaphore.appBusy == 1) {
mcTimerSet(SLOT_TIME);
mcPanic(MC_FAULT_APPLICATION);
return;
}
Hardware
Software
92
Initial Test Architecture
application
interface
radio
interface
Upper Tester
Lower Tester
GMAC layer:
Software
Ad
ap
ter
in S
oft
ware
Model-Based
Test Tools:
• Uppaal-Tron
• JTorX
• TorXakis
93
Test Interfaces / PCOs
application interface:
3 function calls
radio interface:send and receive
of messagevia air interface
GMAC layer:
Software
timer interface:
clock ticks
modelling:
actions for function call
and function return
modelling:
clocks ticks as input
actions
modelling:
send and receive actions
in model
implemented:
software function calls
in adapter
implemented:
software emulation of send
and receive in adapter
implemented:
emulation of oscillator
ticks with external
software trigger
94
Test Architecture
radio
interface
GMAC layer:
Software
Adapter in Software
Model-Based
Test Tools:
• Uppaal-Tron
• JTorX
• TorXakis
clock
application
interface
radio
interface
socket
interface
95
SUT
TTCNTTCNTest
cases
model-basedtest
generation
test execution
Model-Based Testing of a WSN Node
system
model
model-based
test tool:
TorXakis
C
test
adapter
pass fail
test runs
WSN software on PC
(vague) descriptions
guruad-hoc model
learning
imp
|[Tick]|
96
A First Model for MBT of a WSN Node
! appInitCall
? appInitReturn
<n>
! appEvalCall
<string>
? appEvalReturn
! appPrepareCall<n>
? appPrepareReturn
<string>
? tick
! radioTxMsg
<m>
? tick
n:=+1
? radioRxMsg <m>
[ n==framelength]
n:=0
||| |||
||| |[ Tick ]|
First Model:
• abstract, underspecified, loose model: all behaviour is allowed
• all output allowed, few inputs specified(but there must be some inputs for synchronization)
• refinement in subsequent steps
n:= N
97
Implementation Relation
i ioco s =def ∀∀∀∀ σσσσ ∈∈∈∈ Straces (s) : out (i after σσσσ) ⊆⊆⊆⊆ out (s after σσσσ)
i rtioco s =def ∀∀∀∀ σσσσ ∈∈∈∈ Ttraces (s) : outt (i after σσσσ) ⊆⊆⊆⊆ out t (s after σσσσ)
rtioco: Progress of time is an output observation
WSN model:
time == tick action == input action
! TxMsg
! TxMsg? tick
! TxMsg? tick
i ioco tick s =def
∀∀∀∀ σσσσ ∈∈∈∈ Straces (s) :
outU (i after σσσσ) ⊆⊆⊆⊆ outU (s after σσσσ)
and δδδδ ∈∈∈∈ out (i after σσσσ) implies
δδδδ ∈∈∈∈ out (s after σσσσ) or (s after σσσσ) tick
98
MBT for WSN : Process
model descriptions
guru
test
environment
testing:generation,execution,analysis
SUT
refine
pass
fail
learning
improve
adapter
understanding
eventually.....
99
Test adapter / environment :
• connect Tron / JTorX / TorXakis to WSN node
• low-level programming: interrupts, parallel threads, priorities
• correct emulation of hardware in software, such as clock, radio
Modelling :
• no clear specification, only ‘guru’
• choice of abstraction level: what are actions in the model,
especially radio interface, clock interface, interrupts, and level of parallelism
• modelling timer clock in untimed formalism; interference with quiescence
MBT for WSN: Challenges
100
Future
Done:
• testing of software in simulated time with JTorX, Tron, and TorXakis
• generic WSN test adapter
Further:
• further refining models, learning, agile MBT ?
• include more real hardware
• add more real real-time
• comparison of MBT tools: JTorX, Tron, and TorXakis
Investigate testability:
• how to design and develop so that future testing
(and adapter development) is easier
103© Jan Tretmans
Concluding
� Testing can be formal, too (M.-C. Gaudel, TACAS'95)
♦ Testing shall be formal, too
�A test generation algorithm is not just another algorithm :
♦ Proof of soundness and exhaustiveness
♦ Definition of test assumption and implementation relation
� For labelled transition systems :
♦ ioco for expressing conformance between imp and spec
♦ a sound and exhaustive test generation algorithm
♦ tools generating and executing tests:
TGV, TestGen, Agedis, (J)TorX(akis), . . . .
104© Jan Tretmans
Model based formal testing can improve the testing process :
☺ model is precise and unambiguous basis for testing
♦ design errors found during validation of model
☺ longer, cheaper, more flexible, and provably correct tests
♦ easier test maintenance and regression testing
☺ automatic test generation and execution
♦ full automation : test generation + execution + analysis
☺ extra effort of modelling compensated by better tests
Perspectives