Date post: | 30-Dec-2015 |
Category: |
Documents |
Upload: | heidi-sosa |
View: | 56 times |
Download: | 4 times |
1
Model-Based Testing
and Test-Based Modelling
Jan Tretmans
Embedded Systems Institute, Eindhoven, NL
and Radboud University, Nijmegen, NL
Quasimodo
2
Overview
1. Model-Based Testing
2. Model-Based Testing with Labelled Transition Systems
3. Model-Based Testing: A Wireless Sensor Network Node
4. Test-Based Modelling
3
Software Testing
4
checking or measuring
some quality characteristics
of an executing object
by performing experiments
in a controlled way
w.r.t. a specificationtester
specificationSUT
System Under Test
(Software) Testing
5
Sorts of Testing
unit
integration
system
efficiency
maintainability
functionality
white box black box
phases
accessibility
aspects
usability
reliability
module
portability
6
But also:
• ad-hoc, manual, error-prone
• hardly theory / research
• no attention in curricula
• not cool :
“if you’re a bad programmer
you might be a tester”
Testing is:
• important
• much practiced
• 30-50% of project effort
• expensive
• time critical
• not constructive
(but sadistic?)Attitude is changing:• more awareness• more professional
Paradox of Software Testing
7
Trends in Software Development• Increasing complexity
• more functions, more interactions, more options and parameters
• Increasing size• building new systems from scratch is not possible anymore• integration of legacy-, outsourced-, off-the shelf components
• Blurring boundaries between systems• more, and more complex interactions between systems• systems dynamically depend on other systems, systems of systems
• Blurring boundaries in time• requirements analysis, specification, implementation, testing,
installation, maintenance overlap• more different versions and configurations
• What is a failure ?
Testing Challenges
8
Models
9
Formal Models?coin
?button
!alarm ?button
!coffee
modelSelected
workingConfiguration
noModelSelected
validConfiguration
addComponent(slot, component)_________________________
send mopdelDB: findComponent()send slot:bind()
removeComponent(slot)_________________________
send slot:unbind()
addComponent(slot, component)_________________________
send Component_DB: get_component()send slot:bind
deselectModel()selectModel(model)_________________
send modelDB: getModel(modelID,this)
removeComponent(slot)_________________________
send slot:unbind()
isLegalConfiguration()[legalConfig = true]
(Klaas Smit)
10
Model-Based Testing
11
SUT
System Under Testpass fail
Developments in Testing 1
1. Manual testing
12
SUT
pass fail
test execution
TTCNTTCNtestcases
1. Manual testing
2. Scripted testing
Developments in Testing 2
13
SUT
pass fail
test execution
1. Manual testing
2. Scripted testing
3. High-level
scripted testing
Developments in Testing 3
high-level
test notation
14
system
model
SUT
TTCNTTCNTestcases
pass fail
model-basedtest
generation
test execution
1. Manual testing
2. Scripted testing
3. High-level
scripted testing
4. Model-based
testing
Developments in Testing 4
15
Model-Based . . . . .
Verification, Validation, Testing, . . . . .
16
Validation, Verification, and Testing
modelproperties
SUT
ideasideas
concreterealizations
ideaswishes
abstractmodels,math
validation
testing
verification
testing
validation
17
formal world
concrete world
Verification is only as good as the validity of the model on
which it is based
Verification is only as good as the validity of the model on
which it is based
Verification and Testing
Model-based verification :• formal manipulation• prove properties• performed on model
Model-based testing :• experimentation• show error• concrete system
Testing can only show the presence of errors, not their
absence
Testing can only show the presence of errors, not their
absence
18
Code Generation from a Model
A model is more (less)
than code generation:
• views
• abstraction
• testing of aspects
• verification and validation
of aspects
met
19
Model-Based Testing
with Labelled Transition Systems
20
system
model
SUT
TTCNTTCNTestcases
pass fail
model-basedtest
generation
test execution
Model-Based Testing
21
MBT with Labelled Transition Systems
LTS
model
SUT behaving as
input-enabled LTS
TTCNTTCNTest
cases
pass fail
LTStest
execution
iocotest
generation
input/outputconformance
ioco
set ofLTS tests
22
Models: Labelled Transition Systems
states
output actions
transitions
initial state
? = input! = output
?coin
?button
!alarm ?button
!coffee
Labelled Transition System: S, LI, LU, T, s0
input actions
23
test case
model ! coin
! button
?alarm
?coffee ---
pass
specificationmodel
Models: Generation of Test Cases
?coin
?button
!alarm ?button
!coffee
fail fail
24
specificationmodel
Models: Generation of Test Cases
?coin
?button
!alarm ?button
!coffee
test case
model ! button
! coin
? alarm
? coffee ---
fail pass fail
25
p p = !x LU {} . p !x
out ( P ) = { !x LU | p !x , pP } { | p p, pP }
Straces ( s ) = { ( L {} )* | s }
p after = { p’ | p p’ }
Conformance: ioco
i ioco s =def Straces (s) : out (i after ) out (s after )
26
i ioco s =def Straces (s) : out (i after ) out (s after )
Intuition:
i ioco-conforms to s, iff
• if i produces output x after trace , then s can produce x after
• if i cannot produce any output after trace ,
then s cannot produce any output after ( quiescence )
Conformance: ioco
27
!coffee
?dime
?quart
?dime?quart
?dime?quart
?dime
!choc
?quart
!tea
!coffee
?dime
!tea
specificationmodel
Example: ioco
ioco
ioco
ioco
ioco
?dime
!coffee
?dime
!choc
?dime
!tea
28
? x (x >= 0)
! y
(|yxy–x| < ε)
specificationmodel
! x
? x (x < 0)
? x (x >= 0)
SUT models
? x
LTS and ioco allow:
• non-determinism
• under-specification
• the specification of properties
rather than construction
Example: ioco
! -x
? x (x < 0)
? x (x >= 0)
? x
!error
29
out (i after ?dub.?dub) = out (s after ?dub.?dub) = { !tea, !coffee }
i ioco s
s ioco i
out (i after ?dub..?dub) = { !coffee } out (s after ?dub..?dub) = { !tea, !coffee }
i ioco s =def Straces (s) : out (i after ) out (s after )
i
?dub
?dub
?dub ?dub
!tea
?dub
?dub
!coffee
?dub
s
!coffee
?dub
?dub
?dub ?dub
!tea
?dub
?dub
?dub
?dub
!tea
30
Test Case
– ‘quiescence’ label
– tree-structured– finite, deterministic– final states pass and fail
– from each state pass, fail :• either one input !a
• or all outputs ?x and
!dub
!kwart
?tea
?coffee?tea
!dub
failfail
test case = labelled transition systemfailfail
?coffee?tea
failpass
?coffee?tea
failfail
?coffee?tea
?coffee
pass failpass
31
Algorithm to generate a test case t(S)
from a transition system state set S, with S ( initially S = s0 after ).Apply the following steps recursively, non-deterministically:
1 end test case
pass
2 supply input !a
!a
t ( S after ?a )
Test Generation Algorithm: ioco
allowed outputs (or ): !x out ( S )
forbidden outputs (or ): !y out ( S )
3 observe all outputs
fail
t ( S after !x )
fail
allowed outputsforbidden outputs?y
?x
fail
t ( S after !x )
fail
allowed outputsforbidden outputs
?y ?x
32
Example: ioco Test Generation
specification
?dime
!coffee
?dime
test
33
Example: ioco Test Generation
specification
?dime
!coffee
?dime
test
34
Example: ioco Test Generation
specification
?dime
!coffee
?dime
test
35
Example: ioco Test Generation
specification
?dime
!coffee
?dime
test
!dime
36
Example: ioco Test Generation
specification
?dime
!coffee
?dime
test
!dime ?coffee
?tea
fail fail
37
?coffee
fail
?tea
Example: ioco Test Generation
specification
?dime
!coffee
?dime
test
!dime ?coffee
?tea
fail fail
pass
38
Example: ioco Test Generation
specification
?dime
!coffee
?dime
test
!dime
?coffee ?tea
passfail fail
?coffee
?tea
fail fail ?coffee
?tea
passfail
39
Test Result Analysis: Completeness
For every test t
generated with the ioco test generation algorithm we have:
• Soundness :
t will never fail with a correct implementation
i ioco s implies i passes t
• Exhaustiveness :
each incorrect implementation can be detected
with a generated test t
i ioco s implies t : i fails t
40
LTS
model
SUT behaving as
input-enabled LTS
TTCNTTCNTest
cases
pass fail
LTStest
execution
iocotest
generation
input/outputconformance
ioco
set ofLTS tests
SUT passes tests
SUT ioco model
exhaustivesound
Completeness of MBT with ioco
41
Model-Based Testing
More Theory
42
? ?
S1 S2 e E . obs ( e, S1 ) = obs (e, S2 )
Testing Equivalences
S1 S2S1 S2
environment e environment e
43
Test assumption :
SUT . mSUT MODELS .
t TEST . SUT passes t mSUT passes t
SUT mSUT
test t test t
MBT: Test Assumption
44
s LTS
SUT
i ioco s
testtool
gen : LTS (TTS)
t SUT
SUT passes gen(s)
SUT comforms to s
soundexhaustive
Prove soundness and exhaustiveness:
mIOTS .
( tgen(s) . m passes t )
m ioco s
Test assumption :
SUTIMP . mSUT IOTS . tTTS .
SUT passes t mSUT passes t
Soundness and Completeness
pass fail
45
SUT passes Ts def t Ts . SUT passes t
test hypothesis: t TEST . SUT passes t mSUT passes t
prove: m MOD. ( t Ts . m passes t ) m imp s
SUT passes Ts SUT conforms to s ?
define : SUT conforms to s iff mSUT imp s
SUT conforms to s
mSUT imp s
t Ts . mSUT passes t
t Ts . SUT passes t
SUT passes Ts
MBT : Completeness
46
Genealogy of ioco
Labelled Transition SystemsLabelled Transition Systems
IOTS ( IOA, IA, IOLTS )
IOTS ( IOA, IA, IOLTS )
Testing Equivalences(Preorders)
Testing Equivalences(Preorders)
Refusal Equivalence(Preorder)
Refusal Equivalence(Preorder)
Canonical Testerconf
Canonical Testerconf Quiescent Trace PreorderQuiescent Trace Preorder
Repetitive QuiescentTrace Preorder
(Suspension Preorder)
Repetitive QuiescentTrace Preorder
(Suspension Preorder)
iocoioco
Trace PreorderTrace Preorder
47
Variations on a Theme• i ioco s Straces(s) : out ( i after ) out ( s after )• i ior s ( L {} )* : out ( i after ) out ( s after )• i ioconf s traces(s) : out ( i after ) out ( s after )• i iocoF s F : out ( i after ) out ( s after )• i uioco s Utraces(s) : out ( i after ) out ( s after )• i mioco s multi-channel ioco• i wioco s non-input-enabled ioco• i eco e environmental conformance• i sioco s symbolic ioco• i (r)tioco s (real) timed tioco (Aalborg, Twente, Grenoble, Bordeaux,..... )
• i rioco s refinement ioco• i hioco s hybrid ioco• i qioco s quantified ioco• i poco s partially observable game ioco• i stiocoD s real time and symbolic data• . . . . . .
48
Model-Based Testing :
There is Nothing More Practical than a Good Theory
• Arguing about validity of test cases
and correctness of test generation algorithms
• Explicit insight in what has been tested, and what not
• Use of complementary validation techniques: model checking, theorem
proving, static analysis, runtime verification, . . . . .
• Implementation relations for nondeterministic, concurrent,
partially specified, loose specifications
• Comparison of MBT approaches and error detection capabilities
49
Test Selection
in Model-Based Testing
Test Selection
• Exhaustiveness never achieved in practice
• Test selection to achieve confidencein quality of tested product– select best test cases capable of detecting failures– measure to what extent testing was exhaustive
• Optimization problem
best possible testing within cost/time constraints
50
Test Selection: Approaches
1. random
2. domain / application specific: test purposes, test goals, …
3. model / code based: coverage
– usually structure based
51
test: a! x?
a? x!
a?x!
a? x!
100% 50%transition coverage
Test Selection: Semantic Coverage
52
implementations
correct implementations:
CS = { i | i ioco s }
passing implementations
PT = { i | i passes T }
measure for test quality:
area PT \ CS
FT
PT
CS
S
S’
weaker model s’:
{ i | i ioco s } { i | i ioco s’ }
implementations
Test Selection: Lattice of Specifications
53
s1 is stronger than s2
s1 s2
{ i | i ioco s1 } { i | i ioco s2 }
CS1
S1
S2
if specs are input-enabledthen ioco is preorderthen ioco`
S3
ST ST
º top elementº allows any impl. chaos
LI ?
Lu !
Test Selection by Weaker Specification
54
but?on!
but? off!
but?on! off!
? x(x >= 0)
! y
(|yxy–x| < ε)
?x(0<x<10)
! y
(|yxy–x| < 2ε)
LI ?
Lu !
chaos
55
Model-Based Testing
A Wireless Sensor Network Node
56
warehouses:sense &control
tracingproducts:
actievelabels
trains:seat
reservation
health care: on-body networks
Wireless Sensor Networks
Myrianed: Wireless Sensor Network
RF ANTENNA
RF TRANCEIVER
(NORDIC SEMI)CPU
(ATMEL XMEGA128)
I/O INTERFACES
58
• Communication inspiredon biology and human interaction
• Epidemic communication
• Analogy: spreading arumor or a virus
• MYRIANED “GOSSIP” protocol
• RF broadcast (2.4 Ghz ISM)
Myrianed: a WSN with Gossiping
59
WSN: Typical Test
Put WSN nodes together:
end-to-end testing
Test all WSN nodes together,
i.e. test the Network
• provide inputs to network
• observe outputs from network
60
WSN: Model-Based Conformance Test
Model-based testing of a single node:
• protocol conformance test
of the gMAC protocol
• according to ISO 9646
• local test method
• time is important in gMAC:
real-time model-based testing
61
WSN Node in Protocol Layers
node 1
Application Layer
gMAC Layer
Radio Layer
Application Layer
gMAC Layer
Radio Layer
node 2
Application Layer
gMAC Layer
Radio Layer
node 3
Medium Layer
applicationinterface
radiointerface
Upper Tester
Lower Tester
62
Local Test Method
applicationinterface
radiointerface
Upper Tester
Lower Tester
clocktick
GMAC layer
Approach:• only software, on host• simulated, discrete time
if(gMacFrame.currentSlotNumber > gMacFrame.lastFrameSlot) { gMacFrame.currentSlotNumber = 0; gMacFrame.syncState = gMacNextFrame.syncState; if (semaphore.appBusy == 1) {
mcTimerSet(SLOT_TIME); mcPanic(MC_FAULT_APPLICATION); return; }
Hardware
Software
63
WSN: Test Architecture
Upper Tester
Lower Tester
GMAC layer: Software
Model-BasedTest Tool:
• TorXakis• JTorX• Uppaal-Tron
sockets
Adapter
• transform messages
• synchronize simulated time
sockets
64
Models
modelSelected
workingConfiguration
noModelSelected
validConfiguration
addComponent(slot, component)_________________________
send mopdelDB: findComponent()send slot:bind()
removeComponent(slot)_________________________
send slot:unbind()
addComponent(slot, component)_________________________
send Component_DB: get_component()send slot:bind
deselectModel()selectModel(model)_________________
send modelDB: getModel(modelID,this)
removeComponent(slot)_________________________
send slot:unbind()
isLegalConfiguration()[legalConfig = true]
(Klaas Smit)
65
system
model
SUT
TTCNTTCNTestcases
pass fail
model-basedtest
generation
test execution
Model-Based Testing
SUT
66
TTCNTTCNTest
cases
model-basedtest
generation
WSN: Model-Based Testing
system
model
MBT
tool
Ctest
adapter
pass fail
test runs
WSN software on
PC
(vague) descriptions
guruad-hoc model
learning
SUT
67
TTCNTTCNTest
cases
model-basedtest
generation
WSN: Test-Based Modeling
system
model
Uppaal-Tron
TorXakis
JTorX
adapter
pass fail
test runs
WSN software on
PC
???Make a model
from observations
made during testing
68
Test-Based Modelling
Model-Based Testing
• IF there exists a model
THEN automatic generation of tests
69
TRUE
FALSE
• No models, or models difficult to obtain– complex, designers make no models, evolving– legacy, third party, outsourced, reuse, no documentation
test execution
70
system
model
SUT
TTCNTTCNTestcases
model-based test
generationalgorithm
Model-Based Testing
71
system
model
SUT
TTCNTTCNTestcases
learningalgorithm
test execution
Test-Based Modelling
Learninga model of SUT behaviourfrom observationsmade during test
• black-box reverse engineering
• test-based modelling
• automata learning
• observation-based modelling
• behaviour capture and testactive passive
72
Validation, Verification, and Testing
modelproperties
SUT
ideasideas
concreterealizations
ideaswishes
abstractmodels,math
validation
learning
73
Empirical Cycle
model
theoryphenomenon
predict
validate
model world physical world
MBT and TBM
74
improveSUT
SUTmodel
MBT
conforming
yes
no
improvemodel
no
satisfied
more tests
yes
no
refinemodel
no
model world physical world
Basic MBT process:
1. make model
2. take SUT
3. generate tests
4. execute tests
5. assign verdict
is only decision process
75
MBT and TBM But MBT is also
1. SUT repair
2. model repair3. improve MBT: more tests
4. improve MBT: better model
until satisfied
iterative and
incremental process
agile mbtstarting point:1. SUT2. initial model3. or no model at all:
learning model from scratchby improving and refining
until satisfied
Satisfaction:• MBT: sufficient confidence
in correctness of SUT• TBM: sufficient confidence
in preciseness of model
LI ?
Lu !
chaos
Learning: Precision
76
measure forlearning precision:( A \ B ) / A
S : precise, expensive : not precise, cheap
implementations
A
B
S’’
S’
S
SUT
implementations
Learning: Lattice of Models
77
s1 is stronger than s2
s1 s2
{ i | i ioco s1 } { i | i ioco s2 }
CS1
S1
S2
if specs are input-enabledthen ioco is preorderthen ioco`
S3
ST
ST
chaos º model for any
SUT
most precise modelis testing equivalent to SUT
Refining Learned Models
78
but?on!
but? off!
but?on! off!
? x(x >= 0)
! y
(|yxy–x| < ε)
?x(0<x<10)
! y
(|yxy–x| < 2ε)
LI ?
Lu !
chaos
Refining Learned Models
79
but?on!
but? off!
but?on! off!
? x(x >= 0)
! y
(|yxy–x| < ε)
?x(0<x<10)
! y
(|yxy–x| < 2ε)
LI ?
Lu !
chaos
MBT and TBM:
two sides of same coin
Test-Based Modelling
• Algorithms, Active:– D. Angluin (1987) :
Learning regular sets from queries and counter-examples– LearnLib: Adaption of Angluin for FSM - Tool
• searching for unique, precise model– T. Willemse (2007) :
TBM – ioco-based Test-Based Modelling • approximation via n-bounded ioco
• Algorithms, Passive:– process mining: ProM - many algorithms and tool
• model generation from set of traces– Use as complete model, or as starting point for active testing
80
Wireless Sensor Network:Passive Learningwith ProM
81
TBM: Use of Learned Models
• No use for testing of SUT from which it was learned
• But for
– understanding, communication
– analysis, simulation, model-checking
– regression testing
– testing of re-implemented or re-factored system
– legacy replacement
– testing wrt. a reference implementation
82
83
Thank You !