Date post: | 22-Dec-2015 |
Category: |
Documents |
View: | 213 times |
Download: | 0 times |
Testing and Monitoring at Penn
An Integrated Framework for Validating Model-based
Embedded Software
Li TanUniversity of Pennsylvania
September, 2003
Testing and Monitoring at Penn
Outline1. Motivations2. The overview of our approach3. Model-based monitoring.
1. Monitoring hybrid automata2. From logic specification to model-based
monitor3. Model instrumentation
4. Model-based testing1. Creating and refining model-based testers
5. The case study on SONY AIBO Dog1. Design-level validation2. “on-board” validation
6. Conclusion
Testing and Monitoring at Penn
Motivations1. Implementing state-of-art validation
technique is a painful and costly process.1. Validation requires special
instrument/programs.2. The details/interfaces of targeted tools are not
always available.
2. Lack of an integrated solution for both design-level and implementation-level validations.
1. Difficult to relate the result of design-level validation with implementation-level validation.
3. Bringing formal methods techniques down to implementation level is challenging.
1. Validation directly on targeted hardware platform is much needed in designing model-based embedded software.
Testing and Monitoring at Penn
The outline of our approach
Goal: introducing state-of-art validation techniques to model-based software design domain using existing tools and techniques.
Solution: a model-based validation technique for model-based software design.
A four-step plan. Synthesizing model-based monitor from the logic
specification. Creating model-based tester from the testing
criteria. Design-level validation: simulating self-monitoring
and self-testing model. Implementation-level validation: generating self-
monitoring and self-testing code for target hardware from the composed model.
Testing and Monitoring at Penn
An overviewMEDL
specification
Monitor Synthesizer
System model
Instrumented model
Model Instrumentation
CodeGenerator
Self-testing and Self-monitoring Code
Implementation Level Validation
Monitor Model
Coverage Criteria/
environment constrains
Tester Model
CoverageChecker
Simulator
Design Level Validation
Composed Model
Testing and Monitoring at Penn
The details of our plan
1. Modeling language: Charon for hybrid systems. Charon toolkit has,
1. A simulator.2. A code generator to C++.
2. Model-based testing and monitoring.1. Testing: working well on implementation level.
1. Need be extended to model level2. Runtime verification: checking the execution
of software system.1. Need be extended to hybrid systems
3. SONY AiBo Robot Dog: a hands-on example.
1. The controller for its head is generated from a Charon model.
Testing and Monitoring at Penn
Outline1. Motivations2. Overview of our approach3. Model-based Monitoring.
1. Monitoring hybrid automata2. From logic specification to model-based
monitor3. Model instrumentation
4. Model-based Testing1. Creating and refining model-based testers
5. Case study on SONY AIBO Dog1. Design-level validation2. “on-board” validation
6. Conclusion
Testing and Monitoring at Penn
Runtime verification
MEDLspecification
ProgramInstrumentation
Java Programs
Runtime verification (monitoring) is to check an execution of a program again its temporal logic specification
Java Programs
+Filter
Monitoring script
Monitor
MEDL compiler
Computer
events
Testing and Monitoring at Penn
Monitoring hybrid automata
MEDLspecification
ModelInstrumentation
Medl2Charon Monitor Synthesizer
Simulator
System Automaton
System Automaton Filter Monitoring Automaton
Instrumented Model
Testing and Monitoring at Penn
Hybrid Automata A hybrid automaton A={S, V, T, G, W, D, A, I, s0}
extends a EFSM {S, V, T, G, W, s0} with continuous behaviors
1. S is the set of modes2. V is the set of variables3. T=V£ V is the set of transitions4. G assigns each t 2 T a guard, a predicate over V.5. W assigns each t 2 T an assignment for V µ V.6. D assigns each s 2 S a set of differential equations over
.7. A assigns each s 2 S a set of algebra equations over V.8. I assigns each s 2 S an invariant, a predicate over V.9. s0 2 S is the initial mode.
VV
Testing and Monitoring at Penn
An example: Dog Head Controller
x = -10x -46
.
x 45?
x -45
|| 10?
|| >10?
46
10
x
x)( xkx
x
: The visibility of the ballx: The angle of the head: The angle of the ball
Hybrid automata can be composed concurrently. Hybrid automata can be composed hierarchically.
1. A location can be a collection of sub-locations.
Testing and Monitoring at Penn
MEDL: expressing your properties
1. MEDL (Meta Events Definition Language) is a linear interval temporal logic for specifying safety properties.
1. MEDL is initially introduced for monitoring Java programs in MaC (Monitor and Checking) System [KKL01]
2. Syntax1. Defined on conditions, events, and expressions.
C := defined(C) | [E, E) | : C | C && C | C||C | Q QE := e | start(C) | end(C) | E || E | E&&E | E when CQ := time(E) | c |Q } Q
Where e is primitive event, c is a constant, 2 {>, <, =}, and } 2 {*,/,+,-}
.
Testing and Monitoring at Penn
(Informal) MEDL semanticsInterpreting MEDL on runs of hybrid automata, 1. A condition C maps a time period with true, false, or
undefined. If C is,1. [E1, E2). C is true from event E1 to E2 (not included)
2. C1&&C2. C is true when C1 and C2 are both true.
3. C1||C2. C is true when either of C1 and C2 is true.
4. : C1. C is the dual of C1.
2. An event E maps a time instance with true or false. If E is,1. start(C). E occurs at the time C becomes true.2. end(C). E occurs at the time C becomes not true.3. E1||E2. E occurs when either E1 or E2 occurs.
4. E1&&E2. E occurs when both E1 and E2 occur.
5. E1 when C. E occurs when E1 occurs and C is true
3. Q is an expression. If Q is,1. time (E). Q’s value is the latest time E occurs.
2. Q1 } Q2. Q’s value is Q1 } Q2.
.
Testing and Monitoring at Penn
An example If the dog loses the ball 50 seconds after the
ball becomes visible, an alarm should be raised.
Begin import event isVisible, isInvisible, track,lost;condition visible= [isVisible, isInvisible);event becameTruelost= lost when visible;alarm lostTrack=start (time(becameTruelost)-time(isVisible)>50);
End
isVisible, isInvisible, track, and lost are primitive events isVisible (isInvisible) occurs when >10 becomes
true (false). Lost (track) occurs when |-x| > 10 become true
(false). An alarm is an event which indicates the violation of
the safety requirement.
.
Testing and Monitoring at Penn
From Spec to Monitor Monitoring hybrid automaton A is synthesized from its
MEDL specification S,
1. For each event E in S,1. Variable VE in A records the latest time E occurs.
2. For each condition C in S1. Variable VC in A records the current value of C.
2. Variable VC- in A records the previous value of C.
3. Variable VCl records the latest time C changes.
3. For each expression Q in S1. Variable VQ in A records the value of Q.
.
Testing and Monitoring at Penn
The monitor-synthesizing algorithm (1)
The translation is highly modularized, Each condition, expression, and event is translated to an
automaton just for it. Each has a token ID Pc reflecting its syntactical order in MEDL
script. Each automaton is enabled only if the token is passed to it
(P=Pc) t records the occurrence time of the primitive event being
processed..
(a) C=: C1
Testing and Monitoring at Penn
The monitor-synthesizing algorithm (2)
.
(b) C=[E1, E2)
(c) E=start(C)
(d) Q=time(E)
Testing and Monitoring at Penn
The monitor-synthesizing algorithm (3)
Monitor is the composition of engine automaton and the automata for events, conditions, and expression.
Engine automaton checks incoming event and initialize the token.
;1:,:?0 PfalsenewEventtruenewEventP
.
;0:?1 PnP
Automaton e1 Automaton Cn
newEventp )0(
0P 1P nP
Token flow
Alarm/event detecting is indicated by the value change on the event variable.
VE records the time E occurs in the model.
Engine Automaton Event, Condition, and Expression Automaton
Monitoring Automaton
Testing and Monitoring at Penn
Outline1. Motivations2. Overview of our approach3. Model-based Monitoring.
1. Monitoring hybrid automata2. From logic specification to model-based
monitor3. Model instrumentation
4. Model-based Testing1. Creating and refining model-based testers
5. From Simulation to Implementation6. Case study on SONY AIBO Dog
1. Design-level validation2. “on-board” validation
7. Conclusion
Testing and Monitoring at Penn
Model Instrumentation (1)Monitor observes primitive events emitted by system model. Event is emitted via shared variables (Option I) Model modification.
x 45?
x -45
|| 10?
|| >10?
46
10
x
x )( xkx 46
10
x
x
Testing and Monitoring at Penn
Model Instrumentation (1)Monitor observes primitive events emitted by system model. Event is emitted via shared variables (Option I) Model modification.
x 45?
x -45
|| 10?VisInvisible=t, newEvent=true
|| >10?VisVisible=t, newEvent:=true
46
10
x
x10||
)(
x
xkx
10||
)(
x
xkx
|-x| 10?Vtrack=t, newEvent=true
46
10
x
x
|-x| >10?Vlost=t, newEvent=true
Testing and Monitoring at Penn
Model Instrumentation (2)
x 45?
x -45
|| 10?
|| >10?
46
10
x
x)( xkx 46
10
x
x
(Option 2) Model Augmentation A observer automaton (filter) is concurrently
composed with the model. The structure of the model will not be changed.
10|| x10|| x 10|| 10||
|-x| >10?Vlost=t,newEvent=true
1t
|-x|10?Vtrack=t,newEvent=true
|| >10?VisVisible=t,newEvent=true
||10?VisInvisible=t,newEvent=true
Filter
Testing and Monitoring at Penn
Outline1. Motivations2. Overview of our approach3. Model-based Monitoring.
1. Monitoring hybrid automata2. From logic specification to model-based
monitor3. Model instrumentation
4. Model-based Testing1. Creating and refining model-based
testers5. Case study on SONY AIBO Dog
1. Design-level validation2. “on-board” validation
6. Conclusion
Testing and Monitoring at Penn
Modeling testing taskTester model reassembles a virtual environment which supplies
test trace
Environment /hardware constrains
Environment /hardware constrains
+
Simulator
Environ. Model System Model
CoverageChecker
No
Test Trace
Tester Model
Yes
Determination Covera
ge Criteria
Coverage
Criteria
Testing and Monitoring at Penn
Modeling tester: an example (1)
Testing requirements. 1. Testing should cover all the locations in system model. 2. Testing should check the dog’s behavior when it lose the ball.
Step I: Modeling the environment as a non-deterministic hybrid automaton.
true?=20,t=0
=0=0
true?d=d-10true?d=d+10
true?b=b-0.1true?b=b+0.1
true?a=a-0.005
true?=0
1
)sin(
202
t
tbtad
true?a=a+0.005
Testing and Monitoring at Penn
Modeling tester: an example (2)
Step II: Select a simulation trace as test case.
1
)sin(
202
t
tbtadx
time Step III: Determinate environ. model for the test case
u=10?=20,t=0,a=0.025,b=0.2,d=90
1
)sin(
202
t
tbtadx
1
10
0
u
u
Testing and Monitoring at Penn
Outline1. Motivations2. Overview of our approach3. Model-based Monitoring.
1. Monitoring hybrid automata2. From logic specification to model-based
monitor3. Model-based instrument
4. Model-based Testing1. Create environment model2. Obtain model-based tester model from
environment model5. Case study on SONY AIBO Dog
1. Design-level validation2. “on-board” validation
6. Conclusion and related works
Testing and Monitoring at Penn
Design-level validation The composition of instrumented model, tester, and monitor forms a self-validating
Charon model Tester supplies test trace during simulation The occurrence of an event is indicated by the value of event variables during
simulation
(a) Alarm Detection
Testing and Monitoring at Penn
Design-level validation
(b) Primitive Events Emitted by Instrumented Model
(c) The simulation trace of monitor
Testing and Monitoring at Penn
Implemtation-level validation
Monitoring automaton
Monitoring automaton
System ModelSystem Model Testing automaton
Testing automaton
Modular compilation
Monitor Generated Code Tester
Link as needed
Testing and Monitoring at Penn
Implementation-level validation
The alarm is used for calling the external functions to report error.
1. “play” function is called when an alarm lostTrack is detected
The space overhead of tester and monitor.
Testing and Monitoring at Penn
ConclusionWe proposed a framework for testing and monitoring
model-based embedded systems.1. The approach works directly on models.
1. Monitor and tester tasks are specified in the high-level modeling language.
2. It doesn’t require the changes on simulator etc.
2. The framework is for both design-level and implementation-level validations.
1. The results on implementation-level validation may be linked to design-level validation.
3. It produces a self-testing and self-monitoring code for embedded system
1. Monitor and tester are executable on the targeted hardware platform
2. Validation is done directly on “board”.
Testing and Monitoring at Penn
Thank you!