of 16
8/13/2019 0ED49d01.pdf
1/16
8/13/2019 0ED49d01.pdf
2/16
For decades, a variety of validation methods have been developed [14]. In 1980[2], Balci and Sargent reviewed litera-
tures about model validation, and the total is 125. In 1984[3], the number increased to 308. These model validation methods
can generally be classified as objective and subjective types. In the same time, the idea to utilize expert system for the val-
idation of simulation model had been proposed, and several expert systems had been applied in some particular areas.
In these expert systems, researchers mainly focused on the implement classical objective validation methods, such as sta-
tistical analysis, and validation tasks were taken in a comparing and goodness of fit testing procedure. Birta and Ozmiz-
raks system was a typical rule-based system, and it was designed based on a validation knowledge base (VKB), which was
collection of credible proposition about simulation outputs, and it took the role of validity criterion[5]. On the other hand,
Findler and Mazurs system is in essential a case-based system. The authors pointed out that there were typical five catego-
ries of errors occurring in simulation model (SM) of large, complex systems and their verification and validation system were
designed mainly based on these error cases[6]. Hopkinson and Sepulvedas system provided a method for real-time evalu-
ation of a trainees performance. It was developed using case-based reasoning to evaluate the trainees decisions as simula-
tion execution. An advantage of this system is that real-time evaluation of a trainee can be performed automatically [7].
One disadvantage of these validation systems was that they were heavily dependent on the availability of measured data
from real system. However, this prerequisite cannot always be satisfied for the validation of complex simulation system, and
it means direct comparison between simulation outputs and real system behavior impossible [8].
After 1990s, simulation models in research became more and more complex in both structure and behavior, and heavily
dependent on simulation conditions. The traditional validation method, such as statistical analysis and simulation error
analysis can not be applied directly.
On the other hand, the validation of complex simulation models can be very exhaustive and time-consuming. For this
reason, the research to utilize expert system for the validation of simulation models was paid attention again. In 1999, SIM-
VAL reviewed most of simulation model verification and validation methods that had been proposed. They analyzed the
requirements of V&V methods and tools, and development of knowledge-based validation system is a promising scheme
[9]. In 2001, Goalwin et al. discussed the possibility of the automated support tools for verification, validation, and accred-
itation, they pointed out that most of the VV&A tasks were still accomplished by manpower, and its necessary to develop
automatic validation tools[10].
In this paper, a more sophisticated knowledge-based system is developed for the validation of complex simulation mod-
els. First, the complicated simulation behavior, which is hard to validated with traditional method, is abstracted and classi-
fied as five types, and their simulation analysis and validation methods are researched. Second, besides measured data,
domain knowledge about real system is utilized for validation task, and the domain knowledge varies from classical theorem
to experience about the dynamic behavior of real system. Third, knowledge system is designed dependent of the content of
domain knowledge, and it means that knowledge system can be applied for different kinds of validation task with proper
domain knowledge base.
The structure of this paper is as follows. In Section2, an overview of the knowledge-based validation method is intro-
duced. And in Section3, the output behavior of complex simulation models is classified as five categories, some special sim-
ulation analysis and validation methods based on knowledge are given for certain kind of simulation behavior. In Section 4,
the implementation of a knowledge-based validation system is introduced. For the last, the application of this method to-
gether with an example is described.
2. Overview of knowledge-based validation method
In practice, experience of domain expert is often used for model validation when there is not enough measured data about
real system. The idea of knowledge-based validation roots from this fact.
Besides measured data and experiential knowledge, the classical theorem and formula about the dynamic of real system
can also be utilized for the validation of simulation models. All these information, termed as domain knowledge takes the
role of reference for validity judgment. In other words, domain knowledge describes the valid characterization of simulation
output, i.e. what kind of dynamic features it should take.
Domain knowledge is important in this method. Forrester categorized domain knowledge in simulation as numerical,
written and mental types[11]. We utilize this taxonomy, the content and source of each kind of domain knowledge is listed
inTable 1.
Of the three kinds of reference information, mental information is often the most abundant, especially for complex sim-
ulation model validation. More details about domain knowledge and its acquisition can be found in Wang and Min [12].
The principle of knowledge-based validation method is displayed in Fig. 1.
First, besides domain knowledge, there is other two kinds of knowledge used, i.e. inference knowledge and task knowl-
edge. The content and characteristic of each kind of knowledge is listed inFig. 1. The three kinds of knowledge are abstracted
and described in knowledge model.
Second, knowledge-based validation system is developed on the basis of inference and task knowledge. In the mean time,
knowledge base is designed based on domain knowledge.
And third, knowledge system is utilized for each validation task automatically after proper domain knowledge is loaded.This method can settle the embarrassment emerging in the validation of complex simulation models in many aspects:
F.-Y. Min et al. / Simulation Modelling Practice and Theory 18 (2010) 500515 501
http://-/?-http://-/?-http://-/?-http://-/?-http://-/?-http://-/?-8/13/2019 0ED49d01.pdf
3/16
(1) Experiential curve and data about real system from domain experts is acquired. And this kind of knowledge can be
used for model validation instead of measured data.
(2) Traditional validation methods mainly concentrate on the validity of continuous dynamic and discrete event. In com-
plex simulation models, the relationships among different behaviors and aggregate of multiple behaviors can be very
complicated and can affect the credibility of simulation result. We propose domain knowledge-based method for the
analysis and validation of these behavioral characterizations in complex simulation model, see Section 3.3 and 3.4.
(3) Knowledge system is designed independent of application domain, it means that this system can be utilized for dif-
ferent validation task with suitable domain knowledge base, see part 4.
(4) Knowledge system can accomplish most of the validation task automatically. It can remove the problem of exhausting
and resource consuming in the validation of complex simulation models to a certain extent.
3. Validation method of complex simulation model based on domain knowledge
R.G. Sargent reviewed that operational validity methods can be classified as comparison and explore model behavior
[4]. And most of the mentioned methods, such as mathematic or statistic analysis, turning test and sensitive analysis, are in
essential based on behavior consistency analysis of simulation outputs.
In traditional research, the behavior of simulation models varies from dynamic of continuous dynamic to discrete event.
In practice, we find that there are some other types of output behaviors unique to complex simulation models, such as the
logic and timing dependence between behavior, and the aggregative behavior comprising several behaviors. These kinds of
behavior can influence the credibility of simulation result heavily. In research, we classify the behavior of complex simula-tion models into five categories, as listed inTable 2.
Table 1
Category of domain knowledge in simulation model validation.
Type Content Knowledge source
Numerical Measured data about state variable Precise descriptions in the form of data, trajectories, charts, etc., and be
found in various special and general database, observed data and curves of
real system
Measured curve about continuous dynamic
Statistical probability about random discrete event
Written Descriptions about behavioral relationship Explicit descriptions about physical property of real system and simulation
scenario, and come from various classical literatures, and usually in the
form of theories, principles, theorems and formulas
Description about hybrid dynamic sequence
Description about conditional discrete event
Mental Experiential curve about continuous dynamic Ambiguous descriptions from experiences and mental impressions of
experts, often incomplete, informal and biasedExperiential data about key index/variable
Experiential probability for random discrete event
Fig. 1. Principle of knowledge-based validation method of complex simulation models.
502 F.-Y. Min et al. / Simulation Modelling Practice and Theory 18 (2010) 500515
http://-/?-http://-/?-8/13/2019 0ED49d01.pdf
4/16
3.1. Classical validation methods
3.1.1. Continuous dynamic and its validation
Among the intricate behavior of complex system, discrete event and continuous dynamic is basic, all the behavior of com-
plex simulation models can be decomposed as aggregate of discrete event and continuous dynamic.
When a continuous dynamic takes place, some variables would change gradually as time evolves, and no discrete event
happens during the time interval.
If there is enough measured date about real system, some objective analysis method can be utilized. The most common
objective method is based on mathematical error norm, such as correlation coefficient and Theil inequality coefficient.
Correlation coefficient method can be used to detect a linear relationship between the simulated and the measured data
[13]. The correlation coefficient is defined as:
cc n
Pni1xix
0i
Pn
i1xi Pn
i1x0i
ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi ffiffi
nPn
i1x2i
Pn
i1xi 2q ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
nPn
i1x02i
Pn
i1x0i
2gq 1
Theil inequality coefficient method can be used to detect the fit degree between the simulated and measured data [14].
TIC is defined as:
TIC
ffiffiffiffiffiffiffiffiffiffiffiffiffiffi ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiPni1xix
0i
2q
ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiPni1x
2i
q
ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiPni1x
02i
q 2
Some other kind of error norm can also be utilized, such as expectation of errorI1 1N
PNt1e
2t; I2
PNt1e
2t; I3
PNt1
etxt
2;
andfet 11I1, maximum of error I4= max (et), and time integration of error, etc.
Besides, there are some other kind of objective analysis methods, such as frequency domain coherence analysis [15]andtime series method[16].
3.1.2. Key index/variable and its statistical analysis method
In traditional validation, statistical methods mainly concentrate on the value of some special variables at certain time
point in terminating simulation [10]. In this research, we term these special variables as behavior of key variable and index.
Statistical method is the most common validation techniques for key variable and index appeared in the literatures
[1,2,10]. Most of these statistical analysis methods concentrate on multi-replicas about specific variables, and should be done
with both real measured and simulated data. The hypothesis testing and confidence interval is of this category.
The application of statistical method depends on the availability of measured data[10]. If there is enough real data, then
the classical two sampleTtest can be used. If there is only experiential probability, the Bayesian statistics theory can be ap-
plied, there is some more work should be done in this field [17].
3.1.3. Discrete event and its validationDiscrete event is another kind of basic behavior of complex simulation system. When a discrete event takes place, some
variables may jump to new values instantaneously while other variables remain unchanged. Discrete event can be catego-
rized as external event, random event and conditional event according to the type of its precondition and change of state
variables, as listed inTable 3.
The validation of discrete event is to ensure that it behaves as described in domain knowledge: its precondition must be
satisfied when it takes place; all the variables in the system should change to new value as described.
Table 2
Output behavior of complex simulation system.
Simulation behavior Description
Continuous dynamic All related variables vary continuous or keep constant
Discrete event Some variables jump to new values instantaneously
Behavior rel ati onship Cas ual, l ogi c a nd t imi ng re lat ions hi p bet wee n cont inuous d yna mic and dis crete eve nt
Aggregative behavior Dynamic comprised of both continuous dynamic and discrete event
Key variable and index Variables and metrics selected by considering simulation objective
Table 3
Taxonomy of discrete event in complex engineering system.
Discrete event type Precondition Change of related state variable
External event None Given value or random value
Random event Given random distribution Given value or random value
Conditional event Some logical conditions Given value
F.-Y. Min et al. / Simulation Modelling Practice and Theory 18 (2010) 500515 503
8/13/2019 0ED49d01.pdf
5/16
3.2. Experiential knowledge-based validation method
In this section, domain expert experience based validation method is introduced. This method can be integrated in knowl-
edge system and be used for the situation that there is not enough measured data about real system in model validation.
For decades, plenty of expert-based validation methods were proposed, such as animation and turning test[1,4].
Our experience-based validation is essentially to automate these validation methods within computer program. First, the
experience of domain expert is elicited and abstracted into experiential curves and parameters. Second, simulation output
analysis and validation functions are designed with these curves and parameters. And third, these functions are integrated in
knowledge system for validation task.
3.2.1. Acquisition and abstraction of experiential curves
InFig. 2, experiential curve about the dynamic of exponential decay and second-order inertia system is displayed. The
validation of these two kinds of continuous dynamic is introduced below.
For the first step, the ambiguous impression about the dynamic is elicited from domain experts. And this mental expe-
rience is then transformed into mathematical descriptions.
As domain expert suggested, the behavior ofsaishould follow a curve of exponential decay. The mental description aboutdynamic behavior ofsai is then abstracted as mathematical form:
Fsai h1geh2t h3 3
whereh1is the decay coefficient,h2is decay time constant andh3is the initial value of the exponential decay. As it illustrated
in the expert curve, the characterization of the continuous dynamic can be described by the three parameters.In the same way, we can acquire the mental description and characteristic parameters of dynamic behavior ofsbi, its out-
put of a second-order inertia link with step input, and its ordinary characteristic parameters are overshoot, stead-state time
and oscillating times, as it shows in the figure.
The characteristic parameters about the behavior in Fig. 2are listed inTable 4.
In a word, the dynamic characterization of expert curve can be described by certain parameters. And our behavioral con-
sistency analysis based on experience will mainly concentrate on these characteristic parameters.
3.2.2. Acquisition and description of characteristic parameters
For this step, the value of each characteristic parameters about sai and sbi are acquired from domain experts.
Fig. 2. Experiential curve of exponential decay and second-order inertia system.
Table 4
The characteristic parameters of the example inFig. 3.
Behavior Characteristic parameter Description
sai continuous dynamic of variable a Deca y coe fficient The gain of exp onenti al deca y, i .e. h1Decay time constant The coefficient describing the length of decay time, i.e.h2Initial v alue of decay The initial state of decay, it sh1+h3 in this case
sbi continuous dynamic of variable b Overshoot of inertia link The value that the peak overshoots step unit signal,rpStead-state time The length of time that the systems get to stead state, i .e.ts
Oscillating times The oscillating times before system get to stead state
504 F.-Y. Min et al. / Simulation Modelling Practice and Theory 18 (2010) 500515
8/13/2019 0ED49d01.pdf
6/16
For domain experts, they have impression about the possible value and range of each characteristic parameter in their
experience. And these experiential data can be acquired by some acquisition technology[18].
These experiential data is often obscure and approximate, and it is usually denoted with fuzzy membership function[19].
Its a normal distribution type membership function inFig. 3. For characteristic parameterhi,i= 1, 2, 3, its reference value
is hi. When hi hi; its membership degreel(hi) = 1; and as h i deviates from hi,l(hi) should decline.For this reason, we can analyze the parameter consistency of simulation outputs with fuzzy member function:
lh0i e
h0ihi
r2i
2
4
where h 0i is the value ofhiin simulation results, hi is the ideal reference value of characteristic parameter hi, andr2i denotes
the precision requirement of simulation application. Both hi andr2i is acquired from domain experts.
3.2.3. Analysis characteristic parameter in simulation result
For the last, the characteristic parameters in simulation result are analyzed. Take the analysis ofF(sai) as an example, theleast squares estimation algorithm can be utilized.
Run simulation model and getm sampling value about variable aamong continuous dynamic sai, (tj,yj),j= 1, 2, . . .,m. Thesampling value is then used for the numerical fitting of behavioral function ofF(sai), and the deviation function of fitting is:
Jh01; h02; h
03
Xmj1
yjFsai 5
To get the best fitting value of each parameter, we should consider the derivative condition:
@J
@h0i0; k 1;2;3 6
AsF(sai) is of nonlinear form, its hard to get the analytical solution. There are some least square functions provide by Matlabcan be utilized, such as lsqcurvefit(), nlinfit() and lsqnonlin().
For the last, the characteristic parameters h0i in simulation output is compared with reference value, and simulation con-
sistency ofsaiis decided. For each parameterh0i, its simulation validity can be obtained with function(4). The analysis of con-
tinuous dynamicsbi can be done in similar way.
3.3. Knowledge-based validation method of behavioral relationship
In classical validation methods, they pay lots of attention to the dynamic characteristic of continuous dynamics and sta-
tistical characteristic of key variables. However, there is few research concentrated on simulation validity about dependence
among different behavior segments. We utilize the term behavioral relationship to depict the logic and timing dependence
among continuous dynamics and discrete events.
In complex simulation models, the behavior relationship can be numerous and affect the credibility of simulation results
heavily. For example, in DIS/HLA based simulation, the monitor unit may collect simulation data about exploding event be-
fore weapon firing. Thats resulting from data packet lost or network delay, and its a typical error of timing relationship in
research.
We propose a knowledge-based method for the validation of behavioral relationship and integrate it into knowledge sys-
tem. First, the common behavior relationships involved in complex simulation models are classified and represented with
standard expressions. Second, for each kind of behavioral relationship, we develop its validation function in knowledge sys-
tem. Third, for each simulation model, domain knowledge about its correct of valid behavior relationship is collected by ana-
lyzing simulation context. These knowledge segments are then described with the formal expressions and coded into
knowledge base. And for each validation task, corresponding knowledge base can be used for validation.
i
i
( )i
1
Fig. 3. Description of experiential data in fuzzy distribution.
F.-Y. Min et al. / Simulation Modelling Practice and Theory 18 (2010) 500515 505
8/13/2019 0ED49d01.pdf
7/16
3.3.1. Knowledge representation of behavioral relationships
For a complex system, we denote the sets of its output continuous dynamic and discrete event by Vand E, respectively.
For each continuous dynamics e V, its four behavioral factors are initial valueinit(s), behavior functionF(s), restricted spaceres(s) and time domaindom(s).F(s,x) is the dynamic property of state variable x e X, it can be dynamic equation, measureddata about real system, logic proposition or other kind of knowledge. res(s) is the state space that the state variable mustremain durings, it is usually description about restriction on system by environment or simulation scenario.
Take the continuous dynamicsa1inFig. 2as an example, we only get some experiential description about its behavior. Soits behavioral factors are noted as:
Fsa1; a fcurve of exponential decayg
ressa1 fa Ag
As for discrete event e e E, its three behavioral factors are change of state variables new(e), enabling conditionenable(e) and
time domaindom(e).new(e,x) is the new value of state variable x after the happening of discrete event e, it can be fixed va-
lue, analytical expression, or statistical distribution.enable(e) states the precondition that touch off discrete evente, and it is
usually described by appropriate space of relate variables.
With the defined behavioral factors above, we can give formal representation for most of the ordinary behavioral relation-
ships involved in complex simulation models as it shows in Table 5.
Generally, there are two kinds of behavior relationship in complex simulation systems, i.e. logic and timing relationship.
The former denotes dependence between values of related variables, and the latter is the timing dependence between behav-
ior components. Most of behavior relationship can be abstracted and described with these expressions. Besides, these
expressions can be generalized for the relationships among more than two behavioral components.
3.3.2. Implementation of analysis and validation function
For each kind of behavioral relationship, we can develop its simulation output analysis function with standard calling for-
mat. Take continuous dynamic s enables discrete event e as an example, its formal expression is
Fs;xt 2 enablee; t2 doms 7
it means that the trajectory of variablex gets into the enable space of discrete event e at some time tduring continuous dy-
namics. Its behavioral consistency analysis algorithm is as following:
Step1: SETtstep, t0 = sftime,tf= s.ltime, flag= 0,th= 0;Step2: IF(t0 P tf)EXIT;
IF(x(t0) e enable(e)){th= t0 ;
IF(e happen)
{IF(x(t0 + tstep) e new(e))flag= 0;
ELSE flag= 1 ;}
ELSE flag= 2 ;}
ELSE IF(e happen)th = t0, flag= 3;
t0 = t0 + tstep;
Step3: RETURN Step2;
Table 5
Ordinary behavior relationship in complex simulation models.
Behavioral relationship Knowledge representation
Continuous dynamics enables discrete event e F(s, x)(t) e enable(e), te dom(s)Discrete event e1 enables discrete event e2 new(e1) e enable(e2)
Discrete event e disables continuous dynamic s new(e)Rres(s)Discrete event e causes continuous dynamic s new(e) e init(s), dom(e) =s ftimeDiscrete event e1 conflicts with discrete event e2 enable(e1) \ enable(e2) = U
Continuous dynamics1 conflict with continuous dynamic s2 res(s1)\ res(s2) = UDiscrete event e1 synchronizes with discrete event e2 dom(e1) =dom(e2)
Continuous dynamics1 synchronizes with continuous dynamic s2 dom(s1) =dom(s2)Discrete event e1 is ttime later than discrete event e2 dom(e1) =dom(e2) +t
Continuous dynamics1 is ttime later than continuous dynamics2 s1 ftime=s2 ftime + te is a periodic discrete event with cycle T dom(e) = {t|t=t0+kT,k e N}
e is a random discrete event with distribution Pk dome1 t0 t;
t Pk
506 F.-Y. Min et al. / Simulation Modelling Practice and Theory 18 (2010) 500515
8/13/2019 0ED49d01.pdf
8/16
There are four kinds of analysis result. flag= 0 means simulation result is consistent with domain knowledge, flag= 1
means value of variable x is wrong after discrete event e, flag= 2 means e does not happen while its enabled, and flag= 3
means e happens but its not enabled.
In this way, we can develop simulation analysis and validation function for all the behavior relationship in Table 5.
3.3.3. Collection and coding of domain knowledge
For each complex system, we can collect the domain knowledge about all valid behavioral relationship during simulation
operation, by analyzing simulation scenario and physical property of real system.
3.4. Knowledge-based validation of aggregative behavior
In this section, we discuss the validation of aggregative behavior with written or numerical knowledge. The term aggre-
gative behavior is used to characterize the macroscopic property of the behavior comprising several continuous dynamic and
discrete events in complex simulation models.
During simulation run, some simulation agents should follow certain given behavior procedure. The behavior procedure
is often defined in simulation scenario or prescribed by the agents destination. For example, there are usually plenty activ-
ities during the flight simulation of aircraft, such as taking off, preparing combat, attacking, Evading. These activities should
be executed in certain order according to the situation of battling. The correctness of behavior order can have significant
influence on simulation result.
On the other hand, there are often plenty of uncertainties and randomness during the operation of simulation, and few
runs may produce very different behavior procedure in output. However, the traditional methods can not be applied directly
for the analysis of the validity of simulation results with different behavior paths.
We propose to utilize hybrid dynamic sequence and discrete event tree for the analysis of the above two aspects of prop-
erties of aggregative behavior.
3.4.1. Hybrid dynamic sequence and its validation
Hybrid dynamic sequence is a kind of aggregative behavior comprising some given discrete events and continuous
dynamics. For example, the hybrid dynamic sequence of variable a in Fig. 2is denoted as:
ha sa1e1sa2e2sa3. . . 8
For the analysis and validation ofha, we should collect the domain knowledge about behavioral factors of both related con-
tinuous dynamic and discrete event. And this knowledge can be described with the formal expression in Section 3.3.
By intuition, we can compare behavioral characterization of both continuous dynamic and discrete event with corre-
sponding domain knowledge. And the domain knowledge about hybrid dynamic ha
can be represented as:
at 2 ressai ^at 2 Fsaiwhen t2 domsai
at 2 enableei ^at 2 newei when t domei 9
where res(sai), F(sai),enable(ei) and new(ei) is the formal expression of the domain knowledge about related behavior.The condition a(t) e F(sai) should be analyzed separately as individual continuous dynamic. The classical method dis-
cussed in Section3.1 can be used if there is enough measured data, otherwise the experience based method in should be
utilized.
And for the analysis ofa(t) e res(sai),a(t) e enable(ei) anda(t+) e new(ei), we should check that if the value of related var-iable gets into the corresponding state space in domain knowledge. It can be accomplished with the same way in Section3.3,
and the analysis algorithm as follow:
Step1: SETtstep, t0 = s.ftime, tf= s.ltime,i =0, flag[] = 0, th[] = 0 ;Step2: IF(t0 P tf)EXIT;
IF(a(t0) e enable(ei))
{IF(ei happen)
{ IF(x(t0 + tstep) e new(ei))flag[i] = 0 ;
ELSEflag[i] = 1 , th[i] =t0;}
ELSEflag[i] = 2 , th[i] =t0;
IFflag[i] = 0i +=1;}
ELSE
{IF(a(t0) eres(sai))flag[i] = 3;ELSEflag[i] = 4 , th[i] =t0;}
t0 = t0 + tstep;
Step3: RETURN Step2;
F.-Y. Min et al. / Simulation Modelling Practice and Theory 18 (2010) 500515 507
http://-/?-http://-/?-http://-/?-http://-/?-http://-/?-http://-/?-8/13/2019 0ED49d01.pdf
9/16
There are five kinds of analysis result. flag[i] = 0 means jth discrete event is consistent with its domain knowledge, fla-
g[i] = 1 means the value of variable ais wrong after the event, andflag[i] = 2 means the discrete event does not happen while
its enabled. flag[i] = 3 means the continuous dynamic after jth discrete event is consistent with domain knowledge, while
flag[i] = 4 means not.
3.4.2. Discrete event tree and its validation method
The term discrete event tree is used to describe all the possible behavior paths that a simulation model may pass through.
LetEI= {ei e E,i = 1, 2, . . . ,n} be the set of discrete event that may take place, and its discrete event tree can be denoted as:
PathI fA1A2 . . .An;Ai2 fS; F; Ogg 10
where Ai=Sife i occurs, Ai=Fife i loses, andA i=O meanse i is not involved in current path.
For example, there three possible discrete eventse1,e2ande3inFig. 4. Ife1occurs, the possible following discrete event is
e2. Other wise, e3 may happen. So there is four possible discrete event paths, i.e. SSO, SFO, FOSand FOF.
For the validation of discrete event tree, we concentrate on the probability of each event path, and take the same idea as
classical statistical validation method. For each possible event path, the statistical characteristic of its occurrence should be
consistent with real system.
For each pathPi e PathI, its probability can be computed conditional probability of related event along the path iteratively:
proPi;j proejjPi;j1 proPi;j1j 1;2; . . . ; n 11
wherepro(ei|Pij1) is the conditional probability ofe iwhenPij1 occurs, and pro(Pi) =pro(Pi,n). And these conditional proba-
bilities may from measured data or experiential probability about real system.
As the probability of each possible event path is computed, the statistical analysis method of confidence interval orhypothesis testing can utilized, as introduced in Section 3.1.
4. Design and implement of knowledge-based validation system
In this part, the implement of knowledge-based validation system is introduced. First, inference structure suitable for the
validation of complex simulation models is discussed. And then, knowledge system is designed based on the discussion.
4.1. Design of inference machine for model validation
Inference machine describes the inferring strategies and knowledge transformation in simulation model validation. By
analysing the context of model validation, we design the proper inference machine, as it shows in Fig. 5. All the validation
methods discussed in part 3 can be implemented in this machine.
Validation inference steps focus on generating new information with static knowledge, i.e. transforming validation tech-
nique into automatic inference. In general, the validation process of complex simulation models is decomposed into five
steps, i.e.Determine,Design,Select,Analyze, andJudge. These inference steps should be executed in given strategy and there
is specific input and output knowledge segment for each inference step.
The function of each inference step is listed below:
Determine:Choose proper simulation behavior for current validation task, by analyzing simulation context, simulation
objective and validity weight of each candidate task.
Design:Decide validation schema and choose proper analysis and validation method, by analyzing the characterization of
related behavior of determined validation task.
Select:Select the corresponding simulation output data of simulation behavior to be analyzed, the related variables, sim-
ulation logic time interval, and number of simulation replicas is decided.
Analyze: Analyze the consistency between determined simulation behavior and domain knowledge. The related tech-
niques and algorithm is detailed in part 3.Judge:Compute validity index and make decision, by analyzing behavior consistency with range of acceptable.
Fig. 4. Example of discrete event tree.
508 F.-Y. Min et al. / Simulation Modelling Practice and Theory 18 (2010) 500515
http://-/?-http://-/?-8/13/2019 0ED49d01.pdf
10/16
4.2. Implement of knowledge-based system
The central issue involved in the implement of knowledge system is the structure of software. Validation system imple-
ment three levels of functions: user interface, inference machine, and validation infrastructure, as it shows in Fig. 6.
User interface:Provide interface between knowledge system and validation undertaker.
Inference machine:Execute the inference steps in specific strategy, and response to user instructions. The inference ma-
chine automates the simulation output analysis and validation process, it the most important part of knowledge-based val-
idation system. The main functions of inference machine contains: validation task decision, validation schema design,
simulation output selection, behaviour consistency analysis, and validity judgment.
Validation infrastructure: All the supporting data and knowledge for validation is stored here. This information containssimulation output data, domain knowledge base, and specific behaviour consistence analysis algorithm.
This system can be utilized for simulation data analysis and validation, and functions of this system are listed here:
Fig. 5. Inference machine for the validation of simulation models.
Fig. 6. Software structure of knowledge-based validation system.
F.-Y. Min et al. / Simulation Modelling Practice and Theory 18 (2010) 500515 509
8/13/2019 0ED49d01.pdf
11/16
(1) Automate the validation inference process and analyze the behavior consistency between simulation and real system.
(2) Compute and synthesize validity indices for all the validation nodes among different levels of the hierarchical index
tree.
(3) Manage and maintain domain knowledge, it provide interface for validation engineers and domain experts.
(4) Manage, view, edit and update information about each validation nodes in task tree.
(5) Trace the whole validation process, collect information generated in each inference steps, and display validation infor-
mation for every validation node.
(6) Generate validation conclusion and document validation information.
5. An example problem
5.1. Introduction of simulation system
In this part, we take simulation model of electromagnetic rail gun as an example. This model has long been used in our
research, and it has a multi-entity structure and hybrid dynamic behavior, offering excellent case for demonstrating the pro-
posed method, seeFig. 7.
Electromagnetic rail gun is a new concept of launching equipment [20]. It consists of two parallel rails connected to a
source of DC current, and a projectile sliding between the rails propelled by Lorentz force. During the acceleration, there
are mainly there kinds of forces against the projectile motion: mechanical friction, ablation drag, and resistance force arising
from the air. A key equipment is the pulse forming network for optimizing peak current. It contains suits of capacitors, and
the scheduling of switches on capacitors has obvious influence on launching efficiency.The introduction will mainly about the validation tasks of dynamics of projectile and behavioural relationship of current
in capacitors. As the application of statistical and quantitative validation methods can be found in lots of literatures, our dis-
cussion will mainly focus on experience and logic knowledge-based validation method.
Fig. 7. Sketch and pulse forming network of electromagnetic rail gun.
Table 6
Validation behavior of electromagnetic rail gun simulation system.
Simulation behavior Related variable Description of domain knowledge Index
Switch on Current of rail Conditional event arranged by launching logic A01
Launching efficiency Gunpoint velocity Measured data about velocity, A02
Energy in capacitors Measured energy expenditure
Launching process Status of weapon
system
Target? charge? acceleration? departure? external flight? end A03
Start acceleration Acceleration Conditional event dependent on Lorenz force B01
Power operationalprocess Status of capacitors Recharge?
launching instruction?
discharge?
departure?
feedback B02
Acceleration process Status of rail
Projectile velocity Initial acceleration?mechanical friction? heat congregating? ablation
drag? air resistance? departure
B03
Inner acceleration
dynamic
Acceleration The acceleration is influenced by many factors B04
Lorenz force dynamic Lorenz force F 12 L0I2R B05
Mechanical friction
dynamic
Friction Proportional to velocity B06
Air resistance dynamic Air resistance Influenced by velocity B07
Ablation dynamic Ablation Influenced by inner temperature B08
Arc discharge Rail voltage Conditional event dependent on port voltage B09
Target Position of target External event defined in scenario C01
Air disturbance Disturbance Random event during external flight C02
Port velocity Projectile velocity Measured data about projectile velocity C03
Miss distance Projectile position Measured projectile position C04
510 F.-Y. Min et al. / Simulation Modelling Practice and Theory 18 (2010) 500515
8/13/2019 0ED49d01.pdf
12/16
8/13/2019 0ED49d01.pdf
13/16
dominant resistance of projectile. The friction is proportional to the pressure between projectile and rail, and it would in-
crease as Lorentz force increases, as the green curve.
The collected domain knowledge, both measured data about the operation of real system and experience of experts, is
then coded in knowledge base.
There are typically two kinds of simulation output analysis and validation functions involved, general analysis function
and experience based analysis function. The formal is implemented independently of validation domain, can be used for the
analysis of general model behavior, the regular statistical analysis and TIC method is of kind. The latter is mainly used for the
analysis of experiential knowledge, such as the expert curve-based fitting.
Take the experiential curve of air resistance (red curve in Fig. 9) as example. It can be considered as two linear dynamic
connected by the discrete event that velocity of projectile exceed value ofVth1. And the experiential curve can be denoted as:
Ffp k1t v Vth1
k2tVth1=k1 Vth1 v> Vth1
12
where k1, k2 is experiential parameter that can be acquired from domain experts. The method about their acquisition and
description is discussed in Section3.2.
All the experience about inner ballistic dynamic is abstracted in the same way. And the behavior consistency analysis
functions can be implemented based on this experiential curve and data.
5.4. Step 3: simulation experiment
In this step, simulation experiment is done with the electromagnetic rail gun model, and the simulation output of interestis collected. In these experiments, length of rail is 3 m, mess of projectile is 100 g, and four suits of capacitors are utilized.
Some of the simulation output behavior is listed in Figs. 10 and 11.
0 0.5 1 1.5 2 2.5
x 10-3
0
0.5
1
1.5
2
2.5
3
(s)
(m)
Fig. 10. Projectile position in inner ballistic.
x 10-3
(s)
0 0.5 1 1.5 2 2.50
200
400
600
800
1000
1200
(N)
FdFf
Fp
Fig. 11. Inner ballistic resistance during acceleration.
512 F.-Y. Min et al. / Simulation Modelling Practice and Theory 18 (2010) 500515
http://-/?-http://-/?-8/13/2019 0ED49d01.pdf
14/16
Simulation outputs for all validation tasks inTable 7are collected. As it shows following figures, the inner acceleration
end at 1.1 ms, and the velocity is about 1899.6 m/s. The launching efficiency g =Ep/Ec= 180424 J/1903000 J = 9.5%.
5.5. Step 4: validation of electromagnetic rail gun simulation system
After all the required measured data, domain knowledge and simulation output listed inTable 6is collected and loaded in
knowledge system, simulation analysis and validation task is accomplished automatically.
It shows the simulation analysis of air resistance in inner ballistic in Fig. 12. The simulation behavior of air resistance isfitting as two linear continuous dynamic connected by the discrete event that velocity of projectile exceed value ofVth1. The
analysis of all the simulation behavior in Table 6can be accomplished in the same way.
The main validation result is validity indices of tasks among the validation task tree, as it shows in Table 8.
Table 7
Part of the domain knowledge about inner resistance for electromagnetic rail gun simulation models.
Validation task Simulation behavior Related
variable
Validation method Knowledge description
B06
(mechanical
friction)
Continuous dynamic of
mechanical friction
Friction/
Lorentz force
Behavioral relationship
analysis
The mechanical friction is proportional to the Lorentz
force approximately
B07 (air
resistance)
Continuous dynamic of
initial air resistance
Air resistance Continuous dynamic
fitting
It should almost keep constant at some low value if the
velocity is under threshold Vth1Continuous dynamic of
latter air resistance
Air resistance Continuous dynamic
fitting
It should increase linearly with slop karif the velocity is
aboveVth1
B08 (ablation
drag)
Discrete event of beginning
ablation
Inner rail
temperature
Discrete event match Projectile begins to ablate as inner rail temperature
exceeds threshold T1Hybrid dynamic of ablation
drag
Ablation drag Hybrid dynamic
sequence Validation
Ablation drag dynamic is dependent on the state of
ablating event tree
Event tree of ablation Inner rail
temperature
Discrete event tree
analysis
The state of ablating event tree changes as temperature
exceeds two thresholds
Fig. 12. Simulation output analysis of inner air resistance.
F.-Y. Min et al. / Simulation Modelling Practice and Theory 18 (2010) 500515 513
8/13/2019 0ED49d01.pdf
15/16
8/13/2019 0ED49d01.pdf
16/16
[5] L.G. Birta, F.N. Ozmizrak, A knowledge-based approach for the validation of simulation models: the foundation, ACM Transactions on Modeling andComputer Simulation 6 (1996) 7698.
[6] Nicholas V. Findler, Neal M. Mazur, A system for automatic model verification and validation, Transactions of the Society for Computer Simulation 6(1990) 153172.
[7] W.C. Hopkinson, J.A. Sepulveda, Real time validation of man-in-the-loop simulations, in: Proceedings of the Winter Simulation Conference, 1995, pp.12501256.
[8] J.P.C. Kleijnen, Validation of models: statistical techniques and data availability, in: Proceeding of Winter Simulation Conference, 1999, pp. 647654.[9] P. Glasqow, C.M. Parnell et al., Simulation Validation (SIMVAL), in: Making VV&A Effective and Affordable Mini-Symposium and Workshop, 1999, pp.
3446.[10] Patrick W. Goalwin, Jerry M. Feinberg, Pamela L. Mayne, A detailed look at verification, validation, and accreditation (VV&A) automated support tools,
in: Proceedings of the 2001 Fall Simulation Interoperability Workshop, Orlando, FL, 2001, 01F-SIW-041.[11] Daviud N. Ford, John D. Sterman, Expert knowledge elicitation to improve formal and mental models, System Dynamic Review 4 (1998) 309340.[12] J.T. Wang, F.Y. Min, Knowledge elicitation and acquisition for simulation validation, in: International Conference on Computational Intelligence and
Security, 2007, pp. 8588.[13] A.S. White, R. Sinclair, Quantitative validation techniques a data base (I). Simple example, Simulation Modeling Practice and Theory 12 (2004) 451
473.[14] D.J. Murray-Smith, Methods for the external validation of continuous system simulation models: a review, mathematical, Computational Model
Dynamical System 4 (1998) 531.[15] John C. Morris, Matthew P. Newlin, Model validation in the frequency domain, in: Proceedings of the 34th Conference on Decision and Control, New
Orleans, 1995.[16] Der-Ann Hsu, J. Stuart Hunter, Validation of computer simulation models using parametric time series analysis, in: Proceedings of the 1974 Winter
Simulation Conference, 1974, pp. 1416.[17] Sigrun Andradorrir, Vicki M. Bier, Applying Bayesian ideas in simulation, Simulation Practice and Theory 8 (2000) 253280.[18] G. Schreiber, H. Akkermans, et al, Knowledge Engineering and Management: The CommonKADS Methodology, MIT Press, 2000.[19] L.A. Zadeh, K.S. Fu, et al, Fuzzy sets and their application to cognition and decision process, IEEE Transactions on System, Man, and Cybernetics 7 (2)
(1977) 122123.[20] H.D. Fair, Electric launch science and technology in the United States, Proceedings of the 11th Symposium on Electromagnetic Launch Technology,
Saint-Louis, 2002.
F.-Y. Min et al. / Simulation Modelling Practice and Theory 18 (2010) 500515 515