Linköping Studies in Science and Technology
Licentiate Thesis No. 1588
Towards an Approach for Efficiency Evaluation of
Enterprise Modeling Methods
by
Banafsheh Khademhosseinieh
Department of Computer and Information Science
Linköpings universitet
SE-581 83 Linköping, Sweden
Linköping 2013
This is a Swedish Licentiate´s Thesis
Swedish postgraduate education leads to a Doctor´s degree and/or a Licentiate´s degree.
A Doctor´s degree comprises 240 ECTS credits (4 years of full-time studies).
A Licentiate´s degree comprises 120 ECTS credits.
Copyright © 2013 Banafsheh Khademhosseinieh
ISBN 978-91-7519-639-8
ISSN 0280-7971
Printed by LiU-Tryck 2013
URL: http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-89883
Department of Computer and Information Science
Linköpings universitet
SE-581 83 Linköping, Sweden
Towards an Approach for Efficiency Evaluation of
Enterprise Modeling Methods
by
Banafsheh Khademhosseinieh
April 2013
ISBN 978-91-7519-639-8
Linköping Studies in Science and Technology
Licentiate Thesis No. 1588
ISSN 0280-7971
LiU-Tek-Lic-2013:22
ABSTRACT
Nowadays, there is a belief that organizations should keep improving different aspects of their
enterprise to remain competitive in their business segment. For this purpose, it is required to
understand the current state of the enterprise, analyze and evaluate it to be able to figure out
suitable change measures. To perform such a process in a systematic and structured way,
receiving support from powerful tools is inevitable. Enterprise Modeling is a field that can
support improvement processes by developing models to show different aspects of an
enterprise. An Enterprise Modeling Method is an important support for the Enterprise
Modeling. A method is comprised of different conceptual parts: Perspective, Framework,
Method Component (which itself contains Procedure, Notation and Concepts), and
Cooperation Principles. In an ideal modeling process, both the process and the results are of
high quality. One dimension of quality which is in focus in this thesis is efficiency. The issue
of efficiency evaluation in Enterprise Modeling still seems to be a rather unexploited research
area.
The thesis investigates three aspects of Enterprise Modeling Methods: what is the meaning of
efficiency in this context, how can efficiency be evaluated and in what phases of a modeling
process could efficiency be evaluated. The contribution of the thesis is an approach for
evaluation of efficiency in Enterprise Modeling Methods based also on several case studies.
The evaluation approach is constituted by efficiency criteria that should be met by (different
parts of) a method. While a subset of these criteria always need to be fulfilled in a congruent
way, fulfillment of the rest of the criteria depends on the application case. To help the user in
initial evaluation of a method, a structure of driving questions is presented.
This work has been supported by School of Engineering, Jönköping University.
Acknowledgements
This thesis was performed within infoFLOW-2 project, that was funded by KK-foundation.
infoFLOW-2 was a research project with a relatively large number of partners: four industrial
partners (SYSteam, C-Business, Proton Finishing, CIL Ljungby), one research institute
(Fraunhofer ISST) and one academic partner (Jönköping Tekniska Högskolan: JTH).
Here I would like to thank people, who helped me throughout my research process. I am very
much thankful to my main supervisor Kurt Sandkuhl, and my co-supervisors Ulf Seigerroth
and Sture Hägglund for their positive attitudes as well as all their worthy comments and
guidelines. I am also thankful to participants of infoFLOW-2. Without their cooperation, it
was not possible to receive support from this project for the purpose of this thesis.
My especial thanks goes to my family for all their encouragements: to my parents for their
patience, when I had to spend most of my time in the office, sitting in front of my computer or
being lost under books and papers; to my sister Bahar, that kept motivating me and was
supportive all the way, despite living in another continent.
Thank you all,
Banafsheh Khademhosseinieh
March 2013, Jönköping
Table of Contents
1. Introduction ............................................................................................................................ 1
1.1 Background and Motivation ............................................................................................. 1
1.2 Related Publications by the Author .................................................................................. 4
1.3 Thesis Outline ................................................................................................................... 4
2. Research Method .................................................................................................................... 8
2.1 The Followed Research Approach, Discipline and Method ............................................. 8
2.1.1 Abductive Approach .................................................................................................. 8
2.1.2 Design Science ........................................................................................................... 9
2.1.3 Case Studies ............................................................................................................. 15
2.2 Schematic Overview of the Followed Research Path ..................................................... 16
2.2.1 Background Formulation of Research Questions .................................................... 17
2.2.2 Contribution Evolvement ......................................................................................... 19
3. Theoretical Background & Frame of Reference .................................................................. 21
3.1 Foundational Concepts ................................................................................................... 21
3.1.1 Enterprise Modeling & Enterprise Models .............................................................. 21
3.1.2 From Method to Enterprise Modeling Method ........................................................ 25
3.1.3 Quality & Efficiency ................................................................................................ 30
3.2. Approaches for Quality Evaluation in Enterprise Modeling ......................................... 33
3.2.1 Quality Evaluation of Models .................................................................................. 35
3.2.2 Quality Evaluation of Modeling Processes, Languages, Methods and
Methodologies ................................................................................................................... 41
3.3 Conclusions of the Chapter............................................................................................. 43
4. Enterprise Modeling Case Studies ....................................................................................... 44
4.1 Rationales behind Selection of the Cases ....................................................................... 44
4.2 infoFLOW-2 ................................................................................................................... 46
4.2.1 Introduction to infoFLOW-2 .................................................................................... 46
4.2.2 Observed Modeling Case/Workshop in infoFLOW-2 ............................................. 48
4.3 “Enterprise Modeling (EM)” Course .............................................................................. 50
4.3.1 Introduction to “Enterprise Modeling (EM)” Course, MSc Level .......................... 50
4.3.2 Observed Modeling Case (and Its Workshops) in EM Course ................................ 52
4.4 Identified Problematic States in EM Projects ................................................................. 55
4.5 Summary of the Identified Problems in the EM Projects ............................................... 59
5. Efficiency Evaluation in Enterprise Modeling ..................................................................... 61
5.1 Efficiency in EM ............................................................................................................ 61
5.2 An Approach for Evaluating Efficiency of Enterprise Modeling Methods .................... 62
5.2.1 Structure of the Approach for Efficiency Evaluation of Enterprise Modeling
Methods ............................................................................................................................. 63
5.2.2 How to Follow the Efficiency Evaluation Approach ............................................... 83
6. Empirical Validation of the Efficiency Evaluation Approach ............................................. 85
6.1 Introduction to the Validation Cases .............................................................................. 85
6.1.1 Marketing Department at Jönköping University-School of Engineering ................ 86
6.1.2 The Manager (Head) of Information Engineering & Management Program .......... 88
6.2 Reflections on the Efficiency Evaluation Approach ...................................................... 89
6.2.1 Reflections on the Preparatory Phase ...................................................................... 90
6.2.2 Reflection on the Structure of A3E2M .................................................................... 91
6.3 Refinements to A3E2M based on the Reflections .......................................................... 93
6.3.1 Implications regarding the Preparatory Phase ......................................................... 94
6.3.2 Additional Efficiency Criteria for Different Parts of EMM .................................... 96
6.4 Future Work: Support of Different Phases of an EM Process ........................................ 97
6.5 Discussion on the Applicability of A3E2M ................................................................. 100
7. Discussion .......................................................................................................................... 102
7.1 Answering the Research Questions .............................................................................. 102
7.2 Reflection on the Followed Research Discipline ......................................................... 104
7.3 Lessons Learned on Conducting EM Projects .............................................................. 105
7.3.1 Access to the Relevant Information Sources ......................................................... 105
7.3.2 Following the Same Language by Members of the Modeling Team regarding the
Case ................................................................................................................................. 106
7.3.3 Following a Concrete Action Plan ......................................................................... 107
8. Conclusions & Future Work .............................................................................................. 109
8.1 Conclusions .................................................................................................................. 109
8.2 Future Work .................................................................................................................. 110
8.2.1 Expanding the Efficiency Evaluation Approach .................................................... 110
8.2.2 Moving from an Evaluation Approach to an Improvement Approach .................. 111
8.2.3 Developing Guidelines for Conducting an Efficient Evaluation Process .............. 111
References .............................................................................................................................. 112
List of Figures
Figure 1: Logical relations between the thesis chapters ............................................................. 7
Figure 2: Schematic overview of the followed research path .................................................. 17
Figure 3: Support for action (Seigerroth, 2011) ....................................................................... 27
Figure 4: The notion of method (Goldkuhl et al., 1998) .......................................................... 28
Figure 5: Finalized notion of method ....................................................................................... 29
Figure 6: SEQUAL framework for discussing quality of models (Krogstie, 2012a) .............. 36
Figure 7: The six Guidelines of Modeling (GoM) (Becker et al., 2000) ................................. 38
Figure 8: The final framework (Kesh, 1995) ........................................................................... 39
Figure 9: Draft of information demand model for the application case order planning
(Carstensen et al., 2012c) ......................................................................................................... 49
Figure 10: Final information demand model for application case order planning (translated
from Swedish) (Carstensen et al., 2012c) ................................................................................ 50
Figure 11: A sample “Business Process Model” developed in the EM Course. The model is
developed according to the EKD syntax for BPM (Business Process Models); the given labels
are understandable. ................................................................................................................... 53
Figure 12: A sample “Goals Model” developed in the EM Course. Placements of model
components are proper. ............................................................................................................ 54
Figure 13: A sample "Goals Model" developed in EM Course. Components of the model are
placed disordered. ..................................................................................................................... 54
Figure 14: A sample "Business Process Model" developed in EM Course. The typed labels are
difficult to understand and the syntax of BPM are not followed correctly. ............................. 55
Figure 15: An overview of A3E2M structure .......................................................................... 63
Figure 16: Overview of information demand analysis process (translated from Swedish)
(Lundqvist et al., 2012) ............................................................................................................ 69
Figure 17: An overview of how to follow A3E2M .................................................................. 84
Figure 18: Information demand model for “JTH Marketing Department” (Carstensen et al.,
2012b) ....................................................................................................................................... 87
Figure 19: Information demand model for “Information Engineering” program manager
(Carstensen et al., 2011a) ......................................................................................................... 89
List of Tables
Table 1: Guide relevant to Figure 1 ........................................................................................... 6
Table 2: Design science research guidelines and their coverage in the thesis ......................... 11
Table 3: Quality criteria presented in different models ............................................................ 32
Table 4: Examples of approaches for quality evaluation in modeling ..................................... 34
Table 5: Comparison between InoFLOW-2 and EM Course ................................................... 45
Table 6: Modeling participants of the background cases ......................................................... 62
Table 7: Evaluation results of IDA in ”Marketing Department” and “Program Manager” ..... 90
Table 8: Modeling participants of the validation cases ............................................................ 94
1
1. Introduction
This thesis concentrates on investigating the meaning and implications of efficiency in the
context of Enterprise Modeling (EM), as well as how to evaluate efficiency in an EM process.
For this purpose, this first chapter presents an introduction to the problem domain and
motivation for the pursued problems in section 1.1. This is followed by the list of the relevant
publications in section 1.2. Section 1.3 indicates what chapters comprise the thesis.
1.1 Background and Motivation
Today there is a general agreement that organizations continuously need to improve
themselves in order to stay competitive (Abdiu et al., 2005). These continuous improvements
are usually related to different aspects of an enterprise for instance structural, behavioral and
informational (Fox & Gruninger, 1999). In order to obtain sustainable improvements, it is
usually argued that it is beneficial to have a clear understanding of the current state and the
future state. As Harmon (2010) has put it, improvement is a transition process that entails
actions of taking a business from a current state (AS-IS) into an improved state, which is
regarded as something better (TO-BE). Going through such an improvement process usually
means going through a couple of generic phases such as understanding the current situation,
evaluation of the current situation, formulating suitable and relevant change actions, and
implementing these actions (Hayes, 2007). Such an improvement process also needs to be
performed in a structured and rational way and in order to achieve this we usually rely on and
seek guidance in different theories, methods, tools etc. In the context of EM this is usually
done through different types of activities supported by modeling methods where different
aspects of the enterprise are elucidated through different types of conceptual models (Dietz,
2008), e.g. process models, information demand models, goal models, problem models,
competence models etc. EM with its different modeling components can therefore be used as
a powerful tool in the transformation process where an enterprise wants to improve
(Seigerroth, 2011).
Another motive for enterprises to seek support in EM is that they need to deal with
complexity, both within and between enterprises (Carstensen, 2011). “Enterprise Modeling
tries to capture into models the knowledge about the objectives, processes, organization, roles,
2
resources and concepts that are of interest for solving particular problems within an enterprise
or a network of collaborating enterprises” (ibid).
In the section above EM has been argued for as a suitable support for improvement processes.
Vernadat (2002) has broken this down into more details in a number of motivations for the
usefulness of EM:
“managing system complexity by understanding how the enterprise works,
capitalization of enterprise knowledge and know-how,
enterprise engineering and continuous process improvement,
better management of all types of processes, and enterprise integration”.
In an enterprise, all or just a subset of the listed examples might be helpful. Each item in the
list has a generic meaning that can be translated and adapted differently in each application
case. In other words, it is the potential applier of EM that has to ensure that it will support the
goals of the enterprise.
As touched upon earlier, there are a number of different types of artifacts that can be used as
support for EM. Examples of such artifacts are computerized tools, methods, frameworks etc.
(Ghidini et al., 2008; Rolstadås & Andersen, 2000; Kaidalova, 201l). One issue in this context
is to choose the most suitable support for the specific situation at hand. Methods that are
implemented in a computerized tool are usually a powerful support for EM. The main reason
for this is that methods by nature provide guidelines and useful instructions for ways of
working that is understandable for a broad EM audience (Tissot & Crump, 2006).
EM and the models that are produced can be used for different purposes. One common
purpose is to use them as instruments in a change process to address different aspects such as;
improvement in interoperable business processes (Bernus, 2003), ensuring quality of
operations and business development (Persson & Stirna, 2001), decision making,
communication and learning (Bernus, 2001). Methods and computerized tools are in different
ways expected to ensure quality both in the work process and in the results that are produced
during the work process (e.g. cf. (Seigerroth, 2011)).
The concept of quality is a broad concept, which can be perceived and interpreted in different
ways. In a general sense, quality is understandable by everyone. But if we elaborate briefly on
the quality concept then we must acknowledge a variety of meanings. According to (Sallis,
2002) some of the confusion over the meaning of quality arises because it can be used both as
an absolute and a relative concept. One way of dealing with this confusion is to boil down the
3
quality notion into a set of sub-criteria for a specific context, in this thesis the context is
Enterprise Modeling Method (EMM).
In a review of the current literature different criteria sets for quality can be identified, for
example efficiency (Shah et al., 2011; Ortega et al., 2003; Kim et al., 2006), reliability
(Wolfinbarger & Gilly, 2003; Madu & Madu, 2002), understandability (Cox & Dale, 2001;
Ortega et al., 2003), performance (Al-Tarawneh, 2012; Yang et al., 2003), durability (Garvin,
1978), etc. The efficiency dimension of EMM seems to be partly neglected in the literature
and therefore it is relevant and worthwhile to explore the concept in this study. Efficiency is
mostly defined as the ratio between input and output (Priem & Butler, 2001). This is an
emphasis on the fact that in carrying out a process the focus is not only on obtaining the
intended results but also on the usage of resources. To have an efficient process, it should be
possible to complete it by following predefined work procedures without waste of resources
or unexpected side effects.
An EM process, which is about applying an EMM to produce enterprise models, is expected
to be of high quality where efficiency is an important dimension of quality. Efficiency in
application of EMM means that, on the one hand the generated models should be useful as
basis for change, and on the other hand that resource have not been wasted but used in a
purposeful way. The issue of efficiency is therefore relevant in all stages of an EM process,
i.e. from the start of the process until all models are completed and used for their designated
and intended purpose.
To elaborate a bit more on the quality dimension we can recognize different scopes of
modeling quality that have been addressed by different researchers. These different modeling
scopes are for example focusing on quality of modeling languages, modeling processes and
models by themselves. A subset of these modeling scopes focus only on defining the quality
criteria, while others include suggestions regarding how to evaluate fulfillment of the criteria
and even how to improve them (e.g. cf. Maier, 1999; Moody et al., 2002; Moody, 2005;
Krogstie et al., 2006; Frank, 2007). Despite considerable efforts dedicated to quality in
different aspects and scopes of modeling and the fact that efficiency is an important
dimension of quality, no investigation regarding efficiency evaluation of EMM has yet been
done. This thesis will focus on quality dimensions of EMM based on the following research
questions:
RQ 1. What is the meaning of efficiency in the context of EMM?
RQ 2. How can the efficiency of an EMM be evaluated?
4
RQ 3. In what phases of an EM process could method efficiency be evaluated?
1.2 Related Publications by the Author
This thesis is authored as a monograph. However, parts of it have been published previously.
A list of relevant publications is presented below:
Khademhosseinieh, B. (2010). Towards an Evaluation of Enterprise Modeling
Methods for Developing Efficient Models. In proceedings of the 9th
International
Conference on Perspectives in Business Informatics Research (BIR 2010) Doctoral
Consortium. pp. 97-100. ISBN: 978-3-86009-092-3
Khademhosseinieh, B., Seigerroth, U. (2011). An Evaluation of Enterprise Modelling
Methods in the Light of Business and IT Alignment. In R. Zhnag, J. Cordeiri, X. Li,
Zh. Zhnang, J. Zhang (Eds.), proceedings of the 13th
International Conference on
Enterprise Information Systems (ICEIS 2011). vol 4, pp. 479-484. ISBN: 978-989-
8425-65-2
Khademhosseinieh, B. (2012). Efficiency as an Aspect of Quality: In the Way of
Developing a Method for Evaluating Efficiency of Enterprise Modeling Methods. In
N. Aseeva, E. Babkin, O.Kozyrev (Eds.), proceedings of the 11th
International
Conference on Perspectives in Business Informatics Research (BIR 2012) Satellite
Workshops & Doctoral Consortium. pp. 200-210. ISBN 978-5-502-00042-0
Khademhosseinieh, B., Seigerroth, U. (2012). Towards Evaluating Efficiency of
Enterprise Modeling Methods. In T. Skersys, R. Butleris, and R. Butkiene (Eds.),
proceedings of the 18th
International Conference on Information and Software
Technologies (ICIST 2012). CCIS 319, pp. 74-86. Springer, Heidelberg
1.3 Thesis Outline
This thesis consists of eight chapters. Chapter 1 (the current chapter), provides a brief
motivation for conducting this research. The chapter states the issue of efficiency evaluation
in EM as the core of the research and clarifies what research questions should be answered.
Chapter 2 elucidates why abductive reasoning was the suitable choice for this work, how
design science as a research discipline and how case studies as a research method were
applied in investigating the research questions. It is also clarified what research path was
followed. All these are presented in order to clarify that the research questions are addressed
in a scientifically valid manner. Chapter 3 fulfills different aims. First, it sheds light on the
5
foundational concepts to the thesis, as it is necessary to have a clear understanding of the core
concepts exactly mean. Second, chapter 3 supports the understanding of how the topic of
quality evaluation in modeling has been approached. These two chapters provide the required
(theoretical) background to the audience for following the rest of the thesis, i.e. the research
contributions, the evaluation outcomes, discussions, conclusions and future work. Third,
chapter 3 presents conclusions that underline the topic of efficiency evaluation that has not
been touched on by other researchers, yet. Chapter 4 contains introductions to the research
projects and EM cases that were selected for performing case studies. The EM case studies
motivated the research described in this thesis and were used for developing the research
results. It is explained what problems in the EM case studies that were identified and how
they motivated conducting this research. The research contributions are presented in chapter
5. It first clarifies the phenomenon of efficiency in EM and EMM. Following this, the main
part of the contribution, which is an approach for efficiency evaluation of EMMs, is
presented. It is also explained how this approach should be used in practice. Chapter 6
presents results from validation of the research contributions, i.e. the efficiency evaluation
approach that is proposed in chapter 5. The validation process was conducted using two EM
cases (different from those followed in developing the results). Chapter 6 starts by an
introduction about EM cases used for validation. This is followed by presenting reflections on
the efficiency evaluation approach. To resolve issues identified in the reflections, some
refinements, that were found necessary for the developed approach, are presented at the end
of the chapter. Chapter 7 entails discussions about different topics: answers to the research
questions posed in chapter 1, reflection on the followed research discipline (design science)
and some lessons learned while conducting the EM cases. Both (EM) case studies that helped
in developing contributions and validation cases contributed in learning these lessons. The
thesis ends with chapter 8 that contains conclusions about the research contributions and
possibilities for future work.
Figure 1 shows how different chapters are related to each other and how understanding a
chapter aids in understanding the rest. Table 1 contains guides relevant to Figure 1 and
clarifies how different chapters support each other. The table has two columns. Column
“Sign” contains different possible combinations of elements and relationship types that are
used in Figure 1. Column “Elucidation about the Sign” contains explanations on the presented
signs.
6
Table 1: Guide relevant to Figure 1
Sign Elucidation about the Sign
Chapter
Information/Material
???
???
Relations
Chapter A
Chapter B
??????
Clarifies how Chapter A contributed in providing
information/material needed for performing tasks
that result in gaining contents of Chapter B.
Chapter DChapter C
???
Clarifies how Chapter C aids in understanding
Chapter D.
Chapter FChapter E
???
Clarifies how Chapter E provides complementary
support for following Chapter F (Parts of Chapter
E were used in developing Chapter F. One aim of
authoring Chapter F was developing contents
relevant and complementary to Chapter E).
7
1. Introduction
2. Research Method
3. Theoretical Background
& Frame of Reference
4. Enterprise Modeling
Case Studies
5. Efficiency Evaluation in
Enterprise Modeling
6. Empirical Validation of the Efficiency
Evaluation Approach
7. Discussion
8. Conclusions
& Future Work
Providing the basis for understanding
the theoretical background by giving
a brief on the state of the art
Underlining the necessity for
conducting the research
and developing results
(stating the RQs)
Clarifying the research
approach & research discipline
that should be followed
Clarifying the research approach
& research discipline that
should be followed
Unifying the theoretical basis Needed
for conducting the research, developing
and understanding the results
Forming the basis
for developing
the results
Clarifying
the RQs
Input for conducting
discussions on
the research
The followed case material
from the motivation cases
Presenting the
research results
Presenting the
research results
The evaluation outcome
of the research results
Input for making conclusions
and suggestions possibilities
for future work
Presenting the
Evaluation outcome
of the research results
Answering
the RQs
Input for conducting
the research, developing
and understanding
the results
Presenting the
research results
Input for evaluating the
research results
Presenting the evaluation
outcome of the research
results
Presenting conclusions about the
research results & possibilities
of future work
Figure 1: Logical relations between the thesis chapters
8
2. Research Method
In all research, scholars attempt to conduct their work in a systematic and scientifically valid
way. The current chapter clarifies how the research in this thesis was conducted. The chapter
starts by giving explanations to the research approach, discipline and method followed in this
thesis (see section 2.1). In section 2.2 the research path followed for carrying out the research
is elaborated schematically.
2.1 The Followed Research Approach, Discipline and Method
To perform scientific work, it is necessary to be aware of different research approaches and
research disciplines. In section 2.1.1 a brief introduction is given about inductive, deductive
and abductive approaches. Following this, it is explained what approach was followed for the
purpose of this thesis. Section 2.1.2 contains an introduction to design science, that is a
commonly followed discipline in Information Systems science, and how it is positioned in
this research. Section 2.2.3 contains an explanation on the case study method and its
application in the thesis.
2.1.1 Abductive Approach
According to the Oxford Dictionaries1, an approach is “a way of dealing with a situation or
problem”. A research approach, or as Elalfi et al. (2009) state, a reasoning style, is “the
process of using existing knowledge to draw conclusions, make predictions, or construct
explanations. Three methods of reasoning are the deductive, inductive, and abductive
approaches. Deductive reasoning starts with the assertion of a general rule and proceeds from
there to a guaranteed specific conclusion. Inductive reasoning begins with observations that
are specific and limited in scope, and proceeds to a generalized conclusion that is likely, but
not certain, in light of accumulated evidence. One could say that inductive reasoning moves
from the specific to the general. Abductive reasoning typically begins with an incomplete set
of observations and proceeds to the likeliest possible explanation for the set”. Abductive
reasoning is defined also as the combination of inductive and deductive reasoning. This
means that an approach is developed using deductive reasoning, followed by testing with the
use of inductive reasoning (Samuels, 2000). While inductive reasoning supports development
of new theories and deductive reasoning supports explaining specific cases based on the
1 www.oxforddictionaries.com
9
existing theories, abductive reasoning supports delivering new things (Lindström &
Polyakova, 2010). Any of the mentioned research approaches support a different purpose.
Researchers should have comprehensive knowledge about the weaknesses and strengths of
different approaches as well as the needs and possibilities in the intended research to be able
to decide what approach is more suitable.
In conducting the research in this thesis, the abductive approach was followed. This approach
was selected based on the goal of the research. According to the above paragraph, inductive
approach is suitable for case where a new theory (or “thing”) is going to be developed and
deductive approach supports legitimatizing real life cases with the help of existing theories. In
fact, the abductive approach is applicable for cases that entail conditions for both inductive
and the deductive reasoning. According to (ibid) that states abductive reasoning helps in
developing new things, initial development and evaluation can be covered by the abductive
reasoning style. In this thesis the aim was developing a new thing (artifact) for efficiency
evaluation of EMMs (presented in chapter 5). The initial results were required to be evaluated
(validated). Results of this validation are presented in chapter 6). Accordingly, the abductive
approach was found to be a suitable choice for the purpose of this research.
2.1.2 Design Science
The purpose of research is “to advance knowledge and the scientific process” (Dennis &
Valacich, 2001). Such advancement can be achieved by answering questions, which results in
obtaining new knowledge (Marczyk et al., 2010). This means the contributed results have to
be novel (Ghauri, 1995).
In the context of Information System (IS) research, behavioral science and design science are
examples of the foundational research paradigms (Hevner et al., 2004). As it is stated in the
Oxford Dictionaries2, discipline is ”a system of rules of conduct”. In comparison to
approaches, disciplines provide more concrete and more precise ways for performing works.
Behavioral science and design science support addressing two key issues in IS: the central
role of the IT artifact in IS research (Weber, 1987; Orlikowski & Iacono, 2001; Benbasat &
Zmud, 1999) and addressing the perceived lack of relevance of IS research to the business
community (Benbasat & Zmud, 1999). The design science paradigm has its roots in
engineering and the sciences of the artificial (Simon, 1996). It is fundamentally a problem-
solving paradigm. Design science seeks to create innovations that define the ideas, practices,
2 www.oxforddictionaries.com
10
technical capabilities and products through which the analysis, design, implementation and
use of information systems can be effectively and efficiently accomplished (Denning 1997;
Anderson & Donnellan, 2012). “The design-science paradigm seeks to extend the boundaries
of human and organizational capabilities by creating new and innovative artifacts” (ibid). This
discipline has been used widely by researchers of IS field and is the result of confluence
between people, organizations and technology. Following this research discipline “knowledge
and understanding of a problem domain and its solution are achieved in the building and
application of the designed artifact” (ibid). Different opinions are given on what an IS artifact
is: while Hevner et al. (2004) define IS artifacts as constructs, models, methods and
instantiations, van Aken (2004) sees an IS artifact as a social innovation.
Design science is comprised of two main set of activities, which are construction and
evaluation (Cole et al., 2005). Different authors, each providing a different level of details,
present models and frameworks to demonstrate how an artifact is developed by following
design research. While Owen (1998) presents a simple general model, other researchers’
contributions such as Takeda et al.’s (1990) design cycle, Stempfle and Badke-Schaub’s
(2002) generic step model, McKay’s (2005) ideal process, Hevner et al.’s (2004) IS research
framework and the design research cycle by vom Brocke and Buddendick(2006) are more
detailed. Although different authors have followed different styles for representing their
proposed model, all emphasize that design science is an iterative process. They clarify that
design science-based research is done by carrying out development/design and
justification/evaluation cycles iteratively. This supports improving an artifact’s maturity to a
satisfactory state. In this chapter and for clarifying how design science was followed, the
terms “evaluate” and “validate” are used alternatively to refer to “justify/evaluate” activity.
Hevner et al. (2004) propose seven guidelines to support design research-based research. The
left part of Table 2 contains a summary of this proposition. Detailed description, on what each
guideline is about, can be found in the main literature, i.e. (Hevner et al., 2004). The right part
is a summary of how design science was applied in performing the research described in this
thesis. This was performed following the design science discipline, since the aim was to
develop an artifact aiding in efficiency evaluation of EMMs. The intended artifact is
manifested as an approach built of efficiency criteria for each EMM part and driving
questions that support starting an efficiency evaluation process (see chapter 5 and chapter 6).
In the following, we explain how Hevner et al.’s (2004) guidelines about design science were
pursued in this thesis.
11
Table 2: Design science research guidelines and their coverage in the thesis
Design Science Research Guidelines (Hevner et al., 2004) Coverage of Design Science in the Thesis
Guideline Description
Guideline 1:
Design as an Artifact
Design-science research must produce a
viable artifact in the form of a construct, a
model, a method, or an instantiation.
Following design science in this research
resulted in producing an artifact in the form
of an approach for evaluating fulfillment of
efficiency criteria in EMMs.
Guideline 2:
Problem Relevance
The objective of design-science research
is to develop technology-based solutions
to important and relevant business
problems.
The developed artifact (approach) is helpful
in addressing the unattended problem of
efficiency evaluation in EM. Relevance of
the artifact to EM and consequently IS,
makes a technology-based solution.
Guideline 3:
Design Evaluation
The utility, quality, and efficacy of a
design artifact must be rigorously
demonstrated via well-executed
evaluation methods.
The developed artifact (approach) was
evaluated using case study method to find
out what shortcomings it had (and even
suggest solutions).
Guideline 4:
Research Contributions
Effective design-science research must
provide clear and verifiable contributions
in the areas of the design artifact, design
foundations, and/or design
methodologies.
The thesis aim at presenting a novel design
artifact (approach) applicable for efficiency
evaluation of EMMs. The developed
artifact was later on evaluated to suggest
possibilities for improvement; this
demonstrates its viability.
Guideline 5:
Research Rigor
Design-science research relies upon the
application of rigorous methods in both
the construction and evaluation of the
design artifact.
Development and evaluation of the
intended artifact was carried out using case
study method, which is widely used in IS
research.
Guideline 6:
Design as a Search Process
The search for an effective artifact
requires utilizing available process means
to reach desired ends while satisfying
laws in the problem environment.
The search process contributed in an
iterative process of developing and
evaluating the intended artifact (approach).
The artifact is adjustable to the
environment, i.e. the state of the enterprise
and its goal.
Guideline 7:
Communication of Research
Design-science research must be
presented effectively both to technology-
oriented as well as management-oriented
audiences.
The contributed artifact is packaged as an
approach for efficiency evaluation of
EMMs on an overall level. This approach is
usable by different users, those who
perform the evaluation and those who are
managers.
Guideline 1; Design as an Artifact: According to Hevner et al. (2004) “the result of design
science research in IS is by definition, a purposeful IT artifact created to address an important
organizational problem. It must be described effectively, enabling its implementation and
application in an appropriate domain”. This research is done with the purpose of developing
12
an artifact supporting the evaluation of efficiency in EM and more specifically an EMM
application process. The topic of efficiency is a notable problem, as discussed in section 3.1.3.
It has however not yet been addressed in EM (see section 3.3). The developed artifact in this
thesis is an approach that helps in evaluating efficiency of EMMs, presented in chapter 5.
This approach is documented as criteria clarifying how each part of an EMM (see Figure 5)
should be together with a list of suggested driving questions.
Another aspect of an IT artifact is that interdependency and coequality with people and the
organization they are being used in (Hevner et al., 2004) is taken into account in the design
and development of the mentioned artifact. The efficiency evaluation approach is designed in
a way that supports evaluating EMMs based on a set of suggested criteria. These criteria are
proposed to drive the efficiency evaluation process. Following the efficiency evaluation
approach, members of the EM team can unify their understanding about the notion of
efficiency in EM. In short, application of the developed artifact provides the basis for
reaching efficiency as an objective of using (IT) artifacts (Denning, 1997; Tsichritzis, 1998).
Guideline 2; Problem Relevance: “The objective of research in information systems is to
acquire knowledge and understanding that enable the development and implementation of
technology-based solutions to heretofore unsolved and important business problems …
Design science approaches this goal through the construction of innovative artifacts aimed at
changing the phenomena that occur” (Hevner et al., 2004). As stated above, the important
problem of efficiency evaluation in EM has been left unattended and unsolved. This was a
motive for conducting the research with the purpose of taking this into account and
investigating it. The conducted research was done with the aim of addressing the identified
problem that gave even additional findings. Attainment of extra findings was controlled and
did not lead in deviating from the right path. A solution is the result of eliminating or reducing
a problem (Simon, 1996) or the difference between the goal and the current state (Hevner et
al., 2004). Assuming that the relevant problem in this research was inefficiency in an EM
process (that is not addressed yet) and the goal was unveiling it, the developed artifact (an
approach for evaluating efficiency) is the relevant solution for fulfilling this need. In addition
to this, EM helps in development of IS (Brinkkemper et al., 1999). Thus, a solution that is
developed in EM, supports IS field and consequently is technology-based. According to this,
the developed artifact can be considered as a technology-based solution.
Guideline 3; Design Evaluation: A design artifact should be evaluated in terms of quality,
utility and efficacy using well-executed methods. This should be done respecting
requirements that are established by the business environment (Hevner et al., 2004). The
13
designed artifact in this research, i.e. approach for efficiency evaluation of EMMs had
shortcomings and inconsistencies, same as any other new developed product. This imposed
the need for performing design evaluation. The principle and underlying criterion for
evaluating the intended artifact was the extent to which it is usable for evaluating efficiency of
an EMM. The design evaluation method pursued in this research was case studies. For this
purpose, two EM cases were selected to carry out the design evaluation (see chapter 6). In this
way, feedback on strengths and weaknesses of the artifact were gained and were used for
writing reflections on the approach (presented in section 6.2) and suggesting refinements (see
section 6.3) for the artifact.
Guideline 4; Research Contributions: Expectation from design research is gaining a novel
contribution that can be categorized into at least one of the areas of the design artifact, design
construction knowledge (i.e., foundations) and design evaluation knowledge (i.e.,
methodologies) (Hevner et al., 2004). According to this, the focus of the thesis was on
developing an artifact (a design artifact) for evaluating efficiency of EMMs. Use of this novel
artifact results in reaching conclusions about whether an EMM supports efficiency and if not,
what the shortcomings are. The developed artifact was evaluated in terms of representational
fidelity and implementability, as it is emphasized in (ibid). According to the evaluation
(validation) results, the developed artifact is relevant and contributing to EM. It demonstrates
that the artifact was developed in environments where EMMs were followed. This is evidence
showing the artifact is applicable and implementable in the business environment, which
makes it a clear contribution. It is clear in the sense that it is explicit what can be supported
using it (efficiency evaluation of EMMs) and how it can be done (by checking whether the
defined criteria are fulfilled). The contribution is verifiable, too. The proof for this claim is
chapter 6, which entails results of evaluating (validating) the artifact. This shows the proposed
artifact can be verified with the use of case studies.
Guideline 5; Research Rigor: As Hevner et al. (2004) mention, following rigorous
construction and evaluation processes is necessary in a design science-based research. This
was satisfied in this thesis, too. The research started with a literature review. Following this,
the artifact development was done. This entailed construction and evaluation. In the literature
review, a massive number of different sources about EM, EMM, quality evaluation in IS as
well as modeling were reviewed to identify the state of the art. Following this, two EM cases
from two different projects were selected as case studies (see chapter 4). These cases provided
motives for conducting this thesis. The same cases were used for developing the intended
artifact (presented in chapter 5). This was done by taking the state of the art and the method
14
notion (see Figure 5) into account. The developed artifact had to be evaluated to find out
whether it was capable of covering the specified needs. Evaluation was done using two other
EM cases. In short, both development and evaluation were done using case studies. Following
the confirmed research method, whilst not hesitating to embrace spontaneous changes in the
path, helped in pursuing a concrete but flexible approach.
Guideline 6; Design as a Search Process: Being iterative is the nature of design science and
makes it “essentially a search process to discover an effective solution to a problem”, which is
in fact seeking for a satisfactory solution (ibid). This fact was considered and met in writing
this thesis. As stated, the current research was performed with the purpose of developing an
artifact. The process was however not completed in a single round. It was done iteratively and
resulted in a gradual but controlled development process. The reason for following an
iterative work process was the nature of a search process, which requires an ongoing process
until reaching a satisfactory state. In conducting this research, reviewing the two EM case
studies iteratively was the key means to search for the intended results. Not only the
development process, but the evaluation process was done iteratively. To do evaluation, the
artifact was checked several times against EM cases, that were selected specifically for this
purpose to ensure gaining a rich evaluation result.
Due to the iterative nature of design science and the search process, the research results were
gained gradually and along the way. The search process supported learning new things,
especially about the relevant study field, i.e. EM, and the relevant EMMs, namely EKD
(Bubenko et al., 1998) and IDA (Lundqvist, 2011). Besides these, the search process required
shifting between results development (and evaluation) and the relevant EM cases. In each
iteration, the cases were reviewed to extract relevant data and apply them in the development
(or evaluation) process.
Guideline 7; Communication of Research: The presentation of design science research in
this thesis is a way that details necessary for the audience are provided. This thesis is
packaging an approach (artifact) supportive to efficiency evaluation of EMMs. The
contribution is manifested in the form of efficiency criteria for different EMM parts plus
suggestions for driving questions that aid in evaluating fulfillment of the defined criteria.
According to (Hevner et al., 2004), a technology-oriented and management oriented audience
need sufficient detail to enable construction and determine resource allocation of artifacts,
respectively. The contribution of the thesis is usable by people who are directly responsible
for performing efficiency evaluation as well as managerial people. The need for this
contribution is motivated in detail in “Theoretical Background & Frame of Reference”
15
(chapter 3) and “EM Case Studies” (chapter 4). The significance of EM is motivated by
elaborating its support for business improvement. It is also explained how receiving support
from EM requires applying a relevant EMM and this application process has to result in
gaining the expected results, while resources must be used in a worthwhile way. Evaluation
(and improvement) of efficiency however has not been addressed yet by other researchers.
Thus, a management audience needs to take this contribution into account. After accepting
this need, the potential applier of the artifact (approach for efficiency evaluation of EMMs)
can refer to the presented details to realize how to conduct the efficiency evaluation process.
The contribution is presented as efficiency criteria for various EMM parts. This presentation
is done on an overall level, which makes the contribution comprehensible by different ranges
of audience, varying from technology-oriented (here: those who are going to perform an
efficiency evaluation) to management oriented (here: those who manage and observe the
process). Each person may review the defined criteria to interpret and tailor them according to
her/his needs. After referring to the presentation of the developed artifact, while being aware
of the state of the art, the audience can understand the artifact’s novelty as well.
2.1.3 Case Studies
Case study is the most followed research method in qualitative research of Information
Systems (Darke, et al., 1998). Yin (1994) defines a case study as “an empirical enquiry that
investigates a contemporary phenomenon within its real-life context, especially when the
boundaries between phenomenon and context are not clearly evident”. A case study can be
used in both quantitative and qualitative research approaches (ibid) and also within positivist
and interpretivist traditions (Cavaye, 1996; Doolin, 1996). It encompasses various data
collection techniques such as interview, observation, questionnaire, data and text analysis
(Yin, 1994). This research method is applicable for fulfilling various aims such as providing
descriptions about phenomena, developing theories and testing theories (Cavaye, 1996). Case
studies support any of these needs by providing a basis for bringing research questions up on
the table, data collection and analysis, and presentation of the obtained results.
Case studies is similar to field studies in the sense that both support examining phenomena in
their natural context. They are different in the sense that in case studies, the researcher has
less prior knowledge about constructs and variables (Benbasat et al., 1987; Cavaye, 1996).
Case studies can also be compared with experimental studies, where both need several studies
for gaining comprehension about a particular phenonomenon. Nonetheless, their difference is
that case study does not support studying any relation between cause and effect. In other
16
words, manipulation of variables is not possible in this research method (Cavaye, 1996; Lee,
1989).
For this thesis work, the design science discipline was followed and case study for developing
results and evaluating them. According to (Remenyi & Money, 2004), case studies support
providing a multi-dimensional view of a situation. This feature of case studies made the
process of the current research more flexible. Using this method, the author had the possibility
of interpreting moments, expressions and actions in the EM case studies (presented in chapter
4) and validation cases (presented in 6.1) in different ways. As Perry et al. (2004) mention,
when a case study is seen as a research method, it should help in defining research questions
as well as collecting and analyzing data to answer the research questions. According to
section 2.2 (and Figure 2), EM case studies helped in defining research questions and
developing results (presented in chapter 5). Also, validating the research results was done
using case studies, which are called validation cases. In short, case study provided a flexible
means for performing this research.
2.2 Schematic Overview of the Followed Research Path
In this section it is elaborated in schematic form what phases were performed to complete the
research. Challenges in writing this chapter was grouping different tasks into activities and
give each activity a relevant name. It might be required to investigate the traversed path from
either high or low level of details. On a high level, the research can be divided into two phases
of Background Formulation of Research Questions and Contribution Evolvement. In the
lower level, each phase is comprised of a series of activities. An overview of the followed
research path as well as its phases and activities is presented in Figure 2. In this section, each
research phase is elaborated. This is done by explaining activities within each phase in
sections 2.2.1 and 2.2.2. Figure 2 shows a schematic and straight forward working path. In
reality, deviation from the decided path was inevitable and happened now and then. The
reason for this was that at some points in time it was necessary to repeat a prior activity. In
other words, the figure is not presented to show the exact time frame, rather it is dedicated to
the logical structure of the research path.
As explained in section 2.1.2, this thesis work was done by following the design science
discipline, an abductive approach and the case study method. To decide what research
discipline, approach and method should be followed, literature relevant to research methods
17
were reviewed. This is however not shown in the figure. In fact, this elaboration is about how
the research process was done after selection of discipline, approach and method.
Initial Description of the PhD
Position
Literature Review
Case Observation
i
State of the Art in EM Problem
Definition &Formulation of
RQs
Results
Case Observation ii
Empirical Data
i
Empirical Data
iiResults
Development
Knowledge Gap (Problem
& RQs)
Making Conclusions
& Answering RQsConclusions
& Answers to
The RQs
Case material
i
Results Validation
ValidatedResults
Background Formulation of Research Questions
Contrbution Evolvement
Case Material ii
(Empirical Data iii)
Figure 2: Schematic overview of the followed research path
2.2.1 Background Formulation of Research Questions
Similar to other dissertations, in this work it was required to find out what is already done and
what research gap(s) that exists in the research field. To shed more light on this phase and its
details, we break it into a series of activities: Literature Review, Case Observation i and
Problem Identification and Formulation of RQs.
Literature Review: This activity is the starting point in any research work and requires
spending considerable amount of effort and time. It was started by reviewing the Initial
Description of the PhD position that was suggested by the PhD supervisors and reviewing the
Existing Literature. The Existing Literature included different types of scientific publications,
such as technical reports, dissertations and papers. Literature Review aided mainly in
identifying the State of the Art in EM, and did shed light on what had been developed in
18
different parts of the EM research field, especially contributions regarding quality (and
efficiency) evaluation. Therefore, various topics in EM were reviewed. The review process
was continuous.
Case Observation i: Besides reviewing literature and working on identifying the State of the
Art, the author reviewed Case Material i coming from the EM case studies (see chapter 4).
The focus of this activity was on reviewing the obtained experiences, trying to identify the
implicit knowledge and transform it into explicit knowledge. The output of the activity was
called Empirical Data i, which was the motive for pursuing the research This activity itself
can be broken down into two sub-activities: Case Selection and Empirical Data Collection
(for simplicity the sub-activities are not shown in Figure 2).
Case Selection: In an empirical study, the aim is to go through one or several cases to
assess them from a specific viewpoint. For this purpose, the author started selecting
suitable cases and deciding about how to observe and assess them. The set of target
cases entailed an EM session from the infoFLOW-2 research project and one project
group from the EM course, Spring 2012, at Jönköping University (in the remainder of
this chapter: EM Course), explained in sections 4.2 and 4.3.
Empirical Data Collection: After selecting the target case for empirical data
collection started. An EM session involves different people cooperating with each
other. There might be cases where all members of a modeling team focus on the same
task. In such a state the observer is able to follow the work process conveniently. In an
EM session team members can be divided into groups and each group works on a
fragment of the current task. In such a situation, following all groups at the same time
becomes a problem. To solve this issue the decision became to record video from the
modeling sessions and focus on them as Case Material i. Besides this, notes that were
made during the modeling sessions were considered for review and data collection.
This facilitated the work in different ways. Through this, it became possible to assess
parallel work divisions and not missing them. Moreover, the recordings could be
reviewed several times, any time after finishing the EM sessions. This lessened the
risk of missing details and made it possible to review the findings iteratively.
Problem Identification and Formulation of RQs: The two above activities were in fact pre-
requisite to this activity and information material that had been gained as their output were
used here. As stated above, during Literature Review a wide range of topics were touched
upon that resulted in finding the State of the Art, which was reviewed and even revisited
iteratively to define the problem precisely. By doing this revision process and considering
19
Empirical Data i, the research problem was identified and the relevant research questions
were defined. This activity as the last activity from Background Formulation of Research
Questions phase supported conducting the Contribution Evolvement phase by specifying the
Knowledge Gap (Problem & RQs). In the following, the second phase of the research path,
i.e. Contribution Evolvement is elaborated.
2.2.2 Contribution Evolvement
After identifying the Knowledge Gap, it was time to contribute in solving it. For this purpose,
it was necessary to conduct activities that converge into supporting this phase. These activities
in general included collecting relevant data, developing results and making refinements to the
results. These are elaborated in more details in the following:
Case Observation ii: An EMM as the relevant tool for receiving support from EM, is usually
of interest when it is in use. Finding out what strengths or deficiencies it has, can be clarified
during usage. Therefore, it was decided to observe EM cases with the purpose of data
collection. The result of this activity was Empirical Data ii, which were required for
preparing Results Development. The same EM cases and case material as in Case Observation
i were used here, though the approach was different. In this activity, the problems were
already known. Having this in mind, focus of the activity was on specifying statements
required for Results Development.
Results Development and Results Validation: Results Development and Results Validation
support developing an artifact that address the research questions. Due to the close relevance
and relation between these two activities, they are explained in the same paragraph. The State
of the Art (as an indirect input), the identified Knowledge Gap and Empirical Data ii were
used in Results Development for developing Results. This was done iteratively to reach
satisfactory Results. As continuation to this, Empirical Data iii (Case Material ii) and Results
were used in Results Validation attain Validated Results. This was also carried out iteratively,
which required reviewing the relevant inputs for the activity several times. Case Material ii
were used directly as an input (Empirical Data iii) to Results Validation. Thus, we do not
differentiate them. As the main intention of this research was to contribute to the efficiency of
EMMs, the direction of the Results Development and Results Validation activities were
towards developing outcomes in relation to establishing understandings about how an
efficient EMM should be and how to evaluate its fulfillment. Both activities were done
respecting the research questions and with the purpose of addressing them. From one side
Results Development and Results Validation helped in answering the research questions and
20
even modifying them. From the other side, the modified research questions ensured
maintaining the right track.
Making Conclusions & Answering RQs: As stated above, the research questions had direct
effect on the research work, and vice versa. Answers to the research questions were dependent
on the Results and the Validated Results. This activity received Results and Validated Results
to assist in answering the research questions. These answers are helpful in finding out how it
is possible to apply this work and what can be supported by it. Besides answering the research
questions, this resulted in Making Conclusions on the gained attainments.
In both phases mentioned, it was necessary to move back and forth between activities within
each phase every now and then. It was even required to move between activities of one phase
and activities of the other phase. For example, whilst the Results Development and Results
Validation were under progress, it was needed to review the State of the Art or even do extra
Literature Review to check details of the Knowledge Gap and also making decision about the
rest of the development. However, for the reason of simplicity, Figure 2 is presented as a
straight forward path.
21
3. Theoretical Background & Frame of Reference
The focus of this thesis is on studying efficiency (as an aspect of quality) in EM. Thus, it is
required to gain an understanding about the relevant theoretical background to it, which is
addressed in this chapter. This is started by describing the foundational concepts relevant to
EM and quality (see section 3.1). Since, efficiency is an aspect of quality and this thesis is
written to contribute to the area of EM and efficiency evaluation in EM, section 3.2 clarifies
how quality evaluation in EM has been approached by other researchers. The chapter ends
with section 3.3 that contains conclusions of the whole chapter and a hint regarding how the
presented theoretical background motivated the current research.
3.1 Foundational Concepts
This chapter aids in understanding the relevant and foundational concepts. In section 3.1.1 we
explain what EM and enterprise models mean. Following this, clarifications about the method
notion and how it supports EM, are given in section 3.1.2. It is lastly elucidated what
efficiency as an aspect of quality in the IS research field stands for (see section 3.1.3).
3.1.1 Enterprise Modeling & Enterprise Models
This section intends to elaborate the notion of EM and enterprise models. This aids in
unifying the author's and the audience's perception about these notions. Also, clarifying these
two meanings is a foundation for the developed tentative research contribution.
3.1.1.1 Enterprise Modeling
EM, Enterprise Architecture (EA) and Business Process Management (BPM) are three areas
that have for a long time been part of a tradition where the mission is making improvement in
enterprises (Harmon, 2010). According to (Degbelo et al., 2010) “since more than two
decades, the contribution of EM to solving problems in organizational development, process
improvement or system integration has been acknowledged”. But why EM? “EM, or Business
Modeling, has for some years been a central theme in Information Systems (IS) engineering
research and a number of different methods for this purpose have been proposed” (Bubenko
Jr. et al., 2010). All organizations desire to make progress in their business to remain
competitive in their business. Therefore, all organizations need to know how to make such
progress. For this purpose, they need to know about the current (AS-IS) situation as well as
22
the desired (TO-BE) situation. Indeed stakeholders need to gain an understanding about the
reality of the enterprise. But the reality is often complicated and confusing, and any insight is
rarely achieved without considerable simplification. Modeling is a means that helps in
simplifying facts without losing elements that are essential to representation and reasoning.
Also, since EM takes time and costs money, the model must be developed with a justifiable
purpose in mind. In other words, enterprise models should give the complete and correct
description respecting the purpose they are developed for (Christensen et al., 1996).
“Professionals in various disciplines feel the need to describe an enterprise according to
prescribed rules in order to be able to pursue specific goals through the modeling” (Kassem et
al., 2011). EM is a field that has been arisen and developed to support filling this gap. Indeed,
according to (Persson & Stirna, 2010), EM is now applicable for a variety of purposes related
to organizational development and helps in various ways such as designing or redesigning the
business, eliciting requirements for information systems, capturing and reasoning about
organizational knowledge. EM helps in visualizing an enterprise from specific viewpoints for
the purpose of understanding the enterprise. This understanding is the basis for further
activities such as design, evaluation, improvement, etc.
Persson and Stirna (2010) state two reasons for applying EM:
“Developing the business: this entails developing business vision, strategies,
redesigning the way the business operate, developing the supporting information
systems, capturing IS requirements, etc.
Ensuring the quality of the business: here the focus is on two issues: 1) sharing the
knowledge about the business, its vision, the way it operates, and 2) ensuring the
acceptance of business decisions through committing the stakeholders to the decisions
made”.
To explain this in a more detailed way, one can refer to (Kassem et al., 2011) that expresses
the following reasons for using EM:
Development of information systems: EM is of high importance in IT projects and
supporting companies. According to (Shen et al. 2004), EM is an initial and essential
task for an IT project and this is carried out at the stage of system analysis and user
requirements gathering.
EM as the backbone element in enterprise integration projects: EM helps in
increasing synergy and interoperation among people, systems and applications
throughout the enterprise, including integration in manufacturing or Computer
23
Integrated Manufacturing (CIM) and workflow management dealing with automation
of paper and document flows as well as control of business processes (Kosanke & Nell
1997; Vernadat, 2001).
Shift from organizing companies in separated departments to process
orientation: Nowadays, there is an intention in companies to shift from organizing
companies in the form of departments (silos) to processes. To support this “EM can
provide a better understanding of existing processes and help companies in the
migration from departmentalized organization to process orientation” (Kassesm et al.,
2011).
In order to start a discussion about EM, first we need to have a definition for it. By going
through the existing literature, it can be seen that several definitions are presented for the
term. Some of these definitions are developed for a specific field, whereas some others are
general and can be specialized to a case. Regardless of a definition belonging to any of these
two groups, it enumerates some characteristics for EM. By reviewing these definitions, we
have extracted the main characteristics e presented below. Each item in this list is covered by
a subset of the definitions, and not all.
Enterprise Knowledge Representation: The aim of EM is capturing the knowledge about an
enterprise and representing it in an abstract form. Indeed, the intention of this is helping
stakeholders in extracting the knowledge of the domain of discourse under assessment, which
is usually a part of an enterprise, and then presenting them in the shape of models. This
characteristic has been obviously taken into account in all definitions presented to EM.
Focal Areas: “A focal area means that certain aspects are focused in that investigation”
(Goldkuhl & Röstlinger, 2003). Capturing and representing the enterprise knowledge can be
done from different focal areas, such as business processes, resources and organizational
divisions. It is important when working on more than one focal area, the results of modeling
processes, i.e. the models are consistent with each other. In other words, modeling from
different focal areas, i.e. modeling different facets of the enterprise, each focusing on different
constructs and concepts, in a specific domain of discourse should result in models that
complement and match each other. (Ngwenyama & Grant, 1993), (Zhao & Fan, 2003) and
Nurcan, 2008) are examples of contributions that have mentioned “Focal Area” in their
definition of an EM.
Structure of the Enterprise: EM supports the stakeholders in developing models to show
what the structure of the enterprise is in each focal area. This is done by finding special types
24
of elements (according to the focal area) in a domain of discourse. Realizing the relation
between the specified domain of discourses is another necessary phase, which has to be done
as complement to show the structure of the domain of discourse. (Fox & Gruninger, 1998)
and (Vernadat, 2002) are examples of contributions that have mentioned “Structure of the
Enterprise” in their definition of an EM.
Further Use: An enterprise is always in need of improving its business to be able to compete
with the rest of the organizations in the same field. The purpose of EM is helping stakeholders
in developing models that will be used for understanding the state of the enterprise. Also, EM
helps stakeholders to show the future (desired) state in the form of models. Stakeholders need
to gain understanding of these issues on both business and managerial levels. The reason is
that they should act towards developing improvement plans. (Frank, 2002), (Vernadat, 2002)
and (Delen & Benjamin, 2003) are examples of contributions that have mentioned “Further
Use” in their definition of an EM.
From the presented definitions, Vernadat’s (2002) is accepted as the basis for this thesis, since
it covers all the marked points:
“enterprise modeling is the art of externalizing enterprise knowledge which adds value to the
enterprise or needs to be shared. It consists in making models of the structure, behavior and
organization of the enterprise”.
The definition explicitly mentions “structure of enterprise” as well as “enterprise knowledge”
representation as features of EM. Bringing the issue of “adding value” to enterprise up on the
table can be accepted as a variation of “further use”. After all these, Vernadat has talked about
the fact that EM support modeling “structure, behavior and organization” of the enterprise.
According to this, the definition has taken the issue of “focal areas” into account, too.
3.1.1.2 Enterprise Models
As EM is about developing enterprise models, we need to have a concrete understanding
about what an enterprise model is. For this purpose, we should know what characteristics such
a model has. Although at first glance it seems that different authors have presented very
different specifications for enterprise models, it is still possible to underline the main
characteristics enumerated by various researchers.
“Enterprise model is used as a semantic unification mechanism, or knowledge-sharing
mechanism” (Petrie, 1992).
25
It is a symbolic representation (Liles, 1996; Whitman & Huff, 2001) of an
organization from a specific focal area.
Enterprise models show what elements (element types) from the real world in the
enterprise exist and the relations (relationship types) between them (Koubarakis &
Plexousakis, 2002) from the specified focal area.
Such a model identifies the basic elements and their decomposition to any necessary
degree (ANSI/NEMA, 1994), that should be comprehensive (Taveter & Wagner,
2000) and at the same time simplified and explicit (Gandon, 2001). Making decisions
about simplicity and details in an enterprise model depends ultimately on what the
stakeholders require (ibid).
Although the above points could be retrieved through various references, we have mentioned
only references where their presentation (in sense of meaning and wording) are more
understandable. Each of these references indeed present their own definition of enterprise
models. As can be seen, each point is covered by a subset of the reviewed literature, whereas
all the mentioned characteristics are essential for enterprise models. This imposes the need for
presenting a definition of enterprise model, that is fulfilled by the author and in this thesis:
Enterprise model is a symbolic and comprehensive representation of one or several aspects of
an organizational structure, following the notational rules for the purpose of knowledge
sharing and semantic unification.
At first glance, it looks as if the presented definition does not cover all the enumerated
characteristics. First, it does not mention that an enterprise model is a “simplified”
representation. Besides, no indication about “decomposition to any necessary degree” is
given. As both of these two points are about providing details to a sufficient level, stating
“comprehensive enough” covers both. Second, in this definition it is not referred to the fact an
enterprise model contains “elements and relations between them”. Nevertheless, as a model
itself means a set of elements and relations, and it is not necessary to emphasize it again, this
point can be neglected. Apart from the two mentioned points that are covered indirectly in the
definition, the rest are directly emphasized.
3.1.2 From Method to Enterprise Modeling Method
The term “method” is introduced and used in many references, ranging from dictionaries to
scientific articles, in different fields. In general, a method is “a type form of procedure for
accomplishing or approaching something, especially a systematic or established one” (Oxford
26
Dictionaries3). It is a broad term and can be explained and defined in different ways. By
reviewing different references and checking how different people understand this term we see
that method as a particular form of procedure is indeed a “concrete” (Oliga, 1998),
“systematic and orderly” (Baskerville, 1991) procedure. Such a process aims at performing a
task (March & Smith, 1995) such as “attaining something” (Odell, 2005) or “dealing with a
problem situation” (Mingers, 2000).
From what different authors emphasize as the key feature of a method, it can be concluded
that a method clarifies “what to do”, “how to do” and “why to do”:
“What to do” elaborates the ultimate goal that is to be gained, together with
intermediate goals. Indeed, achievement of intermediate goals guarantees
obtaining the ultimate goal.
“How to do” has to clarify the phases that have to be traversed and their
comprising steps. Each step is a guideline for completing a specific intermediate
task. It might also specify the capabilities that involved people should hold in
order to be able to follow the phases and guidelines.
“Why to do” makes it clear for what purpose the method is applicable. This
clarification aids stakeholders in understanding the purpose of the available
methods and selecting a suitable one.
To complete a task we need to take into account various issues that might influence the work
process, including guides and inspirations about how to carry out a work (Seigerroth, 2011).
According to what Seigerroth (2011) states, source of guidance and influence can be found in
various forms, such as methods, theories, experiences, patterns and tools. In addition to these,
action guidelines can be found in the solution space in the form of best practices, and
computerized tools, in which methods are implemented (see Figure 3).
Among all the enumerated ways, methods are the ones which are extensively used. According
to (Goldkuhl et al., 1998), the characteristic of a method is its Perspective. Method
Perspective is the conceptual and value basis of the method and its rationality. All methods
build on some implicit or explicit Perspective, which includes values, principles and
categories. A method involves procedural guidelines, in short: Procedure, which show how to
work and what questions to ask. In a method there exist representational guidelines; what is
often called modeling techniques or Notations. Method Notation supports the need for
documenting different aspects as well as how answers to the procedural questions should be
3 www.oxforddictionaries.com
27
documented. Procedure and Notation are tightly coupled to each other. The Procedure
involves some meta concepts, such as process, activity, information, object. Such general
concepts are used when the questions are asked, i.e. they are part of the prescribed Procedure.
They are also part of the semantics of the Notation. The Concepts is the cement between
Procedure and Notation. When there is a close link between Procedure, Notation and
Concepts, it is referred to as Method Component.
MethodMethod TheoryTheory
Goal
Knowledge & Experiences
Knowledge & Experiences
PatternPattern
Action
Result
choices & Configuration of
as support for
ToolTool
as support for
is implemented in
is manifested through
generates to fulfillgenerates
as a part of
Interpretation
Best Practice
Figure 3: Support for action (Seigerroth, 2011)
A method is often a compound of several Method Components, giving what is often called a
methodology (Avison & Fitzgerald, 2006). As stated in (Goldkuhl et al., 1998) Method
Component tells us how to do a specific step. Different Method Components together form a
structure, called Framework that includes the phase structure of the method. This phase
structure tells us what to do, in what order to do, and what results to produce. Another aspect
of method is Cooperation & Collection Principles. Cooperation Principles are about who
asks and answers questions. To elaborate this more, we can say that Cooperation Principles
are about what roles should be involved in the work and how the work division should be, i.e.
how different persons interact and cooperate. Collection Principles entail principles about
how to put questions and collect answers. A Method Component can be used within several
Collection Principles, such as seminars, brainstorming sessions, interviews and modeling
sessions. The notion of Method Component in (ibid) is similar to Method Chunk in (Ralyté et
28
al., 2006), Method Block in (Prakash, 1999) and Method Fragment in (Brinkkemper, 1999).
The method notion introduced by Goldkuhl et al. (1998) can be seen in Figure 4.
Cooperation & Collection Principles
Framework
Perspective
What is important?
What questions to ask?
What to talk about?
How to express answers?
Who asks?Who answers?
How are questions related?
Procedure Notation
Concept
Figure 4: The notion of method (Goldkuhl et al., 1998)
Although Goldkuhl et al.’s method notion is supportive in the way of understanding what a
method is, for the purpose of this thesis it is necessary to define a more precise interpretation.
This is fulfilled by adding some aspects to it. To fulfill this need, we underline that a Method
might contain only one Method Component, or a composition of several Method Components.
A Method Component supports investigating a specific focal domain of the enterprise and
might be either Vertical Composition, Horizontal Composition or a combination of these two,
from two or more Method Components. As (Cronholm & Ågerfalk, 1999) state, Vertical
Composition means integration of methods between different levels of development, whereas
Horizontal Composition means integration of methods within the same level of abstraction.
Regardless of what composition form is followed in a method, the comprising Method
Components aim at modeling an enterprise or discourse domain by asking different types of
questions; that result in gaining different types of answers and consequently different types of
models. Using Horizontal Composition applying different Method Components to a domain of
discourse the produced results can be linked to each other and no contradiction is seen
between them. On the other hand, Vertical Composition means aggregation of different
Method Components in an abstraction level and producing results where linking and
integrating them supports the upper abstraction level. The possibility of having a composition
of Method Components is shown by presenting a cascade of several Method Components. For
29
the purpose of this work, we assume that there is a possibility of composition, but do not
distinguish between Horizontal Composition and Vertical Composition. Another method part
that requires modification is Cooperation & Collection Principles. As we stated above,
Collection Principles are mainly about how to put questions and collect answers. These forms
clearly refer to the issue of asking and answering questions and they are more related to
method Procedure than being part of another method part. This means the method part
Cooperation & Collection Principles should be changed to Cooperation Principles. The
modified method notion that is what we accept for the rest of this work is presented in Figure
5.
Cooperation Principles
Framework
Perspective
What is important?
What questions to ask?
What to talk about?
How to express answers?
Who asks?Who answers?
How are questions related?
Procedure Notation
Concept
Figure 5: Finalized notion of method
To receive support from EM, we should follow a systematic way of working, i.e. a method. A
method that is specifically developed for the purpose of EM is called an EMM. By applying
an EMM we develop enterprise models as results and they can be used for studying the status
of the enterprise and planning improvement actions. But what are the specifications for an
EMM? In this type of method, Perspective clarifies from what viewpoints an enterprise can
be studied (and indeed be modeled). For this purpose the Framework highlights the phases
that are required for developing a comprehensive (set of) model(s) of an enterprise. Each
phase clarifies what the aim of completing it is. This imposes the need of applying a Method
Component. By applying the Method Component it will be possible to apply the Procedure in
order to ask questions about what elements exist in the domain of discourse (as a part of the
marked enterprise). Answers to these questions are implemented using the Notation part of
30
the Method Component and are in fact abstract visualizations of the enterprise. According to
this, the determining feature of a method, and here an EMM, is its Method Component. By
this we mean that it is the presentation of a Method Component that helps the reader in
understanding that the output of applying an EMM is a visualization of an enterprise and what
this visualization looks like. Investigation of different focal areas as a subject of EM needs to
be covered in EMM. This should be supported in phases of a Framework or as part of Method
Components, i.e. developing enterprise models as a visualization of particular focal areas are
to be covered in Framework or Method Components or both. Concepts as the cement part
between Procedure and Notation stress what phenomena and meanings of an enterprise are
considered in this EMM. This is a great help for making decision whether this EMM is
applicable for team’s objectives. All this is done by a group of people that follow an accepted
Cooperation Principles for studying and modeling the enterprise with the help of the other
method parts.
3.1.3 Quality & Efficiency
According to (Mevius, 2007), a process is of high enough quality if it is suitable for attaining
the pre-defined business processes targets. To carry out a process it is necessary to use
different types of resources. Resources are not always cheap and freely available. Hence, we
need of take care of resource usage. The extent to which we have been successful in the
proper use of resources, is an indicator for the process success. Accordingly, to assess if a
process has been (or will be) successful in achieving the specified aims, one issue that has to
be studied is the amount of resources consumed. This concept is called Efficiency. Although
efficiency seems to be a primitive term that can easily be understood, there is no universally
accepted definition for it and its meaning has been introduced and discussed in different ways.
Before continuing with the issue of efficiency, a brief about quality and its meaning in EM is
presented. Quality is a term that shows the extent to which a process or product is acceptable
by the stakeholders and different definitions and explanations are given. The issue of quality
is of interest in all fields (such as economics & industrial organization, management,
operations & engineering, etc.) but it is perceived and interpreted differently in each field
(Golder et al., 2012). This results in “no universal, parsimonious, or all-encompassing
definition or model of quality exists” (Reeves & Bednar, 1994).
According to Martin (1989), developing enterprise models is a means to support Information
Engineering, which is about “enabling users to understand and monitor the information they
are dealing with, as well as the process they are executing” (Duarand & Dubreuil, 2001).
31
Therefore, the perception of quality that is followed in operations and engineering is
applicable in EM. According to Golder (2012) “in operations and engineering, quality means
conformance to design specifications or the reliability of internal processes, even though
many customers do not find these processes meaningful” (Feigenbaum, 1991; Juran, 1992;
Shewhart, 1986). Accordingly a list of specifications that should be met by a process in order
to conclude that it is of acceptable quality is needed. There exist literature that focus on this
issue and aim at clarifying quality criteria. Among such works, quality models are about
quality criteria as well as how criteria affect fulfillment of one another.
As this thesis is supposed to contribute to EM and Information Engineering, which are
subjects in IT and IS, relevant quality models should be considered. Although these models
are mainly developed for software quality, they can be used in other disciplines of IS, too.
Table 3 shows what criteria are listed in each quality model. This is done using a table that is
adapted from (Ortega et al., 2003) and extended. According to Table 3, efficiency and
maintainability are criteria that are taken into account in all quality models. Consequently,
these two criteria should be prioritized fulfilling quality. Between these, “the efficiency
criterion determines the choice of alternatives that lead to the maximization of results when
scarce resources are applied” (Simon, 1964). But what does efficiency exactly mean? In a
general sense and its simplest form, efficiency is about “doing things right” (Gleason &
Barnum, 1986; Zokaei & Hines, 2007), in comparison to effectiveness that is about “doing the
right things” (Drucker, 2008; Drucker, 1974). The two terms look similar, but convey
different meanings. Effectiveness is about the results expected from a process, whereas
efficiency has to do with the minimum amount of resources that are required for obtaining a
goal. “Doing the things right” refers to the fact that not only achieving the intended results is
of interest, but also that a proper way of working has to be pursued. If this is fulfilled,
resources will be used in a more sensible way. By evaluating (and measuring) efficiency one
can find out whether resources are used properly, i.e. “elimination of waste” (Molina, 2003) is
supported through efficiency evaluation. This naturally can be continued by identifying cause
of resource waste and dispelling it.
It is often assumed that no definition is required for efficiency, because the meaning of the
word is usually established on a common sense basis. It is however necessary to have a
mutual understanding about what efficiency means. There exist two attitudes towards
efficiency. The first attitude defines this phenomenon as the amount of required resources for
obtaining a pre-defined goal. In a process any of input and output might be varying.
32
Therefore, definitions that have looked at efficiency from an output/ input viewpoint can be
categorized into three groups:
1. Some authors define efficiency as the varying ratio between output to input (e.g.
Farrel, 1957; Lovell, 1993; Chen, 2006).
2. There are authors that consider efficiency as the amount of inputs that should be
determined for gaining a fixed output (e.g. Sink & Shuttle, 1989; Neely et al., 1995;
Jackson, 2000).
3. Efficiency is also defined as increasing output for using a fixed amount of resources
(e.g. Sumanath, 1994; Taylor, 1957; Vakkuri, 2005; Morris et al., 2007; Milikowsky,
2008).
Table 3: Quality criteria presented in different models
Boehm
(Boehm et
al., 1978)
McCall
(Mc Call
et al.,
1977)
FURPS
(Grady &
Caswell,
1987)
ISO 9126
(ISO/IEC
9126,
1991)
Dromey
(Dromey,
1996)
BBN
(Signore,
2005)
Star
(Khosravi &
Guèhenèuc
2004)
SPQM
(Ortega
et al.,
2003)
Testability X X X X
Correctness X X
Efficiency X X X X X X X X
Understandability X X X X
Reliability X X X X X X X
Flexibility X X X
Functionality X X X X
Human
Engineering
X
Integrity X X X
Interoperability X X
Process Maturity X
Maintainability X X X X X X X X
Changeability X
Portability X X X X X X
Reusability X X X
Usability X X X X X
There also exists another attitude towards efficiency. As stated above, efficiency is basically
about doing things right. Following this, the other viewpoint focus is on the process, but not
the resources. This attitude, has been taken less into account by researchers for defining
efficiency. A definition that is developed respecting this viewpoints is:
33
“Efficiency is used for passive or operational activity, which is usually defined technically so
that the system and its behavior are foreseeable in advance” (Kurosawa, 1991).
At first it seems that the two identified attitudes do not have anything in common and
communicate different messages, but they are two sides of the same coin and support each
other. A process that has a foreseeable behavior is more likely to result in producing the
required output using resources in a worthwhile way. In practice, the “input-output” attitude is
suitable where the work process is not the center of interest and what we care about is the
amount of output versus the amount of input. In such a condition it is required and likely
possible to measure the input (resource) usage and gained output based on an agreed upon
benchmark showing the acceptable ratio of output / input. Finally, by making a comparison
between the benchmark and measurement results it will be possible to draw conclusion about
the process efficiency. However, there might be cases where the focus is on the work process
and not the outputs and inputs (resources). In such a case the expectation is that the working
process matches to a set of defined criteria. The better the criteria are covered, the more
foreseeable the process is. This means a higher efficiency has been gained.
3.2. Approaches for Quality Evaluation in Enterprise Modeling
This section contains a brief on how the issue of quality evaluation in the field of modeling
has been taken into account. This begins by giving an explanation of the existing attitudes
towards quality evaluation in modeling. This is followed by introducing some contributions
regarding quality evaluation in modeling. The issue of quality in modeling has been important
for researchers and several efforts have been carried out regarding that. Bollojy and Leung
(2006) emphasize that it is extremely important for the system’s ultimate success to ensure the
quality of the modeling methods and tools. In (Siau & Rossi, 2007) a classification for
analyzing the underlying philosophies of methods evaluation techniques is presented.
According to this classification, the evaluation techniques can be divided into the following
categories:
Feature comparison;
Theoretical and conceptual investigation; and
Empirical evaluation
Feature analysis as a common methodology for feature comparison, has the risk of
subjectivity (Avison & Fitzgerald, 1995). To resolve this, (Bubenko, 1986) suggests doing:
theoretical investigations of concepts and languages;
34
experiences of actually applying the methodology to realistic cases; and
cognitive-psychological investigations
In the modeling field, the quality issue has been approached from different viewpoints. In
total, the amount of research and publications that have been dedicated to quality in modeling
is considerably high. Table 4 contains examples of contributions developed regarding quality
in modeling.
Table 4: Examples of approaches for quality evaluation in modeling
Reference Scope
(Batini et al., 1992) Quality evaluation of EER models
(Levitin & Redman, 1994) Quality criteria for data models
(Moody & Shanks, 1994; 1998a; 2003) Quality measurement of data models
SEQUAL
(Lindland et al., 1994; Krogstie et al., 2006; Krogstie et al., 2012)
Quality evaluation of conceptual models
(Kesh, 1995) Quality measurement of ER models
(Shanks & Darke, 1997) Quality evaluation of conceptual models
(Teuuw & van den Beg, 1997) Quality evaluation of conceptual models
Guidelines of Modeling (GoM)
(Schütte & Rotthowe, 1998; Becker et al., 2000; Blecken, 2010)
Principles for quality of models
(Moody, 1998) Quality measurement of data models
(Moody & Shanks, 1998b) Quality evaluation and improvement of data models
(Schuette & Rotthowe, 1998) Quality improvement of information models
(Opdahl & Henderson-Sellers, 1999) Quality evaluation of OO Modeling Languages
(Mišic & Zhao, 2000) Quality evaluation of reference models
Q-ME (Hommes & van Reijswoud, 2000) Quality measurement of BP modeling techniques
(Maier, 1999; Maier 2001) Quality evaluation of data modeling process
(Cherfi et al., 2002) Quality measurement of UML and EER models
(Fettke & Loos, 2003a) Quality evaluation of reference models
(Fettke & Loos, 2003b) Quality evaluation of EM languages.
(Breu & Chimiak-Opoka, 2005) Quality evaluation of models
(Graeme et al., 2005) Quality criteria for data models
Quality of Modeling (QoMo)
(van Bommel et al., 2008)
Quality measurement of process modeling
(Ssebuggwawo et al., 2009) Quality measurement of modeling processes
(Thörn, 2010) Quality evaluation and improvement in feature models
(Fettke et al., 2012) Quality measurement of conceptual models
While some researchers have concentrated only on the quality of models, others have taken
quality of methods, modeling languages and other artifacts for model development into
account. In both approaches, some contributions focus only on defining quality criteria and
35
what each of them means. There exist also contributions that present tools for evaluating the
quality of models, model development methods or languages, etc. These contributions present
criteria, that should be met by the artifact under investigation, together with guidelines for
evaluating their fulfillment. In addition to this, some publications present means for
measuring the quality of models or model development artifacts.
In the entries of Table 4 the focus of developers has been on quality of models, modeling
methods or modeling languages. This shows that quality issue has been of high importance
and interest. Since people are more keen on outputs of modeling processes, quality evaluation
and improvement of enterprise models has been of more interest than modeling processes. In
the following sections, examples from Table 4 entries are selected and explained to show how
quality in modeling has been investigated by other researchers so far.
Section 3.2.1 contains examples of contributions concentrating on quality of models and
section 3.2.2 contains examples of contributions focusing on quality of modeling processes,
languages and methods.
3.2.1 Quality Evaluation of Models
In this section brief explanations on the selected works that are used for evaluating quality of
models are presented:
SEQUAL: According to (Krogstie et al., 2006), SEQUAL is a framework developed based
on semiotics theory and aids at quality evaluation of conceptual models. SEQUAL has three
properties that make it unique:
It seperates quality goals from quality means (Krogstie et al., 2006; Krogstie, 2012a),
It is based on a constructivistic world-view, and
It is closely linked to linguistic and semiotic concepts.
In the earliest version of SEQUAL, three quality levels were considered: Syntactic, Semantic
and Pragmatic Quality (Lindland et al., 1994). During time and by extension of the
framework the quality levels increased to seven: Physical, Empirical, Syntactic, Semantic,
Pragmatic, Social and Deontic (Krogstie, 2012a). Figure 6 shows framework of the relevant
version of SEQUAL. Quality of models at different semiotic levels is discussed based on
correspondence between statements of the following sets:
“G, the (normally organizationally defined) goals of the modeling.
36
L, the language extension, that is, the set of all statements that are possible to make
according to the
graphemes, vocabulary, and syntax of the modeling languages used.
D, the domain, that is, the set of all statements that can be stated about the situation at
hand.
M, the externalized model, that is, the set of all statements in someone’s model of part
of the perceived reality written in a language.
Ks, the relevant explicit knowledge of the set of stakeholders being involved in
modeling (the audience A). A subset of the audience is those actively involved in
modeling, and their knowledge is indicated by KM.
I, the social actor interpretation, that is, the set of all statements that the audience think
that an externalized model consists of.
T, the technical actor interpretation, that is, the statements in the model as ‘interpreted’
by the different modeling tools” (Krogstie et al., 2006).
Social actor
explicit
knowledge,
K
Model
externalisatio,
M
Deontic
(learning)
Social actor
interpretation,
I
Physical
Goals of
Modeling,
G
Perceived semantic
Social
Pragmatic
(human understanding)
Deontic
Domain
Modeling,
D
Semantic
Deontic
(action)
Technical
actor
interpretation,
T
Pragmatic
(tool understanding)
Language
extenstion,
L
Syntactic
Empiric
Figure 6: SEQUAL framework for discussing quality of models (Krogstie, 2012a)
37
The principle idea behind developing SEQUAL was evaluating conceptual models in general.
It was even extended and modified for evaluating other sorts of models such as interactive
models (Krogstie & Jørgensen, 2003), requirement models (Krogstie, 2012b) and process
models (Krogstie et al., 2006) in a more detailed and precise way.
Fettke and Loos’ Proposal: Fettke and Loos (2003a) used BWW ontological model (Wand
& Weber, 1995; Wand & Weber, 1989; Weber, 1977) for evaluating reference models. The
main idea of this approach was normalizing references models ontologically. This can be
compared with normalization in databases. The difference is in normalizing reference models
real structures are taken into account, rather than technical aspects. The proposed evaluation
framework is comprised of four main steps:
1. “Developing a transformation mapping. In the first step of our method, it is necessary
to develop a transformation mapping for the grammar used for representing the
reference model. This transformation mapping allows to convert the constructs of the
used grammar to the constructs of the BWW-model. The transformation mapping
introduces an ontological meaning for each construct of the grammar used by the
reference model. The transformation mapping consists of two mathematical mappings:
First, a representation mapping describes whether and how the constructs of the
BWW-model are mapped onto the grammatical constructs. Second, the interpretation
mapping describes whether and how the grammatical constructs are mapped onto the
constructs of the BWW-model.
2. Identifying ontological modeling deficiencies. The second step is based on the former
constructed transformation mapping in general. It is possible that one ontological
deficiency is resolvable in various ways or even not resolvable at all. To identify the
ontological deficiencies of the reference model all constructs of the reference model
must be reviewed. Each construct of the reference model must be examined with
respect to whether the construct is used correctly regarding the interpretation mapping.
3. Transforming the reference model. In the third step, the reference model will be
transformed to an ontological model. The outcome of this step is an ontologically
normalized reference model. More formally, an ontologically normalized reference
model is a mapping from the constructs of the reference model to the constructs of an
ontological model. While mapping a construct of the reference model on to an
ontological construct, four cases can arise: adequacy, inadequacy, excess and
overload.
38
4. Assessing the results. In the last step, the reference model can be evaluated regarding
the results of the three mentioned steps above. First, the transformation mapping can
be assessed in general. Based on the representation and interpretation mappings it is
possible to determine the ontological clarity and adequacy of the used grammar.
Second, the ontological deficiencies of constructs of the reference model can be
assessed in particular. While the ontological deficiencies excess and overload have
their roots in the definition of the grammar, the cause of an ontologically inadequate
construct of the reference model is the specific application of a grammatical construct
employed by the person who developed the model. Third, the ontologically
normalized reference model can be assessed. In this case, two different evaluation
aspects are reasonable: isolated assessment and comparative assessment” (Fettke &
Loos, 2003a).
Guidelines of Modeling (GoM): According to (Blecken, 2010), subjectivity of modeling
cannot be removed completely. Thus, it is needed to develop and use a “framework of
principles that improve the quality of information models by reducing subjectivism in the
information modeling process” (Schütte & Rotthowe, 1998). GoM (Becker et al., 2000;
Roseman, 1995; Roseman, 1998; Becker et al., 1995) is the result of selecting relevant aspects
of information modeling from Generally Accepted Accounting Principles (GAAP) and
elements of the existing approaches for the evaluation of information models. GoM includes
six principles for quality of models: Correctness, Relevance, Economic Efficiency, Clarity,
Comparability and Systematic Design (see Figure 7).
Correctness
Clarity
Economic EfficiencyRelevance
Information Model
Quality
Systematic DesignCompatibility
Figure 7: The six Guidelines of Modeling (GoM) (Becker et al., 2000)
In each guideline clarifications on what and how should be evaluated (and improved).
According to (Becker et al., 2000): the guideline of correctness underlines that a model should
be syntactic correct (i.e. be consistent and complete against the meta model the model is
based on) and semantic correct (i.e. the structure and the behavior of the model is consistent
39
with the real world and support consistency between different models). The guideline of
relevance is about selecting a relevant object system, taking a relevant modeling technique or
configuring an existing meta model adequately, and developing a relevant (minimal) model
system. The guideline of economic efficiency is a constraint to all other guidelines restricts
e.g. the correctness or the clarity of a model. The guideline of clarity means a model should
readable, understandable and useful for both modeler and other people that deal with it. The
guideline of comparability is about using all guidelines in a modeling project consistently.
Finally, the guideline of systematic design postulates well-defined relationships between
information models, which belong to different views. Use of syntactic rules besides GoM is a
means for improving quality of models.
Kesh’s Proposal: Most of the frameworks and approaches that have taken the issue of quality
in modeling into account, present a set of guidelines about how to do the evaluation.
According to (Moody, 1998), most of these frameworks “rely on experts giving overall
subjective ratings of the quality of a data model with respect to the criteria proposed”.
However, there have been researchers that addressed the quality evaluation by proposing
frameworks for measuring quality in modeling with the purpose of reducing subjectivity (e.g.
Kesh, 1995; Moody & Shanks, 2003). Kesh’s (1995) can be seen in Figure 8. The proposal is
a model for evaluating quality of E-R models, relevant metrics and a relevant methodology.
The model is developed based on Artificial Intelligence and Software Engineering disciplines.
QUALITY
BEHAVIOR ONTOLOGY
CONTENTSTRUCTURE
USABILITY
(USER)PERFORMANCE
USABILITY
(DESIGNER)MAINTAINABILITY ACCURACY
Figure 8: The final framework (Kesh, 1995)
In Kesh’s model, quality is measured based on Behavior and Ontology of the model. Quality
of Ontology is calculated based on Completeness, Cohesiveness and Validity if CONTENTS.
40
Quality of STRUCTURE is also calculated based on Suitability, Soundness and Consistency of
relations between the model statements. In addition to quality of Ontology, it is required to
know on what factors, quality of Behavior is dependent. Criteria for calculating quality of
Behavior are Usability (User), Usability (Designer), Maintainability, Accuracy and
Performance of the E-R models.
Equation “Q = w1.s1 + w2.s2 + w3.s3 + w4.s4 + w5.s5” shows how to calculate quality of a
model; where w1, w2, w3, w4, w5 are the weights of the behavioral factors; usability (user),
usability (designer), maintainability, accuracy and performance; and s1, s2, s3, s4, s5 are the
scores on the behavioral factors. The weights and the scores have to be estimated for
calculating Q.
Shanks and Darke’s Proposal: Developing quality evaluation frameworks is not always
done from scratch. An example of this is Shanks and Darke’s (1997) proposal, which is a
composite framework for evaluating quality of conceptual models. This composite framework
is built on and is an extension of frameworks proposed in (Krogstie et al., 1995) and (Moody
& Shanks, 1994). Krogstie et al.’s (1995) framework, as it is described above, is based on
semiotic theory, but at a high abstraction level that makes it difficult to apply. On the other
hand, Moody and Shank’s (1994) proposal is about evaluating quality of models in practice,
but is not based a sound theoretical basis. Shanks and Darke’s (1997) composite framework
integrates and formalizes links between the two frameworks and how they inform each other.
This is done using components in both frameworks that are similar and are explored as a
means of integrating the two frameworks:
Audience and Stakeholder: as Audience in Krogstie et al.’s (1995) framework is more
general than Stakeholder in Shanks and Darke’s, Audience is selected.
Goal, Property and Quality Factor: “All quality factors in Moody and Shank’s (1994)
framework, i.e. correctness, completeness, understandability, flexibility and
simplicity, can be mapped to Goals and Properties in Krogstie et al.’s (1995)
framework. In order to be able to discuss alternative models, Goal, Property and
Quality Factors are taken separately in the composite framework.
Activity and Strategy: Activity concept from Krogstie et al.’s framework can cover
both.
Model: from one side Model in Krogstie et al.’s (1995) framework refers to any
model. From the other side, Moody and Shank’s (1994) framework supports
evaluating all conceptual models. Consequently Model concept is kept and used in the
composite framework.
41
Components that are disjoint can also be incorporated to the composite framework. Applying
this framework supports quality assurance in both modeling product and modeling process.
3.2.2 Quality Evaluation of Modeling Processes, Languages, Methods and
Methodologies
In studying topic of quality evaluation in modeling, concentration has been mainly towards
quality of models, than processes and tools for a modeling process. “Though some have
written about detailed stages in and aspects of ways of working in modeling, i.e. its process or
procedure, the detailed how behind the activity of creating models is still mostly art rather
than science” (van Bommel et al., 2008). Despite less number of contributions in this
category, we present examples of this set as follows:
Application of Ontologies for Evaluating Modeling Languages: An approach that has been
followed for evaluation purpose of modeling languages is applying ontologies. To shed more
light on this, we mention a few of such contributions here. Opdahl and Henderson-Sellers
(1999) conducted a series of researches to use ontologies for evaluating OO Modeling
Languages (OOML). They report that they applied the same ontology that Fettke and Loos
(2003a; 2003b) had applied, i.e. BWW, to evaluate Object Modeling Language (OML) meta-
model. They did this to make conclusions for improving OOML meta-model. As OML is a
good platform for studying OOMLs, results that were the obtained in this work, could be
generalized to other OOMLs. Consequently, a framework for evaluating OOMLs is the result
of this evaluation work. Details about ontological evaluation of OML were later published in
(Opdahl et al., 2000b). Moreover, Opdahl and Henderson-Sellers (1999) state that BWW was
used for analyzing aggregation mechanisms in OOMLs, which its results were later presented
in (Opdahl et al., 2000a).
QoMo: Quality of Modeling (QoMo) (van Bommel et al., 2008) is a framework that has
taken the issue of process quality in addition to product quality into account. This framework
supports quality measurement of process modeling. The outline of QoMo is based on
knowledge state transitions, activity costs and a goal structure for activities-for-modeling,
which are then linked to SEQUAL framework. QoMO outline is based on knowledge state
transitions and goal structure (usage goals, creation goals, validation goals, argumentation
goals, grammar goals, interpretation goals and abstraction goals for modeling activities. The
goals structure is directly linked to the main concepts of the SEQUAL framework and
42
expresses different quality notion. QoMo entails a comprehensive set of modeling process
goal types that are dependent on strategy description and means for process modeling
description. The strategies can be used for analyzing models and also guiding modeling
processes. The process descriptions are based on some strategy descriptions (usage strategy,
creation strategy and validation strategy), which may be used descriptively, for
studying/analyzing real instances of processes, as well as prescriptively, for the guiding of
modeling processes. Development of QoMo is an ongoing projects and more details about it
will be accessible after completion.
An Evaluation Framework Based on Analytic Hierarchy Process (AHP): Ssebuggwawo
(et al., 2009) have presented a method for evaluating EM sessions. They state their works is
an evaluation approach for modeling processes. As the main purpose of the work is
understanding and evaluating the modeling process, its emphasis is on the quality of four
artifacts that are used (modeling language, medium-support tool and modeling procedure) or
produced (modeling product) during a modeling process. Quality analysis and evaluation of
these artifacts provides a way for measuring the quality of the modeling process. This
approach is based on AHP, that is borrowed from Operation Research field. AHP as a
technique that helps in making complex decisions by integrating both subjective and objective
opinions, as well as individual and group priorities and combining deterministic and
stochastic to find out interdependencies between models. With the help of AHP one can deal
with the important phenomenon that might be biased by modelers and evaluators towards
evaluation criteria. Applying the proposed approach helps in taking everyone’s judgment into
account and the overall priority is aggregated as a group decision. It serves as a basis for
deriving adequate and theoretically sound and quantified quality criteria for the modeling
process using the AHP method. To follow the approach, three phases should be completed:
The evaluation approach is comprised of the steps: Structural Decomposition, Comparative
Judgment (which itself consists of pairwise comparison, formation of a comparative matrix
and priority vector, and checking consistency) and Synthesizing. Structural Decomposition
step is about identifying the real problem, continued by decomposing it into a hierarchical
structure. The Comparative Judgment step aims at establishing (local) priorities at each level
by comparing, pair-wise, each criterion, sub criterion, etc., in the low hierarchy levels in order
to find out the priority of each. The outcome of the Comparative Judgment step is a
comparative matrix the entries of which are the comparison values between each two criteria.
At the end, Synthesizing consists of determining overall rating and ranking of alternatives.
43
3.3 Conclusions of the Chapter
According to section 3.1.1, EM is a field that helps in analysis and improvement of
organizations. This is done by developing models showing the current (AS-IS) and future
(TO-BE) states of an enterprise, that should be analyzed to identify change needs and change
measures. As explained in section 3.1.2, in order to develop enterprise models, a suitable
EMM should be applied. On the other hand, quality is a broad notion, where its meaning has
to be clarified to ensure a unified understanding about it. The topic of quality has been studied
in different fields, including IS and different quality models have been proposed (see section
3.1.3, Table 3). In these proposals, the notion of quality is operationalized by specifying its
different aspects, i.e. defining a set of criteria for it. As seen in Table 3, among different
criteria for quality, efficiency is mentioned as an aspect of quality in all quality models
proposed in IS. This shows that efficiency is an effective factor in IS quality.
In an EM process, the quality of models and modeling processes are important issues. As
explained in section 3.2, numerous contributions regarding quality of enterprise models and
modeling processes have been developed. While some researches concentrate merely on
quality evaluation of models and modeling processes, there have been contributions on how to
improve the quality of models. Quality evaluation of a modeling process might include
assessing different subjects such as quality of modeling languages, quality of modeling
methods, the surrounding environment, etc. An EMM is the relevant tool for applying and
receiving support from EM. An EM process, likewise other types of processes, is required to
be efficient and the selected EMM should fulfill this need. Accordingly, investigating whether
the EMM is suitable for this purpose, is a fundamental step in an EM process (or in fact the
EMM application process). However, the issue of efficiency evaluation in EM has not been
investigated yet. Due to the importance of this topic in EM and the fact that it has not been
addressed yet, it was found necessary to begin research about it. The research results are
presented in chapter 5.
44
4. Enterprise Modeling Case Studies
In this chapter, the research projects and EM cases that were used as case studies for
conducting the present research are described. To cover this, section 4.1 exposes the
rationales of selecting these particular projects and EM case studies. Sections 4.2 and 4.3 go
into more details and elucidate more details of the selected projects and the EM case studies.
The EM case studies helped in identifying some problems that might happen in EM projects.
The identified problems can be seen in section 4.4. The chapter concludes with section 4.5,
which entails a summary on problems explained in section 4.4 and the need for performing
the presented research work.
4.1 Rationales behind Selection of the Cases
Selection of projects and EM case studies from them was based on some rationales. In this
section, the reasons behind these selections are presented. The EM case studies helped in
identifying some problems in EM cases. The same EM case studies were followed for
developing the research contribution. In fact, the EM case studies formed the background of
this research. According to this and also for the reason of simplicity, in the remainder of this
chapter the EM case studies are referred to as “background cases” or simply “cases”. In the
rest of this thesis, they are referred to as “background cases”.
Diversity: The observed EM cases were selected from the infoFLOW-2 research project
(section 4.2) and the EM Course Spring 2012 (section 4.3), which are different from each
other from several aspects. The infoFLOW-2 project (henceforth infoFLOW-2) and EM
Course Spring 2012 (henceforth EM Course) were different from each other in aspects
ranging from the involved modelers to domain experts and from type of cases to the followed
approaches. Working with cases from different projects that are selected respecting their
differences enhanced the possibility of identifying more problems. Cases from different
projects could still have problems in common. It is probable that a vague problem in a case (in
a project), was clearer in another case (that might be even from a different project). Even
clarity of a problem that is obvious in both, can be considered as an emphasis of its
importance and be helpful in understanding its importance and consequently result in taking it
into account. Problems that are visible in only one of the cases, would be identified. This
contributes to identifying additional problematic issues, especially those that occur less
frequently. According to all above, cases from different projects were selected to identify
45
more (in terms of number and accuracy) problems. Table 5 summarizes the differences
between the projects.
Table 5: Comparison between InoFLOW-2 and EM Course
InfoFLOW-2 EM Course Spring 2012
Modelers 3 Experts 5 Novices
EMM IDA EKD
Number of Sessions 1 5+
Purpose Actual improvement in the enterprise Education
Type of the Investigated Cases Real Dummy
Approach Participative Consultative
The Thesis Author’s Involvement Expert modeler Tutor
Involvement of the Author: Another reason for selecting the mentioned was the involvement
of the author: in infoFLOW-2 as expert modeler and in EM Course as tutor. Having a role in
the EM cases not only as an observer, but rather as an active participant helped in obtaining
more pervasive knowledge that facilitated the observance process. The selected projects were
carried out by the Information Engineering research group at Jönköping University (School of
Engineering). infoFLOW-2 required carrying out several EM cases. The author of this thesis
was involved in the project and participated in a number of cases. In some cases, the author
conducted the modeling session in collaboration with other modeling experts, while in others,
the author was the only modeling expert. In the EM Course, the author was involved not as a
modeler, but rather as a project tutor. This role was about supervising students in their efforts
towards completing their course projects. By selecting cases that the author was involved as
an active member, it became possible to gain a rather clear image about details of the cases,
from the followed EMMs to the discourse domain. It also provided the basis for having a
more accurate understanding about the modeling process, such as what people refer to during
the discussions, and what different parts of the models mean.
Accessibility of EM Results: The observed projects (and cases) were conducted by the
Information Engineering research group at Jönköping University, School of Engineering. As a
result, access to the output models from different stages of the EM process, was possible. The
set of developed models contained different sorts of items, varying from draft models
46
developed in the modeling sessions on plastic sheets (e.g. cf Figure 9) to finalized models that
were presented in the delivered reports (see Figure 10, that is the finalized model of Figure 9).
Access to different versions of models was helpful to more appropriately understand the
outputs of the modeling sessions. A point, that was missing in one version or is not clearly
presented, could be revealed in the future versions. In addition to this, access to various
versions of an enterprise model was a useful means even for studying the EM process.
The above stated points were major rationales for selecting EM Course and infoFLOW-2 as
background projects (and cases) for observation. This helped in deciding the relevant research
questions and developing results to address the questions.
4.2 infoFLOW-2
This section starts by presenting an introduction to infoFLOW-2 project in 4.2.1. The project
required performing several EM cases and one of them was selected for the purpose of this
thesis. Details of this case are given in 4.2.2.
4.2.1 Introduction to infoFLOW-2
This explanation is written based on the project proposal (Scientific Project Plan, 2009) and
final report (Final Report for infoFLOW-2, 2012).
“The infoFLOW-2 project is a continuation of the KK-Foundation funded project infoFLOW.
The problem addressed is information overload in industrial enterprises, which results in
inefficient work processes, quality problems and unnecessary costs. The industrial demand is
to find organizational and technical solutions, and best practices for information flow, which
can be reused in different enterprises. The approach taken was to develop information demand
patterns capturing organizational knowledge for information flow” (Final Report for
infoFLOW-2, 2012). The project runtime was 2010-04-01 through 2012-03-31 and was done
as cooperation between five institutes, Jönköping University (School of Engineering),
SYSteam, c-Business, Proton Finishing, Centrum för Informationslogistik (CIL) and
Fraunhofer ISST, collaboratively (ibid).
According to (Scientific Project Plan, 2009), infoFLOW-2 is relevant to the field of Computer
Science. More precisely, it aimed at contributing to Information Logistics, Organizational
Knowledge Modeling and Pattern Use. The core problems were addressed in infoFLOW
project and imposed the need for further research, i.e. conducting infoFLOW-2. The aim of
47
infoFLOW-2 was to address industrial demands of enterprises and IT consultants developing
solutions. The project contributed in different ways, such as:
contributing to the area of EMMs, by improving the existing methodology component
for information demand analysis;
advancing the field of EMMs by contributing with practices for information demand
modeling;
contributing to the area of knowledge patterns by advancing the concept of
information demand patterns;
contributing to the areas organizational knowledge management and EM by
evaluating the effects of information demand patterns use;
contributing to the information logistics community in a large by initiating and
establishing a community on the web gathering interested researchers for purposes of
information demand pattern validation, collection and development (ibid).
Consequently, completing the project resulted in achievement of the following objectives:
development of a large number of information demand patterns for the application
areas covered by the project partners (pattern factory);
evaluation of the benefits of information demand patterns in practical use (compared
to conventional improvement projects and compared to changes in the previous
situation);
establishment of a pattern community on the web for pattern distribution, retrieval
(repository) and continuous improvement;
improvement of the pattern development method in terms of ease-of-use and the
pattern representation in terms of visual quality;
spread the knowledge about pattern development and use (Scientific Project Plan,
2009; Final Report for infoFLOW-2, 2012).
To reach the above stated contributions and objectives, various types of activities such as
meetings and modeling workshops (sessions) were pursued. “infoFLOW-2 meetings with all
partners were arranged on average 4 to 5 times a year. The major tasks of the meetings were
discussion of activities, presentation of results and decision about future work planning.
Additionally, many modeling workshops, interviews or work package specific workshops
were performed” (Final Report for infoFLOW-2, 2012).
During the modeling workshops and work package specific workshops (aka: modeling cases)
the attendants worked with finding out what information demands that a role has. For this
48
purpose, they selected different roles at different enterprises and identified their information
demands. The EMM used in this project and its modeling sessions was Information Demand
Analysis (IDA) (Lundqvist et al., 2012). This EMM is developed specifically for modeling
information needs of a role in an enterprise. The author of this thesis was involved in a
number of meetings and workshops and acted as a modeling expert in a number of modeling
sessions, in some cases as the only modeling expert and in other cases in collaboration with
other modeling experts.
4.2.2 Observed Modeling Case/Workshop in infoFLOW-2
As mentioned above, a part of this project was to conduct modeling workshops. A modeling
workshop was done with the purpose of covering the or a fragment of a particular case. From
the different modeling cases where author attended as modeling expert, one was selected as a
case study for this thesis. In this section, the selected case is briefly introduced. The
explanation is about what role and which enterprise that was elected for information demand
analysis and how the sessions went. The explanation has been written based on (Carstensen et
al., 2012c).
Application Case Order Planning: Proton Group4 is an assembly of several companies,
including Proton Finishing5 . Proton Finishing works with surface treatment in different plants
in Sweden and Denmark. To do its business, Proton Finishing requires keeping
communication with its customers, which is done by the role “order planner”. “The purpose
with order planning case is to gain a better view on the information demand that the role of
planner has. The planner has to optimize the process between receiving the goods and starting
the actual manufacturing where the surface treatment is done. In this process several
information objects are handled by different stakeholders and by gaining a better knowledge
about this process it is possible to make it more efficient” (ibid).
According to (ibid) two domain experts and three modeling experts (including the author of
this thesis) participated. Although the expectation was to includ domain experts that act as
order planners at Proton Finishing, alternative participants who possessed other roles joined
the session: one QA & environment manager, and one logistics & purchasing responsible.
Besides including the (alternative) domain experts, the model from a EM case, that was
completed in an earlier session, was used. This session however was later on documented.
The modeling participants started the session by giving a brief introduction on their role in the
4 www.proton.se
5 www.proton.se/en/finishing
49
relevant organization. As a complement to this, the domain experts gave a presentation about
the enterprise and specifically the domain of discourse. After concluding that all participants
had the minimum knowledge about the case, the IDA modeling started. Models during the
modeling session are subject to constant change. Therefore, modeling was done on a plastic
sheet and using post-its. This helped in following a convenient modeling process. To ensure
that the “draft” model would not be spoilt, photos were taken of it, which were later put
together to make a coherent image (see Figure 9).
Figure 9: Draft of information demand model for the application case order planning (Carstensen et al., 2012c)
The captured photos and the plastic sheet were reviewed, refined and transformed into
electronic form (see Figure 10).
50
Final Planer
Create an optimized production sequence
Optimized Manufacturing
Sequence
Other Roles
Customer
Delivery Note
Delivery Requirements
(Usually Implicit)
Inquiry of Quotation
Modified Delivery
Need
ProcurementInput Material
Production
ManufactureManufacturing
Disorder
Beredare
Article Registration
Article DataTool
Optimzation
Operation Calculation
Material Calculation
Operation Data
Material Need
Rough Planner
Control of Quoting
Control Execution
Plan
Control Fixture
Availability
Coating Report
Execution Plan
Fixture Coating
Correct Execution Plan/Manufacturing Plan
Definitive Manufacturing
Order
Deficieny List
Revised Customer
Order
Revise Manufacturing
Order
Validate
Revised Customer
Order
Revised Manufacturing
Order
Set Out Definitive Execution Plan/
Kanufacturing PlanDefinitive Customer
Order
Confirm Customer’s
Order
Order Confirmation
Goods Receiver
Manufacturing Order
Register Customer
Order
Customer Order
Register Arrival
Stock Information Database
Register Manufacturing
OrdersPallet Card
Figure 10: Final information demand model for application case order planning (translated from Swedish)
(Carstensen et al., 2012c)
4.3 “Enterprise Modeling (EM)” Course
This section starts by elaborating what the EM Course is about in section 4.3.1. The course
covers different activities, from theory to practice. The main practical part of the EM Course
was working on a course project. Section 4.3.2 contains explanations about the project group
(team) that was observed as a case study for the purpose of this thesis.
4.3.1 Introduction to “Enterprise Modeling (EM)” Course, MSc Level
EM is a course that is included in the syllabus of the master level programs, IT, Management
and Innovation (Programme syllabus IT, Management and Innovation Two Years, 2012;
Programme syllabus IT, Management and Innovation One Year, 2012) and Information
Engineering and Management (Programme Syllabus Master of Science in Informatics,
specialization Information Engineering and Management, 2012) at Jönköping University
(Högskolan i Jönköping: HJ). According to (Course Syllabus Enterprise Modeling, n.d.) by
completing this course the student will be able to use EM in different problem situations, such
as organization development, information system development, standardization of enterprise
processes, etc. Also, they will be able to develop models using the EKD method (Bubenko et
al., 1998) and finally obtain knowledge for learning various other modeling methods. EM
course aims at giving students knowledge and skill for developing and analyzing conceptual
51
enterprise models. For that reason, the course has students in getting to know various topics
such as:
“Organizations and information systems, information system requirements
engineering
Change management and reengineering
Enterprise Modeling Methods, languages and modeling processes
Enterprise Knowledge Development (EKD) method. The EKD modeling language and
the participative modeling process
Quality issues of Enterprise Models
Other Enterprise Modeling approaches and languages (e.g. business use cases, EPC)
Enterprise Modeling tools (e.g. METIS) and the use of simple drawing tools to
support modeling (e.g. Visio)
Enterprise Modeling and information system development, requirements engineering,
agile development
Reuse of knowledge captured in Enterprise Models. Organizational patterns, task
patterns and pattern creation process
State of the art research direction in Enterprise Modeling” (Course Syllabus Enterprise
Modeling, n.d.).
The teaching forms selected for covering the underlined topics are giving lectures / seminars
and completing a course project (ibid). Although it seems that it is not possible to cover all
above mentioned activities, in practice the teachers’ effort was on
The teachers attempted at spreading over the major parts. Accordingly, the focus was on
developing models for a specific case, followed by analysis and identification of the change
needs and change measures. To work on the course project students had to form groups and
each group should develop enterprise models using EKD for a particular case. The selected
EM case was based on a scenario for a dummy company called Dressed for Success (DfS)
(Seigerroth, 2012). In the course project the students were asked to develop five (out of six)
sub-models of EKD: Goals Model (GM), Business Process Model (BPM), Business Rules
Model (BRM), Actors & Resources Model (ARM) and Concepts Model (CM). These models
were later used to investigate the AS-IS state of DfS and in the view of that, identify change
needs and change measures.
52
4.3.2 Observed Modeling Case (and Its Workshops) in EM Course
From the activities that are defined for the course, the planned course project was the relevant
activity for being followed to prepare the basis for this thesis. To carry out the course project
of the EM Course, students were divided into groups of 5-6 members and followed the EKD
user guide (Bubenko et al., 1998), lecture slides and other course materials. In the course
schedule five supervised sessions were scheduled that project groups were supposed to attend,
to continue their modeling task and discuss their questions with the project tutors. The
supervised sessions of the selected groups were conducted in parallel with other groups. As
the project tutor in the course, the author had access to the models developed by all groups
and was in charge of answering students’ questions. Although the observation was
concentrating on the selected groups, questions posed and models developed by other groups
were not neglected. Students received even more detailed feedback on their work in a pre-
final and final tutoring from the project tutors. However, they were told that they should
arrange extra sessions to be able to cover the whole task. More explanations on the course
project are presented in-line with section 4.3.1.
As support to this particular research and to see how students work in a group, one specific
group was selected and their modeling sessions (both supervised and extra sessions) were
video recorded. The selected EM group (team) consisted of five MSc level students. These
videos were later reviewed to find any problems that occurred. On the other hand, questions
and issues that members of other projects groups faced during their work, were also taken into
account. Four samples of EKD models that the observed group developed and included in
their final report can be seen in Figures 11 to 14. In the caption of each figure, the first part is
the title of the figure. The second part of each caption is a short explanation on whether the
model is from acceptable quality.
53
[ Standard Customer ][ Standard Customer ]
Customer choses article to buy from
Standard Collection
Customer choses article to buy from
Standard Collection
Process 4Sales registration
Article withdrawal
Registered in
database Database
Article withdrawal
Registered in
database Database
Article number,
the amount of articles, and discounts if there are any
are entered in cash register
Article number,
the amount of articles, and discounts if there are any
are entered in cash register
Sales is registered
Sales is registered
Process 8Payment processing
Customer receives receipt and standard
article
Customer receives receipt and standard
article
[ Special Customer ][ Special Customer ]
Customer Invoice
Customer Invoice
Print order with
specification of desired printing
motif
Print order with
specification of desired printing
motif
articles chosen from the standard collection
articles chosen from the standard collection
Article Not in Stock
Article Not in Stock
Article in Stock
Article in Stock
Process 2Check Stock Availability
Process 3Order stock process
Customer choses article to order from Standard Collection
Customer choses article to order from Standard Collection
Process 1Order Acknowledgment
Article withdrawal
registered in database
Database
Article withdrawal
registered in database
Database
Article number,
the amount of articles, and discounts if there are any are
entered in cash register
Article number,
the amount of articles, and discounts if there are any are
entered in cash register
Store takes order
Store takes order
Process 4Sales registration Shop begins
manual routine
Shop begins manual routine
Process 5
printing procedure
containsShop sends print order to labeling company
Shop sends print order to labeling company
Process 6(Ext.1)
printing done by
labelling company
Final printed product sent back
to shop
Final printed product sent back
to shop
Process 7
Billing
customer
Shop sends package to customer
Shop sends package to customer
contains
Final Printed Item
Final Printed Item
Process 8Payment processing
Customer receives
receipt for paid order
Customer receives
receipt for paid order
Customer makes payment
Customer makes payment
Figure 11: A sample “Business Process Model” developed in the EM Course. The model is developed according to the
EKD syntax for BPM (Business Process Models); the given labels are understandable.
54
Figure 12: A sample “Goals Model” developed in the EM Course. Placements of model components are proper.
Figure 13: A sample "Goals Model" developed in EM Course. Components of the model are placed disordered.
55
Figure 14: A sample "Business Process Model" developed in EM Course. The typed labels are difficult to understand
and the syntax of BPM are not followed correctly.
4.4 Identified Problematic States in EM Projects
By observing background cases from infoFLOW-2 and EM Course, some problems were
identified. Identification of these problems was rather straight forward. They were indeed
obvious issues that occurred during the EM processes and identifying them did not require
56
putting in much effort. After detecting the problems, they were categorized into groups.
Elaborations on problems in each group contain examples and references to sample models
from the background cases (that are made known in section 4.2 and section 4.3) and/or
statements from chapter 5. To prevent redundancy, statements relevant to both the identified
problems and research results (that are presented in chapter 5), are not repeated here. In the
following, the identified problems and problem categories are presented:
Low-Quality Enterprise Models: EM, EA and BPM are areas that for a long time have been
part of a tradition where the mission is making improvement in enterprises (Harmon, 2010).
Enterprise models as the results of an EM process are usable for such an improvement goal. A
challenge during EM is quality of enterprise models (Persson & Stirna, 2009). The quality
issue of models has been investigated by different people and several works have addressed
this issue (e.g. cf .Batini et al., 1992; Becker et al., 2000; Fettke & Loos, 2003a; Krogstie et
al., 2006). Although different authors have approached this issue differently, there are matters
that most of the works have taken into account while studying the quality of enterprise
models. Understandability of models, as an aspect of model quality (Moody & Shanks, 1994;
Frank, 2007), refers to the fact that an enterprise model should be found understandable by
the model user and model developer alike (Becker et al., 2000). Conclusion on
understandability of models is subjective (ibid), i.e. varies from person to person and her/his
capabilities. Not only understandability of models, but their correctness is also underlined as
another aspect of models quality (Moody, 1998; Ssebuggwawo et al., 2009), which means
enterprise model should reflect the reality in a correct way and not convey the wrong
message; in short, it should be semantically correct (Lindland, 1994). Another sort of
correctness of models is conformance to syntax of the followed modeling language, i.e.
syntax quality (Krogstie, 2012a). After all this, it is also expected that enterprise models are
relevant and usable for the intended improvement work (Becker et al., 1995; Krogstie,
2012a).
In the observed EM cases, there were several occasions when low-quality enterprise models
were produced. Sometimes the models were not understandable by other people who were
going to review the produced models, such as EM teachers that had to guide the students in
their project work, tool workers that were going to implement the draft models in computers
or those that were going to interpret the models and make decisions. For example, some
models were hard to read and follow. A cause of this problem was poor presentation of the
models that was a complex of different problems itself. Disordered placement of model
components (see Figure 13 as an example of a Goals Model that its components are placed
57
disordered; make a comparison with Figure 12 where placement of its components are
ordered) was a problem that made understandability of problems difficult. In addition to this,
texts and labels, that are not easy to comprehend (see Figure 14 as an example of a Business
Process Model where labels of its model components are confusing and not convenient to
understand, especially the “Information/Material” components; compare with Figure 11
where elements have with understandable labels), were also a cause of comprehending the
models inconveniently. It even happened that modelers developed models by following the
EMM syntax in correctly (see Figure 11 as an example of a Business Process Model matching
the defined syntax; compare with Figure 14 that violates the relevant syntax in different
places). This was mainly about having incorrect understanding of different EMM parts and as
a result, using them wrongly.
Occurrence of these problems resulted in gaining low-quality models. This even at some
points of time imposed the need for repetition of the modeling process.
Incorrect Understanding of the EMM: An EMM or a modeling language is a tool used for
developing enterprise models. Understandability of an object-oriented modeling language
(OMG, 1997) as a dimension of its quality can be generalized to all types of modeling
languages in, including EMMs. Tools used for developing models are required to be
understood by users. An EMM, as the relevant tool for developing enterprise models, is
comprised of different parts (see Figure 5). There is a need to have a correct understanding of
different parts of the EMM: Perspective, Method Component (Procedure, Notation and
Concepts), Framework and Cooperation Principles.
The issue of incorrect understanding of the EMM was found in both the observed cases and
regarding different EMM parts. In the following, a few examples of this are presented. These
problems were significant and were identified immediately. To apply the EMMs, it was
necessary to ensure they were found helpful by the modeling team. Misunderstanding about
Perspective and how an EMM can support a purpose, was a problem occurring prior to or
even in the beginning of an EM process (e.g. Statement 3). After confirming suitability of the
EMMs Perspectives, their Frameworks had to be followed. Uncertainty about how to perform
the work, i.e. what phases that have to be completed and how, was a notable problem in
applying the EMMs and conducting the EM cases. An example of this can be seen in
Statement 10. This kind of conditions were misleading or confusing in the process. To
perform the phases of Frameworks, that aimed at developing models, use of Method
Components was necessary. This method part was perceived wrongly in different ways. There
were occasions when modelers were going to apply an irrelevant Method Component (e.g.
58
Statement 15). It also happened that comprising parts of Method Components were
understood incorrectly. For example, team members were confused about how a notational
constituent (an element or relation) can be used (e.g. Statement 24) or how to differentiate
between different meanings and Concepts (e.g. Statement 29). There were also moments
when team members were not on the right track in investigating the enterprise and extracting
relevant information about it, which is the topic of Procedure. Examples of this occurred
when the team members had not learned the method Procedure correctly (e.g. Statement 17).
As can be seen, the identified problems are relevant to Perspective, Method Component
(Procedure, Notation and Concepts), but no problem relevant to Cooperation Principles is
stated here. This does not mean that in applying the EMMs no problem regarding this part
occurs. Rather, this was merely how it went in the observed EM cases.
Incorrect Understanding about the Case: To start studying an enterprise and construct
improvement actions, one should have an understanding of the state of the enterprise. “We
should inquire the current situation. When the current state is clarified, it is possible to begin
the design of future solutions” (Goldkuhl & Röstlinger, 2010). This is mainly about which
enterprise components are related to each other in the under studied domain and how things
go in the enterprise. This is however not an easy process, since a part of it is about
transforming tacit knowledge into explicit knowledge or transforming subjective knowledge
into objective knowledge (Wickramasinghe & Mills, 2001). “Tacit knowledge exists in the
human’s mind, which is the knowledge that people don’t know; in other words people don’t
know what they know” (Allee, 1997); they even do not know what they want. If the process
of extracting explicit knowledge results in producing incorrect output, the audience will
probably get an incorrect perception on the enterprise.
In the performed cases, it frequently happened that the modeling team members had doubt
about a part of the discourse domain. They were not sure of how two elements were related to
each other or how an issue is handled. For example, members had doubts about what tasks a
role should complete (as an example see Statement 20 together with role Final Planer in
Figure 10) or what is needed as input for accomplishing a task (as an example see Statement
21 and Statement 31 together with role Goods Receiver in Figure 10). They realized that there
are issues in the enterprise that they had not been aware of. It was seen that people had
different or even contradictory perceptions about a specific fact in the enterprise. This resulted
in starting discussions about what idea was correct or probably more correct.
There were occasions when team members had to make assumptions in order to proceed the
work. Especially, in the EM Course case, the modeling team members did not have access to
59
the real domain experts and found it necessary to make assumptions. This presented some
difficulties, since they had not clarified their assumptions. As a consequence of this, it was
difficult at some points to judge if the models were developed fully based on DfS scenario
(Seigerroth, 2012) or parts of them are based on assumptions (e.g. Statement23).
Deviation from the Proper Working Path: To reach the objectives of EM that are assisting
in better understanding and a unified enterprise, building new parts of the enterprise, and also
providing a model used to control and monitor the enterprise operations (Madarász, 2005),
relevant models have to be developed. During the process of developing enterprise models,
the focus at a time should be on modeling only one state of the enterprise. The resulting
model should represent either the current or future state. Different states cannot be included in
one model. In case this mistake is committed, the process of identifying the shortcomings will
be hindered.
In infoFLOW-2 and EM course the attendants were supposed to concentrate on modeling the
current state of the enterprise, identify the shortcomings in the enterprise and figure out the
change needs accordingly. During the modeling processes, the team members were
sometimes struggling in remaining on the correct working path. It was specifically evident
when they were discussing the current and future states of the enterprises. This resulted
occasionally in deviating from modeling the current state of the enterprise and instead
including information regarding change needs (or the TO-BE) state in the AS-IS models (e.g.
Statement 29). Such a deviation made the diagnosis and solutions development processes
difficult.
4.5 Summary of the Identified Problems in the EM Projects
The major problems identified in the performed projects are explained in section 4.4. Having
a closer look at these problems, we can divide them into two main groups. The first group,
that has really only one member, contains “Low-Quality Enterprise Models”. This issue is the
potential cause of further difficulties. Enterprise models are developed to be used as input for
improvement purposes and models that violate any of the quality criteria of enterprise models
result in inconvenience during further work. As a consequence of this, the audience might get
wrong interpretations on the domain of discourse, which itself might result in making wrong
decisions. As a consequence, low-quality models impose the need of carrying out extra work
for improvement and to make them usable. Such extra work varies from minor changes to
even re-modeling the whole case. The second group of problems, that entails “Incorrect
60
Understanding about the EMM”, “Incorrect Understanding about the Case” and “Deviation
from the Right Working Path”, affects an EM process (either directly or indirectly). Just like
the first set of problems, this set inflicts the need for extra effort to keep the work process
according to the requirements and expectations of the modeling team and stakeholders.
Occurrence of the above mentioned sets of problems in the observed EM cases demonstrated
that figuring out solutions for addressing them is important. Besides this, process quality has a
direct influence on the product’s quality (see ISO/IEC 9126). A similar topic has been
underlined by researchers of EM field. It has been discussed that EM processes influence
quality of enterprise models (van Bommel et al., 2008). In an EM project, both the EM
process and modeling results are of interest. The produced models should be found relevant
and useful by the stakeholders. It is also important that the EM process be performed in a
proper way. This need became a motive for conducting the current research. A solution for
this is to define a set of criteria that should be covered by the EMM. These criteria underline
factors whose fulfillment by the EMM is decisive. An EMM that meets the specified criteria
supports performing the EM process correctly. Such a process and the EMM that is used to
initiate it, help in developing models closer to the ideal. This idea was developed to deliver
the research results, presented in chapter 5.
61
5. Efficiency Evaluation in Enterprise Modeling
This thesis is done with the purpose of studying the topic of evaluating efficiency (as an
aspect of quality) in EM. The result of this investigation forms the current chapter. Section 5.1
clarifies what the notion of efficiency in EM means. This is followed by introducing an
approach that supports evaluating efficiency of EMMs in section 5.2.
5.1 Efficiency in EM
An EM process is performed using a relevant tool that is an EMM to develop enterprise
models. Therefore, an EM process can even be called an EMM application process. Such a
process is expected to be of high enough quality. An aspect of quality in an EM process is
conducting an efficient EMM application process. As it is stated in 3.1.3, this thesis pursues
Kurosawa’s (1991) definition of efficiency:
“Efficiency is used for passive or operational activity, which is usually defined technically so
that the system and its behavior are foreseeable in advance”.
A process should result not only in “results attainment”, but according to Kurosawa (1991) it
should “be foreseeable” to ensure efficiency of the process. In case any of these conditions is
not fulfilled, efficiency is consequently violated.
To have an efficient EM process, the relevant EMM should support this aim. Thus,
application of the EMM should result in fulfilling stakeholders’ needs. It should also fulfill a
number of criteria that clarify how the EMM should be, i.e. the EMM should be foreseeable.
We call an EMM that covers these needs (as requisite for conducting an efficient EM process)
an efficient EMM.
An EMM, as a variation of method, is not a singleton artifact. It is built from and comprised
of different parts: Perspective, Framework, Cooperation Principles and Method Component,
where Method Component is comprised of Procedure, Notation and Concepts (see Figure 5).
Applying an EMM means working with different parts of it. Therefore, it is necessary that
different EMM parts aid in conducting EM process. As a result of this, foreseeability of not
only the EMM as whole, but its comprising parts matters.
In applying an EMM it should be noted that there are criteria that always have to be fulfilled,
always in the same way, and regardless of the current case. Also, there are issues where their
fulfillment depend on the ongoing case and vary from case to case. In, short, the need is
62
having an EMM with parts that support efficiency in “general case of application” as well as
“specific case of application”. According to all above, an efficient EMM is defined as
follows:
An EMM is efficient if the results of the EM process are according to the needs expressed by
the stakeholders and the process defined by an EMM (and each part of it) is performed
exactly according to the criteria for the general and specific case of application.
5.2 An Approach for Evaluating Efficiency of Enterprise Modeling
Methods
As it is discussed in section 2.1.2, the contribution of this thesis is an artifact. The intended
artifact is an “Approach for Efficiency Evaluation of EMMs (A3E2M)”, which is presented in
this section. This artifact is developed respecting the notion of efficiency (elaborated in
section 3.1.3), the notion of method (see Figure 5), the state of research in quality evaluation
in EM field (see sections 3.2 and 3.3) and problems identified in the background cases (see
sections 4.4 and 4.5). The same cases were later on used for developing the results. In
development of A3E2M, statements from the background cases were used. Each statement is
extracted from either the followed handbooks ((Bubenko et al., 1998) and (Lundqvist et al.,
2012)) or discussions between the participants of the EM cases. Table 6 contains the list of
participants in the background cases.
Table 6: Modeling participants of the background cases
Organizational Role; Organization Project Role in the EM Case Identification
Student; HJ EM Course Modeler Participant 0
Head of Quality Management; Proton Finishing infoFLOW-2 Domain Expert Participant 1
IT, Logistics and P Manager; Proton Finishing infoFLOW-2 Domain Expert Participant 2
Researcher, Associate Professor; HJ infoFLOW-2 & EM Course Expert Modeler Participant 3
Researcher, Lecturer; HJ infoFLOW-2 & EM Course Expert Modeler Participant 4
PhD Student, Researcher; HJ infoFLOW-2 Expert Modeler Participant 5
Student; HJ EM Course Modeler Participant 6
Student; HJ EM Course Modeler Participant 7
Student; HJ EM Course Modeler Participant 8
Student; HJ EM Course Modeler Participant 9
Student; HJ EM Course Modeler Participant 10
PhD Student; HJ infoFLOW-2 & EM Course Expert Modeler Participant 13
63
In section 5.2.1 it is clarified how the statements are used in developing the results. The
remainder of this section is dedicated to clarifying the structure of A3E2M (in section 5.2.1)
and explaining how to follow this approach (section 5.2.2)
5.2.1 Structure of the Approach for Efficiency Evaluation of Enterprise Modeling
Methods
In this section the structure of A3E2M is presented. Figure 15 gives an overview of how the
structure of A3E2M is presented. A3E2M entails a list of efficiency criteria that an EMM
should cover to be called efficient It is explained what criteria for each method part should be
fulfilled and how efficiency of each part could be evaluated
Figure 15: An overview of A3E2M structure
To do this, the current section is organized as follows: it consists of sub-sections (5.2.1.1 to
5.2.1.4), each covering one EMM part, i.e. Perspective, Framework, Method Component and
Cooperation Principles; sub-section 5.2.1.3 consists of three sub-sections (5.2.1.3.1 to
5.2.1.3.3) relevant to the constitutive parts of Method Component: Procedure, Notation and
Concepts. Each sub-section starts with elaboration on that particular EMM part. This sheds
more light on what is presented in the method notion about each part. This is followed by
defining efficiency criteria for different method parts. The criteria are divided into two main
categories: Criteria for General Case of Application and Criteria for Specific Case of
64
Application. Criteria in each category are operationalized by explaining exactly what has to be
fulfilled. They are supported by including statements from the background cases. Each
criterion is given an alphanumeric identification. The letter part consists of the initial letter(s)
of the same EMM part and the numeric part shows the numerical position of the criterion in
the EMM part. Some statements are quotations from discussions between the project
participants and the rest are written based on the EMM user guides followed in the
background cases (see (Bubenko et al., 1998) and (Lundvqvist et al., 2011)). Each statement
is given an identification. Since efficiency evaluation is indeed checking whether the defined
criteria are covered, each sub-section ends with a list of suggested driving questions. These
questions can be considered as the starting point of the evaluation process. The user
nonetheless, is not restricted to this list and can define additional evaluation questions as well.
Table 6 contains the list of modeling participants, their relevant organizations, the role each
individual held in her/his organization and role in the EM case. For simplicity reasons, each
participant is given an identification, are used to clarify what statement in the text is stated by
what participant. In this table, Participants 1 to 10 are individuals that were involved in the
followed modeling cases. Participant 0 refers to students of the EM course that were members
of other project groups. They were not directly of interest for this research, but the author was
in contact with them as their project tutor. This contact resulted in identification of a number
of issues that they brought to the table as questions. Thus, such statements are also considered
for developing the results. Participant 13 refers to the author that was involved in both
infoFLOW-2 and EM Course. Participants 11 and 12, that are not included in Table 6, are not
relevant to this chapter. They were involved later in validating the contribution.
5.2.1.1 Perspective
Perspective is a part of a method that sheds light on what is important and can be supported
by the method. Although it might look that it is sufficient to know only this about Perspective,
in order to have a Perspective that supports efficiency, one needs to go into more details and
specify criteria that should be covered. To develop enterprise models, the modeling team need
to get a clear perception about the enterprise. Perspective of an EMM clarifies the viewpoints
that can be followed for developing enterprise models using this specific EMM. In an EMM,
Perspective can be clarified explicitly, or be implicit in explanations about the other method
parts. This is however preferred to be as explicit as possible. The more clear and
unambiguous a method Perspective is, the more correct understanding an audience will have
about it. Consequently, the possibility of selecting a proper EMM will be higher. Efficiency
65
criteria for general and specific application cases in an EMM Perspective are presented
below:
Efficiency Criteria for Perspective in General Application Cases:
Pe1: Clarification on modeling what aspect(s) of an enterprise are supported by this
EMM: An EMM is developed to help its users in studying an enterprise. This is done by
modeling the enterprise from one or more viewpoints. An EMM Perspective should clarify
how pervasive an EMM is and from what viewpoint(s) the enterprise can be modeled. Hence,
it is required to highlight what aspects of the enterprise can be modeled using this method:
In the EKD user guide, the Perspective is explicitly exposed by Bubenko et al. (1998). As the EKD
user guide state, the purpose of applying this EMM can be formed in a four-item list:
how the enterprise functions currently
what are the requirements and the reasons for change
what alternatives could be devised to meet these requirements
what are the criteria and arguments for evaluating these alternatives. (Statement 1)
According to the IDA handbook, “understanding information demand is very much about provision
of view about individuals that have such a demand”. Although the title of the EMM implies that it
helps in modeling information demand in an enterprise, it is still required to go through the handbook
to get a more detailed perception. (Statement 2)
Pe2: Clarification on modeling what aspect(s) of an enterprise are not supported by this
EMM: In a definition or delimitation it is preferred to clearly stipulate the borderline between
what can and what cannot be covered by the intended meaning. Although, this is not
straightforward, fulfilling it is a considerable aid. By making the assumption that Perspective
in an EMM is a type of delimitation that clarifies the coverage of the EMM, it should be clear
what type of modeling purpose it does and does not support. Users review the method user
guide to find out what the Perspective is and to confirm whether the EMM supports the
current needs. A common mistake is that people, especially those who are novices in EM or a
particular EMM, have unrealistic expectations from EM in general or the EMM that they are
using:
“- How an EMM like EKD solves problems?
- An EMM does not solve problems for you. It helps you modeling the organization. [Then] you will
investigate the models to see where the problem is.”
(Statement 3; Participant 10, Participant 13)
66
Hence, it is necessary to make it clear to the audience what an EMM is and for what purpose
it can (and cannot) be helpful.
In addition to this, exposing Perspectives that cannot be supported by the method, helps in
preventing confusion and misinterpretation:
“Another important consequent of above is the clear role-based information need that it has. For
example, the difference with process modeling is an activity logic, which is not the point of our
focus., i.e. when we analyze information need, do not care about in what order activities are done,
neither how information objects are manipulated or applied by different activities” (Lundqvist et al.,
2012). Therefore, by clarifying what is included in the coverage of the method, the authors help the
reader to have a correct image of what supported is and is not by this EMM”. (Statement 4)
In developing an EMM, different terms and phrases are defined and used. Definitions of each
term and the meaning that it has in a specific context helps the audience to understand the
Perspective and its application field. It is important that the EMM developer and the potential
user have the same understanding of the these terms. A part of defining Perspective is
ensuring that all terms used in it are understandable and clear enough. Although it is expected
to see an elaboration on terms used in Perspective, especially similar terms, in practice this is
a topic relevant to Concepts, as it is in the Concepts part that EMM developers concentrate on
elucidating terms and phrases.
Any change in Perspective affects its strength and its coverage. Thus, the expectations from
the EMM require consideration after any modification to Perspective.
Efficiency Criteria for Perspective in Specific Application Cases:
Pe3: Perspective match with modeling team’s requirements regarding the case: An EMM
is a means that helps in developing models from an enterprise. Enterprise models will be
applied for further usage, mainly analysis and investigation of the models with the aim of
making decisions. Depending on what part of an enterprise that the modeling team need to
study, a specific type of EMM is useful. An EMM where its Perspective match the modeling
team’s needs and requirements is more helpful for a case. These needs are usually identified
and agreed upon before working with the EMM. As part of this agreement, the modeling team
and the involved domain experts (including stakeholders) should decide and confirm what
they intend to assess and investigate:
“- We have discussed a lot about what can be done to improve these things, but never got that far that
we started looking to solutions. But I am sure we can look at some problem areas, as I said… we are
very much dependent on that someone provides us with information, what we have to do is to request
67
it in different ways.
…
- There are a lot of different roles, we have fine planning, a lot of information going out, driver list,
debriefing and
...
- Whom does the planner send information to? What is the next step? Who is dependent on this
person in the manufacturing?”(Statement 5; Participant 1, Participant 2, Participant 3)
In addition to deciding on what needs to be investigated and discussed, it is necessary to
decide on how encompassing the enterprise models should be. That is to what degree of detail
should the modeling be done, and what parts of the enterprise should be taken into account:
“- If we include order, delivery, transports and all those things here too, the question is how far we
can go before time runs out. It will suddenly be very big, or if we should look until production
planning.
- Kind of limitation [is required].” (Statement 6; Participant 1, Participant 4)
Evaluation of Perspective:
Becoming familiar with the notion of efficiency in EM and how a Perspective should be to be
called efficient, is necessary to start the evaluation task. This task is mainly about checking if
the defined criteria for the Perspective are covered. This should be done by asking various
questions, each taking points from an efficiency criterion into account. In the following a
sample list of driving questions, that the evaluator can use as the starting point in evaluation
of Perspective is given. In case there is a need, the list can be modified:
From what viewpoints can the enterprise be modeled using this EMM?
What output models will be gained by covering each aspect of the enterprise?/ What
will the output models contain?
Is it clarified what the strengths and weaknesses of this EMM are?
What image of the enterprise does the modeling team desire to gain?
What further applications are going to be taken based on the enterprise models
(merely analysis, analysis followed by change action suggestions …)?
Is the EMM pervasive enough to cover all needs of the modeling team?
Is the complete coverage of the EMM required for the current case?
68
5.2.1.2 Framework
In a method, Framework is the backbone of the method and clarifies how different parts of a
method are related to each other. This is done by showing the phase structure of the method
and how this structure aims at fulfilling the method Perspective. In an EMM, a Framework
shows the phases for visualizing an enterprise respecting the determined Perspective. A
Framework in an EMM should cover some minimal requirements to facilitate the EM process
and support efficiency. Below, criteria for the general and specific application cases for
Framework of an EMM are presented:
Efficiency Criteria for Framework in General Application Cases:
Fr1: Support provision for the Perspective: In a method, the specification of the phase
structure is mainly about how the modeling team apply the Method Component respecting a
set of Cooperation Principles to cover the method Perspective. In an EMM, a Framework
should comprise phases that when completed result in meeting Perspective of the EMM, i.e.
developing relevant enterprise models. The subject of reaching the EMM goals, has to be
operationalized by specifying the relevant phases:
“… This is … what you do today generally. Looking at errors and faults versus all the information
about how you would like to do it [and] what we do today, will be the background and controls how
we design.” (Statement 7; Participant 3)
There are three main activities in an EM process: “understanding the enterprise”, “modeling”
and “analysis”. Therefore, Framework phases should be set up to cover these three main
activities. Framework in an EMM is supposed to provide a structure that helps in meeting the
Perspective which in an EMM is visualization of an enterprise from a particular viewpoint. If
an EMM Framework covers the three highlighted activities, fulfillment of Perspective is more
probable. Especially coverage of the first two activities, i.e. “understanding the enterprise”
and “modeling” is needed for this purpose:
IDA handbook presents a Framework showing what phases that have to be completed for developing
and presenting IDA models. It is even shown that an IDA process is followed by activities that are
dependent on the developed IDA models. In this way, it is even required to consider Method
Components from other EMMs. All these support the IDA Perspective that is about analysis of
information demands of a role in an enterprise. (Statement 8)
Bubenko et al. (1998) in the EKD user guide present an overall schema showing how an EKD process
starts by studying the AS-IS state, continued by identifying change needs and followed by identifying
the TO-BE state. EKD prescribes developing six different sub-models: Goals Model, Business
Process Model, Business Rules Model, Actors & Resources Model, Concepts Model and Technical
69
Components & Requirements model. Assuming development of each sub-model is done in the
“modeling activity”, the proposed Framework supports to the EKD Perspective, although the
sequence of completing the phases is not clear. (Statement 9)
Fr2: Clear definition of the Framework structure: An EMM Framework contains phases
that when completing them support the EMM Perspective. To follow this structure, it is
important to be aware of how this phase structure works:
“How [do] we [continue] the work? [Do] you have some suggestion?” (Statement 10; Participant 9)
Users of an EMM need to know how such a structure works. As in a Framework the focus is
on introducing the phases and their connection with each other, we need to see what the
purpose of each phase is and how it is related to the other phases. Accordingly, the current
criterion is divided into two sub-criteria:
Fr2.1: Clear definition of each Framework phase: A Framework phase as a step of
enterprise models development should receive specific types of input and give specific types
of output. Clear specifications of what is required for completing a phase (input), what can be
expected by the audience as well as prospective appliers (output), and the aim of the phase is
what the EMM applier needs to know. It is nevertheless not expected to find details about
how the goal of each phase is fulfilled, in the handbook. By fulfilling this set of requirements,
the audience will have a clearer picture of what tasks that should be completed in each step:
IDA method is comprised of seven phases: “Scoping”, “ID-Context Modeling”, “Additional
Analysis”, “ID-Context Analysis & Evaluation”, “Representation & Documentation” and “SE & BPR
Activities” (see Figure 16).
ID-Context Modelling Component
ScopingID-ContextModelling
ID-ContextAnalysis & Evaluation
Representation&
Documentation
AdditionalAnalysis
Enterprise Models
InterviewsModellingSeminars
Method Components
CompetenceModelling
Social Network Analysis
Enterprise Modelling
BusinessSystems
Information Demand Patterns
SE & BPRActivities
Technological/Organisational
Portfolio
Scope &Activities Schedule
Models &Notes
Context Models
EEML Models
ILOGSolutions
Method Documentation
Models
Information Supply
Patterns
optional
optional
Figure 16: Overview of information demand analysis process (translated from Swedish) (Lundqvist et al., 2012)
70
This figure clearly shows what are the required inputs are for each phase and what outputs that can be
expected from them. For each phase, some inputs are provided by other phases, whereas the rest
should be supplied through other means. All Framework phases (except “SE & BPR Activities”) show
clearly what the purpose of each phase is, what the input and/or condition for each phase is and finally
what the output of them will be. (Statement 11)
Fr2.2: Clear definition of the relations between Framework phases: Linkage between
phases of a Framework that aim of each phase in the whole Framework be covered. To
pursue a successful EM process, it is necessary to have full control over the practice of
completing phases. Therefore, it is necessary to know in what sequence the phases have to be
performed. This imposes the need for suggestions on whether the phases should be completed
in parallel, in sequence or a combination of both. One expects to see what benefits the
different alternatives have relative to each other. The more explicit this is, the clearer image
the audience will have. This issue is significant for the “modeling” activity and especially in
EMMs that support development of two or more sub-models. In such EMMs, it is important
to know which different (sub-model development) phases are related to each other and how.
This way it is possible to figure out where from, other than the enterprise, inputs for the
current phase can be retrieved and where its outputs should be sent. Poor fulfillment of this
criterion might result in confusion about in which a particular aspect of an enterprise should
be modeled:
“Is it ok to work on two plastic sheets [i.e. two different sub-models]? Or she (the teacher) didn’t say
anything?”(Statement 12, Participant 6)
IDA handbook makes it clear how different phases are related to each other and in what sequence the
phases of “modeling” activity should be performed. This is done by visualizing the IDA process (see
Figure 16) (Statement 13).
Efficiency Criteria for Framework in Specific Application Cases:
Fr3: Support provision by Framework to the case: This criterion is not novel and
independent, but is a composition of two non-aligned criteria. This is automatically fulfilled
by covering “Pe3: Perspective match with modeling team’s requirements regarding the case”
and “Fr1: support provision for the Perspective”. The purpose of Framework is providing
support to the EMM Perspective. It is important that the structure of the Framework aids in
developing models that match the modeling team’s requirements. Fulfillment of this requires
covering of two different topics. First, the method Perspective should match the modeling
team’s needs. This subject is to be considered in evaluating the Perspective. An EMM
71
Perspective should be reviewed to find out whether it helps in getting enterprise models of the
required viewpoint. In continuation, we need to know how the Perspective is supported by the
Framework. Knowledge about the phases comprising the Framework aids in getting a
perception about how the EM results could be achieved and how they will look like. If the
modeling team has specified what they need to see in enterprise models, they will be able to
confirm whether following a specific EMM and its Framework is enough for obtaining their
goals. This knowledge also helps in finding out if all phases need to be completed, or if there
are phases that could be skipped. Accordingly, confirming whether this support is provided
requires completion of the two mentioned evaluations.
Evaluation of Framework:
To evaluate the Framework part, it is required to ask questions both in general and specific
application cases. The following driving questions can be used to start the evaluation process:
What phases constitute the EMM Framework?
What is the purpose of each phase?
What is the required (input) for completing each phase?
How is each phase related to others of the Framework phases?
Does the set of phases provide enough support to the modeling team?
What phases of the Framework are necessary to complete for the intended case?
In what order should the Framework phases be completed?
5.2.1.3 Method Component
An EM process starts from “understanding the enterprise”, continued by “modeling” and ends
by “analysis”. Each of the stated activities has a decisive role in the process, however, most of
what is known as an EM process concerns applying the EMM for developing enterprise
models. In practice, what mostly is considered as an EMM application process, is using
Method Component. This method part itself comprises three parts that are closely related with
each other: Procedure, Notation and Concepts (see Figure 5).
In an EMM, Procedure contains questions regarding the enterprise under study, and their
answers will be documented using Notation in the form of enterprise models: Concepts show
what is mutual between the Procedure and Notation and how they relate to each other. In an
EMM, Method Component is the distinguishing part and indicates the type of the EMM. The
Notation part has the main role in this contribution. In a Method Component of an EMM,
72
Notation is the part that is applied to expose the EM results. Hence, it has the main role in
differentiating between different EMMs.
It is necessary that not only Method Component, but rather its comprising parts are efficient.
Accordingly, each of the three mentioned parts should are efficient. In the following,
efficiency criteria for general and specific application cases in an EMM Method Component
are presented. Sections 5.2.1.3.1 to 5.2.1.3.3 are dedicated to presenting efficiency criteria for
the constituting parts of Method Component.
Efficiency Criteria for Method Component in General Application Cases:
MC1: Support provision to relevant Framework phases: In an EMM, Method Component
is the part that should be used for completing the “modeling” activity in an EMM application
process. If the EMM contains more than one Method Component, it provides the possibility to
model the enterprise from different viewpoints. In such an EMM, all Method Components
might be needed either in one Framework phase or any subset of Method Components might
be required in a specific phase. All the same, it is essential that each Method Component
provides suitable support to the related phase:
In IDA, the phase “Additional Analysis” suggests using EKD, i* and UECML. The authors motivate
their suggestions by explaining the mentioned EMMs and in fact their Method Components, help in
providing relevant material to the phase. (Statement 14)
The more a Method Component matches to the relevant Framework phase, the better the
purpose of the phase can be fulfilled.
Efficiency Criteria for Method Component in Specific Application Cases:
MC2: Support provision by Method Component to the current application case: When an
EMM is used in a case, it is expected to support visualizing how different elements in an
enterprise are related to each other. There exist EMMs that entail more than one Method
Component, each useful for making models from a particular focal area. In using such an
EMM, all or a subset of the Method Components might be required. The decision about what
Method Components that have to be taken into account for the current case is contingent upon
stakeholders’ requirements for the case. If a Method Component support a provision that is
not in the modeling team’s needs, there is no point in using it:
“Goals and Problems relevant to IS should be modeled using TCRM [not GM]. Working on TCRM is
not your job; you should skip it”. (Statement 15; Participant 13)
73
To make this decision, modeling experts should assess stakeholders' needs as well as the
strengths and weaknesses of a Method Component to conclude whether it is suitable for the
case, or not. This even influences the selection of an EMM for a particular case.
Evaluation of Method Component:
Evaluating the Method Component part requires asking questions in general and specific
application cases. For this purpose different questions can be put. In the following examples
of possible questions are presented.
Are there any sample scenarios and are their models provided, showing how the
Method Component fulfills the aims of the phase?
Is it clear what Framework phase that is supported by each Method Component?
To what extent does the Method Component covers the phase expectations?
What Method Components are provided in this EMM?
Is it clarified what is covered using each Method Component?
Is the coverage of a particular Method Component helpful for the stakeholders’
purpose?
5.2.1.3.1 Procedure
Procedure, as a part of Method Component, is about what questions to ask. Procedure is not
always presented in the form of questions; rather it entails explanations and details on the
execution of the part. Anyhow, Procedure of an EMM guides what should be considered for
assessing the enterprise and how. “What” refers to the facts that should be taken into account
in asking questions and “how” refers to the formulation of procedural guidelines. Regardless
of whether Procedure is formulated as questions or explanations, it should conform to some
criteria. Below the efficiency criteria for a Procedure in an EMM are mentioned:
Efficiency Criteria for Procedure in General Application Cases:
Pr1: Proper formulation of Procedure: Method Procedure sheds light on how to work and
is the reference for the audience and potential user to learn what to ask. In an EMM, this part
aids in figuring out how to investigate the enterprise and what questions that should be
addressed to provide the basis for producing enterprise models. In formulating a Procedure, it
is important that the potential user of the EMM reaches the same understanding as the EMM
developers about the Procedure. For this purpose, different constituting parts of the
Procedure such as procedural explanations and questions should be explained in detail:
74
“Looking at Notation is not enough here. Please review the text (driving questions) to see how you
[should] make models.” (Statement 16; Participant 13)
Supplying a properly formulated Procedure is a major step in defining a Procedure, whereas
violation of this results in an incomplete understanding about the Procedure and the EMM
that might cause several unwanted effects on the EMM application process.
Efficiency Criteria for Procedure in Specific Application Cases:
Pr2: Suitability of the Procedure for the case: In an EM case, Procedure is applied by the
modeling team to ask and answer questions on the enterprise and specifically the domain of
discourse. Thus, they need to work with a Procedure that is suitable for the case. A part of
suitability of Procedure is that the contents can be perceived conveniently and correctly. On
an overall level, a modeling team consists of domain experts and EM (EMM) experts with
different experience and knowledge. Each member needs to understand the Procedure, but
they might have different perceptions about understandability and ease of working with the
Procedure, and its comprising procedural questions:
“- Why is it [the IDA model] wrong?
- [In IDA] you should think role-based, not process-based.” (Statement 17; Participant 5/ Participant
13)
Another aspect of the Procedure suitability is determining the appropriate fragments of the
Procedure and its procedural questions (guidelines) that are needed for a specific case. A
Procedure contains several questions and guidelines that can be used. However, this does not
necessarily mean that the whole Procedure should be applied in an EM case. The reason for
this is that some cases are simpler and investigating them is therefore simple and does not
require going through all details:
“You don’t need to think about AND/OR relations between sub-goals and main goals”. (Statement
18; Participant 13)
Simplicity of a case is depending on the level of details required by the modeling team
(specially the stakeholders) and the complexity of relations within the enterprise:
“- Should we show we have four trucks?
- If you have a reason [for showing the figures], you can do this”. (Statement 19; Participant 0;
Participant 3/Participant 13)
Pr3: Availability of reliable information sources about the case: Working with a
Procedure in an EMM means asking and answering questions about the enterprise. To
75
provide more accurate answers, the most reliable information sources should be used.
Domain experts who are involved in the enterprise, can be considered as reliable information
sources. The more experience and knowledge a domain expert has regarding the discourse
domain, the more truthful answers (s)he can provide:
“I’m not that good at planning myself, it’s not my field. Some things are new, the handling of the
revision; it’s not easy ... What does man do when they check the drive plan, Thomas?” (Statement 20;
Participant 1)
“- What did we call that, which goes back there?
- good question, no clue! (Statement 21; Participant 4, Participant 1)
In addition to domain experts, different types of documentation such as organizational charts,
brochures, already developed enterprise models, etc. should be considered as information
sources about the enterprise:
“- We do have some picture on that. Janne, don't we?
- We have these big activities of course. We could check how it looks. From an order to delivery, in a
pure planning view, big activities. Nothing [is] detailed.” (Statement 22; Participant 2, Participant 1)
The contents of documentation might range from information collected on the current or
future states of the enterprise to any other type of information required for processing the
work:
“- Some of the goals [you have in the model] are not in the [DfS] scenario.
- Actually we have made assumptions.
- Then somewhere in your document you should write the assumptions.” (Statement 23; Participant
7, Participant 13)
Accordingly, it is necessary to be aware of the available information sources. Such awareness
can be used for various purposes, including prioritization of the sources.
Evaluation of Procedure:
To perform efficiency evaluation of Procedure, we should ask questions that aid in checking
whether efficiency criteria for general and specific application cases are fulfilled. To do so,
the following questions might be useful:
Are the different parts of the Procedure clear and identifiable?
Is the Procedure comprised of guidelines or questions?
Is each of the procedural questions elaborated?
How does the EMM expert, as a team member, interpret each procedural question?
76
How does the domain expert, as a team member, interpret each procedural question?
Do EMM experts’ and domain experts’ interpretations of procedural questions match
each other?
Do the EMM and domain experts’, as team members, find the Procedure
understandable?
What fragments of the Procedure (procedural questions/guidelines) are applicable for
the current case?
5.2.1.3.2 Notation
Notation, as the second part of Method Component, prescribes how answers to the procedural
questions in Procedure should be documented. Notation part of an EMM entails notational
constituents (elements plus relations between them) and guidelines on how to document the
answers in the form of enterprise models. This is the final step in visualizing the real world
elements and the relations between them. A Notation should cover some criteria and
specifications as fundamental requirements to it. Below the efficiency criteria for Notation of
an EMM are presented and elaborated:
Efficiency Criteria for Notation in General Application Cases:
N1: Proper clarifications about notational constituents: Using this method part is the final
step in applying a Method Component and results in visualizing answers to procedural
questions. To work with Notation, i.e. to represent answers to the procedural questions, it is
necessary to become familiar with Notation and its constituents.
In an EMM, the presentation of element and relation types is the initial step for fulfilling this
need. The audience review the notational elements to find out “what they look like”.
However, these constituents are presented in an abstract form. Thus, reviewing textual
clarifications in the EMM user guide is a necessity, too. The clearer and more detailed the
texts are, the more accurate the knowledge the audience will have about the Notation. In
addition, it is even useful to assess sample models. Presentation of sample models helps in
getting to know the Notation and recognize useful the EMM can be:
“- I don’t get it. What is the difference between AND join and OR join.
- Have you read the user guide?
- Yes, but it is not written.” (Statement 24; Participant 0, Participant 13)
77
N2: Coverage of Procedure by Notation and contrariwise: A Notation is expected to
provide support for documenting answers obtained by following the relevant Procedure. It is
crucial that in an EMM, all elements and relationship types, which might appear in answers to
the procedural questions, can be depicted by the Notation. Also, the audience should be able
to find out which procedural question (guideline) that is supported by each notational
constituent. The results of applying a Procedure (answers to the procedural questions) might
contain not only elements and relations between them, but also other constituents such as
comments or attributes. It should be possible to implement the results gained by following the
Procedure using notational elements in the form of models. On the other hand, presentation of
each notational constituent should be with the purpose of fulfilling a specific purpose. To be
more precise, each notational constituent should be presented with the intention of helping in
documentation of answers that will be gained by following the Procedure.
According to all above, the expectation is to see a bi-directional coverage between Procedure
and Notation. If this is violated in any of the directions, it is a sign that Notation and
Procedure (or indeed Method Component) efficiency is violated:
In the phase “ID Context Modeling” of IDA, a number of driving questions are suggested that are in
fact the main part of the Procedure. Answers to these questions can be conveniently documented
using the given Notation. On the other hand all of the defined notational elements and relations are
applicable in working with the suggested driving questions. (Statement 25)
Efficiency Criteria for Notation in Specific Application Cases:
N3: Suitability of the Notation for the case: When a Method Component is going to be
applied in a particular project, the Notation should be found suitable by the modeling team.
By suitability we refer to two topics: first the Notation should be understandable by the team
members, i.e. they should be able to figure out how to implement the answers using the
notational constituents:
“Could we have one yellow for two [information objects]?”(Statement 26; Participant 4)
“When you have a Problem and its Sub-Problems, the arrow should go from the Problem to the Sub-
Problems, not other way around.” (Statement 27; Participant 13)
This is heavily dependent on the composition of the team. A Notation that is found
straightforward and easy to understand by an EM expert might require more effort by a
novice modeler.
78
Understandability and ease of learning of a Notation is a decisive factor in selecting an EMM.
In addition to this, team members should find details of the Notation useful and necessary for
their purpose:
“- We don’t have information overflow [sign] in the model.
-Yes… because we did not have any information.” (Statement 28; Participant 13, Participant 5).
The Notation should be found supportive for the modeling team's needs. This however does
not mean each and every element in the Notation has to be applied in the case. After ensuring
a match between the modeling team’s needs and Perspective, reviewing the Notation and
sample models is a measure for selecting a proper EMM.
Evaluation of Notation:
In the following some driving questions for facilitating the efficiency evaluation process of
Notation are suggested. These questions can be used to begin the evaluation process:
What types of clarifications about Notation are given?
Are any sample models included in the user guide?
Are all notational constituents considered in the clarifications?
Are all notational constituents required for the current case?
What shortcomings does the Notation have for being applicable in the intended EM
case?
5.2.1.3.3 Concepts
The last Method Component Part, i.e. the Concepts, is the cement part between Procedure
and Notation to help in understanding terms and meanings used in the two other parts.
Coverage of Concepts can even go beyond this and support the understanding of other parts,
especially the Perspective. To have a Concepts part support efficiency, it should conform to
some specifications. Thus, efficiency criteria for this part are presented as follows.
Efficiency Criteria for Concepts in General Application Cases:
C1: Elucidation about conceptual elements: Concepts in Method Component underline
terms and meanings that are important. Concepts are the cement part between Procedure and
Notation; and having a clear understanding about them in an EMM is a requirement to
understand how Procedure and Notation cooperate in a “modeling” activity. Accordingly,
presenting the Concepts, i.e. highlighting them and explaining their meanings is a necessity.
The main cause for this need is that people that are involved in applying an EMM, need to be
79
able to communicate with each other. For that reason, they should the same language
regarding the appointed EMM. In an EMM, there might be Concepts that look as if they
convey similar meanings, but are in reality referring to different notions. Such Concepts can
mislead the method applier in various ways, such as mixing up terms, gaining contradictory
perceptions by different participants and including wrong Concepts into the answers:
“Many of these [Goals] are in fact changes.”
“But… isn’t Goal a kind of change?” (Statement 29; Participant 13 & Participant 8)
“Isn’t [Information] “Sequence” the same as [Information] Flow?... When [do] we use Sequence?”
(Statement 30; Participant 13)
Thus, it is necessary to not only include the meaning of Concepts, but rather to help a reader
to distinguish and differentiate them.
Although this topic is presented under Concepts, this criterion can be decisive in assessing
other EMM parts as well, especially Perspective. By this we mean that clarifying differences
between similar terms in Perspective is a necessity that in practice should be fulfilled in
Concepts. This helps the potential appliers of the EMM to have a clear picture of the strengths
of the EMM.
C2: Coverage of terms and meanings used in Procedure and Notation by Concepts and
contrariwise: In Method Component of any method (including EMM), there is a strong
relation between Procedure, Notation and Concepts and these three parts should be in
harmony with each other. Criteria concerning the relation between Procedure and Notation in
an EMM were already discussed under Notation. In addition to this, it is required that the
relations between Concepts and Procedure as well as Concepts and Notation be in an
appropriate form. Concepts should be operationalized by inclusion in definition of Procedure
and Notation. This means the two following topics should be covered:
On the one hand, the Concepts should be covered by both Procedure and Notation. By this we
mean that Concepts should be used in the formulation of procedural questions (guidelines).
Also, they should have been applied in the presentation of different notational constituents,
such as elements, relations, comments, etc. By including Concepts in the other two parts, their
meanings and the rationales of their existence become more understandable. The reason for
this is that Procedure and Notation bring the discussion of how Concepts are related to each
other up on the table:
“- [A] process does not go directly to database. Process gives information, information goes to the
database… [first the] process, [then the ]information, [at last the] database.” (Statement 31;
Participant 13)
80
This last criterion (and its details) presented in Concepts part, is complementary to the criteria
for Procedure and Notation. This is a considerable means to improve their understandability;
Concepts especially as the thesaurus for Procedure and Notation has a determinant role in
making them comprehensible.
On the other hand, the meanings that are pointed out in the Procedure (and the procedural
questions) and Notation should be reflected in Concepts. However, we do not insist that it has
to be considered as an independent criterion. Rather, it is a complement to the above point. As
long as Procedure or Notation are understandable and not confusing, the assumption is that
this coverage has been fulfilled.
Efficiency Criteria for Concepts in Specific Application Cases:
C3: Suitability of the Concepts for the case: Concepts underline terms and phrases that help
in understanding the key meanings in a method. In an EMM application case, members of the
modeling team should find Concepts suitable for their modeling purpose and matching what
they need to use for their further purpose. Suitability of Concepts can be assessed from
different viewpoints and is strongly dependent on the people involved in the team and the
subject of the case. An aspect of suitability is understandability. In a specific case of
application, the modeling team reviews the Concepts and their clarifications to gain a better
understanding about the EMM. Accordingly, it is necessary that they find the Concepts
understandable:
“Here [BPM] information means what the process needs as input or gives as output. It doesn’t means
information about the process.” (Statement 32; Participant 13)
As another aspect of suitability, the topic of usefulness should be noted. Stakeholders in a
modeling case have a set of needs that should be fulfilled by developing enterprise models.
These needs might be formally documented. Such documentation can be used as a reference
for identifying important meanings, terms and relations between them. On the other hand,
there might be needs that are not documented and remain in the form of tacit knowledge in
stakeholders’ minds. In such cases, relevant terms and meanings cannot be pointed out. If the
team’s requirements are embedded in any of the stated sources, they can be used for
discussing suitability of Concepts (and consequently the whole EMM):
“- We can’t find any Opportunity [elements to include in the Goals Model].
- You do not necessarily have to have Opportunity [element].” (Statement 33; Participant 0,
Participant 4)
81
Evaluation of Concepts:
To evaluate whether the Framework part efficient is, it is required to ask questions both in
general and specific application cases. To start the evaluation process, the following questions
can be applied:
Are all listed Concepts defined?
Is each Concept mapped (or used directly) to a meaning in Notation and Procedure?
To what extent the team members’ perceptions of each Concept match to EMM
developers’ definitions?
What Concepts have been found useful by the EM team members?
Do all Concepts need to be considered in the specific case?
5.2.1.4 Cooperation Principles
Cooperation Principles of an EMM concentrate on how different people interact and co-
operate in performing an EM process, or indeed an EMM application process. This part has to
do with what roles for applying the EMM should be involved and what the work division
should be. For satisfying this need and having Cooperation Principles that are efficient, this
part should conform to a set of criteria. Below the relevant criteria for efficiency of
Cooperation Principles in general and specific case of application are presented.
Efficiency Criteria for Cooperation Principles in General Application Cases:
CP1: Clarifications about the required competences: In an EMM user guide, developers
present explanations on how to conduct the EM process. Applying any method (including an
EMM) requires holding some competences. This means, that there are certain capabilities that
people involved in an EM process should possess. This can be boiled down into more facets.
In general, in order to work with an EMM it is necessary to include two main groups of
people in the modeling group: EM experts and domain experts:
EKD user guide clearly states what roles and competencies that are required in an EKD process: “the
project manager, the steering committee, the reference group, the (domain experts) modeling
participants (typically 5-8 people), the modeling facilitators, the modeling technicians, and others.
However, not all of these may be needed in a smaller project. (Statement 34)
This can be further detailed by adding clarifications for roles in each of the mentioned groups
of attendants that are required, the required number of participants or their level of expertise.
82
CP2: Clarifications about appropriate Cooperation Principles: An EM process requires
people cooperating with each other to apply the EMM. Hence, before starting to apply an
EMM, it is necessary to know how people should cooperate with each other in the EMM
process, i.e. what Cooperation Principles should be followed:
In the EKD user guide it is stated that the EM progress by interviewing the modeling participants.
This is followed by applying the selected EMM. All these are preferred to be in participative form.
The EKD user guide also states what Cooperation Principles (consultative) should be avoided and
why. (Statement 35)
In addition to this, it is useful to know what advantages and disadvantages each of the
mentioned forms has.
In the EKD user guide three reasons about why a participative approach should be followed is given:
The quality of the enterprise model is enhanced because models are created in collaboration
between stakeholders, rather than resulting from a consultant’s interpretation of interviews
Consensus is enhanced because modeling participants have to collectively expose their
thinking, meaning that they have to establish a direct conflict or strive towards consensus.
Achievement of acceptance and commitment is facilitated because stakeholders are actively
involved in the decision-making process.
This list can be considered as a specification of advantages for the participative approach. Although
the above mentioned motivations about why participative approach should be followed, no drawbacks
about it are explained. Neither is it clarified why a consultative approach should be avoided.
(Statement 36)
Having such information helps the potential EMM applier in making comparisons between
different and EMMs selecting a correct method for her/ his modeling purpose.
Efficiency Criteria for Cooperation Principles in Specific Application Cases:
CP3: Capability of applying the suggested Cooperation Principles To use an EMM and
begin an EM process, it is important the that members of the modeling team are capable of
work with the EMM. By capability we refer to having theoretical as well as practical
capability:
“My name is Anders Carstensen … I work at Jönköping School of Engineering. I am both teacher
and researcher, maybe more teacher. My field is Enterprise Modeling and planning.” (Statement 37;
Participant 4)
Although at first glance it seems that theoretical capability (knowledge) in Cooperation
Principles is not a critical factor, having this knowledge helps people to gain a clearer and
more concrete understanding about what they are going to do. Also, their experience in EM
83
can be considered as a competence factor. Experiences are usually in the form of tacit
knowledge and used unintentionally.
Evaluation of Cooperation Principles:
Efficiency evaluation of this part also requires asking relevant question. As a support to this,
some questions are suggested to evaluate the Cooperation Forms in general and specific
application cases:
What Cooperation Principles are suggested for this EMM?
Is it specified what roles on the modeling experts’ side and what roles on the domain
experts’ side that should be involved?
Is it clarified what level of experience and expertise that is required for each of the
stated roles?
Is any explanation given on each of the Cooperation Principles?
What are the pros and cons of each of the stated Cooperation Principles?
How familiar are the team members with the suggested Cooperation Principles?
5.2.2 How to Follow the Efficiency Evaluation Approach
Section 5.2.1 contains an approach, called A3E2M, for evaluating the efficiency of EMMs.
A3E2M helps in evaluating an EMM and in fact different parts of it. It can be applied for the
purpose of evaluating a complete EMM or just parts of it. To perform any of these, a process
consisting of two phases has to be completed. First, a Preparatory Phase should be carried out
to reconstruct the EMM according to the notion of method. The key point in performing the
Preparatory Phase is that the user’s focus should be on the meaning and purpose of each
EMM part. To confirm what each method part is about and begin the reconstruction process,
the user of A3E2M should go through the structure of this approach (presented in section
5.2.1) and review the elaborations given about each EMM part, in the beginning of each sub-
section. Since these elaborations are detailed explanations about EMM parts and their
purposes, they help in gaining a clearer understanding about the method notion. For
reconstructing an EMM, not only the method notion, but rather the EMM user guide
(handbook, manual, etc.) should be considered. Explanations of different parts of an EMM
might be scattered in various parts of the user guide. Therefore, it is required to review the
whole user guide during the reconstruction process. By completing this, the chance of
identifying all explanations on each part increase. EMM reconstruction is a decisive step in
the evaluation of an EMM. It prevents struggling around how each EMM part has been
84
covered in the user guide. Moreover, people who are involved in the evaluation process gain
common perceptions about the EMM and its different parts.
After completing the Preparatory Phase, the Main Phase (Evaluation Phase) begins. The
purpose of this phase is evaluating the EMM. For this purpose, the reconstructed EMM
should be checked against the efficiency criteria for general and specific cases of application,
defined for each method part. To commence this, a possibility is to start with the suggested
driving questions. To continue, the user might find it necessary to add her/his own questions.
Hence, we emphasize the suggested driving questions are not the only possible means to
begin an evaluation process. The list of driving questions can be tailored according to the
current needs by modifying the existing questions or adding new ones to it. In short,
conducting the Evaluation Phase means asking questions to realize whether the different parts
of the EMM meet the efficiency criteria. Asking these questions provides the basis for
discussions about each EMM part and reaching conclusion about the whole method. The
answers to these questions and the outcome of the discussions should be documented. This
documentation will be used in the following processes (identifying change needs and change
measures). Figure 17 contains an overview of how the developed approach should be
followed.
EMM Handbook Notion of Method
Reconstruct the EMM
Reconstructed EMM
Perspective Method Component
Cooperation FormsFramework
Evlauate the Reconstructed EMM
A3E2M Results of Efficiency Evaluation of the EMM
Preparatory (Reconstruction) Phase
Main (Evaluation) Phase
Figure 17: An overview of how to follow A3E2M
85
6. Empirical Validation of the Efficiency Evaluation
Approach
The efficiency evaluation approach (A3E2M) that is presented in chapter 5, has to be
validated to draw conclusions about the extent to which the contributed results are applicable.
For this purpose, two EM cases from infoFLOW-2, different from the one followed in
developing the approach, were selected. The current chapter is dedicated to empirical
validation of A3E2M. This starts by introducing the validation cases in section 6.1 and
reflecting on the efficiency evaluation approach, i.e. A3E2M, in section 6.2. In section 6.3,
some refinements for A3E2M that were developed based on the reflection in the earlier
section, are presented. Section 6.4 is dedicated to a discussion about what phases of an EM
process can be supported using A3E2M. The chapter finishes with section 6.5 that entails
discussions about A3E2M and its applicability in evaluating efficiency of EMMs.
6.1 Introduction to the Validation Cases
To validate A3E2M, the author followed two cases from the infoFLOW-2 project. In each
case, a modeling expert (the author) and a domain expert were involved. The first case was
“Marketing Department at Jönköping University-School of Engineering (Jönköping Tekniska
Högskolan: JTH)”, henceforth “JTH Marketing Department”. The involved domain expert
was was a marketing communicator at JTH. The second case was the “the Manager (Head) of
Information Engineering & Management Program, Jönköping University-School of
Engineering (Jönköping Tekniska Högskolan: JTH)”, henceforth “the Program Manager”.
The domain expert was , that was included to identify the relevant tasks and information
needs of the same role, was the “Program Manager”. Both cases were performed by the author
and at JTH. Both cases were performed to contribute to the infoFLOW-2 project. Specifically,
they were done to identify the problems in the way information flows for the investigated
roles. As each of the cases was done to contribute to the infoFLOW-2 project, a case report is
written about them. To perform these cases, the IDA handbook v 2.0 (Lundqvist et al., 2012)
was followed. To read about the infoFLOW-2 project, see section 4.2. Brief introductions
regarding each of the cases are presented in section 6.1.1 and section 6.1.2.
86
6.1.1 Marketing Department at Jönköping University-School of Engineering
In this section the “Marketing Department” modeling case is briefly introduced. This
introduction is extracted from (Carstensen et al., 2012b), using both direct and indirect
citations.
“For this department the most significant challenge is to obtain reliable information about
demanded skills in the job market and on the other hand attracting as many as possible
students to the JTH study programs” (ibid). To do this, the marketing department has to
perform the following tasks:
investigating the job market and identifying the skills in demand,
analyzing and processing the collected information regarding the demanded jobs,
communicating with the educational departments to transfer information about
requirements in the job market,
retrieving information about the new (and existing) study programs and delivering
them to prospective students.
(ibid) shows in detail which activities that were performed in this specific EM case to model
and analyze information demands existing at the JTH Marketing Department. The case was
done by attendance of one modeling expert and one domain expert. The modeling expert (the
author of the current thesis) works as PhD student in Information Engineering research group
in JTH, and the domain expert works as marketing communicator at the JTH Marketing
Department. Although both attendants had informally met before, the modeling case started
by giving a brief introduction by each attendant, mainly the domain expert. As a complement
to this, the modeling expert asked the domain expert a series of questions, which can be
regarded as an initial interview. To start modeling, the domain expert was asked to mention
what roles that the JTH Marketing Department must communicate with regarding the tasks
defined for the JTH Marketing Department:
“Webmaster
Educational department(s), i.e. computer engineering, mechanical engineering,
industrial engineering, built environment and lighting.
Prospective student
Study counsellor
Advertising company
Current student/ alumni
Host company (-ies)
87
University services
Jönköping International Business School (JIBS) / Högskolan för Lärande och
Kommunikation (HLK) / Hälsö Högskolan i Jönköping (HHJ)
Partner university (-ies)” (ibid).
While some of the roles that the JTH Marketing Department has to communicate with are
inside JTH, others are inside HJ, but outside JTH, or even outside HJ. Identifying what roles
that JTH Marketing Department connects to, was based on what information that is needed
for completing each task and what roles that expect to receive information from the JTH
Marketing Department. The modeling case was carried out in two sessions. The models were
done using post-its on a plastic sheet. While working in the modelling sessions, the domain
expert stated that there exist models showing how things go in the JTH, including the JTH
Marketing Department. He presented these models during the modelling process. As a result.
they were also considered as information sources. The produced IDA models were later
implemented in electronic form. The final model output can be seen in Figure 18.
Figure 18: Information demand model for “JTH Marketing Department” (Carstensen et al., 2012b)
88
6.1.2 The Manager (Head) of Information Engineering & Management Program
In this section a brief introduction to “the Program Manager” modeling case is given.
Information Engineering & Management is a two-year MSc program given at Computer
Engineering Department at JTH. To complete the program, one should pass a number of
courses and write a relevant thesis. While most of the courses are mandatory, some elective
courses have to be cleared, too (see (Course Syllabus Enterprise Modeling, n.d.)). The
introduction is written based on (Carstensen et al., 2011a) using both direct and indirect
citations. The program is administrated by a program manager to ensure the quality of the
courses and the program as a whole (ibid). The program manager needs to complete a number
of tasks and communicate with some other roles to run the program. This is done to exchange
information that any side of the communication may need. While a subset of the roles, that
should be communicated with, are inside department and the relevant research group, the rest
are outside the department but inside JTH, or even outside JTH but inside HJ.
Members of the modelling team for this particular case were a modelling expert (the author of
this thesis) and the Program Manager. In contrary to the JTH Marketing Department case,
both modeling participants work at the same department and research group. As a result of
this, they were familiar enough with each other's roles. Thus, the initial interview was skipped
for this case. The roles that the Program Manager communicate with to accomplish its tasks
are:
“Teacher
Head of the (Computer Engineering) department
Informatics program manager
Head of the research group
Student
Course responsible
Administration” (ibid)
During the modeling session, the model was developed using the same material used in JTH
Marketing Department, i.e. post-its and plastic sheet. The models were later implemented in
electronic form (see Figure 19).
89
Figure 19: Information demand model for “Information Engineering” program manager (Carstensen et al., 2011a)
6.2 Reflections on the Efficiency Evaluation Approach
To validate A3E2M the two EM cases introduced in section 6.1 were followed. To increase
transparency and prove the applicability of the developed approach, a summary of the
efficiency evaluation of the two cases using A3E2M is presented in Table 7. The contents of
the table is the results of evaluating IDA using criteria for general and specific cases of
application, that are presented in section 5.2. The followed EMM, as well as the aim of both
cases, were the same. Thus, all evaluation results are combined and presented in one table.
Each criterion in A3E2M is given an identification. The same identifications are used in Table
7 to show whether IDA was found successful, unsuccessful or semi-successful in meeting
each criterion.
Using A3E2M against the two cases made it possible to reflect on the approach. Reflections
on the Preparatory Phase (see section 6.2.1) and the structure of A3E2M (see section 6.2.2) of
A3E2M are explained.
90
Table 7: Evaluation results of IDA in ”Marketing Department” and “Program Manager”
Criterion ID & Evaluation Results
Met
hod
Par
t
Per
spec
tiv
e Pe1: Successful; The method provides support for understanding information demands a role has in an organization.
Pe2: Successful; It does not support process-based investigations.
Pe3: Successful; Perspective was found supportive to the team’s purposes.
Fra
mew
ork
Fr1: Successful; The phase structure of IDA fulfills all three types of activity in a Framework.
Fr2.1: Successful; It is graphically clarified what the inputs and outputs for each phase are.
Fr2.2: Successful; It is graphically clarified how different phases are related to each other.
Fr3: Successful; The Perspective is relevant to the team’s purposes and the Framework supports the Perspective.
Met
hod
Co
mp
on
ent
MC1: Successful; The Method Component of IDA supports the relevant phases.
Method Components suggested for “Additional Analysis” seem to be supportive, but might be required to be revised in
case of need.
MC2:Semi-Successful; The Method Component of IDA was found applicable, whereas the Method Components
suggested for “Additional Analysis” were not found applicable to the cases. In “Marketing Department” case “another
Method Component was applied to complete “Additional Analysis”.
Pro
ced
ure
Pr1: Semi-Successful; Procedure presented in “Scoping”, “ID Context Modeling” and “ID Context Analysis” are
explained in detail. To reach explanations about Procedure in “Representation & Documentation” and “Additional
Analysis”, referring to other handbooks are prescribed. No Procedure relevant to “SE & BPR Activities” is presented.
Pr2:Semi-Successful; Procedural guidelines presented for phases “Scoping”, “ID Context Modeling” and “ID Context
Analysis” were found applicable and followed. Procedure relevant to “Additional Analysis” and “Representation &
Documentation” were not found applicable.
No
tati
on
N1: Semi-Successful; No explanation about different notational elements is given. One sample model is presented.
N2: Successful; All Notational elements were found necessary in working with Procedure. Implementing answers
gained by asking the procedural questions is fully supported by the Notation.
N3:Semi-Successful: IDAs Notation was found relevant. Other (external) Notations were not applicable.
Co
nce
pts
C1: Successful; A list containing the Concepts and explanations about them is given. The explanations were found
helpful for distinguishing the Concepts.
C2: Semi- Successful; There are Concepts that are not covered by Notation (e.g. Time, Place, Social network). Some
terms in Notation are not covered by Concepts (e.g. Indirect Flow, Information Gap, Sequence). All procedural
elucidations were found understandable with/without the help of Concepts.
C3:Successful; All Concepts were understandable and contributed directly/indirectly to the EM process.
Co
op
erat
ion
Pri
nci
ple
s CP1, CP2: Semi-Successful; Small hints about what roles/ Cooperation Forms should be involved/followed are given
in “Scoping”, “ID Context Modeling”, “ID Context Analysis”. No recommendations about number of attendants or
their level of expertise is given.
No suggestions about relevant roles/ Cooperation Forms are given in “Representation & Documentation”.
CP3:Successful; All team members (the modeling expert and domain expert) were strongly capable in terms of theory
and practices.
6.2.1 Reflections on the Preparatory Phase
Below reflections on the Preparatory Phase, that were obtained by validating A3E2M ,are
presented.
91
No suggestions about avoiding reference to wrong handbook: According to section 5.2.2,
in the Preparatory Phase the EMM should be reconstructed based on the EMM handbook and
the method notion (see Figure 5). To initiate the validation process of A3E2M, the author
noticed that there exist different versions of the IDA handbook (see Statement 40). This
imposed the need to investigate how to work with this set of handbooks. On the other hand, it
was realized that no suggestion is presented in A3E2M (and it comprising criteria) regarding
how to proceed if there exist different versions of an EMM handbook. As a result of this, it
was not clear whether the only allowed source for reconstruction is the version of handbook
used in EM sessions or if other versions can be followed, too.
No suggestions about need for repeating the Preparatory (Reconstruction) Phase:
According to section 5.2.2, completion of the Preparatory Phase is a prerequisite for starting
the Main (Evaluation) Phase. According to this, during evaluation of A3E2M, the intention
was towards completion of the Preparatory Phase before continuing with the Main Phase. It
was however found that it is inevitable to repeat the Preparatory Phase during the Evaluation
Phase, now and then. To reconstruct IDA and its different parts, it was necessary to go
through the IDA handbook several times. Once the IDA handbook was reviewed, it was found
that there exist lines that can be considered as coverage of more than one EMM part (an
example of this can be seen in Statement 38). Furthermore, explanations about each method
part are not necessarily presented all together (in one paragraph, under a sub-heading, etc.).
They might be scattered in different parts of the handbook (see Statement 39). All this
resulted in a need for reviewing lines and paragraphs, that were marked as presentation of
EMM parts, to ensure the most relevant fragments of the handbook are marked for each EMM
part. Nevertheless, by looking at the presentation of A3E2M (and hints about how to follow
this approach) it can be seen that no clarifications about the contingency of repetition of the
Preparatory Phase is given.
6.2.2 Reflection on the Structure of A3E2M
In performing the validation EM cases, the reconstructed IDA method was checked against
the efficiency criteria. Lines and fragments in the IDA handbook that were marked as support
for each EMM part, were compared to the relevant efficiency criteria. In this way the
suggested driving questions were used as a starting point. For example, the part of IDA
handbook that is marked as clarification of Perspective (see Statement 2) was checked against
the driving question “from what viewpoints can the enterprise be modeled using this EMM?”
to begin discussion about coverage of IDA method. By following the defined efficiency
92
criteria and the proposed driving questions, it was possible to evaluate each part of the EMM
individually as well as in relation with the other parts. For example in a method, the Method
Component consists of three inter-related parts (Procedure, Notation and Concepts). Each of
these three parts should be efficient in their place and in relation with the other two. To fulfill
this, criteria presented in sections 5.2.1.3.1 to 5.2.1.3.3 should be fulfilled. In addition to this,
Method Component itself should cover the efficiency criteria defined for it (see section
5.2.1.3).
It was necessary to confirm the suitability of the EMM for the intended cases. To do this,
questions like “is the EMM pervasive enough to cover all modeling team’s needs?” had to be
asked for each EM case. Not only the Perspective, but also other EMM parts had to be
checked against the intended (validation) cases. For example, it had to be determined what
notational elements that were suitable for each of the cases. In the “JTH Marketing
Department” case, a question was whether “JIBS/HLK/HHJ Marketing Department” should
be depicted as a “role”, an “organizational unit” or an “external unit” (see Figure 18). In both
EM cases it had to be decided whether all Framework phases should be completed, i.e.
conducting discussions to answer the question “what phases of the Framework are necessary
to complete for the intended case?” was necessary. The phase “Additional Analysis” in IDA
(see Figure 16) is optional. This phase was not needed in the “Program Manager” case,
whereas it was found helpful in the “JTH Marketing Department”. As explained in section
6.1.1, in the “JTH Marketing Department” case the already existing models were reviewed to
obtain additional information other than what the domain expert was providing.
A3E2M and in fact the defined efficiency criteria for different EMM parts, are presented in a
generic style. Elaborations on efficiency criteria are simple and straight forward. This has
made them flexible and convenient to work with. For example in the efficiency criteria for
Cooperation Principles, the criterion “CP1: clarifications about the required competences”
has to be interpreted and operationalized by the audience. It is the audience that should
specify what competences should be possessed to this criterion. In elaboration about
efficiency criteria for Method Component, “MC1: support provision to relevant Framework
phases” is stated as an efficiency criterion. The audience should go through the given
explanation about the criterion to interpret and specialize it. However, during application of
A3E2M it was realized that some parts of the approach are too general and still have room for
improvement to become a mature approach. Two points indicating this type of shortcoming in
the presentation of the structure of A3E2M are presented below.
93
Promising presentation of efficiency criteria for Framework: In evaluating the Framework
part of IDA, fulfillment of the criterion “Fr1: support provision for the Perspective”
(presented in section 5.2.1.2) was also evaluated. A part of this evaluation was to check if all
three activities “understanding the enterprise”, “modeling” and “analysis” are covered in the
IDA handbook. From these three activities, A3E2M underlines “understanding the enterprise”
and “modeling” as the most decisive activities. The “modeling” activity is already the focus
point of A3E2M and aided in efficiency evaluation in the validation cases. It was however
realized that A3E2M does not support evaluating the efficiency in “understanding the
enterprise”.
Promising presentation of efficiency criteria for Method Component: While performing
the validation cases, it had to be checked whether applying the Method Component developed
specifically for IDA is sufficient, or if Method Components suggested for “Additional
Analysis” phase (see Figure 16) had to be used as well. To evaluate each Method Component
it was necessary to go through its three constitutive parts. Each of Procedure, Notation and
Concepts parts were evaluated individually to draw conclusions about whether applying that
particular Method Component was suitable for the case. In other words, fulfillment of each of
the three criteria “Pr2: suitability of the Procedure for the case”, “N3: suitability of the
Notation for the case” and “C3: suitability of the Concepts for the case” had to be assessed.
However, if it was possible to perform an initial evaluation on the suitability of each Method
Component, evaluation of the three mentioned criteria could be facilitated or even skipped.
This issue has not been taken into account in A3E2M.
6.3 Refinements to A3E2M based on the Reflections
Based on reflections on A3E2M (presented in section 6.2) some complementary points that
should be taken into account while following the developed approach, were developed. They
are in fact details of A3E2M that were not recognized during development of the approach.
The main body of A3E2M consists of criteria for each EMM part that are supported by
statements from the validation cases. A similar style is followed in the current section. Each
complementary point is given an identification, formed of one or more fragments and
supported by relevant statements from the validation cases. The statements are either written
based on the user guide followed (the IDA handbook) or quotations from the EM team
members. Table 8 shows the list of participants in the cases used for validating A3E2M. The
table shows the involved modeling participants, their organizational role and their role in the
EM case. Also, each participant is given an identification. Three individuals were involved in
94
the validation cases: one modeling expert (the author of the thesis) who conducted both cases
(Participant 13) and one domain expert per case (Participant 11 and Participant 12).
Table 8: Modeling participants of the validation cases
Organizational Role; Organization Role in the EM Case Identification
Information Engineering Program Manager; JTH Domain Expert Participant 11
Marketing Communicator; JTH Domain Expert Participant 12
PhD Student; JTH Expert Modeler Participant 13
A subset of the identified refinements are concerns relevant to the Preparatory Phase
(presented in section 6.3.1) and the rest are additional efficiency criteria for different EMM
parts (presented in section 6.3.2).
6.3.1 Implications regarding the Preparatory Phase
Validation of A3E2M resulted in identifying problems in the way it is applied. A group of
these problems are relevant to the Preparatory Phase. They imposed the need for figuring out
implications in order to dissolve the problems. The identified implications are “Prep1: the
iterative nature of EMM reconstructions” and “Prep2: following the correct reference” to
address the problems “following the correct reference” and “dispersed presentation of EMM
parts in the handbook”, respectively. Since these implications are relevant to the Preparatory
Phase, their identifications start by “Prep”, as the alphabetical part.
Prep1: The iterative nature of EMM reconstruction: In practice, an efficiency evaluation
process should not be expected to be completed in one iteration. Rather, it is an iterative task.
During application of A3E2M, it is required to switch between the evaluation and the
reconstruction phases. This does not necessarily mean that the reconstruction results are
incorrect in the beginning. Rather, it means that they should be improved or presented in
different ways and respect the current status of EM case. Changes that occur in the EM case
and the EM team are two important factors that make the reconstruction phase, iterative. In
addition to this, each iteration of the Reconstruction Phase might be necessary to repeat
several times. The reason for this is that there are statements that can be considered as support
for understanding more than one EMM part. Although this facilitates comprehension of the
EMM, it results in continuous revision of such lines and paragraphs to ensure that
demarcation of EMM parts has been done in the most suitable way.
The IDA handbook contains a section specifically for clarification the Perspective: 3.1 Perspective-
95
what is important in IDA. This section however is not wholly dedicated to Perspective. In-line with
this, elucidations on applicable Cooperation Principle, and motivation about why it is the most
appropriate choice, are given. As information demand analysis is about identifying information needs
that a specific role has, making a differentiation between fragments related to Perspective and
Cooperation Principles was completed in more than one iteration.
Also in the same section, it is emphasized that the IDA process should be performed role-based and
not process-based. This on the one hand can support clarifying the Perspective definition, and on the
other hand, is a hint and part of Procedure. (Statement 38)
Another reason for the iterative nature of the reconstruction phase is that in practice, the
authors of handbooks do not follow a particular notion of method. As a consequence of this,
the coverage of different EMM parts are scattered in different parts of the handbook, which
imposes the need for reviewing the handbook several times.
In the IDA handbook , Procedure (procedural guidelines) relevant to the phases “Scoping”, “ID-
Context Modeling” and “ID-Context Analysis & Evaluation” are presented under the sub-heading
“Activities”, that makes it convenient to find a presentation of this EMM part. On the other hand, in
the phase “Representation & Documentation”, Procedure is diffused in among lines and phrases and
in “Additional Analysis” the audience has to refer to the handbook of other EMMs to see the
procedural guidelines. “SE & BPR Activities” are not elaborated in this handbook. Thus, its
Procedure cannot be evaluated. In addition to this, as explained in Statement 38, hints regarding the
point that the IDA process should be role-based and not process-based are stated under the
Perspective heading in the handbook. Accordingly, it is inevitable to review the IDA handbook
several times to ensure that all fragments relevant to Procedure of IDA have been identified
(Statement 39).
Prep2: Following the correct reference: An EMM might be changed and modified over
time. This might result in a need for writing a new hand book (aka user guide or manual) or
other types of reference material that reflect the applied changes. As a consequence of this,
there might exist several versions of a handbook. The reconstruction process should be carried
out respecting the same version that is selected to for the EM process. In case the modeling
team has planned to follow more than one handbook during the EM case, this should also be
taken into account when reconstructing the EMM.
At the time when this research was under progress, different versions of IDA handbook were
available, e.g. cf version 1.0 (Lundqvist et al., 2009), version 1.6 (Lundqvist et al., 2011) and version
2.0 (Lundqvist et al., 2012). The validation cases were conducted using the latest version, which was
96
the most complete version. In view of that, the efficiency evaluation had to be done based on the
same version. (Statement 40)
In case a different version of the handbook, other than the one(s) used in the modeling, is
selected for the efficiency evaluation, the results might not be reliable.
6.3.2 Additional Efficiency Criteria for Different Parts of EMM
The validation process of A3E2M contributed in giving reflections on the Structure of
A3E2M (see section 6.2.2). The reflections involve problems and shortcomings of A3E2M, to
resolve these problems, A3E2M was refined by adding criteria to it. The result of this
refinement is presented below. While “Fr3: suggestion on approaches for understanding the
enterprise activity in Framework” intends at resolving “promising presentation of efficiency
criteria for Framework”, “MC3: clarification on coverage of the Method Component”
addresses “promising presentation of efficiency criteria for Method Component”.
Fr3: Suggestion on the approaches for “understanding the enterprise” activity in
Framework: According to section 5.2.1.2, Framework of an EMM should support meeting
the Perspective by covering the activities “understanding the enterprise”, “modeling” and
“analysis”. It might be possible that each of the mentioned activities are effectuated in
different ways and following different approaches. From the three mentioned activities,
“understanding the enterprise” has a crucial role in an EM process. Gaining a correct and
comprehensive perception about the enterprise is a necessity. Thus, criteria and specifications
about this activity should be covered in an EMM handbook. This can be done in different
ways, e.g. interviewing the domain experts, arranging a presentation about the discourse
domain, and reviewing the relevant documents.
Lundqvist et al. (2011) suggest a number of approaches for carrying out the first phase of an IDA
process, i.e. “Scoping”. This however, can be generalized and followed as a means for
“understanding the enterprise”: “the typical way of achieving this is conducting meeting and
interviewing the key stakeholders within customer’s organization.” (Statement 41)
“- We start by your roles and (then) your tasks. Please tell me what roles you have.
- I am program manager and teacher….” (Statement 42; Participant 13, Participant 11)
As stated above, each of the mentioned activities, including “understanding the enterprise”,
can be performed by following different approaches. As each activity type has pros and cons,
it would be even more helpful if suggestions on the most suitable alternatives were provided:
97
The IDA handbook suggests approaches in addition to interviewing to fulfill the need of
“understanding the enterprise”: “It is also possible to do this with more comprehensive approaches
such as observation of typical work situations, reviews of existing business descriptions or quality
documentation by participant, etc”. From the suggested alternatives, business descriptions can be
presented in different models, including graphical models of the enterprise. (Statement 43)
“This is how our webmaster modeled the department. Do you think it you could use them?”
(Statement 44; Participant 12)
MC3: Clarification on coverage of the Method Component: In an EM process, it is using
Method Component that results in generating enterprise models. It is necessary to have a
correct and clear image about what output models that will be gained by applying a specific
Method Component. Understanding Perspective provides an overall and abstract
understanding about the viewpoint of an enterprise that can be modeled. This however
requires to be more operationalized. Thus, giving an understandable clarification of the
Method Component is necessary. Such clarification could be provided in different forms such
as textual description, sample models or a mixture of both. Especially, if the EMM prescribes
applying more than one Method Component, meeting this criterion helps in gaining a clearer
understanding about what focal area that can be modeled using each Method Component.
In IDA, different Method Components are to be used, where only one of them is specifically
developed for this EMM. The authors present one sample model that can be used to find out what an
IDA model looks like, but not an explicit textual elaboration on IDA Method Component. For the
Method Components that are imported from other EMMs no clarification is provided (To learn about
and assess Method Components imported from other EMMs, one should refer to the relevant
reference). (Statement 45)
6.4 Future Work: Support of Different Phases of an EM Process
While applying A3E2M for the purpose of empirical validation, one shortcoming in A3E2M,
other than what are presented in section 6.2, was identified. According to chapter 1, this thesis
should contribute in answering the research question “in what phases of an EM process could
efficiency of the applied EMM be evaluated?”. By reviewing chapter 5 it can be seen that this
question cannot be answered with the use of the developed results, i.e. A3E2M. Thus, the
topic is discussed here.
During different phases of an IS Development (ISD), people frequently need to evaluate the
produced results and the process itself. Beynon et al. (2004) present a model showing how
different forms of evaluation approaches fit to an ISD. Beynon et al. (2004) also distinguishes
98
between strategic, formative, summative and post-mortem analysis. Strategic evaluation (or
pre-implementation evaluation or pre-development evaluation) helps in assessing whether
conducting an IS/IT project can deliver benefits against costs. Strategic evaluation has a
decisive role in establishing both long term and short term IS strategies. Formative evaluation
supports the assessment of an IS during its development process. This type of evaluation is
about determining whether the decided objectives have been met and if any crucial change in
the design of the IS is required. Summative evaluation (or post-implementation evaluation)
involves checking whether the costs and benefits have been according to the strategic
evaluation. This is done either by evaluating if the system specifications have been covered or
the system usability. Summative evaluation might end in suggesting ways for system
modification. The reason behind this type of evaluation is that even if a project reaches
completion, there is a risk of failure in delivery. Post-mortem analysis is a variant of
summative evaluation, which is followed for identifying the cause of failure in case of project
abandonment.
According to section 3.1.3, EM is a subject relevant to IS. An evaluation approach (method or
framework) that is developed in EM should support the four activity types of IS evaluation on
an EMM. Below, it is discussed how different evaluation forms of EMMs are supported by
A3E2M:
Strategic evaluation: Before applying an EMM and conducting an EM process, it is
necessary to carry out a feasibility study and strategic evaluation. Strategic evaluation
of an EM process is about clarifying if any EMMs could be followed, and even what
alternative that is the most suitable one. Performing a strategic evaluation facilitates
the EM process, even in a state where the EM process has not started yet. In applying
A3E2M reviewing general and specific cases of application criteria is what the
potential EMM users should consider. This is followed by concentrating on the
specific case, and finding the extent to which, a candidate EMM is suitable for it.
Although, before starting the EM process many details are unknown, reviewing the
enterprise, especially the domain of discourse, and checking the criteria for the
specific case of application still provides the basis for discussions about suitability of
an EMM and making an appropriate selection.
Formative evaluation: By starting an EM process, i.e. applying the selected EMM,
parts of enterprise models start to become visible. The evaluation process should be
conducted in parallel with the EM process. In this way, two types of evaluation are
notable: evaluation of the enterprise models and evaluation of the EM process. As a
99
part of evaluating the EM process, it should be investigated whether the EMM has
lived up to the expectations of the application process. For this purpose, one can apply
A3E2M to evaluate the EMM against the criteria for a specific case of application.
This evaluation can be done at any time during the EM process, and making the
decision about the proper point in time for it is left to the potential applier. For
evaluating whether the enterprise models meet the requirements, use of a method or
technique that is specifically developed for this purpose, can be helpful. Accordingly,
a formative evaluation of an EMM application process is partially supported by this
approach.
Summative evaluation: Once an EMM application process has been completed,
enterprise models are developed and all relevant parts from different EMM parts have
been used. Although at this time, the mission of conducting an EM process is
completed, it should be confirmed whether the process has been according to
expectations. This means, we should check whether the models meet the defined
requirements, and also if the application process has been according to the defined
criteria. Unlike formative evaluation, in summative evaluation it is necessary to
evaluate only one set of models, i.e. the final enterprise models. Also, checking the
conformance of different EMM parts to the defined criteria done only once. To do
this, one can receive support from A3E2M in studying the whole EMM, as well as
each individual part of the EMM in the process. Following this, conclusions should be
drawn based on the evaluations. Studying different EMM parts in a process is done by
specifying points in time, assessing the behavior of the EMM at each time using
A3E2M, and converging all interpretations. Like formative evaluation, summative
evaluation is partially supported by A3E2M. By partially we mean for evaluating
enterprise models, and means that are specifically developed for the purpose of
models evaluation should be applied as well.
Post-Mortem Analysis: With the help of this variant of summative evaluation,
recognition of the cause of failure in an abandoned EM case will be feasible. To
perform this activity using A3E2M, one can review elaborations about how
summative evaluation is supported by A3E2M and apply them for the purpose of this
activity.
The four evaluation forms in an EM process can be adjusted into three activities “pre-
modeling”, “modeling” and “post-modeling”. Consequently, A3E2M is applicable in the three
activities constituting an EM process.
100
6.5 Discussion on the Applicability of A3E2M
The purpose of this section is to discuss applicability of A3E2M. The remainder of this
section consists of two main points regarding the approach. Each point contains discussions
about a feature of A3E2M, its pros and cons, and suggestions of changes to eliminate the
cons.
Tentative presentation of Preparatory Phase: A feature of A3E2M is its division into the
Preparatory Phase and the Main (Evaluation) Phase, where the former is about reconstruction
of an EMM and the latter supports efficiency evaluation of the EMM. This phase structure
helps the potential user of A3E2M in keeping in mind that both the Reconstruction Phase and
the Evaluation Phase are decisive and should be completed. Especially, mentioning EMM
reconstruction as a separate phase helps keeping in mind that it should be considered as a
particular task. According to the implication “Prep1: the iterative nature of EMM
reconstruction” (presented in section 6.3.1), by any change in demarcation of EMM parts, the
results of the Evaluation Phase should be revisited to confirm their validity. The iterative
nature of the reconstruction that results in iterating the whole evaluation process, might cause
extra work and extra burden. Although reconstruction is stated in section 5.2.2 as a “phase”,
and in section 6.3.1 it is mentioned an iterative process, explanations on this phase still has
room for improvement. For this purpose, it should be explained how to decide on whether to
proceed with more reconstruction iterations. According to the explanations on the same
implication, a handbook might contain lines and paragraphs fitting into more than one EMM
part. Users need to know how to distinguish if an explanation is about a particular method
part. Thus, it is necessary to enrich the presentation of A3E2M by adding explanations about
how to work with such lines and fragments and prevent confusion about their coverage.
Generic Presentation of A3E2M: This approach is presented on an overall level. By this we
mean that explanations in each efficiency criterion are generic, which make them flexible for
adaption. As a result of this generic presentation, interpretation and even modification can be
done in different ways. As an example, “C1: elucidation about conceptual elements”
(presented in section 5.2.1.3.3) mentions that Concepts should be explained and clarified.
However, details about what the suitable means for fulfilling this are and what steps that
should be passed to reach this purpose are left to the user. The same style is followed in
section 6.3 for suggesting refinements for A3E2M. For example, in criterion “Fr3: suggestion
on approaches for understanding the enterprise” (presented in section 6.3.2), examples of
suitable means for understanding an enterprise, such as interview and presentation sessions,
101
are given. It is nevertheless, not specified what sort of interview is more suitable. Moreover,
no restriction regarding other ways of understanding the enterprise is given.
The generality of A3E2M has made it flexible in terms of use: the user is able to understand it
conveniently and interpret its different parts according to the modeling participants'
knowledge and further plans. On the other hand, the general presentation of A3E2M and the
potential need for customization is a challenge to the user. The audience need to know how to
interpret criteria for different EMM parts. To resolve this, one possibility is adding guidelines
to the approach regarding how to adapt it. This helps a user of A3E2M in following a more
concrete way for adapting the approach.
Accordingly, A3E2M is a proper means to study whether an EMM supports conducting an
efficient EM process. This however, does not mean that it is a fully mature process. Rather it
still has room for being improved and becoming more mature.
102
7. Discussion
This chapter contains discussions on different topics. It starts with section 7.1, which includes
answers to the research questions that were defined in chapter 1. This is followed by
reflection on the applied research discipline, i.e. design science, in section 7.2. Development
and validation of the research results were done based on EM projects and cases. Some
lessons that were learned during observation of these EM cases are presented in section 7.3.
7.1 Answering the Research Questions
Below, answers to the research questions defined in section 1.1 are given:
RQ 1. What is the meaning of efficiency in the context of EMM? EM aids in developing
models for visualizing the current and/or future states of an enterprise, followed by planning
the improvement actions. For this purpose, it is necessary to apply a relevant tool, which is an
EMM. Like any other process, application of an EMM needs resources. Taking care of
resource usage is a crucial need, as resources are scarce and costly. On the other hand, focus
on the issue of resource usage should not end in developing improper models. Indeed,
application of an EMM should support minimum resource usage, while at the same time
developing of enterprise models matching to the specified needs, i.e. we require an efficient
EMM. To have an efficient EMM, it should fulfill criteria and terms regarding the EMM, and
to be more precise, regarding each part of it and relations between different parts. If this is
fulfilled, it will help in conducting a correct modeling process and gaining models of high
quality. On the other hand, attainment of correct models and correct behavior of the EMM is a
sign of efficiency fulfillment. Especially, correct behavior itself has a decisive role in resource
utilization, since it helps staying on the right track and reduces the number of extra iterations.
While there are terms that always have to be fulfilled, determining the suitability of the EMM
for each new case is necessary. This imposes the need for defining criteria for general and
specific cases. If both types of criteria are fulfilled by an EMM and its parts, the EMM is
efficient.
Accordingly, an efficient EMM is defined as follows:
An EMM is efficient if the results of the EM process are according to the needs expressed by
the stakeholders and the process defined by an EMM (and each part of it) is performed
exactly according to the criteria for the general and specific case of application.
103
RQ 2. How can efficiency of an EMM be evaluated? A method, such as an EMM, is
comprised of different parts, Perspective, Framework, Method Component (Procedure,
Notation, Concepts) and Cooperation Principles, which are related to each other. An EMM
part or a relationship between two parts, should match and cover some fundamental criteria.
These criteria can be divided into two main groups: one group entails criteria that should
always be true, i.e. regardless of the application case, whereas fulfillment of another group
depends on the application case, especially on the enterprise and the modeling team. Criteria
from any of the two groups have to be reviewed and agreed on by the whole team.
In an EMM, different parts are related to each other and working with one particular part
requires seeking support in the other parts, too. Thus, evaluating a specific part imposes the
need for studying it as a part of the whole EMM and also evaluate the other parts. For
conducting the evaluation process, the set of defined and agreed upon criteria for each EMM
part and relation should be checked. Prior to that, the defined criteria should be understood by
each team member, and the team members have to reach a mutual agreement about each
criterion. Due to the fact that the potential applier of A3E2M is not restricted to these criteria,
the evaluation process can be continued by using more criteria and questions related to them.
To commence the evaluation process, the first step is reconstructing the “artifact” under study
according to the method notion. As evaluation is done on the result of the reconstruction step,
it has a decisive role on the evaluation process and its results. After specifying the EMM
parts, the evaluation process, i.e. assessing whether each part matches to the defined criteria
for general and specific case of application, starts.
RQ 3. In what phases of an EM process could method efficiency be evaluated? An EM
process consists of three main phases: pre-modeling, modeling and post-modeling. An EMM
should fulfill the efficiency criteria during each of the stated phases. It is necessary to be
ready to carry out efficiency evaluation during the whole modeling process, i.e. in all three
parts of it. In the pre-modeling phase, a strategic evaluation aids in finding out whether the
EMM will support to the case. During the modeling phase, formative evaluation assists in
identifying what change needs that ought to be considered for the currently used EMM. In the
end, during activities of the post-modeling phase, which are summative evaluation and post-
mortem analysis, efficiency evaluation helps in figuring out if the completed EM process was
efficient, or what the failure causes were. Since not all three phases could be investigated with
104
a sufficient number of cases during the course of this thesis, more research with focus on RQ
3 is required.
7.2 Reflection on the Followed Research Discipline
As elaborated earlier in section 2.1.2, of the different research disciplines in IS, design science
(Hevner et al., 2004) was selected and followed in this thesis. The focus of this thesis was on
developing an approach for efficiency evaluation of EMMs. Design science as a discipline
was an appropriate choice to this aim. Especially, following the seven guidelines of Hevner et
al.(2004) facilitated the research process. Guideline 1 (Design as an Artifact) helped the
author in making clear that the expected outcome of the research has to be an artifact. As the
result, taking this guideline into account supported development of an artifact, which is an
approach for efficiency evaluation of EMMs. According to section 3.3, the identified problem
(research gap) was the issue of efficiency evaluation in EM, which has not been investigated
by other researchers. Following Guideline 2 (Problem Relevance) ensured that the developed
artifact is relevant to the identified problem. In fact, following Guideline 1 and Guideline 2
was helpful in producing an artifact for the purpose of addressing the specified problem.
Since the developed artifact had to be evaluated, it was necessary to find out what evaluation
methods that are available and which of them that were best suited. Guideline 3 (Design
Evaluation) in (ibid) contains a list of evaluation methods that can be followed. This opened
up a wide range of evaluation methods that could be followed, but also resulted in spending
more time on investigating different options and choosing the most suitable method. In the
end, the observational (case study) method was selected. Following Guideline 4 (Research
Contribution), that is the aim of any research, was automatically considered. It was
nonetheless, a reminder that helped keeping in mind that the research results should be both
clear and verifiable. This guideline helped in considering the fact that output results should be
represented in a clear form. Fulfilling this was rather complex and time consuming. The
author was familiar well enough with her own work, which resulted in finding it clear.
However, deciding whether other people also found it understandable, was not an
unambiguous process. It was also unclear in the beginning how the presentation of the output
results should be in order to be verifiable. Consequently, to ensure this guideline was
followed properly, it was necessary to go through the results repeatedly and revise the
presentation style. Guideline 5 (Research Rigor) emphasizes that both construction and
evaluation (validation) processes should be performed following rigorous research methods.
This assisted in paying more attention to the fact that a concrete research strategy should be
105
pursued. It included looking for rigorous research methods and ensuring they were followed.
According to Guideline 6 (Design as a Search Process), in a design science-based process,
the researcher should focus on looking for results. The main positive point of applying this
guideline was receiving a green light saying that searching for the results is allowed or even
necessary and at the same time iterative. It was however not helpful in figuring out where the
iterating should be stopped. Presentation and packaging of results has a decisive impact on
how the audience judge the outcome result. This was a challenge in this thesis, too. Guideline
7 (Communication of Research) made it clear that the contributed artifact (approach) should
be understandable to both a technology-oriented and a management-oriented audience. This
guideline was a means to present the results in a way that from one side are on an overall
level and understandable by management-oriented audience and on the other side, the results
are understandable by a technology-oriented audience with sufficient detail.
All in all, following design science was suitable and appropriate for performing this work.
The main advantage was emphasizing the fact that a design science-based research is an
iterative process, consisting of construction and evaluation. The seven guidelines of (Hevner
et al., 2004) provided a list of concerns that had to be taken into account during the work.
These guidelines constituted a concrete measure, which helped ensuring that the correct
process was performed.
7.3 Lessons Learned on Conducting EM Projects
During development and validation of A3E2M, some lessons relevant to EM sessions were
learned. These lessons are not separate from contribution of this thesis. Rather they indicate
how following different combinations of efficiency criteria facilitate conducting efficient EM
processes. The lessons are presented in sections 7.3.1 to 7.3.3.
7.3.1 Access to the Relevant Information Sources
According to criterion “CP1: clarifications about the required competences, in an EM process,
involvement of people that are competent enough is a necessity. In a modeling team,
modeling experts have relevant knowledge about the EMM and know how to use it to conduct
a smooth EM process. Domain experts are reliable information sources about the enterprise
(see criterion “Pr3: availability of reliable information sources about the case”. The
assumption is that they have pervasive knowledge about the enterprise and the events that are
likely to occur in it. However in practice, members of an EM team might encounter confusion
106
even about their field of expertise and require further informational assistance. A modeling
expert might become doubtful about whether (s)he is pursuing the EM work correctly. For
example, one might need to know how to document details of a notational element. Domain
experts might also get confused about different issues in the enterprise, such as work-flows,
rules, relevant individuals, etc. As an alternative, it is necessary to access different
information resources relevant to the case. Organizational brochures, charts, existing models
(as it is stated in “Pr3: availability of reliable information sources about the case”) or other
available sources of information on the enterprise can be helpful. To learn about the EMM,
the method handbook should be considered as a relevant source.
Access to different information sources about the enterprise and the EMM should be taken
into account in any EM case. When doing so, the decision about what sources are most
helpful has to be made by the EM team. This choice is based on different factors such as what
resources that are available and what information that is required. In this way, domain and
modeling experts are also valuable information sources about the enterprise and the EMM and
should not be neglected.
Ensuring access to the relevant information sources is the requisite for “following the same
language by members of the modeling team regarding the case”, discussed in section 7.3.2.
7.3.2 Following the Same Language by Members of the Modeling Team regarding
the Case
People who cooperate with each other to pursue a systematic work, need to have a clear
understanding of it. It is necessary that not only each individual understands the work, but
also that they reach unified perceptions about different topics. In an EM process, the subject is
to use an EMM to model an enterprise from particular viewpoints. For such a process, team
members can be divided into two main groups: one group consists of domain experts and the
other group consists of modeling experts. It should be confirmed that the modeling experts
and the domain experts use the same language about the EM case. This need itself can be
divided into the two following needs:
Same understanding about terms and phrases in the enterprise: In every
enterprise or organization people use specific phrases to refer to different facts. These
phrases might be unknown for people who do not work in that enterprise or
department. People, who are outside the enterprise and intend at to communicate with
the enterprise members, should have the same perception about the set of (regularly)
107
used phrases and terms. When it comes to EM processes, modeling experts should
have a comprehensive understanding about terms and phrases relevant to the
enterprise. Even domain experts that work in different parts of the enterprise should
have such a uniform understanding. Fulfillment of this set of requirements results in
establishing and following the same language, and to cover this need, it is necessary
that all people involved in the EM process have access to reliable information sources
about the enterprise. This is a benefit of meeting the efficiency criterion “Pr3:
availability of reliable information sources about the case”.
Same understanding about terms and phrases in the EMM: As in enterprises, in
EMMs there also exist terms and phrases that should be understood to provide the
basis for applying the EMM and expediting the EM case. Not only the modeling
experts , but also the domain experts should have understood the EMM as well as its
terms and phrases correctly. To ensure that the EM team members have the correct
understanding about terms and phrases in the EMM, fulfillment of efficiency criteria
for Concepts in the general case of application (“C1: elucidation about conceptual
elements” and “C2: coverage of terms and meanings used in Procedure and Notation
by Concepts and contrariwise”) is necessary. The more Concepts used in a handbook
that are understandable, the more accurate and unified the understanding will be by
modeling and domain experts.
These two points should be considered in an EM process to make sure that the whole team is
following the same language. Nevertheless, violation of this might result in disparate insights
about the enterprise and the EMM, which itself might cause a deviation from the right
working path and development of improper enterprise models.
7.3.3 Following a Concrete Action Plan
In applying an EMM and developing enterprise models, like in any other process, having a
concrete action plan is an effective factor. Without such a plan, the modeling process might
deviate from the correct path. According to (Farrington & Stachenko, 2010), an action plan is
about what should be done, by whom, and when. This requires confirming what should be
achieved and how. Accordingly, in an EM process the following points should be covered:
Mutual agreement about what should be achieved: In an EM process, domain experts
should be confident about what they expect from the case. Also, when several domain experts
are involved in the process, their expectations should match to each other and not be
conflicting. For instance, the “application order planning” from infoFLOW-2 (introduced in
108
section 4.2.1) was done to support gaining a clear image of what information demands the
role “planner” has, whereas the EM Course (introduced in section 4.3.1) was conducted to
support students in learning EKD and the essentials of EM. A prerequisite of clarifying what
is expected from an EM case is to specify the domain of discourse. Depending on what EMM
that is being followed, this might be done different ways. For example, in applying the IDA
method, it is done by specifying the roles that the investigated role communicates with. This
was followed in all cases presented from infoFLOW-2. This specification is a sort of guide for
both the domain experts and the modeling experts to know what parts of the enterprise they
should take into account, and what parts they should not. Other agreements such as where the
modeling and assessment should start from, and what degree of detail the developed models
should be, have to figured out based on the stakeholders’ further plans. Such a discussion was
conducted more or less in all background cases and validation cases. For instance, it was
discussed what elements should be included in the models (examples of this can be seen in
Statement 15 and Statement 19).
Establishing time plan and work division: Completion of any project or process requires
following a time plan. All expectations from an EM process, such as what competences are
required (see criterion “CP1: clarifications about the required competences”), what phases
should be completed (see criterion “Fr3: support provision by Framework to the case”) or
what Method Components should be used in a Framework phase (see “MC1: support
provision to relevant Framework phases”), should be drawn up respecting not only the
stakeholders’ requirements, but the time plan as well. Moreover, a member of the modeling
team needs to know what role (s)he has in the project, i.e. which of roles specified by the
EMM handbook is going to be assigned to her/him. Although most of the time a team
member possesses one specific type of role, i.e. an enterprise modeler or a domain expert, this
will still be helpful in case more than one role is assigned to a team member.
109
8. Conclusions & Future Work
After developing the results, validating them and conducting discussions, it is time to draw up
final conclusions about the extent to which the contribution matches to the aims and
expectations for starting the project. This is covered in section 8.1. This is followed by stating
some possibilities for further research, in 8.2.
8.1 Conclusions
By conducting this research the author aimed at studying the issue of efficiency in the EM
field and especially for EMMs. For this purpose, focus was on developing an approach for the
evaluating efficiency of an EMM. The approach supports evaluating criteria that have to be
true in the same way as well as criteria that their fulfillment varies from case to case. By the
latter we mean that there are criteria, whose their fulfillment for a descried case is not
necessarily generalizeable to further cases.
For writing this dissertation three research questions were defined:
RQ 1. What is the meaning of efficiency in the context of EMM?
RQ 2. How can the efficiency of an EMM be evaluated?
RQ 3. In what phases of an EM process could method efficiency be evaluated?
Development of the results was based on case studies and EMMs that were used in the cases.
This was followed by validating the results and answering the research questions. Using the
gained results and validation outcomes, it is possible to investigate whether an EMM is
efficient (supports efficiency in an EM process). Evaluating this feature of an EMM has
become possible by developing an efficiency evaluation approach, called A3E2M.
This thesis contributes to both the scientific domain and the practice domain. Investigating
and evaluating quality of models and modeling processes have been the topic of different
research efforts (see section 3.2). Efficiency is mentioned as an aspect of quality in all quality
models relevant to IS (see section 3.1.3). This thesis contributes to the scientific domain by
presenting an Approach for Efficiency Evaluation of EMMs (A3E2M), which can be regarded
as a support for quality evaluation in EM in a more concrete way. The approach itself still has
room for improvement. A number of possibilities for further research are suggested in section
8.2. In addition to this, the idea of efficiency evaluation in EM itself is a novel topic and can
be the basis for further research.
110
The outcome of this research is useful not only to the research community, but to the practice
community as well. The developed approach made up of efficiency criteria for different EMM
parts and suggested driving questions, assists the audience (potential applier) of A3E2M,
providing a basis for starting an efficiency evaluation process and reach a common language
about how a particular EMM should be. Presentation of A3E2M is flexible. It is presented
here on an overall level thus the practitioners may interpret and adapt it based on their
knowledge and purpose of use. By applying this contribution the user cannot expect gaining
satisfactory results right away. Rather, they should be aware that this is an iterative process
that has to be continued and repeated during the whole life time of the EM process.
8.2 Future Work
This dissertation includes an approach for evaluating efficiency of EMMs, this however is not
the end of this work and there exist areas for further improvement. In addition to what already
has been discussed in 6.4, the identified areas for further development are briefly explained in
sections 8.2.1 to 8.2.3:
8.2.1 Expanding the Efficiency Evaluation Approach
The proposed approach is still in its first stage of development and has to be expanded to
become a more mature evaluation approach. To support this, the following suggestions for
expansion of A3E2M are mentioned.
The approach currently supports evaluating EMMs that are already developed and are going
to be applied. In real life, other application types of EMMs might occur, such as developing
an EMM from scratch to fulfill an uncovered Perspective. It might also be necessary to
compare or integrate two or more EMMs to utilize their strengths and counter their
weaknesses. In such cases, it is also important to develop an efficient EMM. Thus, the current
version of A3E2M should be enriched by adding criteria and driving questions for other
application types of EMMs. In addition to this, it should be clarified in what sequence the
evaluation of different parts should be carried out.
An evaluation process is executed by asking (driving) questions developed for this purpose.
The driving questions should be improved with the purpose of reaching an exhaustible list of
questions. In the driving questions relevant to each EMM part, it should be clarified from
what questions the evaluation process should start, how to pose further questions and how to
end the questioning, in all different application types of EMMs.
111
Another suggestion for expanding A3E2M is relevant to documentation and presentation of
evaluation results. By performing an evaluation approach, a user of A3E2M obtains answers
and results regarding the efficiency of the investigated EMM. Accordingly, another
opportunity for expansion of A3E2M is to provide guidelines for how to document evaluation
outputs.
8.2.2 Moving from an Evaluation Approach to an Improvement Approach
Evaluation is the first step in an EMM efficiency improvement process. Such an evaluation
process is expected to result in identification of the shortcomings and consequently the
change needs for the EMM under study. To conduct the improvement process of the EMM, it
is necessary to have suitable change measures in place. By change measures we refer to
actions that are required for performing change needs discovered in an EMM. Thus, the
evaluation approach is required to be completed in a second phase that supports identifying
and implementing the change measures. Since there exist different application types of EMMs
and the efficiency evaluation process is dependent on the planned application, it should be
identified what change measures that are relevant to what application type. In other words,
suggested change measures should be categorized according to the possible applications of
EMMs. By following a concrete approach for setting up the change measures, team members
will have a mutual language about what changes that are possible for each EMM part and
details of each part.
8.2.3 Developing Guidelines for Conducting an Efficient Evaluation Process
The suggested approach is a general set of criteria for different EMM parts. Each EM case is
done to fulfill a specific and new purpose, probably in a different enterprise and/or by a new
modeling team. To evaluate and improve the efficiency of an EMM, an evaluation followed
by an improvement process should be performed. Both of these additional processes, require
extra effort and use of extra resources. Hence, it is necessary to consider the issue of
efficiency for these purposes as well. Having a clear and concrete understanding about how
the evaluation and improvements processes should be conducted, helps in delivering the
required results while a reasonable amount of resources are used. In other words,
development of an artifact that entails criteria and guidelines about how an evaluation and
improvement process of EMMs should be done in order to be called efficient, is another part
of future work.
112
References
Abdiu, D., Strandberg, M. & Stridberg, M. (2005). The impact of a real-time IT-
Logistics solution (Master’s Thesis). Jönköping International Business School,
Jönköping University, Sweden.
Abdulhadi, S. (2007). i* Guide. Retrieved Sept 18, 2012 from http://istar.rwth-
aachen.de/tiki-index.php?page=i%2A+Guide
van Aken, J., E. (2004). Management Research Based on the Paradigm of the Design
Sciences: The Quest for Field-Tested and Grounded Technological Rules. Journal of
Management Studies, 41(2), 219-246. doi: 10.1111/j.1467-6486.2004.00430.x
Al-Tarawaneh, K., A. (2012). Measuring E-Service Quality from the Customers'
Perspective: An Empirical Study on Banking Services. International Research Journal
of Finance and Economics, 91, 123-137.
Allee, V. (1997). The Knowledge Evolution: Expanding Organizational Intelligence.
Boston, MA: Butterworth Heinemann.
ANSI/NEMA. (1994). Committee Draft 14258 - Industrial automation systems,
Systems architecture: Framework for Enterprise Modeling, Technical Report, ISO
TC184 SC5 WG1.
Anderson, J., Donnellan, B. & Hevner, A. (2012). Exploring the Relationship between
Design Science Research and Innovation: A Case Study of Innovation at Chevron.
Communications in Computer and Information Science, 286. 116-131. New York,
NY: Springer-Verlag Berlin. doi: 10.1007/978-3-642-33681-2_10
Avison, D., E. & Fitzgerald, G. (2006). Information systems development:
methodologies, techniques & tools. Columbus, OH: McGraw-Hill.
Baskerville, R. (1991). Risk analysis as a source of professional knowledge.
Computers& Security, 10(8), pp. 749-764. doi: 10.1016/0167-4048(91)90094-T.
Batini, C., Ceri, S.& Navathe, S. (1992). Conceptual database design. An entity
relationship approach. San Francisco, CA: Benjamin Cummings Publishing
Company.
Becker, J., Rosemann, N. & Schütte, R. (1995). [In German] Guidelines of Modelling
(GoM). Wirtschaftsinformatik, 37 (5), 435-445.
113
Becker, J., Rosemann, M.& von Uthmann, C. (2000). Guidelines of Business Process
Modeling. W. van der Aalst (Ed.), Business Process Management: Models,
Techniques and Empirical Studies (pp. 30-49). doi: 10.1007/3-540-45594-9_3
Benbasat, I., Goldstein, D. K. & Mead, M. (1987). The case research strategy in
studies of information systems. MIS Quarterly, 11 (3), 368-386. Retrieved from
http://www.jstor.org/stable/248684
Benbasat, I. & Zmud, R., W. (1999). Empirical Research in Information Systems: The
Practice of Relevance. MIS Quarterly, 23(1). 3-16.
Bernus, P. (2001). Some thoughts on enterprise modeling, Production Planning &
Control: The Management of Operations, 12 (2), 110-118. doi:
10.1080/09537280150501211
Bernus, P. (2003). Enterprise models for enterprise architecture and ISO9000:2000,
Annual Reviews in Control, 27, 211-220. doi: 10.1016/j.arcontrol.2003.09.004
Beynon, P., Owens, I. & Williams, M., D. (2004). Information Systems Evaluation
and the Information Systems Development Process. The Journal of Information
Manegement, 17(4). 276-282. doi: 10.1108/17410390410548689
Boehm, B., W., Brown, J., R., Kaspar, H., Lipow M., McCleod, G., J., & Merritt M.,
J. (1978). Characteristics of Software Quality. Amsterdam, North-Holland.
Bollojy, N. & Leung
, S.S.K. (2006). Assisting Novice Analysts in Developing Quality Conceptual Models
with UML. Communications of the ACM, 49(7), 108-112. Doi:
10.1145/1139922.1139926
van Bommel, P.& Hoppenbrouwers, S.J.B.A. (2008). QoMo: A Modeling Process
Quality Framework based on SEQUAL. H. A. Proper, T. Halpin, and J. Krogstie,
(Eds.), Proceedings of Exploring Modeling Methods for Systems Analysis and Design
07 (pp. 118-127). Trondheim, Norway: Tapir Academic Press.
vom Brocke, J. & Buddendick, Ch. (2006). Reusable Conceptual Models -
Requirements Based on the Design Science Research Paradigm. First International
Conference on Design Science Research in Information Systems and Technology:
February (pp. 24–25).
Brinkkemper, S., Saeki, M.& Harmsen, F. (9111). Meta-modeling based assembly
techniques for situational method engineering . Information Systems, 24 (3), pp. 209-
228. doi: 10.1016/S0306-4379(99)00016-2
114
Breu, R., Chimiak–Opoka, J. (2005). Towards Systematic Model Assessment.
Proceedings of Perspectives in Conceptual Modeling: ER 2005 Workshops (pp. 398-
409). LNCS 3770. doi: 10.1007/11568346_43
Bubenko, Jr., J., A. (1986). Information systems methodologies - a research view. T.
W. Olle, H., G., Sol& A., A., Verrijn-Stuart (Eds.), Information Systems
Methodologies: Improving the Practice (pp. 289-318). North-Holland.
Bubenko, J. A., Jr, D. Brash & J. Stirna. (1998). EKD User Guide. Kista, Dept. of
Computer and Systems Science, Royal Institute of Technology (KTH) and Stockholm
University, Stockholm, Sweden. Retrieved from
ftp://ftp.dsv.su.se/users/js/ekd_user_guide.pdf
Bubenko Jr, J., Persson, A., & Stirna, J. (2010). An Intentional Perspective on
Enterprise Modeling. Intentional Perspectives on Information Systems Engineering,
Springer-Verlag . doi: 10.1007/978-3-642-12544-7_12
Carstensen, A. (2011). The Evolution of the Connector View Concept: Enterprise
Models for Interoperability Solutions in the Extended Enterprise (Licentiate Thesis).
Department of Computer and Information Science, Linköping University, Linköping,
Sweden.
Carstensen, A., Khademhosseinieh, B., Lundqvist, M., Seigerroth, U. & Sandkuhl, K.
(2012a). InfoFLOW-2 Deliverable DX.X Version 1.0: Application Case “Information
Engineering (IE) program manager at JTH (Jönköping Tekniska Högskolan)”,
Experiences from IDA-Modelling Activities. Jönköping University, School of
Engineering, Jönköping, Sweden.
Carstensen, A., Khademhosseinieh, B., Lundqvist, M., Seigerroth, U. & Sandkuhl, K.
(2012b). InfoFLOW-2 Deliverable DX.X Version 1.0: Application Case “Marketing
Department at JTH”, Experiences from IDA-Modelling Activities. Jönköping
University, School of Engineering, Jönköping, Sweden.
Carstensen, A., Khademhosseinieh, B., Lundqvist, M., Seigerroth, U. & Sandkuhl, K.
(2012c). InfoFLOW-2 Deliverable DX.X Version 1.0: Application Case “Order
Planning”, Experiences from IDA-Modelling Activities. Jönköping University, School
of Engineering, Jönköping, Sweden.
Cavaye, A., L., M. (1996). Factors contributing to the success of customer-oriented
interorganizational: a multi-faceted research approach for IS. Information Systems
Journal, 6, 227-242.
115
Chen, Ch. (2006). Applying the stochastic frontier approach to measure hotel
managerial efficiency in Taiwan. Tourism Management, 28, pp. 696-702. doi:
10.1016/j.tourman.2006.04.023
Cherfi, S., S., Akoka, J.& Comyn-Wattiau, I. (2002). Conceptual modeling quality-
from EER to UML schemas evaluation. S. Spaccapietra, S.T. March, Y. Kambayashi
(Eds.), Proceedings of the 21st International Conference on Conceptual Modeling
(pp. 414-428). LNCS 2503. doi: 10.1007/3-540-45816-6_38
Christensen, L.C., Christiansen, T.R., Jin, Y., Kunz, J.& Levitt, R.E. (1996). Modeling
and simulation in enterprise integration - A framework and an application in the
offshore oil industry. Concurrent Engineering – Research and Applications, 4(3), pp.
247-259.
Course Syllabus Enterprise Modeling, 7.5 Credits. (n.d.). School of Engineering,
Jönköping University.
Cronholm, S.& Ågerfalk, P., J. (1999). On the Concept of Method in Information
Systems Development. Käkölä (Ed.), Proceedings of the 22nd
Information Systems
Research Seminar in Scandinavia Enterprise Architecture for Virtual Organizations,
Linköping Electronic Articles in Computer and Information Science, 4(19) (pp. 229-
236).
Cox, J. and Dale, B.G. (2001). Service quality and ecommerce: An exploratory
analysis. Managing Service Quality, 11(2), 121-131.
Darke, P., Shanks, G. & Broadbent, M. (1998). Successfully completing case study
research: combining rigour, relevance and pragmatism. Information Systems Journal,
8 (4), 273-289. doi: 10.1046/j.1365-2575.1998.00040.x
Degbelo, A., Matongo, T.& Sandkuhl, K. (2010). Enterprise Models as Interface for
Information Searching. P. Forbrig & H. Günther (Eds.) Proceedings of the 9th
International Conference on Perspectives in Business Informatics Research (pp. 27-
42). LNBIP 64. Doi: 10.1007/978-3-642-16101-8_3
Delen, D. & Benjamin, P., C. (2003). Towards a truly integrated enterprise modeling
and analysis environment. Computers in Industry, 51 (3), pp. 257-268. doi:
10.1016/S0166-3615(03)00063-0
Denning, P. J. (1997). A New Social Contract for Research. Communications of the
ACM, 40 (2), 132-134. doi: 10.1145/253671.253755
Dennis, A. R. (2001). Conducting Research in Information Systems. Communications
of the Association for Information Systems, 7(5), 1-41.
116
Dietz, J., L., & Albani, A. (Eds.). (2008). Advances in Enterprise Engineering I:
Proceedings of the 4th
International Workshop CIAO! and 4th
International Workshop
EOMAS, held at CAiSE 2008, Montpelliers .LNBIP 10. New York, MY: Springer
Berlin Heidelberg.
Doolin, B. (1996). Alternative views of case research in information systems.
Australian Journal of Information Systems, 3(2), 21-29. doi:10.3127/ajis.v3i2.383
Dromey, R. G. (1996). Cornering the Chimera [software quality]. Software,
IEEE, 13(1), 33-43. doi: 10.1109/52.476284
Drucker, P. F. (1974). Management: tasks, responsibilities, practices. New York, NY:
Harper & Row.
Drucker, P. F. (2008). Management (revised edition). New York, NY: Haper Collins.
Duarand, Th. & Dubreuil, M. (1999). Humanizing the future: managing change with
soft technology. Foresight, 3(4). 285-295. doi: 10.1108/14636680110803283
ElAlfi, A., E., E., ElAlami, M., E., Asem, Y., M.(2009). Knowledge Extraction for
Discriminating Male and Female in Logical Reasoning from Student Model.
International Journal of Computer Science and Information Security, 6(1), pp.6-15.
Retrieved from http://arxiv.org/ftp/arxiv/papers/0911/0911.0028.pdf
Farrell, M. J. (1957). The measurement of productive efficiency. Journal of the Royal
Statistical Society. Series A (General), 120(3), 253-290. World Health Organization,
Regional Sector for Europe.
Farrington, J., L. & Stachenko, S. (2010). Country capacity for non-communicable
disease prevention and control in the WHO European Region, Preliminary Report.
Feigenbaum, A., V. (1991). Total Quality Control (3rd
ed). New York, NY: McGraw
Hill .
Fettke, P. & Loos, P. (2003a). Multi-perspective evaluation of reference models:
towards a framework. D. Gentner, G. Poels, H.J. Nelson & M. Piattini (Eds.),
Proceedings of the International Workshop on Conceptual Modeling Quality 03 (pp.
80-91). LNCS 2814. doi: 10.1007/978-3-540-39597-3_9
Fettke, P., Loos, P. (2003b). [In German] Ontological Evaluation of the Semantic
Object Model. E.J. Sinz, M. Plaha & P. Neckel (Eds.), Modellierung betrieblicher
Informationssysteme. Proceedings of the MobIS Workshop (pp. 109-129).
Fettke, P., Houy, C., Vella, A. & Loos, P. (2012). Towards the Reconstruction and
Evaluation of Conceptual Model Quality Discourses – Methodical Framework and
Application in the Context of Model Understandability. B. Ilia, H. Terry, J. Krogstie,
117
S. Nurcan, E. Proper, R. Schmidt, P. Soffer, S. Wrycza, W. Allst, J. Mylopoulos, M.
Rosemann, M.J. Shaw & C. Szperski, Enterprise, Business-Process and Information
Systems Modeling. LNBIP 113 (pp. 406-421). doi: 0.1007/978-3-642-31072-0_28
Final Report for infoFLOW-2. (2012). Jönköping, Sweden: School of Engineering at
Jönköping University, Jönköping, Sweden.
Fox, M.S.& Gruninger, M. (1998). Enterprise Modelling. AI Magazine, 19(3), pp.
109-122. Retrieved from
http://aaaipress.org/ojs/index.php/aimagazine/article/viewFile/1399/1299
Frank, U. (2002). Multi-Perspective Enterprise Modeling (MEMO) - Conceptual
Framework and Modeling Languages. Proceedings of the 35th
Hawaii International
Conference on System Sciences, 3. doi: 10.1109/HICSS.2002.993989
Frank, U. (2007). Evaluation of Reference Models. P. Fettke, & P. Loos (Eds.),
Reference Modeling for Business Systems Analysis (pp. 118-140). Hershey, PA: Idea
Group Publishing. doi: 10.4018/978-1-59904-054-7.ch006
Gandon, F. (2001). Engineering Ontology for a Multi-Agent Corporate Memory
System. Proceedings of the 8th
International Symposium on the Management of the
Industrial and Corporate Knowledge (ISMICK01) (pp. 209-228). Retrieved from
ftp://138.96.0.43/acacia/fgandon/research/ismick2001/article_fabien_gandon_ismick2
001.pdf
Garvin, D.A. (1984). What Does ‘‘Product Quality’ Really Mean?” Sloan
Management Review,26(1), 25-43.
Ghauri, P. N., Gronhaug, & Kristianslund. (1995). Research in Business Studies.
Upper Saddle River, NJ: Prentice Hall.
Ghidini, C., Rospocher, M., Serafini, L., Faatz, A., Kump, B., Ley, T., Pammer, V. &
Lindstaedt, S. (2008). Collaborative enterprise integrated modelling. Proceedings of
the 5th International Workshop on Semantic Web Applications and Perspectives,
Volume 426 of CEUR Workshop Proceedings.
Gleason, J.M.& Barnum, D.T. (1986). Toward Valid Measures of Public Sector
Productivity: Performance Measures in Urban Transit, Management Science, 28 (4),
pp. 379-386.
Golder, P., N., Mitra, D. & Moorman, Ch. (2012). What is quality? An Integrative
Framework of Processes and States. Journal of Marketing, 76 (4), 1-23. doi:
10.1509/jm.09.0416
118
Goldkuhl, G., Lind, M.& Seigerroth, U. (1998). Method integration: the need for a
learning perspective. IEEE Proceedings- Software, 145 (4), pp. 113-118. doi:
10.1049/ip-sen:19982197
Goldkuhl, G. & Röstlinger, A. (2003). The significance of work practice diagnosis:
Socio-pragmatic ontology and epistemology of change analysis. Proceedings of the
International workshop on Action in Language, Organizations and Information
Systems. Linköping University.
Goldkuhl, G. & Röstlinger, A. (2010). Development of public e-services - a method
outline. Presented at the 7th
Scandinavian Workshop on E-Government, SWEG-2010.
Grady, R. & Caswell, D. (1987). Software Metrics: Establishing a Company-Wide
Program. Englewood Cliffs, NJ: Prentice-Hall.
Harmon, P. (2010). The Scope and Evolution of Business Process Management. J.
vom Brocke & M. Rosemann (Eds.), Handbook on Business Process Management 1:
Introduction, Methods and Information Systems (pp. 83-106). Germany: Springer
Berlin Heidelberg.
Hayes, J. (2007). The Theory and Practice of Change Management. Basingstoke, The
Netherlands, Palgrave MacMillan.
Hevner, A., R., March, S., T., Park, J. & Ram, S. (2004). Design Science in
Information Systems Research. MIS Quarterly, 28(1). 75-106.
Hommes, B., van Reijswoud, J., V. (2000). Assessing the quality of business process
modelling techniques. Proceedings of the 33rd Annual Hawaii International
Conference on System Sciences (pp. 1–10). doi: 10.1109/HICSS.2000.926591
ISO/IEC IS 9126. (1991). Information Technology—Software Product Evaluation:
Quality Characteristics and Guidelines for Their Use. Geneva, ISO.
Jackson, M. (2000). An analysis of flexible and reconfigurable production systems
(Doctoral Dissertation). Department of Mechanical Engineering, Linköping
University, Linköping, Sweden.
Juran, J., M. (1992). Juran on Quality by Design: The New Steps for Planning Quality
into Goods and Services. New York, NY: The Free Press.
Järvinen, P. (2007). On Reviewing of Results in Design Research. Proceedings of the
15th
European Conference on Information Systems - ECIS 2007 (pp. 1388-1397).
Retrieved from http://www.cs.uta.fi/reports/dsarja/D-2007-8.pdf
Kaidalova, J. (2011). Efficiency indicators for Enterprise Modelling Methods and
Enterprise Models (Master's Thesis), Jönköping University, Sweden.
119
Kassem, M., Dawood, N. N.& Mitchell, D. (2011). A structured methodology for
enterprise modeling: a case study for modeling the operation of a British organization.
Journal of Information Technology in Construction, 16, pp. 381-410. Retrieved from
http://www.itcon.org/2011/23
Kesh, S. (1995). Evaluating the Quality of Entity Relationship Models. Information
and Software Technology, 37 (12), pp. 681-689. doi: 10.1016/0950-5849(96)81745-9
Khosravi, K. & Guèhenèuc, Y. (2004). A Quality Model Design Patterns (Technical
report 1249). Montréal, Québec: University of Montreal.
Kim, M., Kim, J. & Lennon, S., J. (2006). Online service attributes available on
apparel retail web sites: an E-S-QUAL approach, Managing Service Quality, 16 (1),
51-77.
Kosanke, K.& Nell, J.G. (Eds) (1997), Enterprise Engineering and Integration:
Building International Consensus, Springer-Verlag, Berlin.
Koubarakis, M.& Plexousakis, D. (2002). A formal framework for business process
modeling and design. Information Systems, 27, pp. 299-319. doi: 10.1016/S0306-
4379(01)00055-2
Krogstie, J., Lindland, O., I. & Sindre, G. (1995). Towards a deeper understanding of
quality in requirements engineering. Proceedings of the 7th
International Conference
on Advanced Information Systems Engineering – CAISE 2007 (pp. 82-95). LNCS 932.
doi: 10.1007/3-540-59498-1_239
Krogstie, J. (2012a). Quality of Models. In Model-Based Development and Evolution
of Information Systems: A Quality Approach (pp. 205-247). doi: 10.1007/978-1-4471-
2936-3_4
Krogstie, J. (2012b). Quality of Models. In Model-Based Development and Evolution
of Information Systems: A Quality Approach (pp. 281-326). doi: 10.1007/978-1-4471-
2936-3_6
Krogstie, J. & Jørgensen, H., D. (2003). Quality of Interactive Models. Genero, F.
Grandi, W.-J. van den Heuvel, J. Krogstie et al. (Eds.), Advanced Conceptual
Modeling Techniques - ER 2002 Workshops (pp. 351-363). LNCS 2784. doi:
10.1007/978-3-540-45275-1_31
Krogstie, J., Sindre, G. & Jørgensen, J. (2006). Process models representing
knowledge for action: a revised quality framework, European Journal of Information
Systems, 15(1), 91-102. doi: 10.1057/palgrave.ejis.3000598
120
Kurosawa, K. (1991), Productivity measurement and management at the company
level: the Japanese experience, Advances in industrial engineering, 14, Chapter 2.
Amsterdam, The Netherlands: Elsevier.
Lee, A. S. (1989). A scientific methodology for MIS case. MIS Quarterly, 13 (1). 33-
50. Retrieved from http://www.jstor.org/stable/248698
Levitin, A. & Redman, T. (1995). Quality dimensions of a conceptual view.
Information Processing and Management, 31(1), pp. 81-88. doi: 10.1016/0306-
4573(95)80008-H
Liles, D., H. (1996). Enterprise Modeling Within an Enterprise Engineering
Framework. J. M. Charnes, D. J. Morrice, D. T. Brunner, & J. J. S1vain (Eds.), the
Proceedings of the 28th
Winter Simulation Conference, pp. 993-999.
Lindland, O., Sindre, G.& Solvberg, A. (1994). Understanding Quality in Conceptual
Modeling. IEEE Software, 11 (2), pp. 42-49. doi: 10.1109/52.268955
Lindström, A. & Polyakova. (2010). CRM Tool & Philosophy – The Clue to a
Customer-Centric Organization (Master’s Thesis). Linnaeus University, Sweden.
Lovell, C., A. (1993). Production frontiers and productive efficiency. Hal Fried, C. A.
Knox Lovell & Shelton S. Schmidt (Eds.), The Measurement of Productive Efficiency:
Techniques and Applications, pp. 3-67. Oxford, UK: Oxford University Press.
Lundqvist, M., Holmquist, E., Sandkuhl, K., Seigerroth, U. & Strandesjö, J. (2009).
Information Demand Context Modelling for Improved Information Flow: Experiences
and Practices. A. Persson & J. Stirna (.Eds), Proceedings of the 2nd
IFIP WG 8.1
Working Conference, PoEM 2009 (pp. 8-22). LNBIP 39. doi: 10.1007/978-3-642-
05352-8_3
Lundvist, M., Sandkuhl, K., Seigerroth, U., & Holmquist, E. (2009). Handbok för
informationsbehovsanalys Version 1.0. The Information Engineering Research Group,
Department for Computer/Electrical Engineering, School of Engineering at Jönköping
University, Jönköping, Sweden.
Lundvist, M., Sandkuhl, K., Seigerroth, U., & Holmquist, E. (2011). [In Swedish]
Handbok för informationsbehovsanalys Version 1.6. The Information Engineering
Research Group, Department for Computer/Electrical Engineering, School of
Engineering at Jönköping University, Jönköping, Sweden.
Lundvist, M., Sandkuhl, K., Seigerroth, U., & Holmquist, E. (2012). [In Swedish]
Handbok för informationsbehovsanalys Version 2.0. The Information Engineering
121
Research Group, Department for Computer/Electrical Engineering, School of
Engineering at Jönköping University, Jönköping, Sweden.
Madarász, L., Raček, M., Kováč., M. & Timko, M. (2005). Tools and intelligent
methods in enterprise modeling. Proceedings of the IEEE 9th
International Conference
on Intelligent Engineering Systems, Mediterranean Sea (pp. 187–192). doi:
10.1109/INES.2005.1555155
Madu, C., N. & Madu, A., A. (2002). Dimensions of equality. International Journal of
Quality & Reliability Management, 19(3), 246-259.
Maier, R. (1999). Evaluation of data modeling. J. Eder, I. Rozman, T. Welzer (Eds.),
Proceedings of Advances in Databases and Information Systems (pp. 232-246). LNCS
1691. doi: 10.1007/3-540-48252-0_18
Maier, R. (2001). Organizational concepts and measures for the evaluation of data
modeling. S. Becker (Ed.), Developing Quality Complex Database Systems: Practices,
Techniques and Technologies. Hershey, PA: Idea Group Publishing.
March, S., T. & Smith, G., F. (1995). Design and natural science research on
Information Technology. Decision Support Systems, 15 (4), 251-266. doi:
10.1016/0167-9236(94)00041-2
Marczyk, G., DeMatteo, D. & Festinger, D. (2005). Essentials of Research Design and
Methodology. New York City, New York: John Wiley & Sons, Inc.
Martin, J. (1989). Information Engineering: Introduction, Book I . Englewood Cliffs,
NJ: Prentice Hall.
McCall, J.A., Richards, P.K., & Walters, G.F. (1977). Factors in Software Quality,
Vols. I–III, AD/A-049-014/015/055. Springfield, VA: National Technical Information
Service.
Mevius, M. (2007). Performance Indicator-Based Business Process Engineering with
Performance Nets. Proceedings of IADIS International Conference e-Commerce 2007
(pp. 197-204). Retrieved from http://www.iadisportal.org/digital-
library/mdownload/performance-indicator-based-business-process-engineering-with-
performance-nets
Mingers., J. (2000). Variety is the spice of life: combining soft and hard OR/MS
methods. International Transactions in Operational Research, 7(6), pp. 673-691. doi:
10.1111/j.1475-3995.2000.tb00224.x
122
Milikowsky, M. (2008). A Not Intractable Problem: Reasonable Certainty, Tractable,
and the Problem of Damages for Anticipatory Breach of a Long-Term Contract in a
Thin Market. Columbia Law Review, 108 (2), pp. 452-493.
Mišic, V., B.& Zhao, J., L. (0222). Evaluating the quality of reference models.
Proceedings of the 19th International Conference on Conceptual Modeling (pp. 182-
244). LNCS 1920. doi: 10.1007/3-540-45393-8_35
Molina, E., S. (2003). Evaluating IT Investments: A Business Process Simulation
Approach (Doctoral Thesis). Industrial Information and Control Systems Department
of Electrical Engineering KTH, Royal Institute of Technology, Sweden.
Moody, L. (1998). Metrics for Evaluating the Quality of Entity Relationship Models.
Proceedings of the 17th
International Conference on Conceptual Modeling (pp. 211-
225). LNCS 1507. doi: 10.1007/978-3-540-49524-6_18
Moody, L. (2005). Theoretical and practical issues in evaluating the quality of
conceptual models: current state and future directions. Data & Knowledge
Engineering, 55, 243-276. doi: 10.1016/j.datak.2004.12.005
Moody, L.& Shanks, G. (1994). What Makes a Good Data Model? Evaluating the
Quality of Entity Relationships Models. Proceedings of the 13th
International
Conference on Conceptual Modelling (pp. 94-111). LNCS 881. doi: 10.1007/3-540-
58786-1_75
Moody, D., L.& Shanks, G., G. (1998a). Evaluating and improving the quality of
entity relationship models: an action research programme. Australian Computer
Journal, 30 (3), 97-110.
Moody, D., L.& Shanks, G.G. (1998b). What Makes a Good Data Model? A
Framework for Evaluating and Improving the Quality of Entity Relationship Models.
Australian Computer Journal, 90, pp. 97-110.
Moody, D., L.& Shanks, G., G. (2003). Improving the quality of data models:
Empirical validation of a quality management framework. Information Systems, 28
(6), pp. 619-650. doi: 10.1016/S0306-4379(02)00043-1
Moody, D., L., Sindre, G., Brasethvik, T. & Sølvberg, A. (2002). S. Spaccapietra, S.T.
March, & Y. Kambayashi (Eds.), Proceedings of the 21st International Conference on
Conceptual Modeling (pp.380-396). LNCS 2503. doi: 10.1007/3-540-45816-6_36
Morris, S., Devlin, N., Parkin, D. (2007). Economic analysis in health care. J Wiley&
Sons.
123
Neely, A., Gregory, M.& Platts, K. (1995). Performance measurement system design:
A literature review and research agenda. International Journal of Operations &
Production Management, 14 (5), pp. 80-116.
Ngwenyama, O., K.& Grant, D., A. 1993. Enterprise modeling for CIM information
systems architectures: an object-oriented approach. Computers and Engineering, 26
(2). Retrieved from
http://deepblue.lib.umich.edu/bitstream/2027.42/31672/1/0000608.pdf
Nurcan, S. (2008). A survey on the flexibility requirements related to business
processes and modeling artifacts. Proceedings of the 41th Hawaii International
Conference on System Sciences, pp. 378-387. doi: 10.1109/HICSS.2008.39
Odell, J., J. (1996). A primer to method engineering. Proceedings of the IFIP TC8,
WG8.1/8.2 working conference on method engineering on Method engineering :
principles of method construction and tool support: principles of method construction
and tool support s, 3 (19). London, UK: Chapman & Hall, Ltd.
Oliga., J. (1988). Methodological foundations of systems methodologies. Systemic
Practice and Action Research, 1(1), pp. 87-112. doi: 10.1007/BF01059890
OMG. (1997). Object Management Group - UML. Retrieved from
http://www.uml.org/
Opdahl, A .L., & Henderson-Sellers, B. (1999). Evaluating and improving OO
modeling languages using the BWW-model. Proceedings of the Information Systems
Foundations Workshop (Ontology, Semiotics and Practice). Retrieved from
http://comp.mq.edu.au/isf99/Opdahl.htm
Opdahl, A., L., Henderson-Sellers, B. & Barbier, F. (2000a). Using Ontology to
Analyse Whole-Part Relationships in OO-Models. Submitted for publication.
Opdahl, A., L., Henderson-Sellers, B. & barbier, F. (2000b). An Ontological
Evaluation of the OML Metamodel. E. D. Falkeberg, K. Lyytinen & A. A. Verrijn-
Stuart (Eds.), proceedings of the IFIP TC8/WG8.1 International Conference on
Information System Concepts - ISCO 2000. Retrieved from
http://dl.acm.org/citation.cfm?id=661112
Orlikowski, W., J. & Iacono, C., S. (2001). Research Commentary: Desperately
Seeking the 'IT' in IT Research? A Call to Theorizing the IT Artifact. Information
System Research, 12(2). 121-134. doi: 10.1287/isre.12.2.121.9700
Ortega, M., Pérez, M., A. & Rojas, T. (2003). Construction of a Systematic Quality
Model for Evaluating a Software Product. Software Quality Journal, 11(3). 219-242.
124
Owen, C., L. (1998). Design Research: Building the Knowledge Base. Design Studies,
19 (1), 9-20. doi: 10.1016/S0142-694X(97)00030-6
Perry, D., E., Sim, S., E. & Easterbrook, S., M. (2004). Case Studies for Software
Engineers. Proceedings of the 26th
International Conference on Software Engineering
(pp. 736-738). doi: 10.1109/ICSE.2004.1317512
Persson, A. & Stirna. J. (2001). Why Enterprise Modelling? An Explorative Study into
Current Practice. K.R. Dittrich, A. Geppert and M. C. Norrie (Eds.), Proceedings of
the13th Conference on Advanced Information Systems Engineering (pp. 465-468).
LNCS 2068.Springer Berlin Heidelberg. doi: 10.1007/3-540-45341-5_31
Persson, A. & Stirna, J. (2009). Anti-patterns as a Means of Focusing on Critical
Quality Aspects in Enterprise Modeling. T. Halpin, J. Krogstie, S. Nurcan, E. Proper,
R. Schmidt and P. Soffer (Eds.), Proceedings of 10th
International Workshop, BPMDS
2009, and 14th
International Conference, EMMSAD 2009, held at CAiSE 2009 (pp.
407-418). Springer Berlin Heidelberg. doi: 10.1007/978-3-642-01862-6_33
Persson, A. & Stirna, J. (2010). Towards Defining a Competence Profile for the
Enterprise Modeling Practitioner. P van. Bommel, S. Hoppenbrouwers, S. Overbeek,
E. Proper and J.Barjis (Eds.), Proceedings of the 2nd
Practices of Enterprise Modeling
(pp. 232-245). LNBIP 68. doi: 10.1007/978-3-642-16782-9_17
Petrie, C. J. (Ed.). (1992). Enterprise integration modeling. Proceedings of the 1st
international conference on enterprise integration, Cambridge, MA, pp. 1–13.
Prakash, N. (1999). On method statics and dynamics. Information Systems, 24 (8), pp.
613-637. doi: 10.1016/S0306-4379(00)00002-8
Priem, R., L. & Butler, J., E. (2001). Is the resource-based “view” a useful perspective
for strategic management research?. Academy of Management Review, 26(1),22-40.
Retrieved from http://www.jstor.org/stable/259392
Programme Syllabus IT, Management and Innovation (One Year), 60 credits. (2012).
Jönköping International Business School, Jönköping University.
Programme Syllabus IT, Management and Innovation (Two Years), 120 credits.
(2012). Jönköping International Business School, Jönköping University.
Programme Syllabus Master of Science in Informatics, specialization Information
Engineering and Management, 120 credits. (2012). School of Engineering, Jönköping
University.
125
Ralyté, J., Backlund, P., Kühn, H. & Jeusfeld, M., A. (2006). Method Chunks for
Interoperability. Proceedings of the 25th
International Conference on Conceptual
Modeling (ER2006). LNCS 4215(pp. 339-353). doi: 10.1007/11901181_26
Reeves, C. & Bednar, D. (1994). Defining quality: alternatives and implications.
Academy of Management Review, 19(3), 419–445. Retrieved from
http://www.jstor.org/stable/258934
Remenyi, D. & Money, A. (2004). Research Supervision for Supervisors and their
students. Reading, UK: Academic Conferences Limited
Rolstadås, A. & Andersen, B. (2000). Enterprise modeling - Improving global
industrial competitiveness. Dordrecht, The Netherlands: Kluwer Academic publishers.
Rosemann, M. (1995). A Framework for Efficient Information Modeling - Guidelines
for Retail Enterprises. V. Jacob, R. Krishnan (Eds.), Proceedings of the 3rd
INFORMS
Conference on Information Systems and Technology (pp. 442-448).
Rosemann, M. (1998). Managing the complexity of multiperspective information
models using the guidelines of modeling. In the 3rd
Australian Conference on
Requirements Engineering (ACRE’98) (pp. 101–118)
Sallis, E. (2002). Total Quality Management in Education, third edition. London, UK:
Kogan Page Ltd.
Samuels, W., J. (2000). Signs, Pragmatism, and Abduction: The Tragedy, Irony, and
Promise of Charles Sanders Peirce, Journal of Economic Issues, 34 (1), pp.207-217.
Retrieved from http://www.jstor.org/stable/4227542
Sannicolo, F., Perini, A. & Giunchiglia, F. (2002). [In Italian] The TROPOS Modeling
Language. A User Guide (Technical Report # DIT-02-0061). University of Trento,
department of Information and Communication Technology. Retrieved from
http://eprints.biblio.unitn.it/208/1/61.pdf
Schütte, R.& Rotthowe, T. (1998). The Guidelines of Modeling- An Approach to
Enhance Quality in Information Models. In the 17th
International Conference on
Conceptual Modeling (pp. 240-254). LNCS 1507. doi: 10.1007/978-3-540-49524-
6_20
Scientific Project Plan for Demand Patterns for Efficient Information Logistics
(infoFLOW 2) Version 1.0. (2009). Jönköping, Sweden.
Shanks, G.& Darke, P. (1997). Quality in Conceptual Modeling: Linking Theory and
Practice. Proceedings of the 3rd
Pacific Asia Conference Information Systems (pp.
805-814).
126
Seigerroth, U. (2011). Enterprise Modeling and Enterprise Architecture: The
Constituents of Transformation and Alignment of Business and IT. International
Journal of IT/Business Alignment and Governance, 2(1), pp. 16-34. doi:
10.4018/jitbag.2011010102
Seigerroth, U. (2012). Dressed for Success for Enterprise Modelling. Description of
the case and the assignment. Jönköping Universi§ty.
Shah, U., Shaikh, M., U., Shamim, A. & Mehmood, Y. (2011). Proposes Quality
Framework to Incorporate Quality Aspects in Web Warehouse Creation. Journal of
Computing, 3(4), pp. 85-92.
Shen, H.,Wall, B., Zarembab, M., Chena, Y.& Browne, J. (2004). Integration of
Enterprise Modeling methods for enterprise information system analysis and user
requirements gathering. Computers in Industry, 54, pp.307-323.
Shewhart, Walter A. (1986). Statistical Method from the Viewpoint of Quality Control.
New York: Dover.
Siau, K.& Rossi, M. (2007). Evaluation techniques for systems analysis and design
modelling methods – a review and comparative analysis. Information Systems Journal,
21 (3), pp. 249-268. doi: 10.1111/j.1365-2575.2007.00255.x
Signore, O. (2005). Towards a Quality Model for Websites. Proceeding of ACM
International Conferences. Retrieved from http://www.w3c.it/papers/cmg2005Poland-
quality.pdf
Simon, H., A. (1964). On the concept of organizational goal. Administrative Science
Quarterly, 9(1), pp.1-22.
Simon, H., A. (1996). The Sciences of the Artificial (3rd
ed.). Cambridge, MA: MIT
Press.
Sink, D. S. & Tuttle, T. C. (1989). Planning and Measurement in your Organization
of the Future. Norcross, GA. Industrial Engineering and Management Press.
Ssebuggwawo, D.& Hoppenbrouwers, S.& Proper, E. (2009). Evaluating Modeling
Sessions Using the Analytic Hierarchy Process. A. Persson& J. Stirna (Eds.),
Proceedings of the 2nd
Practiced of Enterprise Modeling (pp. 69-83). LNBIP 39. doi:
10.1007/978-3-642-05352-8_7
Stempfle, J. & Badke-Schaub, P. (2002). Thinking in design teams – an analysis of
team communication. Design Studies, 23 (5), 473-496. doi: 10.1016/S0142-
694X(02)00004-2
Sumanth, D. J. (1994). Productivity Engineering and Management, McGraw-Hill.
127
Takeda, H. , Veerkamp, P., Tomiyama, T. & Yoshikawa, H. (1990). Modeling Design
Processes. AI Magazine, 11 (4), 37-48. Retrieved from
http://aaaipress.org/ojs/index.php/aimagazine/article/viewFile/855/773
Taveter, K.& Wagner, G. (2000). Combining AOR Diagrams and Ross Business
Rules’ Diagrams for Enterprise Modeling. Proceedings of the 2nd
International Bi-
Conference Workshop on Agent-Oriented Information Systems. Berlin, Germany: iCue
Publishing.
Tsichritzis, D. (1998). The Dynamics of Innovation. P. Denning & R. Metcalfe (Eds.),
Beyond Calculation: The Next Fifty Years of Computing (pp. 259-265). New York,
NY: Copernicus Books.
Taylor, F., W. (1957). [In French] La direction scientifique des entreprises. Paris,
France: Dunod.
Teuuv, B. & van den Berg, H. (1997). On the Quality of Conceptual Models. In S.W.
Liddle (Ed.), Proceedings of the f the ER97 Workshop on Behavioral Models and
Design Transformations: Issues and Opportunities in Conceptual Modeling. Retrieved
from http://osm7.cs.byu.edu/ER97/workshop4/tvdb.html
Thörn, C. (2010). On the Quality of Feature Models (Doctoral Dissertation).
Department of Computer and Information Science, Linköping University. Linköping,
Sweden.
Tissot, F. & Crump, W. (2006). An Integrated Enterprise Modeling Environment.
International Handbooks on Information Systems. P. Bernus, K. Mertins and G.
Schmidt (Eds.), Handbook on Architectures of Information Systems (pp. 539-567).
doi: 10.1007/3-540-26661-5_22
Vakkuri, J. (2005). Doing the Best One Can - Bounded Rationality and the
Measurement of Organizational ‘Best’ Practices. Paper prepared for EGPA
Conference, Bern, Switzerland.
Vernadat, F. B. (1996). Enterprise Modeling and Integration. London, UK: Chapman
& Hall.
Vernadat, F., B. (2001). Enterprise Modeling, Production Planning & Control, 12(2),
107-109.
Vernadat, F.B. (2002). Enterprise Modeling and Integration (EMI): Current Status and
Research Perspectives. Annual Reviews on Control, 26 (1), pp. 15-25. doi:
10.1016/S1367-5788(02)80006-2
128
Wand, Y. & Weber, R. (1989). A Model of Control and Audit Procedure Change in
Evolving Data Processing Systems. The Accounting Review, 64 (1), 87-107.Retrieved
from http://www.jstor.org/stable/248130
Wand, Y. & Weber, R. (1995). On the deep structure of information systems.
Information Systems Journal (5), 203-223. doi: 10.1111/j.1365-2575.1995.tb00108.x
Weber, R. (1987). Toward a Theory of Artifacts: A Paradigmatic Base for Information
Systems Research. Journal of Information Systems, 1(2), pp. 3-19.
Weber, R. (1997). Ontological Foundations of Information Systems, Brisbane,
Australia: Coopers & Lybrand.
Whitman, L., E. & Huff, B., L. (2001). A Living Enterprise Model. Proceedings of the
6th Industrial Engineering Research Conference, Miami Beach, FL, USA.
Wolfinbarger, M., F. & Gilly, M., C. (2003). eTailQ: Dimensionalizing, measuring
and predicting etailing quality. Journal of Retailing, 79(3), 183-198. doi:
10.1016/S0022-4359(03)00034-4
Wickramasinghe, N. & Mills, G. (2001). Knowledge management systems: A Health
care initiative with lessons for us all. Proceedings of the 9th
European Conference on
Information Systems (pp. 763-774). Retrieved from
http://is2.lse.ac.uk/asp/aspecis/20010122.pdf
Yang, Z., Peterson, R.T. & Cai, S. (2003). Services quality dimensions of Internet
retailing: An exploratory analysis. Journal of Services Marketing, 17(7), pp. 685-701.
Yin, R., K. (1994). Case Study Research: Design and Methods, 3rd
Edition (Applied
Social Research Methods, Vol. 5). Thousand Oaks, CA: Sage Publications.
Zhao, Y.& Fan, Y. (2003). Integrated Enterprise Modeling Framework for Developing
Consistent Distributed Systems. Journal- Tsinghua University, pp. 63-69. Retrieved
from http://simflow.net/Publications/Papers/Year2003/zy-SDM-DS-0312.pdf
Zokaei, K., Hines, P. (2007). Achieving consumer focus in supply chains.
International Journal of Physical Distribution and Logistics Management, 37(3), pp.
223–247. doi: 10.1108/09600030710742434
Department of Computer and Information Science Linköpings universitet
Licentiate Theses
Linköpings Studies in Science and Technology Faculty of Arts and Sciences
No 17 Vojin Plavsic: Interleaved Processing of Non-Numerical Data Stored on a Cyclic Memory. (Available at: FOA,
Box 1165, S-581 11 Linköping, Sweden. FOA Report B30062E) No 28 Arne Jönsson, Mikael Patel: An Interactive Flowcharting Technique for Communicating and Realizing Al-
gorithms, 1984. No 29 Johnny Eckerland: Retargeting of an Incremental Code Generator, 1984. No 48 Henrik Nordin: On the Use of Typical Cases for Knowledge-Based Consultation and Teaching, 1985. No 52 Zebo Peng: Steps Towards the Formalization of Designing VLSI Systems, 1985. No 60 Johan Fagerström: Simulation and Evaluation of Architecture based on Asynchronous Processes, 1985. No 71 Jalal Maleki: ICONStraint, A Dependency Directed Constraint Maintenance System, 1987. No 72 Tony Larsson: On the Specification and Verification of VLSI Systems, 1986. No 73 Ola Strömfors: A Structure Editor for Documents and Programs, 1986. No 74 Christos Levcopoulos: New Results about the Approximation Behavior of the Greedy Triangulation, 1986. No 104 Shamsul I. Chowdhury: Statistical Expert Systems - a Special Application Area for Knowledge-Based Computer
Methodology, 1987. No 108 Rober Bilos: Incremental Scanning and Token-Based Editing, 1987. No 111 Hans Block: SPORT-SORT Sorting Algorithms and Sport Tournaments, 1987. No 113 Ralph Rönnquist: Network and Lattice Based Approaches to the Representation of Knowledge, 1987. No 118 Mariam Kamkar, Nahid Shahmehri: Affect-Chaining in Program Flow Analysis Applied to Queries of Pro-
grams, 1987. No 126 Dan Strömberg: Transfer and Distribution of Application Programs, 1987. No 127 Kristian Sandahl: Case Studies in Knowledge Acquisition, Migration and User Acceptance of Expert Systems,
1987. No 139 Christer Bäckström: Reasoning about Interdependent Actions, 1988. No 140 Mats Wirén: On Control Strategies and Incrementality in Unification-Based Chart Parsing, 1988. No 146 Johan Hultman: A Software System for Defining and Controlling Actions in a Mechanical System, 1988. No 150 Tim Hansen: Diagnosing Faults using Knowledge about Malfunctioning Behavior, 1988. No 165 Jonas Löwgren: Supporting Design and Management of Expert System User Interfaces, 1989. No 166 Ola Petersson: On Adaptive Sorting in Sequential and Parallel Models, 1989. No 174 Yngve Larsson: Dynamic Configuration in a Distributed Environment, 1989. No 177 Peter Åberg: Design of a Multiple View Presentation and Interaction Manager, 1989. No 181 Henrik Eriksson: A Study in Domain-Oriented Tool Support for Knowledge Acquisition, 1989. No 184 Ivan Rankin: The Deep Generation of Text in Expert Critiquing Systems, 1989. No 187 Simin Nadjm-Tehrani: Contributions to the Declarative Approach to Debugging Prolog Programs, 1989. No 189 Magnus Merkel: Temporal Information in Natural Language, 1989. No 196 Ulf Nilsson: A Systematic Approach to Abstract Interpretation of Logic Programs, 1989. No 197 Staffan Bonnier: Horn Clause Logic with External Procedures: Towards a Theoretical Framework, 1989. No 203 Christer Hansson: A Prototype System for Logical Reasoning about Time and Action, 1990. No 212 Björn Fjellborg: An Approach to Extraction of Pipeline Structures for VLSI High-Level Synthesis, 1990. No 230 Patrick Doherty: A Three-Valued Approach to Non-Monotonic Reasoning, 1990. No 237 Tomas Sokolnicki: Coaching Partial Plans: An Approach to Knowledge-Based Tutoring, 1990. No 250 Lars Strömberg: Postmortem Debugging of Distributed Systems, 1990. No 253 Torbjörn Näslund: SLDFA-Resolution - Computing Answers for Negative Queries, 1990. No 260 Peter D. Holmes: Using Connectivity Graphs to Support Map-Related Reasoning, 1991. No 283 Olof Johansson: Improving Implementation of Graphical User Interfaces for Object-Oriented Knowledge- Bases,
1991. No 298 Rolf G Larsson: Aktivitetsbaserad kalkylering i ett nytt ekonomisystem, 1991. No 318 Lena Srömbäck: Studies in Extended Unification-Based Formalism for Linguistic Description: An Algorithm for
Feature Structures with Disjunction and a Proposal for Flexible Systems, 1992. No 319 Mikael Pettersson: DML-A Language and System for the Generation of Efficient Compilers from Denotational
Specification, 1992. No 326 Andreas Kågedal: Logic Programming with External Procedures: an Implementation, 1992. No 328 Patrick Lambrix: Aspects of Version Management of Composite Objects, 1992. No 333 Xinli Gu: Testability Analysis and Improvement in High-Level Synthesis Systems, 1992. No 335 Torbjörn Näslund: On the Role of Evaluations in Iterative Development of Managerial Support Systems, 1992. No 348 Ulf Cederling: Industrial Software Development - a Case Study, 1992. No 352 Magnus Morin: Predictable Cyclic Computations in Autonomous Systems: A Computational Model and Im-
plementation, 1992. No 371 Mehran Noghabai: Evaluation of Strategic Investments in Information Technology, 1993. No 378 Mats Larsson: A Transformational Approach to Formal Digital System Design, 1993.
No 380 Johan Ringström: Compiler Generation for Parallel Languages from Denotational Specifications, 1993. No 381 Michael Jansson: Propagation of Change in an Intelligent Information System, 1993. No 383 Jonni Harrius: An Architecture and a Knowledge Representation Model for Expert Critiquing Systems, 1993. No 386 Per Österling: Symbolic Modelling of the Dynamic Environments of Autonomous Agents, 1993. No 398 Johan Boye: Dependency-based Groudness Analysis of Functional Logic Programs, 1993. No 402 Lars Degerstedt: Tabulated Resolution for Well Founded Semantics, 1993. No 406 Anna Moberg: Satellitkontor - en studie av kommunikationsmönster vid arbete på distans, 1993. No 414 Peter Carlsson: Separation av företagsledning och finansiering - fallstudier av företagsledarutköp ur ett agent-
teoretiskt perspektiv, 1994. No 417 Camilla Sjöström: Revision och lagreglering - ett historiskt perspektiv, 1994. No 436 Cecilia Sjöberg: Voices in Design: Argumentation in Participatory Development, 1994. No 437 Lars Viklund: Contributions to a High-level Programming Environment for a Scientific Computing, 1994. No 440 Peter Loborg: Error Recovery Support in Manufacturing Control Systems, 1994. FHS 3/94 Owen Eriksson: Informationssystem med verksamhetskvalitet - utvärdering baserat på ett verksamhetsinriktat och
samskapande perspektiv, 1994. FHS 4/94 Karin Pettersson: Informationssystemstrukturering, ansvarsfördelning och användarinflytande - En komparativ
studie med utgångspunkt i två informationssystemstrategier, 1994. No 441 Lars Poignant: Informationsteknologi och företagsetablering - Effekter på produktivitet och region, 1994. No 446 Gustav Fahl: Object Views of Relational Data in Multidatabase Systems, 1994. No 450 Henrik Nilsson: A Declarative Approach to Debugging for Lazy Functional Languages, 1994. No 451 Jonas Lind: Creditor - Firm Relations: an Interdisciplinary Analysis, 1994. No 452 Martin Sköld: Active Rules based on Object Relational Queries - Efficient Change Monitoring Techniques, 1994. No 455 Pär Carlshamre: A Collaborative Approach to Usability Engineering: Technical Communicators and System
Developers in Usability-Oriented Systems Development, 1994. FHS 5/94 Stefan Cronholm: Varför CASE-verktyg i systemutveckling? - En motiv- och konsekvensstudie avseende
arbetssätt och arbetsformer, 1994. No 462 Mikael Lindvall: A Study of Traceability in Object-Oriented Systems Development, 1994. No 463 Fredrik Nilsson: Strategi och ekonomisk styrning - En studie av Sandviks förvärv av Bahco Verktyg, 1994. No 464 Hans Olsén: Collage Induction: Proving Properties of Logic Programs by Program Synthesis, 1994. No 469 Lars Karlsson: Specification and Synthesis of Plans Using the Features and Fluents Framework, 1995. No 473 Ulf Söderman: On Conceptual Modelling of Mode Switching Systems, 1995. No 475 Choong-ho Yi: Reasoning about Concurrent Actions in the Trajectory Semantics, 1995. No 476 Bo Lagerström: Successiv resultatavräkning av pågående arbeten. - Fallstudier i tre byggföretag, 1995. No 478 Peter Jonsson: Complexity of State-Variable Planning under Structural Restrictions, 1995. FHS 7/95 Anders Avdic: Arbetsintegrerad systemutveckling med kalkylprogram, 1995. No 482 Eva L Ragnemalm: Towards Student Modelling through Collaborative Dialogue with a Learning Companion,
1995. No 488 Eva Toller: Contributions to Parallel Multiparadigm Languages: Combining Object-Oriented and Rule-Based
Programming, 1995. No 489 Erik Stoy: A Petri Net Based Unified Representation for Hardware/Software Co-Design, 1995. No 497 Johan Herber: Environment Support for Building Structured Mathematical Models, 1995. No 498 Stefan Svenberg: Structure-Driven Derivation of Inter-Lingual Functor-Argument Trees for Multi-Lingual
Generation, 1995. No 503 Hee-Cheol Kim: Prediction and Postdiction under Uncertainty, 1995. FHS 8/95 Dan Fristedt: Metoder i användning - mot förbättring av systemutveckling genom situationell metodkunskap och
metodanalys, 1995. FHS 9/95 Malin Bergvall: Systemförvaltning i praktiken - en kvalitativ studie avseende centrala begrepp, aktiviteter och
ansvarsroller, 1995. No 513 Joachim Karlsson: Towards a Strategy for Software Requirements Selection, 1995. No 517 Jakob Axelsson: Schedulability-Driven Partitioning of Heterogeneous Real-Time Systems, 1995. No 518 Göran Forslund: Toward Cooperative Advice-Giving Systems: The Expert Systems Experience, 1995. No 522 Jörgen Andersson: Bilder av småföretagares ekonomistyrning, 1995. No 538 Staffan Flodin: Efficient Management of Object-Oriented Queries with Late Binding, 1996. No 545 Vadim Engelson: An Approach to Automatic Construction of Graphical User Interfaces for Applications in
Scientific Computing, 1996. No 546 Magnus Werner : Multidatabase Integration using Polymorphic Queries and Views, 1996. FiF-a 1/96 Mikael Lind: Affärsprocessinriktad förändringsanalys - utveckling och tillämpning av synsätt och metod, 1996. No 549 Jonas Hallberg: High-Level Synthesis under Local Timing Constraints, 1996. No 550 Kristina Larsen: Förutsättningar och begränsningar för arbete på distans - erfarenheter från fyra svenska företag.
1996. No 557 Mikael Johansson: Quality Functions for Requirements Engineering Methods, 1996. No 558 Patrik Nordling: The Simulation of Rolling Bearing Dynamics on Parallel Computers, 1996. No 561 Anders Ekman: Exploration of Polygonal Environments, 1996. No 563 Niclas Andersson: Compilation of Mathematical Models to Parallel Code, 1996.
No 567 Johan Jenvald: Simulation and Data Collection in Battle Training, 1996. No 575 Niclas Ohlsson: Software Quality Engineering by Early Identification of Fault-Prone Modules, 1996. No 576 Mikael Ericsson: Commenting Systems as Design Support—A Wizard-of-Oz Study, 1996. No 587 Jörgen Lindström: Chefers användning av kommunikationsteknik, 1996. No 589 Esa Falkenroth: Data Management in Control Applications - A Proposal Based on Active Database Systems,
1996. No 591 Niclas Wahllöf: A Default Extension to Description Logics and its Applications, 1996. No 595 Annika Larsson: Ekonomisk Styrning och Organisatorisk Passion - ett interaktivt perspektiv, 1997. No 597 Ling Lin: A Value-based Indexing Technique for Time Sequences, 1997. No 598 Rego Granlund: C3Fire - A Microworld Supporting Emergency Management Training, 1997. No 599 Peter Ingels: A Robust Text Processing Technique Applied to Lexical Error Recovery, 1997. No 607 Per-Arne Persson: Toward a Grounded Theory for Support of Command and Control in Military Coalitions, 1997. No 609 Jonas S Karlsson: A Scalable Data Structure for a Parallel Data Server, 1997. FiF-a 4 Carita Åbom: Videomötesteknik i olika affärssituationer - möjligheter och hinder, 1997. FiF-a 6 Tommy Wedlund: Att skapa en företagsanpassad systemutvecklingsmodell - genom rekonstruktion, värdering och
vidareutveckling i T50-bolag inom ABB, 1997. No 615 Silvia Coradeschi: A Decision-Mechanism for Reactive and Coordinated Agents, 1997. No 623 Jan Ollinen: Det flexibla kontorets utveckling på Digital - Ett stöd för multiflex? 1997. No 626 David Byers: Towards Estimating Software Testability Using Static Analysis, 1997. No 627 Fredrik Eklund: Declarative Error Diagnosis of GAPLog Programs, 1997. No 629 Gunilla Ivefors: Krigsspel och Informationsteknik inför en oförutsägbar framtid, 1997. No 631 Jens-Olof Lindh: Analysing Traffic Safety from a Case-Based Reasoning Perspective, 1997 No 639 Jukka Mäki-Turja:. Smalltalk - a suitable Real-Time Language, 1997. No 640 Juha Takkinen: CAFE: Towards a Conceptual Model for Information Management in Electronic Mail, 1997. No 643 Man Lin: Formal Analysis of Reactive Rule-based Programs, 1997. No 653 Mats Gustafsson: Bringing Role-Based Access Control to Distributed Systems, 1997. FiF-a 13 Boris Karlsson: Metodanalys för förståelse och utveckling av systemutvecklingsverksamhet. Analys och värdering
av systemutvecklingsmodeller och dess användning, 1997. No 674 Marcus Bjäreland: Two Aspects of Automating Logics of Action and Change - Regression and Tractability,
1998. No 676 Jan Håkegård: Hierarchical Test Architecture and Board-Level Test Controller Synthesis, 1998. No 668 Per-Ove Zetterlund: Normering av svensk redovisning - En studie av tillkomsten av Redovisningsrådets re-
kommendation om koncernredovisning (RR01:91), 1998. No 675 Jimmy Tjäder: Projektledaren & planen - en studie av projektledning i tre installations- och systemutveck-
lingsprojekt, 1998. FiF-a 14 Ulf Melin: Informationssystem vid ökad affärs- och processorientering - egenskaper, strategier och utveckling,
1998. No 695 Tim Heyer: COMPASS: Introduction of Formal Methods in Code Development and Inspection, 1998. No 700 Patrik Hägglund: Programming Languages for Computer Algebra, 1998. FiF-a 16 Marie-Therese Christiansson: Inter-organisatorisk verksamhetsutveckling - metoder som stöd vid utveckling av
partnerskap och informationssystem, 1998. No 712 Christina Wennestam: Information om immateriella resurser. Investeringar i forskning och utveckling samt i
personal inom skogsindustrin, 1998. No 719 Joakim Gustafsson: Extending Temporal Action Logic for Ramification and Concurrency, 1998. No 723 Henrik André-Jönsson: Indexing time-series data using text indexing methods, 1999. No 725 Erik Larsson: High-Level Testability Analysis and Enhancement Techniques, 1998. No 730 Carl-Johan Westin: Informationsförsörjning: en fråga om ansvar - aktiviteter och uppdrag i fem stora svenska
organisationers operativa informationsförsörjning, 1998. No 731 Åse Jansson: Miljöhänsyn - en del i företags styrning, 1998. No 733 Thomas Padron-McCarthy: Performance-Polymorphic Declarative Queries, 1998. No 734 Anders Bäckström: Värdeskapande kreditgivning - Kreditriskhantering ur ett agentteoretiskt perspektiv, 1998. FiF-a 21 Ulf Seigerroth: Integration av förändringsmetoder - en modell för välgrundad metodintegration, 1999. FiF-a 22 Fredrik Öberg: Object-Oriented Frameworks - A New Strategy for Case Tool Development, 1998. No 737 Jonas Mellin: Predictable Event Monitoring, 1998. No 738 Joakim Eriksson: Specifying and Managing Rules in an Active Real-Time Database System, 1998. FiF-a 25 Bengt E W Andersson: Samverkande informationssystem mellan aktörer i offentliga åtaganden - En teori om
aktörsarenor i samverkan om utbyte av information, 1998. No 742 Pawel Pietrzak: Static Incorrectness Diagnosis of CLP (FD), 1999. No 748 Tobias Ritzau: Real-Time Reference Counting in RT-Java, 1999. No 751 Anders Ferntoft: Elektronisk affärskommunikation - kontaktkostnader och kontaktprocesser mellan kunder och
leverantörer på producentmarknader, 1999. No 752 Jo Skåmedal: Arbete på distans och arbetsformens påverkan på resor och resmönster, 1999. No 753 Johan Alvehus: Mötets metaforer. En studie av berättelser om möten, 1999.
No 754 Magnus Lindahl: Bankens villkor i låneavtal vid kreditgivning till högt belånade företagsförvärv: En studie ur ett agentteoretiskt perspektiv, 2000.
No 766 Martin V. Howard: Designing dynamic visualizations of temporal data, 1999. No 769 Jesper Andersson: Towards Reactive Software Architectures, 1999. No 775 Anders Henriksson: Unique kernel diagnosis, 1999. FiF-a 30 Pär J. Ågerfalk: Pragmatization of Information Systems - A Theoretical and Methodological Outline, 1999. No 787 Charlotte Björkegren: Learning for the next project - Bearers and barriers in knowledge transfer within an
organisation, 1999. No 788 Håkan Nilsson: Informationsteknik som drivkraft i granskningsprocessen - En studie av fyra revisionsbyråer,
2000. No 790 Erik Berglund: Use-Oriented Documentation in Software Development, 1999. No 791 Klas Gäre: Verksamhetsförändringar i samband med IS-införande, 1999. No 800 Anders Subotic: Software Quality Inspection, 1999. No 807 Svein Bergum: Managerial communication in telework, 2000. No 809 Flavius Gruian: Energy-Aware Design of Digital Systems, 2000. FiF-a 32 Karin Hedström: Kunskapsanvändning och kunskapsutveckling hos verksamhetskonsulter - Erfarenheter från ett
FOU-samarbete, 2000. No 808 Linda Askenäs: Affärssystemet - En studie om teknikens aktiva och passiva roll i en organisation, 2000. No 820 Jean Paul Meynard: Control of industrial robots through high-level task programming, 2000. No 823 Lars Hult: Publika Gränsytor - ett designexempel, 2000. No 832 Paul Pop: Scheduling and Communication Synthesis for Distributed Real-Time Systems, 2000. FiF-a 34 Göran Hultgren: Nätverksinriktad Förändringsanalys - perspektiv och metoder som stöd för förståelse och
utveckling av affärsrelationer och informationssystem, 2000. No 842 Magnus Kald: The role of management control systems in strategic business units, 2000. No 844 Mikael Cäker: Vad kostar kunden? Modeller för intern redovisning, 2000. FiF-a 37 Ewa Braf: Organisationers kunskapsverksamheter - en kritisk studie av ”knowledge management”, 2000. FiF-a 40 Henrik Lindberg: Webbaserade affärsprocesser - Möjligheter och begränsningar, 2000. FiF-a 41 Benneth Christiansson: Att komponentbasera informationssystem - Vad säger teori och praktik?, 2000. No. 854 Ola Pettersson: Deliberation in a Mobile Robot, 2000. No 863 Dan Lawesson: Towards Behavioral Model Fault Isolation for Object Oriented Control Systems, 2000. No 881 Johan Moe: Execution Tracing of Large Distributed Systems, 2001. No 882 Yuxiao Zhao: XML-based Frameworks for Internet Commerce and an Implementation of B2B e-procurement,
2001. No 890 Annika Flycht-Eriksson: Domain Knowledge Management in Information-providing Dialogue systems, 2001. FiF-a 47 Per-Arne Segerkvist: Webbaserade imaginära organisationers samverkansformer: Informationssystemarkitektur
och aktörssamverkan som förutsättningar för affärsprocesser, 2001. No 894 Stefan Svarén: Styrning av investeringar i divisionaliserade företag - Ett koncernperspektiv, 2001. No 906 Lin Han: Secure and Scalable E-Service Software Delivery, 2001. No 917 Emma Hansson: Optionsprogram för anställda - en studie av svenska börsföretag, 2001. No 916 Susanne Odar: IT som stöd för strategiska beslut, en studie av datorimplementerade modeller av verksamhet som
stöd för beslut om anskaffning av JAS 1982, 2002. FiF-a-49 Stefan Holgersson: IT-system och filtrering av verksamhetskunskap - kvalitetsproblem vid analyser och be-
slutsfattande som bygger på uppgifter hämtade från polisens IT-system, 2001. FiF-a-51 Per Oscarsson: Informationssäkerhet i verksamheter - begrepp och modeller som stöd för förståelse av infor-
mationssäkerhet och dess hantering, 2001. No 919 Luis Alejandro Cortes: A Petri Net Based Modeling and Verification Technique for Real-Time Embedded
Systems, 2001. No 915 Niklas Sandell: Redovisning i skuggan av en bankkris - Värdering av fastigheter. 2001. No 931 Fredrik Elg: Ett dynamiskt perspektiv på individuella skillnader av heuristisk kompetens, intelligens, mentala
modeller, mål och konfidens i kontroll av mikrovärlden Moro, 2002. No 933 Peter Aronsson: Automatic Parallelization of Simulation Code from Equation Based Simulation Languages, 2002. No 938 Bourhane Kadmiry: Fuzzy Control of Unmanned Helicopter, 2002. No 942 Patrik Haslum: Prediction as a Knowledge Representation Problem: A Case Study in Model Design, 2002. No 956 Robert Sevenius: On the instruments of governance - A law & economics study of capital instruments in limited
liability companies, 2002. FiF-a 58 Johan Petersson: Lokala elektroniska marknadsplatser - informationssystem för platsbundna affärer, 2002. No 964 Peter Bunus: Debugging and Structural Analysis of Declarative Equation-Based Languages, 2002. No 973 Gert Jervan: High-Level Test Generation and Built-In Self-Test Techniques for Digital Systems, 2002. No 958 Fredrika Berglund: Management Control and Strategy - a Case Study of Pharmaceutical Drug Development,
2002. FiF-a 61 Fredrik Karlsson: Meta-Method for Method Configuration - A Rational Unified Process Case, 2002. No 985 Sorin Manolache: Schedulability Analysis of Real-Time Systems with Stochastic Task Execution Times, 2002. No 982 Diana Szentiványi: Performance and Availability Trade-offs in Fault-Tolerant Middleware, 2002. No 989 Iakov Nakhimovski: Modeling and Simulation of Contacting Flexible Bodies in Multibody Systems, 2002. No 990 Levon Saldamli: PDEModelica - Towards a High-Level Language for Modeling with Partial Differential
Equations, 2002. No 991 Almut Herzog: Secure Execution Environment for Java Electronic Services, 2002.
No 999 Jon Edvardsson: Contributions to Program- and Specification-based Test Data Generation, 2002. No 1000 Anders Arpteg: Adaptive Semi-structured Information Extraction, 2002. No 1001 Andrzej Bednarski: A Dynamic Programming Approach to Optimal Retargetable Code Generation for Irregular
Architectures, 2002. No 988 Mattias Arvola: Good to use! : Use quality of multi-user applications in the home, 2003. FiF-a 62 Lennart Ljung: Utveckling av en projektivitetsmodell - om organisationers förmåga att tillämpa
projektarbetsformen, 2003. No 1003 Pernilla Qvarfordt: User experience of spoken feedback in multimodal interaction, 2003. No 1005 Alexander Siemers: Visualization of Dynamic Multibody Simulation With Special Reference to Contacts, 2003. No 1008 Jens Gustavsson: Towards Unanticipated Runtime Software Evolution, 2003. No 1010 Calin Curescu: Adaptive QoS-aware Resource Allocation for Wireless Networks, 2003. No 1015 Anna Andersson: Management Information Systems in Process-oriented Healthcare Organisations, 2003. No 1018 Björn Johansson: Feedforward Control in Dynamic Situations, 2003. No 1022 Traian Pop: Scheduling and Optimisation of Heterogeneous Time/Event-Triggered Distributed Embedded
Systems, 2003. FiF-a 65 Britt-Marie Johansson: Kundkommunikation på distans - en studie om kommunikationsmediets betydelse i
affärstransaktioner, 2003. No 1024 Aleksandra Tešanovic: Towards Aspectual Component-Based Real-Time System Development, 2003. No 1034 Arja Vainio-Larsson: Designing for Use in a Future Context - Five Case Studies in Retrospect, 2003. No 1033 Peter Nilsson: Svenska bankers redovisningsval vid reservering för befarade kreditförluster - En studie vid
införandet av nya redovisningsregler, 2003. FiF-a 69 Fredrik Ericsson: Information Technology for Learning and Acquiring of Work Knowledge, 2003. No 1049 Marcus Comstedt: Towards Fine-Grained Binary Composition through Link Time Weaving, 2003. No 1052 Åsa Hedenskog: Increasing the Automation of Radio Network Control, 2003. No 1054 Claudiu Duma: Security and Efficiency Tradeoffs in Multicast Group Key Management, 2003. FiF-a 71 Emma Eliason: Effektanalys av IT-systems handlingsutrymme, 2003. No 1055 Carl Cederberg: Experiments in Indirect Fault Injection with Open Source and Industrial Software, 2003. No 1058 Daniel Karlsson: Towards Formal Verification in a Component-based Reuse Methodology, 2003. FiF-a 73 Anders Hjalmarsson: Att etablera och vidmakthålla förbättringsverksamhet - behovet av koordination och
interaktion vid förändring av systemutvecklingsverksamheter, 2004. No 1079 Pontus Johansson: Design and Development of Recommender Dialogue Systems, 2004. No 1084 Charlotte Stoltz: Calling for Call Centres - A Study of Call Centre Locations in a Swedish Rural Region, 2004. FiF-a 74 Björn Johansson: Deciding on Using Application Service Provision in SMEs, 2004. No 1094 Genevieve Gorrell: Language Modelling and Error Handling in Spoken Dialogue Systems, 2004. No 1095 Ulf Johansson: Rule Extraction - the Key to Accurate and Comprehensible Data Mining Models, 2004. No 1099 Sonia Sangari: Computational Models of Some Communicative Head Movements, 2004. No 1110 Hans Nässla: Intra-Family Information Flow and Prospects for Communication Systems, 2004. No 1116 Henrik Sällberg: On the value of customer loyalty programs - A study of point programs and switching costs,
2004. FiF-a 77 Ulf Larsson: Designarbete i dialog - karaktärisering av interaktionen mellan användare och utvecklare i en
systemutvecklingsprocess, 2004. No 1126 Andreas Borg: Contribution to Management and Validation of Non-Functional Requirements, 2004. No 1127 Per-Ola Kristensson: Large Vocabulary Shorthand Writing on Stylus Keyboard, 2004. No 1132 Pär-Anders Albinsson: Interacting with Command and Control Systems: Tools for Operators and Designers,
2004. No 1130 Ioan Chisalita: Safety-Oriented Communication in Mobile Networks for Vehicles, 2004. No 1138 Thomas Gustafsson: Maintaining Data Consistency in Embedded Databases for Vehicular Systems, 2004. No 1149 Vaida Jakoniené: A Study in Integrating Multiple Biological Data Sources, 2005. No 1156 Abdil Rashid Mohamed: High-Level Techniques for Built-In Self-Test Resources Optimization, 2005. No 1162 Adrian Pop: Contributions to Meta-Modeling Tools and Methods, 2005. No 1165 Fidel Vascós Palacios: On the information exchange between physicians and social insurance officers in the sick
leave process: an Activity Theoretical perspective, 2005. FiF-a 84 Jenny Lagsten: Verksamhetsutvecklande utvärdering i informationssystemprojekt, 2005. No 1166 Emma Larsdotter Nilsson: Modeling, Simulation, and Visualization of Metabolic Pathways Using Modelica,
2005. No 1167 Christina Keller: Virtual Learning Environments in higher education. A study of students’ acceptance of edu-
cational technology, 2005. No 1168 Cécile Åberg: Integration of organizational workflows and the Semantic Web, 2005. FiF-a 85 Anders Forsman: Standardisering som grund för informationssamverkan och IT-tjänster - En fallstudie baserad på
trafikinformationstjänsten RDS-TMC, 2005. No 1171 Yu-Hsing Huang: A systemic traffic accident model, 2005. FiF-a 86 Jan Olausson: Att modellera uppdrag - grunder för förståelse av processinriktade informationssystem i
transaktionsintensiva verksamheter, 2005. No 1172 Petter Ahlström: Affärsstrategier för seniorbostadsmarknaden, 2005. No 1183 Mathias Cöster: Beyond IT and Productivity - How Digitization Transformed the Graphic Industry, 2005. No 1184 Åsa Horzella: Beyond IT and Productivity - Effects of Digitized Information Flows in Grocery Distribution, 2005. No 1185 Maria Kollberg: Beyond IT and Productivity - Effects of Digitized Information Flows in the Logging Industry,
2005. No 1190 David Dinka: Role and Identity - Experience of technology in professional settings, 2005.
No 1191 Andreas Hansson: Increasing the Storage Capacity of Recursive Auto-associative Memory by Segmenting Data, 2005.
No 1192 Nicklas Bergfeldt: Towards Detached Communication for Robot Cooperation, 2005. No 1194 Dennis Maciuszek: Towards Dependable Virtual Companions for Later Life, 2005. No 1204 Beatrice Alenljung: Decision-making in the Requirements Engineering Process: A Human-centered Approach,
2005. No 1206 Anders Larsson: System-on-Chip Test Scheduling and Test Infrastructure Design, 2005. No 1207 John Wilander: Policy and Implementation Assurance for Software Security, 2005. No 1209 Andreas Käll: Översättningar av en managementmodell - En studie av införandet av Balanced Scorecard i ett
landsting, 2005. No 1225 He Tan: Aligning and Merging Biomedical Ontologies, 2006. No 1228 Artur Wilk: Descriptive Types for XML Query Language Xcerpt, 2006. No 1229 Per Olof Pettersson: Sampling-based Path Planning for an Autonomous Helicopter, 2006. No 1231 Kalle Burbeck: Adaptive Real-time Anomaly Detection for Safeguarding Critical Networks, 2006. No 1233 Daniela Mihailescu: Implementation Methodology in Action: A Study of an Enterprise Systems Implementation
Methodology, 2006. No 1244 Jörgen Skågeby: Public and Non-public gifting on the Internet, 2006. No 1248 Karolina Eliasson: The Use of Case-Based Reasoning in a Human-Robot Dialog System, 2006. No 1263 Misook Park-Westman: Managing Competence Development Programs in a Cross-Cultural Organisation - What
are the Barriers and Enablers, 2006. FiF-a 90 Amra Halilovic: Ett praktikperspektiv på hantering av mjukvarukomponenter, 2006. No 1272 Raquel Flodström: A Framework for the Strategic Management of Information Technology, 2006. No 1277 Viacheslav Izosimov: Scheduling and Optimization of Fault-Tolerant Embedded Systems, 2006. No 1283 Håkan Hasewinkel: A Blueprint for Using Commercial Games off the Shelf in Defence Training, Education and
Research Simulations, 2006. FiF-a 91 Hanna Broberg: Verksamhetsanpassade IT-stöd - Designteori och metod, 2006. No 1286 Robert Kaminski: Towards an XML Document Restructuring Framework, 2006. No 1293 Jiri Trnka: Prerequisites for data sharing in emergency management, 2007. No 1302 Björn Hägglund: A Framework for Designing Constraint Stores, 2007. No 1303 Daniel Andreasson: Slack-Time Aware Dynamic Routing Schemes for On-Chip Networks, 2007. No 1305 Magnus Ingmarsson: Modelling User Tasks and Intentions for Service Discovery in Ubiquitous Computing,
2007. No 1306 Gustaf Svedjemo: Ontology as Conceptual Schema when Modelling Historical Maps for Database Storage, 2007. No 1307 Gianpaolo Conte: Navigation Functionalities for an Autonomous UAV Helicopter, 2007. No 1309 Ola Leifler: User-Centric Critiquing in Command and Control: The DKExpert and ComPlan Approaches, 2007. No 1312 Henrik Svensson: Embodied simulation as off-line representation, 2007. No 1313 Zhiyuan He: System-on-Chip Test Scheduling with Defect-Probability and Temperature Considerations, 2007. No 1317 Jonas Elmqvist: Components, Safety Interfaces and Compositional Analysis, 2007. No 1320 Håkan Sundblad: Question Classification in Question Answering Systems, 2007. No 1323 Magnus Lundqvist: Information Demand and Use: Improving Information Flow within Small-scale Business
Contexts, 2007. No 1329 Martin Magnusson: Deductive Planning and Composite Actions in Temporal Action Logic, 2007. No 1331 Mikael Asplund: Restoring Consistency after Network Partitions, 2007. No 1332 Martin Fransson: Towards Individualized Drug Dosage - General Methods and Case Studies, 2007. No 1333 Karin Camara: A Visual Query Language Served by a Multi-sensor Environment, 2007. No 1337 David Broman: Safety, Security, and Semantic Aspects of Equation-Based Object-Oriented Languages and
Environments, 2007. No 1339 Mikhail Chalabine: Invasive Interactive Parallelization, 2007. No 1351 Susanna Nilsson: A Holistic Approach to Usability Evaluations of Mixed Reality Systems, 2008. No 1353 Shanai Ardi: A Model and Implementation of a Security Plug-in for the Software Life Cycle, 2008. No 1356 Erik Kuiper: Mobility and Routing in a Delay-tolerant Network of Unmanned Aerial Vehicles, 2008. No 1359 Jana Rambusch: Situated Play, 2008. No 1361 Martin Karresand: Completing the Picture - Fragments and Back Again, 2008. No 1363 Per Nyblom: Dynamic Abstraction for Interleaved Task Planning and Execution, 2008. No 1371 Fredrik Lantz: Terrain Object Recognition and Context Fusion for Decision Support, 2008. No 1373 Martin Östlund: Assistance Plus: 3D-mediated Advice-giving on Pharmaceutical Products, 2008. No 1381 Håkan Lundvall: Automatic Parallelization using Pipelining for Equation-Based Simulation Languages, 2008. No 1386 Mirko Thorstensson: Using Observers for Model Based Data Collection in Distributed Tactical Operations, 2008. No 1387 Bahlol Rahimi: Implementation of Health Information Systems, 2008. No 1392 Maria Holmqvist: Word Alignment by Re-using Parallel Phrases, 2008. No 1393 Mattias Eriksson: Integrated Software Pipelining, 2009. No 1401 Annika Öhgren: Towards an Ontology Development Methodology for Small and Medium-sized Enterprises,
2009. No 1410 Rickard Holsmark: Deadlock Free Routing in Mesh Networks on Chip with Regions, 2009. No 1421 Sara Stymne: Compound Processing for Phrase-Based Statistical Machine Translation, 2009. No 1427 Tommy Ellqvist: Supporting Scientific Collaboration through Workflows and Provenance, 2009. No 1450 Fabian Segelström: Visualisations in Service Design, 2010. No 1459 Min Bao: System Level Techniques for Temperature-Aware Energy Optimization, 2010. No 1466 Mohammad Saifullah: Exploring Biologically Inspired Interactive Networks for Object Recognition, 2011
No 1468 Qiang Liu: Dealing with Missing Mappings and Structure in a Network of Ontologies, 2011. No 1469 Ruxandra Pop: Mapping Concurrent Applications to Multiprocessor Systems with Multithreaded Processors and Network on Chip-Based Interconnections, 2011. No 1476 Per-Magnus Olsson: Positioning Algorithms for Surveillance Using Unmanned Aerial Vehicles, 2011. No 1481 Anna Vapen: Contributions to Web Authentication for Untrusted Computers, 2011. No 1485 Loove Broms: Sustainable Interactions: Studies in the Design of Energy Awareness Artefacts, 2011. FiF-a 101 Johan Blomkvist: Conceptualising Prototypes in Service Design, 2011. No 1490 Håkan Warnquist: Computer-Assisted Troubleshooting for Efficient Off-board Diagnosis, 2011. No 1503 Jakob Rosén: Predictable Real-Time Applications on Multiprocessor Systems-on-Chip, 2011. No 1504 Usman Dastgeer: Skeleton Programming for Heterogeneous GPU-based Systems, 2011. No 1506 David Landén: Complex Task Allocation for Delegation: From Theory to Practice, 2011. No 1507 Kristian Stavåker: Contributions to Parallel Simulation of Equation-Based Models on
Graphics Processing Units, 2011. No 1509 Mariusz Wzorek: Selected Aspects of Navigation and Path Planning in Unmanned Aircraft Systems, 2011. No 1510 Piotr Rudol: Increasing Autonomy of Unmanned Aircraft Systems Through the Use of Imaging Sensors, 2011. No 1513 Anders Carstensen: The Evolution of the Connector View Concept: Enterprise Models for Interoperability Solutions in the Extended Enterprise, 2011. No 1523 Jody Foo: Computational Terminology: Exploring Bilingual and Monolingual Term Extraction, 2012. No 1550 Anders Fröberg: Models and Tools for Distributed User Interface Development, 2012. No 1558 Dimitar Nikolov: Optimizing Fault Tolerance for Real-Time Systems, 2012. No 1586 Massimiliano Raciti: Anomaly Detection and its Adaptation: Studies on Cyber-physical Systems, 2013. No 1588 Banafsheh Khademhosseinieh: Towards an Approach for Efficiency Evaluation of
Enterprise Modeling Methods, 2013.