+ All Categories
Home > Documents > DOCUMENT DE TRAVAIL 1999-021 - FSA ULaval

DOCUMENT DE TRAVAIL 1999-021 - FSA ULaval

Date post: 24-Mar-2022
Category:
Upload: others
View: 7 times
Download: 0 times
Share this document with a friend
25
Publié par : Published by : Publicación de la : Faculté des sciences de l’administration Québec (Québec) Canada G1K 7P4 Ph. Tel. : (418) 656-3644 Fax : (418) 656-2624 Édition électronique : Electronic publishing : Edición electrónica : Céline Frenette Vice-décanat à la recherche et au développement Faculté des sciences de l’administration Disponible sur Internet : Available on Internet Disponible por Internet : http ://www.fsa.ulaval.ca/rd [email protected] DOCUMENT DE TRAVAIL 1999-021 MANAGING A DECISION-MAKING SITUATION IN THE CONTEXT OF THE CANADIAN AIRSPACE PROTECTION A. Guitouni, J.-M. Martel, M. Bélanger, C. Hunter Centre de recherche en modélisation, information et décision (CERMID) Version originale : Original manuscript : Version original : ISBN – 2-89524-093-0 ISBN - ISBN - Série électronique mise à jour : One-line publication updated : Seria electrónica, puesta al dia 12-1999
Transcript

Publié par :Published by :Publicación de la :

Faculté des sciences de l’administration

Québec (Québec) Canada G1K 7P4Ph. Tel. : (418) 656-3644

Fax : (418) 656-2624

Édition électronique :Electronic publishing :Edición electrónica :

Céline FrenetteVice-décanat à la recherche et au développementFaculté des sciences de l’administration

Disponible sur Internet :Available on InternetDisponible por Internet :

http ://www.fsa.ulaval.ca/[email protected]

DOCUMENT DE TRAVAIL 1999-021

MANAGING A DECISION-MAKING SITUATION IN THE CONTEXT

OF THE CANADIAN AIRSPACE PROTECTION

A. Guitouni, J.-M. Martel, M. Bélanger, C. Hunter

Centre de recherche en modélisation, information et décision(CERMID)

Version originale :Original manuscript :Version original :

ISBN – 2-89524-093-0ISBN -ISBN -

Série électronique mise à jour :One-line publication updated :Seria electrónica, puesta al dia

12-1999

Managing a Decision-Making Situation in the Context of theCanadian Airspace Protection1

A. Guitouni1, J.-M. Martel2, M. Bélanger3, and C. Hunter4

1. Decision Support Technologies Section, Command and Control Sector,Defense Research Establishment Valcartier (DREV), 2459 Pie-XI Blvd. North, Val-Bélair (Québec) G3J 1X5, Canada,

[email protected] Professor, Operations and Decision Systems Department,

Faculté des sciences de l’administration, Université Laval (Québec) G1K 7P4, CANADA,[email protected]

2. Operations and Decision Systems Department,Faculté des sciences de l’administration, Université Laval (Québec) G1K 7P4, CANADA,

[email protected]

3. Decision Support Technologies Section, Command and Control Sector,Defense Research Establishment Valcartier (DREV), 2459 Pie-XI Blvd. North, Val-Bélair (Québec) G3J 1X5, Canada,

[email protected] / [email protected]

4. Center for Operational Research and Analysis,1st Canadian Air Division, Department of National Defense, Box 17000, Stn Forces, Winnipeg, MB, R3J 3Y5, CANADA,

[email protected]

Abstract: An important activity of the Air Operation Center (AOC) is the elaboration, mitigation and evaluation ofdifferent Courses of Actions (CoAs) in order to respond to emergency situations. Defense Research EstablishmentValcartier (DREV) initiated a research activity to provide the AOC Commander and his senior staff, with advisorytools. The Commander’s Advisory System for Airspace Protection (CASAP), a distributed and unsynchronizedknowledge based decision support system, was then designed to help the AOC staff managing counter-drug eventsand their related CoAs. This C2 support system helps the Commander (or his representative) to screen and prioritizethe proposed CoAs to overcome the emergency situation. CASAP uses multicriterion decision analysis (MCDA)concepts and models. A MCDA methodology was used for the knowledge engineering. This methodology wasimplemented in CASAP, with different add-ins to help the Decision-Maker analyzing the CoAs, and minimizing therisk component introduced by the subjectivity and the uncertainty of the evaluation process. In this paper, we reviewthe modeling process of the decision-making situation, the formulation of the different criteria, and the aggregationprocedure. We introduce also the analysis tools implemented in CASAP.

Keywords: Multicriterion analysis; multicriterion aggregation procedure; counter-drug operations; Canadian airspace protection; Courses ofaction analysis; decision-making situation

1.0 INTRODUCTION

Rapid developments in artificial intelligence and in other information technology areas, andespecially in the modeling of decision-making processes, is to greatly influence the Commandand Control (C2) strategy and structure of the 21st century armed forces. In fact these technologiesmight allow shorter chains of command, a flatter organizational structure, rapid sharing of theinformation, and faster task orders dissemination. Modern decision aids, that are able to dealeffectively with complex operational situations involving masses of current information and priorknowledge, might help to enhance the C2 capabilities and structures.

1 This document has been submitted to the Military Operations Research Journal.

Managing a Decision-Making Situation in the Context of the Canadian Airspace Protection

A. Guitouni, J.-M. Martel, M. Bélanger, and C. Hunter 2

Within 1 Canadian Air Division/Canadian NORAD region headquarters (1 CAD/CANR), AirOperation Centre (AOC) is responsible for the day-to-day air force operations monitoring andcontrol. For routine operations, C2 decisions are effected through established doctrine, orders,and procedures. As contingency operations are introduced (crisis, contingency deployment orconflict) the AOC becomes the focal point for planning, directing, controlling and monitoringassigned forces. An important element for either type of operation is the elaboration, mitigationand evaluation of different Courses of Actions (CoAs) in order to respond to the emergencysituations. During such situations, AOC staff members work under stress conditions, and have toprocess a large amount of information within a short time cycle. In fact, time constrains theprocess to extensively assess the situation, generate wide-range of CoAs, and intensively evaluatethem according to significant point-of-views, before selecting and executing the “best” possiblecompromise. Moreover, it might be impossible to have access to the experience of resources thatare not inside the command post at that specific time.

To help the people of the AOC, Defense Research Establishment Valcartier (DREV) started aresearch activity aimed at investigating approaches and concepts, and developing advancedtechnologies to provide the Commander and his senior staff, with advisory tools for planning,management and employment of air defense resources and capabilities. Commander’s AdvisorySystem for Airspace Protection (CASAP), a distributed and asynchronous knowledge basedDecision Support System (DSS), was foremost developed to assist the AOC staff managingcounter-drug events and their related CoAs, as well as to analyze, evaluate and prioritize theseCoAs according to different evaluation criteria. Since the Commanders need to balance severalconflicting and incommensurable criteria to make “wise” decisions, Multicriterion Decision Aid(MCDA) methodology deemed to be appropriate to be implemented in CASAP.

In this paper, we report the methodology and procedures used to develop CASAP. In particular,in section 2, we review the context of operation of the Canadian Airspace Protection (CAP), andwe introduce the MCDA methodology. In section 3, we discuss the structuring (modeling andformulation) process of the decision-making situation (DMS), and the knowledge engineering toformulate the different evaluation criteria. We introduce, in section 4, a multicriterion aggregationprocedure based on the outranking synthesizing approach. This procedure is implemented inCASAP along with different add-ins to help the Decision-Maker (DM) analyzing the CoAs, andminimizing the risk component introduced by the subjectivity and the uncertainty of theevaluation process. In section 5, we present CASAP’s functional architecture as well as somephysical interfaces. Finally, in section 6, we discuss the MCDA methodology, and we proposefurther developments.

2.0 REVIEW OF THE CONTEXT OF OPERATIONS AND THE MCDAMETHODOLOGY

2.1 Context of operations

Within 1 CAD/CANR, AOC is responsible for the day-to-day air force operations monitoring andcontrol. For routine operations, C2 decisions are effected through established doctrine, orders,and procedures. As contingency operations are introduced (crisis, contingency deployment orconflict) the AOC becomes the focal point for planning, directing, controlling and monitoring

Managing a Decision-Making Situation in the Context of the Canadian Airspace Protection

A. Guitouni, J.-M. Martel, M. Bélanger, and C. Hunter 3

assigned forces. During stressful situations, AOC staff members have to process a large amountof information within a short time cycle. Moreover, it might be impossible to have access to theexperience of resources that are not inside the command post at that specific time.

In case of airspace violation, the Force Employment staff within the 1 CAD/CANR AOC mustelaborate, mitigate and evaluate different CoAs for the airspace protection. In the event of anairspace violation, whether it is intentional or not, falls into routine operations category [1 CAD],existing C2 relationships and aerospace doctrine will govern the response to this event. Throughthe application of this operation order, which clearly delineates what is expected from whom, aresponse to an airspace violation under normal circumstances can be handled in a timely fashionwith minimal AOC staff effort. In this situation, the C2 process, modeled by the Object-Orient-Decide-Act (OODA) loop (Figure 1), particularly the D portion, gets compressed since there isare pre-defined operation orders, and more often default CoA is in place.

Observe

Orient

DecideAct

Sense

Collect

AnalyseUnderstand

Identify/GenerateCOAs

EvaluateCOA

SelectCOAPlan/

Schedule

Execute

Monitor/Control

FIGURE 1 - The C2 OODA loop

However, if the airspace violation event doesn’t fit into the routine operation, it will beconsidered as an immediate operational contingency. The AOC then moves to the ImmediateAction Team (IAT). The IAT’s responsibilities include initial assessment, contingency planning,CoA development and analysis, as well as the issuance of warning and execution orders [1 CAD].The IAT therefore completes all the activities in the Orient, Decide and Act processes once theteam has been activated. The IAT is comprised of personnel who have the appropriate skill-sets toreact to short-notice crisis situations and develop comprehensive contingency plans for CF airoperations. In fact, time constrains the process to extensively assess the situation, generate wide-range of CoAs, and to intensively evaluate them according to significant point-of-views, beforeselecting and executing the “best” CoA; achieving the best compromise among all the evaluationcriteria. It is here where a CoA advisory tool would greatly assist the IAT in ensuring that all thenecessary functional activities are completed and verified, the CoAs development and analysisprocess is correctly performed, and the selection of a suitable CoA is accomplished in a timelymanner.

2.2 MCDA methodology

Commanders have always needed to balance objectives and factors to make “wise” decisions.Then MCDA models and procedures appear to be appropriate to deal with aerospace protection

Managing a Decision-Making Situation in the Context of the Canadian Airspace Protection

A. Guitouni, J.-M. Martel, M. Bélanger, and C. Hunter 4

DMS. MCDA methodology aims to aid the DM to handle semi-structured decision andmultidimensional problems with multicriterion, where the components are transitional and therequired information insufficient [Siskos and Spyridakos, 1999]. Within the paradigm of theclassical operational research, a decision problem is modeled by an objective function (f) to beoptimized over a set of feasible solutions (XX); it is the essence of classical mathematicalprogramming. Within, the MCDA perspective, the idea of the optimal solution is replaced by theconcept of satisfactory alternative, which realizes the best possible compromise for the DMconsidering all the conflictual evaluation criteria. This change was the beginning of thedevelopment of many MCDA methods.

A DM faced with a decision is often called upon to reconcile several aspects or points of view,which are often conflicting and incommensurable. MCDA models and procedures are mergingfrom the OR field, and based on theories such as the social choice, decision-making, preferencesmodeling, measurement theory, etc. Within the MCDA perspective, the decision aid process canbe seen as a recursive (iterative), and non-linear process composed of five steps [Guitouni, 1998]:i) structuring and formulation of the decision making situation: knowledge acquisition andengineering, and problem modeling (A- F - E), ii) articulation and modeling of the DM’spreferences for each criterion (local preferences modeling) and inter-criteria information, iii)aggregating these preferences and information to establish one/many global relationship systemamong the alternatives (CoAs), iv) exploiting the relationship system, and v) result andrecommendation.

In the context of MCDA, a set of alternatives (CoAs) has to be evaluated according to a set ofcriteria (see Table I). Let A be the set of the CoAs, A = {a1, a2,..., ai, ..., am}, and F, a coherentfamily of criteria, F={C1, C2,..., Cj, ..., Cn}. The value of the ith CoA evaluation according to thejth criterion is denoted eij. The evaluation of all the CoAs according to the set of criteria producesthe multicriterion performance table E (see table 1). Subsequent to the evaluation step, thedecision-aid process involves an extensive effort of the DM’s preference articulation andmodeling, which are discussed in the following sub-section.

Table 1: Multicriterion Performance Table (E)

Criteria (1…n)

C1 ... Cj ... Cn

a1 e11 ... e1j ... e1n

: : : : : :

ai ei1 ... eij ... ein

: : : : : :CoA

s (1

…m

)

am em1 ... emj ... emn

2.3 Preference modeling

Modeling the DM’s preferences is not an easy task because it is difficult to apprehend thesubjectivity and the imprecision/ambiguity of the human behavior. The major challenge facing

Managing a Decision-Making Situation in the Context of the Canadian Airspace Protection

A. Guitouni, J.-M. Martel, M. Bélanger, and C. Hunter 5

the implementation of any decision-aid is the “accurate” assessment of these preferences. Siskos(1982) declared that “... the major problem for the analyst facing multicriterion phenomenaraise, is the assessment of a model reflecting at best the preferences of the decision-maker.”

The literature reviewed reveals many ways and theories about preferences articulation andmodeling. For example, utility functions, valued functions, pairwise comparisons, tradeoffs, anddiscrimination thresholds could be used (see [Fishburn, 1970; Keeney and Raiffa, 1976; Keeney,1992; Roubens and Vincke, 1985; Roy and Bouyssou, 1993; Saaty, 1980; Vincke, 1989] for moredetails). In this section, we limit our review to the discrimination thresholds for preferencesmodeling.

Generally, when comparing two CoAs taking into account many aspects, a DM may be in one ofthe following situations: i) he is indifferent between ai and ak (denoted ai~ak), ii) he strictlyprefers ai to ak (denoted aiffak), iii) he weakly prefers ai to ak (hesitation between indifference andstrict preference: denoted aiff

fak), or iv) he considers that ai is incomparable to ak (hesitationbetween aiffak and akffai, or the two CoAs are a priori matchless: denoted ai?ak).

If when comparing two alternatives ai and ak according to a “true-criterion”, we get thefollowing preference relations:

=⇔

>⇔

kjijki

kjijki

eeaa

eeaa

j

j

~

f(1.)

However, the evaluation of a CoA with regard to a criterion j is often uncertain and imprecise.Moreover, on top of these uncertainties and imprecision, the DM's preferences may involve somehesitations, doubts, indecision, etc. In order to take these behaviors into account, one canintroduce an indifference threshold q

j≥0, such that if the performances of two alternatives on a

criterion j differ by less than qj, then there is an indifference relation ~j such as:

<−⇔

≥−⇔

jkjijki

jkjijki

qeeaa

qeeaa

j

j

~

f(2.)

A criterion having an indifference threshold is called a “quasi-criterion”. If qj is set to 0, then thequasi-criterion collapses to a true-criterion. Moreover, one may define a strict preferencethreshold for a quasi-criterion j, pj≥0, such that if the performances of ai and ak according to thiscriterion differ by at least pj, then it is a situation where one alternative is strongly preferred ffj tothe other one. However, if this difference is between qj and pj, we can conclude to a weakpreference fff

j between the two CoAs. A criterion with preference and indifference thresholds iscalled a “pseudo-criterion”, and is illustrated as follows:

Managing a Decision-Making Situation in the Context of the Canadian Airspace Protection

A. Guitouni, J.-M. Martel, M. Bélanger, and C. Hunter 6

jj

jkjijki

jkjijjkfji

jkjijki

qpwhere

qeeaa

peeqaa

peeaa

>

≤−⇔

<−<⇔

+≥⇔

j

j

~

f

f

(3.)

If qj is set to 0, then the pseudo-criterion collapses to a pre-criterion, and if pj=qj, then thepseudo-criterion collapses to a quasi-criterion. One should note that these thresholds are notnecessarily constant; i.e. qj =qj(eij) and pj=pj(eij).

MCDA methodology also introduces the inter-criteria information. This information concerns therelative importance of the criteria, as well as the attribution of veto power to particular ones. Therelative importance of the criteria can be an explicit set of scalars representing tradeoffs, scaleharmonization, compromise, or intrinsic coefficients. Some procedures, like the lexicographic,introduce only a priority ranking of the criteria. We discuss the veto power in section 4.

2.4 Multicriterion aggregation approaches

The DM’s preferences and the data contained in E should be aggregated in order to recommendthe “best decision”. During the last three decades, the MCDA discipline has undergonesignificant development, which gives rise to several methods and procedures. Each procedure isbased on a particular hypothesis and operational approach. The review of the literature suggeststhat the different known discrete MCAPs are based on different operational approaches [Guitouniand Martel, 1998]. Here are the most two important:

- Single criterion synthesizing approach: The MCAPs of this approach seek to build anaggregation function V to represent the global score of a CoA ai by a single synthesizingcriterion C(ai) obtained from the scores with regard to criteria c1,...,cn byV(ai)=f[C1(ai),…,Cn(ai)] (see [Fishburn, 1970; Keeney and Raiffa, 1976; Keeney, 1992]). Thisapproach aims to the construction of a value system that aggregate the DM (DM)’spreferences on the criteria (attributes) based on strict assumptions; only the strict preferenceand the indifference are considered, and these preference relations are complete and transitive.For example, the MultiAttribute (Value) Utility Theory (MAUT/MAVT) drops within theframework of this approach. The DM’s preferences are modeled throughout partial utilityfunctions, then aggregated in a single synthesizing utility function.

- Outranking synthesizing approach: The idea behind this approach is to be able to establish anoutranking relation between two CoAs; aiSak : ai outranks ak = ai “is at least as good as” ak

(see [Roy, 1985; Roy and Bouyssou, 1993; Vincke, 1992]). Instead of computing a score foreach CoA ai, it computes V(ai,ak) for each pair of alternatives (ai,ak) )∈A×A. This approach isinspired from the social choice theory. Each criterion is considered as a voter, with aparticular power, and each CoA as a candidate. The DM’s preferences are modeledthroughout a set of parameters (thresholds), and the aggregation is partially compensatory.This aggregation approach is based on the pairwise comparison of the CoAs along with each

Managing a Decision-Making Situation in the Context of the Canadian Airspace Protection

A. Guitouni, J.-M. Martel, M. Bélanger, and C. Hunter 7

criterion. The preference relational system (p.r.s.) obtained could include the indifference,strict preference, weak preference, intransitivity or even the incomparability.

The review of the MCDA literature shows that the exploitation phase of a MCAP can beaccomplished in different manners. For instance, if the MCAP is based on the single criterionsynthesizing approach, the exploitation step is trivial in the sense that a unique value (accordingto the unique criteria) is associated to each CoA during the aggregation phase. The aggregationphase is often considered as a weighted sum of the CoA performances obtained according to eachcriterion. Sometimes, a multiplicative model may be used. Generally speaking, the aggregation ofthis type is totally compensatory. Then, no matter what the decisional problematic (sorting,ranking, choice, etc.) is, this unique value allows to address directly the proper problematic.However, if the MCAP belongs to the outranking approach, the exploitation phase is performedin light of the decisional problematic.

The aggregation phase of the MCAPs, based on the outranking approach, is achieved through aseries of pairwise comparisons. For these procedures, the aggregation phase consists ofcomputing concordance index for each pair of alternatives, which most of the time takes the formof a weighted sum of the pairwise comparison results. The DM’s local preferences modelingusually introduces discrimination thresholds: indifference threshold, strict preference threshold,and sometimes a veto threshold. Some outranking methods, like ELECTRE III [Roy, 1978],introduce a discordance analysis to consider the opposition of the discordant criteria. Anoutranking index for each pair of alternatives is then computed by fusion the global concordanceindex to the local discordance indexes. These thresholds and the discordance analysis enable toattenuate the compensation phenomena.

3.0 STRUCTURING THE DECISION MAKING SITUATION: FORMULATIONPROCESS

In order to develop CASAP, researchers from DREV worked for a few years with the ExecutiveElements (Commander and his senior staff) at the former Fighter Group/CANR headquarters, inorder to acquire knowledge about the DMS and to identify key factors considered to evaluatedifferent CoAs. After a first prototype, multicriterion analysis methodology was chosen toengineer the knowledge acquired. The readings of Canadian military operation documents (ex. [1CAD, 1999; FG, 1990; FG, 1992; FG/CANR, 1992]), and operational checklists confronted byknowledge acquisition sessions with the operational air force personnel led to the identification offive factors to be considered while evaluating CoAs for counter-drug scenarios in a peace timecontext. The key factors identified are Flexibility, Complexity, Sustainability, Cost-of-Resources,and Risk. Each of these factors was decomposed into one or many evaluation criteria. In fact, 14criteria were formulated, and partially validated with the DM. This set of criteria was alsopresented to various plan staff in the new single air force operational headquarters (1CAD/CANR HQ). In this section, we briefly present and discuss the various criteria obtained.

3.1 The Flexibility

The flexibility factor is concerned with the ability to adapt to various possible changes that mayoccur while implementing a CoA. This dimension is divided into three criteria: Covering

Managing a Decision-Making Situation in the Context of the Canadian Airspace Protection

A. Guitouni, J.-M. Martel, M. Bélanger, and C. Hunter 8

Operational Tasks (C1), Covering Mission’s Possible Locations (C2), and Covering Enemy’sCoAs (C3). The criterion C1 is concerned with the ability of CoA ai to adapt to possible changesin operational task k (OTk), which may occur during its implementation. OTk is the result of thehierarchical decomposition of the national security objectives, and can take values such as:Monitor transport routes / Coordinate with civilian and military units for joint operations /Participate in detection, tracking and interdiction of narcotics transport / Interceptcommunications, etc. Different OTk may have different importance (

kOTπ ). For each selected

operational task, staff member assesses a coverage degree (xik) for each CoA (ai) over each OTk.This criterion is to be maximized, and the evaluation of ai is given by ei1∈[0,1] as follows:

=⋅π= ∑

Otherwise0

coveredpartially is if1 and 0between

coverscompletly if1

where,1 k

ki

ikk

ikOTi OT

OTa

xxek

(4.)

The criterion of covering mission’s possible locations (C2) is concerned with the ability of a CoAto adapt to possible changes in the predicted mission’s locations which may occur during theimplementation of a COA. This criterion is to be maximized. Considering the weatherconditions, the distance to Area of operations (AOO), as well as AOO characteristics, the staffmember assesses the conditional probability

iktimep that ai covers in time (no partial covering) the

mission’s location MLk. Each possible mission location is predicted a probability kMLp . A

mission’s possible location can be specified by its coordinates or by its known name. Theevaluation of ai with respect to C2 is given by ei2∈[0,1] in the following manner:

∑ ⋅=k

timeMLi ikkppe 2 (5.)

The criterion of covering enemy’s CoAs (C3) is concerned with the ability of a COA ai to adapt intime to possible changes in the enemy’s CoAs that may occur during the implementation. Thiscriterion is also to be maximized. The operator assesses the conditional probability

ikerpcov that ai

covers in time the enemy's course of action given that the enemy executes ECOAk. Each ECOAk ispredicted with a probability

kECOAp . The evaluation of ai with respect to C3 is given by ei3∈[0,1]:

ijj ercovj ECOAi ppe ∑ =⋅= 3T

13 (6.)

3.2 Complexity

This factor is concerned with the CoA implementation complexity. It is divided into threecriteria: operations complexity, logistic complexity, and C2 complexity. The OperationsComplexity (C4) is concerned with the COA implementation difficulties caused by its operationalrequirements. For each important resource in operation to execute ai, and for each function to beperformed by this resource, the operator assesses the complexity on an ordinal scale (High,Medium, Low) by taking into account the AOO characteristics and deployment conditions. Theresources might have different importance in the execution of the COA. An ordinal evaluation ei4

Managing a Decision-Making Situation in the Context of the Canadian Airspace Protection

A. Guitouni, J.-M. Martel, M. Bélanger, and C. Hunter 9

of each ai with respect to C4 is obtained by redefining a global ordinal scale by aggregate thefunctions complexity evaluations (see [Abi-Zeid et al. 1998]) for more details). The redefinedordinal scale contains 5 levels: very low complexity, low complexity, moderate complexity, highcomplexity, and very high complexity. This criterion is to be minimized.

The criterion logistics complexity (C5) is concerned with the COA implementation difficultiescaused by its logistics requirements. This criterion is to be minimized. For each ai, the operatorassesses its Scheduling Logistics Requirements, Refueling Logistics Requirements, Landing andTake-Off Logistics Requirements, Communication Logistics Requirements, Personnel LogisticsRequirements and Re-supply Requirements according to an ordinal scale of three levels (High,Medium, Low) scale. The different requirements may have different importance. Here again, anordinal aggregation procedure amalgamates the requirement assessments to obtain an ordinalevaluation ei5 of ai with respect to C5; ei5 is expressed using the same scale as ei4 [Abi-Zeid et al.,1998].

The C2 Complexity criterion is concerned with the COA implementation difficulties caused byC2 relationships and Coordination requirements (CR) in operation. This criterion is to beminimized. For each ai, the operator assesses the C2 relationship complexity and the CRcomplexity on an ordinal scale of 5 levels (very high, high, medium, low, very low). Then, theevaluation ei6 of ai with respect to C6 is modeled by a distribution (see [Abi-Zeid et al. 1998]).

3.3 Sustainability

This factor is concerned with the ability to continue (stay in) the operation. It is represented byonly one criterion: Sustainability. The sustainability criterion is used to evaluate an expectedratio of the COA estimated on-station time over the total required on-station time. This latest timeincludes the event required on-station time as well as an estimated additional operational time. Itis obvious that this criterion is to be maximized.

3.4 Optimum use of resources

This factor is concerned with the optimality of the resources employment. It is represented byonly one criterion: Cost of Resources. This criterion is to be minimized. The evaluation ei8 of eachai according to the C8 is obtained by summing the incremental cost, in dollars, associated with theuse of each resource (equipment) implied in this CoA i.

3.5 Risk

This factor represents the risks of mission failure as well as the risks associated with the mission.This factor is developed into six criteria: Impact of the Sensors Coverage Gap, Risk of MilitaryPersonnel Loss, Risk of Collateral Damage, Confrontation Risk, CoA Equipment Reliability, andCOA Personnel Effectiveness. The criterion Impact of the Sensors Coverage Gap (C9) isconcerned with the possibility of mission failure caused by the possible radar and/or radio gaps.This criterion is to be minimized. For each ai, the operator assesses the mission failure risk due toa radar gap, and/or a radio gap on an ordinal scale (High, Medium, Low). These evaluations are

Managing a Decision-Making Situation in the Context of the Canadian Airspace Protection

A. Guitouni, J.-M. Martel, M. Bélanger, and C. Hunter 10

combined using the importance of each sensor to build a distribution, which represent theevaluation ei9 of ai according to C9.

The criterion risk of military personnel loss (C10) represents the likelihood of military personnelloss during the mission. The operator assesses for each ai the likelihood of military personnelloss on an ordinal scale (highly likely, very likely, likely, unlikely, highly unlikely) given that theenemy is expected to execute the ECOAk with a probability (

kECOAp ). Then, one will obtain as

much conditional evaluations as there are predicted enemy’s CoAs. We use an ordinalaggregation procedure to transform this series of evaluation into an ordinal evaluation according anew ordinal scale with 7 levels. The evaluation ei11 is an ordinal evaluation to be minimized [Abi-Zeid et al., 1998].

The criterion risk of collateral damage (C11) measures the possibility of collateral damage(anything but the target) during the mission. The criterion confrontation risk (C12) is concernedwith the assessment of mission failure due to possible confrontations. The evaluations ei11 andei12 of a CoA ai according to these two criteria are to be minimized, and obtained in the same wayas ei10.

The criterion C13 is concerned with the equipment reliability and the robustness of the COA. Theevaluation ei13 for each CoA ai is a cardinal evaluation computed using the information about thefailure risks of the different resources used to implement this CoA. This information can beexpressed by the mean time between failures (MTBF). We use the system reliability theory tocompute the evaluation ei13. This criterion is to be maximized. The criterion C14 is concernedwith the effectiveness of the personnel, which may be jeopardized by fatigue, stress, etc. at anymoment during the mission. This criterion is to be maximized. For each ai, and taking intoaccount its complexity, the operator assesses the effectiveness of the personnel on an ordinalscale with 5 levels (very highly effective, very effective, effective, lowly effective, very lowlyeffective). The evaluation ei14 is considered as a direct linguistic evaluation, and modeled by afuzzy number (fuzzy set theory).

The features of the 14 criteria are summarized in table 2. The first column identifies the criterion,the second identifies whether the criterion should be maximized or minimized, the third columnidentifies the type of individual scales used for the evaluation of the CoA with regard to thecriterion. This evaluation can be deterministic or a probability value; it can be crisp (given by asingle number) or a distribution (given by a vector of evaluation or a by a fuzzy number); themeasurement scale can also be continuous or discrete.

Managing a Decision-Making Situation in the Context of the Canadian Airspace Protection

A. Guitouni, J.-M. Martel, M. Bélanger, and C. Hunter 11

Table 2: Summary of the 14 criteria [[Abi-Zeid et al. 1998]]

Factor Criterion Optimization Scale Evaluation

Flexibility

C1: Covering Operational Tasks Maximize Cardinal on [0,1] Crisp, Deterministic, Continuous

C2: Covering Mission's locations Maximize Cardinal on [0,1] Crisp, Probabilistic, Continuous

C3: Covering Enemy's CoA Maximize Cardinal on [0,1] Crisp, Probabilistic, Continuous

Complexity

C4: Operations Complexity Minimize Ordinal, 5 echelons Crisp, Deterministic, Discrete

C5: Logistics Complexity Minimize Ordinal, 5 echelons Crisp, Deterministic, Discrete

C6: C2 Complexity Minimize Ordinal, 5 echelons Distribution, Discrete

Sustainability

C7: Sustainability Maximize Cardinal, R+ Crisp, Deterministic, Continuous

Cost of resources

C8: Cost of Resources Minimize Cardinal, R+ Crisp, Deterministic, Continuous

Risk

C9: Impact of Sensors Coverage Gap Minimize Ordinal, 3 echelons Distribution, Discrete

C10: Military personnel loss Minimize Ordinal, 7 echelons Crisp, Probabilistic, Discrete

C11: Collateral damage Minimize Ordinal, 7 echelons Crisp, Deterministic, Discrete

C12: Confrontation risk Minimize Ordinal, 7 echelons Crisp, Probabilistic, Discrete

C13: Equipment reliability Maximize Cardinal on [0,1] Crisp, Probabilistic, Discrete

C14: Personnel effectiveness Maximize Ordinal, 5 echelons Fuzzy, Distribution, Continuous

4.0 MCAP FOR CASAP: PAMSSEM I AND II

In order to select an appropriate MCAP to deal with the DMS in a context of CAP, one shouldmatch the MCAPs features to the decisional context requirements. In the context of violation ofCanadian airspace, the requirements of CASAP could be summarized as follows:

- Evaluations are mixed: ordinal and cardinal, crisp and distribution;

- DM wants to get a ranking (prioritization) of the CoAs (also called Decision problematic P.γ);

- DM wants to assign a relative importance coefficient to balance the criteria;

- The DMS requires a “controlled” compensation between the criteria;

- Decision-making in a risky context: model the uncertainties, DM’s hesitations andpreferences accordingly to the doctrine and the rules of operations;

- Limited time to evaluate and prioritize the CoAs (in case of emergency);

- The DM is well identified (single DM).

The majority of MCAP proposed in the literature considers implicitly the hypothesis that thenature and the type of data contained in E (representing the evaluations of the CoAs) arerelatively homogenous. The MCAP to be implemented in CASAP should limit the compensationbetween the criteria, deal with their heterogeneity, and their measurement scales. Moreover, thethresholds should help the DM articulate and distinguish his preferences. Thus, a MCAP based onan outranking approach is suitable to deal with the DMS in the context of CAP. This approach

Managing a Decision-Making Situation in the Context of the Canadian Airspace Protection

A. Guitouni, J.-M. Martel, M. Bélanger, and C. Hunter 12

allows controlling the compensation phenomena among the criteria. Among the differentMCAPs, PAMSSEM seemed to be the most appropriate for CASAP. In fact, PAMSSEM wasdesigned to deal with mixed and missing evaluations. It produces either a partial or a totalpreorder of the alternatives. It offers enough flexibility to consider the DM’s preferences and tocontrol the compensations.

A preliminary version of PAMSSEM I and II was developed by Martel et al. (1996). PAMSSEMbelongs to the outranking approach and encompasses two phases: the aggregation and theexploitation. This MCAP is based on ELECRE III [Roy, 1978] for the construction of theoutranking relation, on PROMETHEE I and II [Brans et al. 1984] for the exploitation of theserelations, and on NAIADE [Munda, 1995] for the manipulation of fuzzy evaluations. The purposeof PAMSSEM is to construct an outranking relation, and provide further exploitation torecommend a “good” alternative. The construction of an outranking relation is based on a formalmodeling of the DM’s preferences according to each criterion: the local preferences modelingphase.

Like ELECTRE III, PAMSSEM uses discrimination thresholds (indifference and strictpreference) with different criteria. PAMSSEM’s inter-criteria information is of two types: explicitset of coefficients representing the relative importance of the criteria, and veto thresholds (νj) forsome criteria where it is appropriate. The veto threshold represents a limit of tolerance that theDM is willing to accept for any compensation. In other words, if the performance of ak is higherthan the performance of ai according to the criterion j by at least νj, the DM may refuse to choosea CoA ai over ak. For example, to intercept a drug smuggler, the Commander may have toconsider two CoAs: ai and ak. The CoA ai involves largely more losses of lives than ak. A vetothreshold on the criterion risk of losses of lives will prevent ai to globally outrank ak, even if ai isat least as good as (even better than) ak on the other criteria. The veto threshold may vary with theposition of the evaluation on the measurement scale associated to the criterion j (νj [eij)]).

PAMSSEM I and II allow to deal with different natures and different types of data. In fact, theevaluations can be crisp quantities, discrete/continuous random variables (with mass functions ordensity probability functions )( ijij xf ), linguistic or fuzzy numbers (which are associated with

membership functions )( ijij xµ ). PAMSSEM also deals with missing evaluations that could be the

result of lack of knowledge, forgetfulness or inappropriateness.

4.1 Treatment of distributional evaluations

PAMSSEM is able to deal with an evaluation eij, which is a distribution representing a randomvariable (or statistical variable) Xij with a probability density or mass function fij(xij). A crispevaluation can be seen as a degeneration of a distribution; the distribution is reduced to only onepoint (P(Xij = eij)=1). In case of fuzzy evaluations (e.g. linguistic evaluation) represented by amembership function, a “rescaling the ordinates of a membership function” is adapted fromNAIADE [Munda, 1995]. One could defines a function fij(xij) = Kij.µij(xij), in such way to obtain a“probability density function” verifying the following condition:

Managing a Decision-Making Situation in the Context of the Canadian Airspace Protection

A. Guitouni, J.-M. Martel, M. Bélanger, and C. Hunter 13

1)( =∫∞

∞−ijijij dxxf (7.)

4.2 PAMSSEM aggregation phase

The aggregation phase of PAMSSEM begins by computing a concordance index C(ai,ak) foreach pair of CoAs (ai,ak)∈A×A. This index is obtained as follows:

∑∑==

δ⋅π=δ⋅π=n

jkjijjj

n

jkijjki eeaaaaC

11

),(),(),( (8.)

Where πj is a normalized scalar representing the relative importance of the jth criterion; 11

=π∑=

n

jj .

δj(ai,ak) is a local outranking index computed for each pair of CoAs according to each criterion as

follows:

)()(),(),( kjkjx x

ijijkjijjkij xfxfxxaakj ij

∑ ∑ ⋅

⋅δ=δ (9.)

fij(xij) and fkj(xkj) are respectively the probability distribution functions (discrete) of xij and xkj. In

case of crisp evaluation, we obtain 1)()( === ijijijij xfxXP . ),( kjijj xxδ is an index computed

according to the following formula:

−≤∆

−<∆<−−

−∆∆≤−

jj

jjjjj

jj

jj

kjijj

p

qpqp

pq

xx

if0

if

if1

),( (10.)

Where kjijkjijj eexx −≈−=∆ , and jjj Epq ≤≤≤0 ; Ej is the maximum range of the

measurement scale of the jth criterion; qj= qj(xkj) and pj= pj(xkj). The crisp evaluations can behandled by considering fj(xij)=1 if xij=eij and 0 otherwise. From the purist point-of-view, it isrequired to consider an ordinal criterion as a true-criterion. In this case, we will have qj= pj= 0.For sake of processing uniformity for the cardinal and ordinal evaluations, we suggest a slightmodification in the computation of the concordance index for the ordinal criteria (when thenumber of levels of the ordinal scale >3) as follows:

−<∆

<∆≤−

∆≤

10

01

01

),( 21

j

j

j

kjijj

if

if

if

xx (11.)

Managing a Decision-Making Situation in the Context of the Canadian Airspace Protection

A. Guitouni, J.-M. Martel, M. Bélanger, and C. Hunter 14

Where ∆j is the inter-level gap.

The aggregation phase also involves the computation of another index: a local discordanceindex. This index is computed for each criterion and for each pair of CoAs (ai,ak)∈A×A. Thelocal discordance index Dj(ai,ak) states the opposition of the criterion j to the assertion that ai

outranks ak. This index is computed according to the following formula:

)()(),(),( ijijx x

kjkjkjijjkjijj xfxfxxDeeDij kj

⋅= ∑ ∑ (12.)

Where ),(),( kjijjkjj eeDaaD = ; and

ν−≤∆

−<∆≤ν−−ν

+∆−

∆≤−

=

jj

jjjjj

jj

jj

kjijj pp

pp

xxD

if1

if)(

if0

),( (13.)

νj=νj(xkj) is the veto threshold; νj>pj. Note that the value of this threshold is influenced by theimportance of the jth criterion. Here again, from the purist point-of-view, it is difficult to computediscordance indexes or define a veto threshold in the case of ordinal criteria (true-criteria).However, if we want to process uniformly the ordinal and cardinal evaluations, we suggestcomputing local discordance index for the ordinal criterion as follows:

+−<∆

++∆−

∆≤+

−=

⋅2

1 if)

2

1()(,1min

2

1 if0

),(j

jj

jj

jj

kjijj xxD ll

l

πξ(14.)

where jl is the number of levels of the measurement scale associated with the jth ordinal

criterion; jl >3. ξ(πj) is a non-decreasing function of the relative importance of the jth criterion.

This function could be for instance linear, or exponential; ξ(πj) = 0.2 (1 + πj/2), or ξ(πj) = 0.2 (2 +eπj/2),…

Then, PAMSSEM aggregates the concordance and the local discordance indexes to establish anoutranking degree σ(ai,ak) for each pair of CoAs (ai,ak)∈A×A. As suggested by Rousseau andMartel [Rousseau and Martel 1994], these degrees can be obtained as follows:

[ ] 1),(0),(1),(),(1

3 ≤σ≤⇒−⋅=σ ∏=

ki

n

jkijkiki aaaaDaaCaa (15.)

σ(ai, ak) represents the “consistency” level of the conclusion that “the CoA ai globally outranksak”, taking into account all the evaluation criteria. For example, if σ(ai,ak)=1, then the conclusion“ai outranks ak” is very well established. The valued outranking relations obtained can berepresented in an outranking matrix (see table 3). The aggregation phase of PAMSSEM

Managing a Decision-Making Situation in the Context of the Canadian Airspace Protection

A. Guitouni, J.-M. Martel, M. Bélanger, and C. Hunter 15

authorizes the possibility to get incomparability; it is possible that at the same time ai outranks ak,and ak outranks ai. This result is mainly due to the introduction the discrimination and vetothresholds.

Table 3: Outranking matrix

a1 a2 … am

a1 - σ( a1, a2) … σ( a1, am)a2 σ( a2, a1) - … σ( a2, am)...

.

.

.

.

.

.

.

.

.

.

.

.am σ( am, a1) σ( am, a2) … -

4.3 PAMSSEM exploitation phase

The exploitation of the outranking relations consists of making a synthesis of this outrankingmatrix to provide a recommendation. In CASAP, this recommendation will consist of ranking theCoAs. This exploitation may be obtained by introducing the concept of entering and leavingflows of PROMETHEE. For each CoA ai, we compute its leaving flow σ+(ai) and its enteringflow σ-(ai) as follows:

),()( kaa

ii aaaik

∑≠∀

+ σ=σ (16.)

),()( iaa

ki aaaik

∑≠∀

σ=−σ (17.)

The leaving flow represents the overall relative strengths of the CoA ai, and the entering flowrepresents its overall relative weaknesses. Note that the values of these flows can change if a CoAis introduced or removed to/from the set A. On the basis of these two flows, PAMSSEM Icomputes a partial preorder by the following procedure:

• Rank the CoAs of A according to a decreasing order of σ+(ai): this ranking constitutes a firstcomplete preorder P+ of the CoAs;

• Rank the CoAs of A according to an increasing order of σ-(ai): this ranking constitutes asecond complete preorder P- of the CoAs;

• A partial preorder P is obtained by the intersection of these two completepreorders: −+ ∩= PPP .

P can be a total preorder if and only if P+=P-. PAMSSEM I’s ranking includes incomparability.PAMSSEM II overcomes this incomparability by deriving (forcing) a total preorder. Thispreorder is the result of computing for each CoA ai a net flow )()()( iii aaa −+ σ−σ=Φ , and thenranking the CoAs according to a decreasing order of this flow.

Managing a Decision-Making Situation in the Context of the Canadian Airspace Protection

A. Guitouni, J.-M. Martel, M. Bélanger, and C. Hunter 16

4.4 Missing evaluations

In addition of mixed evaluations, PAMSSEM I and II allow to consider missing evaluations. Wemight have to deal with two types of missing evaluations: i) non-relevant evaluation, or ii)relevant evaluation but impossible to obtain. In the first case, we try to make sure that the “holes”in the performance table E do not favor or disfavor the concerned CoAs. If an evaluation eij ismissing, then we consider that

0),(),( =∆=∆ ikjkij aaaa , A∈∀ ka (18.)

This implies:

====

0),(),(

1),(),(

ikjkij

ikjkij

aaDaaD

aaaa δδ, A∈∀ ka (19.)

In fact, expression (19) means that we replace, each time, missing evaluation eij by the one of theCoA ka (ekj) to which it is compared. In the second case (relevant missing evaluations), insteadof replacing the missing evaluation by the mean-value of the other CoAs evaluations according tothis criteria as it is regularly done in statistics, we replace it by a distribution of evaluationsretrieved from the other (none missing) ones. If a relevant evaluation ije is missing, we replace it

by the set { }mjkjjj eeee ,...,,...,, 21 for all the CoAs { }ik aa \A∈ ; i.e. the evaluation of ai will be

modeled by a statistic variable ijX with all the values of kje , { }ik aa \A∈ and a mass function

)( ijxf =1/(m-1) (m is the number of CoAs in A).

5.0 RELATIVE IMPORTANCE COEFFICIENTS STABILITY ANALYSIS FORPAMSSEM II

The relative importance coefficients of the criteria play a major role in the outcome of theevaluation and aggregation process. These coefficients are in reality an estimate of the relativeimportance (πj) that the DM (Commander) gives to each criterion in order to balance his decision.CASAP includes a graphical and comprehensive procedure to set these relative importancecoefficients, which is not discussed in this document (inspired from the method proposed by [Royand Figueira, 1998]). However, we cannot eliminate completely the imprecision and vagueness ofhuman judgements. In these circumstances, it is helpful to determine to what extent the solution(ranking) obtained is sensitive to the relative importance coefficient variations. Then, weintroduce the stability analysis for PAMSSEM II. This analysis leads to determine stabilityintervals for the relative importance coefficient of each criterion π’j. These intervals represents thelimits of variations for each πj without any reversal order. This analysis can help the Commanderof the AOC identifying the sensitive factors that can affect the outcomes of the decision analysisprocess.

Recall that PAMSSEM II leads to a total preorder. The rank of each CoAs ai is determined bycomputing a net flow Φ(ai) for each CoA. The expression of σ(ai,ak) is given by the equation

Managing a Decision-Making Situation in the Context of the Canadian Airspace Protection

A. Guitouni, J.-M. Martel, M. Bélanger, and C. Hunter 17

(15). We can show that for quantitative criteria ( )( )∏=

−=n

jkijik aaDX

1

3 ,1 is independent of πj. For

qualitative criteria, we can suppose, without any loss of generality, that Dj(ai, ak) is a constantfunction of πj. Hence, we can rewrite the expression of σ(ai, ak) as follow:

( ) ( ) ( ) ( ) ( )[ ]∑ ∑≠= =

⋅δ−⋅δπ=Φ⇒⋅=σm

ikk

n

jkiikjikkijjiikkiki XaaXaaaXaaCaa

,1 1

,,(,, (20.)

Let ( ) jiQXaaQ jiiikkij

jik ,0 and,, ∀=⋅δ= . Then

( ) { }miQQan

j

m

k

jki

jikji ,...,2,1,)(

1 1

∈∀

−⋅π=Φ ∑ ∑

= =

(21.)

If Φ(ai) ≥ Φ(ak), then ri ≤ rk; where ri and rk are respectively the ranks of ai and ak. The objectiveof the sensitivity analysis is to determine π’j for each selected criterion in a way to respect thefollowing condition:

( ) ( ) ( ) ( )kiki aaaa Φ′≥Φ′⇔Φ≥Φ If (22.)

Where ( )iaΦ′ is the net flow computed using jπ ′ . Let −+ −+π=π′ jjjj dd , where +jd and −

jd are

respectively positive and negative deviations of πj (this decomposition is originally suggested in

[Wolters and Mareschal, 1995]). The new net flow can be computed as follow:

( ) ( ) ( ) { }miQQddaan

j

m

k

jki

jikjjii ,...,2,1,)(

1 1

∈∀

−⋅−+Φ=Φ′ ∑ ∑

= =

−+ (23.)

This transformation reduces the problem of finding the intervals of variations of the relativeimportance coefficients to the problem of maximum deviations. This problem is in reality anoptimization problem where the objective function can be written as follows:

( )∑=

−+ +n

jjj ddMax

1

(24.)

The first constraints should represent the condition (22). In order to reduce the number ofconstraints and reduce the computation burden, we suggest ordering the set of CoAs according totheir rank obtained by PAMSSEM II. Let (A) be an ordered set of CoAs, then we have

( ) ( )( ) ( )( ) { }1,...,2,1, 1)( −∈∀Φ≥Φ⇒∈ + miaaa iii A (25.)

By ordering the set of CoAs, the condition (22) will be expressed by only m-1 constraints. Thesefirst constraints can be formulated as follows:

Managing a Decision-Making Situation in the Context of the Canadian Airspace Protection

A. Guitouni, J.-M. Martel, M. Bélanger, and C. Hunter 18

( ) ( ) ( )( ) ( )( ) 1,...,2,1for ,11 1)(

))(1())(( −=Φ−Φ≥

−⋅− +

= =+

−+∑ ∑ miaaHHdd ii

n

j

m

k

jki

jkijj (26.)

Where jik

jki

jki QQH ))(())(())(( −= . Each π’j should be between 0 and 1, then we have these

constraints:

njdd

dd

jjj

jjj ,...,2,1for ,1

=

π≤−π−≤−

+−

−+

(27.)

To impose maximum deviations, we add the following constraints to the mathematical program:

njd

d

jj

jj ,...,2,1for ,0

1=

≥−π≤+π

+

(28.)

The sum of the π’j should be 1. This constraint is given by the following expression:

( ) 01

=−∑=

−+n

jjj dd (29.)

Finally, we add the following constraints to impose that any increase or decrease of πj shouldrespect the limits authorized on the other criteria:

njdd

dd

n

jkkkj

n

jkkkj

,...,2,1for ,0

0

;1

;1 =

≤−

≤−

≠=

+−

≠=

−+

(30.)

The mathematical program obtained is a linear program easy to solve using any solver based onthe simplex method. The number of constraints of this program is limited at most to (m+6n) withonly 2n variables. It is also possible to make local sensitivity analysis by considering only sub-setof the criteria. For example, it is possible to perform local analysis only by considering the riskfactor’s criteria. In order to implement such functionality, we add the following constraints to themathematical program for each unselected criterion j:

jd

d

j

jcriterion unselectedeach for ,

0

0

=

=−

+

(31.)

6.0 CASAP PROTOTYPE

The prototype CASAP was developed to deal with events of counter-drug operations. CASAPhelps the AOC team to describe and share information about such an incident, to developpertinent CoAs, to evaluate these CoAs and to determine which is most appropriate. This

Managing a Decision-Making Situation in the Context of the Canadian Airspace Protection

A. Guitouni, J.-M. Martel, M. Bélanger, and C. Hunter 19

knowledge based DSS is based on a distributed architecture implemented with JAVA applets andservlets and can be accessed through an Intranet browser (see figure 2).

Server

HT

TP

Tra

nsp

ort

ing

JA

VA

ob

ject

s

Dat

abas

es L

ayer

CoA Evaluation

CASAP Servlets

AT

BM

SD

atab

ase

Client: EventDescriptor

Client: CoADescriptor

Client : Commander

CA

AS

AP

Dat

abas

e

CASAP Applet

CASAP Applet

CASAP Applet

FIGURE 2 - CASAP structure

The functional architecture of CASAP comprises six modules (figure 3). One allows a user todescribe a counter-drug event and to share this information with other users. It includes a retrievalfacility to search out similar past events and manage the event’s database. A second moduleassists in the generation and description of CoAs that might be executed to respond to thedescribed event. This module includes the ability to retrieve and duplicate CoAs from past orarchived events. Once a satisfactory set of candidate CoAs is generated, CASAP notifies theCommander. Then the selection process begins. To help in this process, a third module evaluateseach CoA according to each selected criterion. These criteria are selected and weighted using thefourth module. A screening procedure (using a conjunctive method) is implemented to insure thequality of the CoAs presented to the Commander.

Managing a Decision-Making Situation in the Context of the Canadian Airspace Protection

A. Guitouni, J.-M. Martel, M. Bélanger, and C. Hunter 20

AEDMdataba se

Events and COAsD ataba se

Current Ev ent D es cription

Pos sib leC o As

M u lti crite riaPerform ance

Table

Execution

Vi ol ation ofC anadian Airspace

C riteriaC o As E v alu a ti o n

M o du l e

Event ManagementM o du l e

C O A s M ana g ementM o du l e

Pos t-ex ec u tionA n aly sisM o du l e

Comparis o n ,A n aly sis and

Selec tionM o du l e

C o A

C riteria S e le ctionand Weighti n g

M o du l e

C riteriaD ataba se

1

2

34

56

FIGURE 3 - CASAP Modules

The fifth module makes use of the multicriterion decision aid method and analysis tools that werepreviously described to help the DM selecting the “best” CoA. Moreover, the Commander cancommunicate with anyone logged on the CASAP system, either to announce the selection of aspecific CoA or to request additional candidate CoAs. After the execution of a particular CoA, apost-analysis module (6) can be used to summarize and manage lessons learned. Although thepresent system was developed to deal specifically with counter-drug scenarios, this moduleallows it to be extended easily to other situations, by suitable adjustments to criteria and otherparameters. To help the DM in the selection process, CASAP uses many friendly-user interfaces.For example, the ranking of CoAs is presented using a graphic and different result charts (figure4).

Managing a Decision-Making Situation in the Context of the Canadian Airspace Protection

A. Guitouni, J.-M. Martel, M. Bélanger, and C. Hunter 21

FIGURE 4 - PAMSSEM results

Since, the evaluations of the CoAs according to the different criteria might include uncertainty,ambiguity, fuzziness, and subjectivity, we developed several analysis tools to help the DMminimizing the risk component introduced in the evaluation process. It was then possible toimplement the relative importance coefficient stability intervals (see section 5 and figure 5), andtwo types of “what if analysis”: i) “what if analysis” on the CoAs, which evaluations consists ofanswers to the following question: “What could happen to the actual result if one or moreevaluations of one or many CoAs change?”, and “what if analysis” on the DM’s preferencesthresholds. In figure 5, the blue chart represents stability variation limits of the criteria weights forwhich the ranking will be almost unaffected (no rank reversal).

Managing a Decision-Making Situation in the Context of the Canadian Airspace Protection

A. Guitouni, J.-M. Martel, M. Bélanger, and C. Hunter 22

FIGURE 5 - Stability Analysis Result

7.0 CONCLUSION

This paper describes the work done related to the development and the implementation adistributed and asynchronous DSS for the C2 in a context of airspace violation (counter-drugoperations). CASAP was developed based on MCDA methodology for knowledge engineeringand criteria formulation. In order to evaluate and prioritize the different CoAs, a MCAP, calledPAMSSEM, was implemented. This MCAP is based on the outranking approach, and is able todeal with mixed and missing evaluations. PAMSSEM uses discriminating thresholds withdifferent criteria to model the DM’s preferences. The inter-criteria information is of two types:relative importance coefficient to balance the criteria, as well as veto thresholds for some criteria.

To better assist the DM analyzing the alternatives, analysis facilities has been developed andincluded to CASAP. These facilities are related to: i) relative importance coefficient stabilityanalysis method that has been proposed for PAMSSEM, ii) « what if analysis » method to analysethe possible effects of the evaluations modifications, iii) « what if analysis » method to deal withthe effects of the thresholds. This analysis allows the DM to see the possible effects of differentthresholds over the ranking.

Managing a Decision-Making Situation in the Context of the Canadian Airspace Protection

A. Guitouni, J.-M. Martel, M. Bélanger, and C. Hunter 23

This work allowed to better defining the types of decision-aid that could be provided to AOCstaff. There is still a lot of validation to be accomplished before being able to qualify or quantifythe advantages of such tool. The criteria and their settings have to be validated to be sure that theyrepresent a coherent family of criteria for the DM. The analysis approaches developed have to befully validated with users to get of precise idea of the understanding that the DM will give to theanalysis results. These validations will allow completing the prototype, which will be used todemonstrate the potentials and the effectiveness of such tools in an operational environment. Infurther publications, we will present in depth the functional architecture of CASAP as well as theimplementation developments.

ACKNOWLEDGEMENTS

The authors would like to thank Mr K. Jabeur for the implementation of PAMSSEM’s calculationmodule, Mr K. El-hage and Mr C. Gauthier (Neosapiens inc.) for the implementation of CASAPmodules, as well as Ms S. Leclerc for the implementation of some basic user interface.

REFERENCES

1 CAD Op Orders: Volume 3, Book 8, “Aerospace Control and Surveillance”.

1 CAD/CANR. 1999. Operations Centre: Concept of Operations - DP 900.

Abi-Zeid, I., Bélanger, M., Guitouni, A., Martel, J.-M. 1998. The Criteria for an Advisor Tool inCanadian Airspace Violations Situations, C2 Research and Technology Symposium, NavalPostgraduate School, Monterey, California, 731-739.

Brans, J. P., Mareschal, B. et Vincke, Ph. 1984. PROMETHEE: A new Family of OutrankingMethods in Multicriterion Analysis, in , J. P. Brans (ed.). Operational Research’84. ElsevierScience Publishers, B.V.: North Holland, 408-421.

Canadian Forces Fighter Group Canadian NORAD Region. 1990. OPLAN 3000-90.

Fighter Group/Canadian NORAD Region. 1992. FG/CANR DP-900, WAR HeadquartersDefence PLAN.

Fighter Group: Air Command Defence Plan (ACDP) 580. 1992. OPLAN SPECTRE.

Fishburn, P. C. 1970. Utility Theory for Decision Making. John Wiley and Sons, New York.

Guitouni, A. 1998. L’ingénierie du choix d’une procédure d’agrégation multicritère. Ph.D.Dissertation. Department Opérations et syst èmes de décision, Université Laval, Quebec.

Guitouni, A. and Martel, J.-M. 1998. Tentative Guideline to Help Choosing an AppropriateMCDA Method, European Journal of Operational Research, No 109, 501-521.

Keeney R. and H. Raifaa. 1976. Decision iwth Multiple Objectives; Preferences and Value Trade-Offs. Wiley, New York.

Keeney, R. 1992. Value Focused Thinking: A Path to creative Decision Making. Cambridge:Harvard University Press.

Managing a Decision-Making Situation in the Context of the Canadian Airspace Protection

A. Guitouni, J.-M. Martel, M. Bélanger, and C. Hunter 24

Martel, J-M., Kiss, L. N., and Rousseau, M. A. 1996. PAMSSEM : Une procédure d’agrégationmulticritère de type surclassement de synthèse pour évaluations mixtes. FSA Working Document,Department Opérations et Systèmes de Décision, Université Laval, Quebec.

Munda, G. 1995. Multicriterion Evaluation in a Fuzzy Environment, Springer-Verlag, New York.

Roubens, M. and Ph. Vincke. 1985. Preference Modelling. Springer-Verlag, Berlin.

Rousseau, A. et Martel, J.-M. 1994. Environmental Evaluation of an Electric Transmission LineProject: An MCDM Method, in M. Paruccini(ed.). Work Applying Multicriterion Decision Aidfor Environmental Management. Kluwer Academic Publishers.

Roy, B. 1978. ELECTRE III: un algorithme de classement fondé sur une représentation floue despréférences en présence de critères multiples, Cahiers du Centre d’Études de RechercheOpérationnelle, Vol 1, No 20, 3-24.

Roy, B. and D. Bouyssou. 1993. Aide Multicritère à la décision: Méthodes et Cas. Economica,Paris.

Roy, B. and Figueira, J. 1998. Détermination des poids des critères dans les méthodes de typeELECTRE avec la méthode de SIMOS révisée. Document du Lamsade, No 109, Université Paris-Dauphine.

Saaty, T. L. 1980. The Analytic Hierarchy Process. McGraw-Hill, New York.

Siskos, J. 1982. A Way to Deal with Fuzzy Preferences in Multi-Criteria Decision Problems,European Journal of Operational Research, No 10, 314-324.

Siskos, Y. and Spyridakos, A. 1999. Intelligent Multicriteria Decision Support: Overview andPerspective, European Journal of Operational Research, No 113, 314-324.

Vincke, Ph. 1989. L’aide multicritère à la décision. Éditions de l’Université de Bruxelles,Brussels.

Wolters, W.T.M. and Mareschal, B. 1995. Novel Types of Sensitivity Analysis for AdditiveMCDM Methods, European Journal of Operational Research, No 81, 281-290.


Recommended