+ All Categories
Home > Documents > CHAPTER 6 (handout)

CHAPTER 6 (handout)

Date post: 20-Feb-2016
Category:
Upload: rianna
View: 29 times
Download: 3 times
Share this document with a friend
Description:
CHAPTER 6 (handout). Decision Trees. 6.1. Introduction. Sequential decision making sequence of chance-dependent decisions presentation of analysis can be complex Decision Trees Pictorial device to represent problem & calculations - PowerPoint PPT Presentation
59
1 CHAPTER 6 (handout) Decision Trees
Transcript
Page 1: CHAPTER 6 (handout)

1

CHAPTER 6 (handout)Decision Trees

Page 2: CHAPTER 6 (handout)

2

6.1. Introduction

Sequential decision making sequence of chance-dependent decisions presentation of analysis can be complex

Decision Trees Pictorial device to represent problem &

calculations Useful for problems with small no. of sequential

decisions

Page 3: CHAPTER 6 (handout)

3

6.3. Another Decision Tree Ex.

2 boxes, externally identicalMust decide which box a1: box 1: 6 black balls, 4 white balls a2: box 2: 8 black balls, 2 white balls

Correct guess Receive $100 Wrong guess Receive $0

Prior Probability P(1) = 0.5 P(2) = 0.5

Page 4: CHAPTER 6 (handout)

4

Decision Tree

A connected set of nodes & arcs

Nodes: join arcs Arcs: have direction (L to R) Branch: arc & all elements that follow it

2 branches from same initial node cannot have elements in common

2 nodes cannot be joined by > 1 arc

Page 5: CHAPTER 6 (handout)

5

Example of a Decision Tree

Page 6: CHAPTER 6 (handout)

6

A diagram which is not a tree

Page 7: CHAPTER 6 (handout)

7

Types of nodes

Decision point• Choosing next action (branch)

Chance node• Uncontrollable probabilistic event

Terminal node• Specifies final payoff

Page 8: CHAPTER 6 (handout)

8

Example of Sequential Decision ProblemCar Exchange Problem

A person must decide whether to keep or exchange his car in a showroom. There are 2 decisions: a1: keep cost = 1400 SR

a2: exchange, has 2 possibilities: • good buy P(G) = 0.6 cost = 1200 SR• bad buy P(B) = 0.4 cost = 1600 SR

Good or bad buy can be identified only after buying and using the car. What he should do to minimize his cost?

Page 9: CHAPTER 6 (handout)

9

Car Exchange Problem (no information)

Payoff (Cost) Matrix

P() a1: keep a2: exchange

1: Good 0.6 1400 1200

2: Bad 0.4 1400 1600

EV 1400 1360

Page 10: CHAPTER 6 (handout)

10

Car exchange decision tree

Keep

Exchange

G: 0.6

B: 0.4

$1400

$1400

G: 0.6

B: 0.4

$1200

$1600

Page 11: CHAPTER 6 (handout)

11

Car exchange decision tree

Keep

Exchange

G: 0.6

B: 0.4

$1400

$1400

G: 0.6

B: 0.4

$1200

$1600

$1400

$1360

Page 12: CHAPTER 6 (handout)

12

6.2. A Sequential Test ProblemCar Exchange Problem

Assume the person has 5 options for deciding whether to keep or exchange his car.

(i) Decide without extra information(ii) Decide on basis of free road (driving) test(iii)Decide after oil consumption test costing $25(iv)Decide after combined road/oil test costing $10(v) Decide sequentially: road test then possibly oil test

costing $10

In (iv), both tests must be takenIn (v), oil test is optional, depending on road test

Page 13: CHAPTER 6 (handout)

13

Car Exchange Problem (with information)

• The decision tree is complicated• Cannot fit in 1 slide• 5 branches: 5 options• Probabilities after extra information are

conditional (posterior)• To illustrate, we choose the branch of option

(v)• Road test then, depending on result, possible oil

test costing $10

Page 14: CHAPTER 6 (handout)

14

Car Exchange Problem (with information)

Result of road test:• y1 : fair p(y1) = 0.5

• y2 : poor p(y2) = 0.5

Result of oil consumption test:• Z1 : high p(Z1|y)

• Z2 : medium p(Z2 |y)

• Z3 : low p(Z3 |y)

Page 15: CHAPTER 6 (handout)

15

Car exchange decision tree (with information)

y1: 0.5

y2: 0.5

No test

Oil testZ3

Z1

Z2

No test

Oil testZ3

Z1

Z2

Road test

Page 16: CHAPTER 6 (handout)

16

Car exchange decision tree with information (y1 branch)

y1: 0.5

No testZ1: 0.28

Oil test

a2

a1

0.60.4

14001400

0.60.4

12001600

a2

a1

0.430.57

14101410

0.430.57

12101610

a2

a1

0.50.5

14101410

0.50.5

12101610

a2

a1

0.750.25

14101410

0.750.25

12101610

Z2: 0.24

Z3: 0.48

Page 17: CHAPTER 6 (handout)

17

Car exchange decision tree with information (y1 branch)

y1: 0.5

No test

Z1: 0.28

Oil test

a2

a1

0.60.4

14001400

0.60.4

12001600

a2

a1

0.430.57

14101410

0.430.57

12101610

a2

a1

0.50.5

14101410

0.50.5

12101610

a2

a1

0.750.25

14101410

0.750.25

12101610

Z2: 0.24

Z3: 0.48

1400

1360

1410

1439

1410

1410

1410

1310

1360

1410

1410

1310

1362

Page 18: CHAPTER 6 (handout)

18

Car exchange decision tree with information (y2 branch)

y2: 0.5 No test

Z1: 0.32Oil test

a2

a1

0.60.4

14001400

0.40.6

12001600

a2

a1

0.250. 75

14101410

0.250.75

12101610

a2

a1

0.310.69

14101410

0.310.69

12101610

a2

a1

0.570.43

14101410

0.570.43

12101610

Z2: 0.26

Z3: 0.42

Page 19: CHAPTER 6 (handout)

19

Car exchange decision tree with information (y2 branch)

y2: 0.5 No test

Z1: 0.32Oil test

a2

a1

0.60.4

14001400

0.40.6

12001600

a2

a1

0.250. 75

14101410

0.250.75

12101610

a2

a1

0.310.69

14101410

0.310.69

12101610

a2

a1

0.570.43

14101410

0.570.43

12101610

Z2: 0.26

Z3: 0.42

1400

1440

1410

1510

1410

1487

1410

1381

1400

1410

1410

1381

1398

Page 20: CHAPTER 6 (handout)

20

Decision Tree Calculations

• Tree is developed from left to right• Calculations are made from right to left• Many calculation are redundant

• For inferior solutions• Not needed in final solution

• Probabilities after extra information (road or oil tests) are conditional (posterior)

• Calculated by Bayes’ theorem

Page 21: CHAPTER 6 (handout)

21

Initial Payoff Data (no information)

Payoff (Reward) Matrix

P() a1: Box 1 a2: Box 2

1: Box 1 0.5 100 0

2: Box 2 0.5 0 100

EV 50 50

Page 22: CHAPTER 6 (handout)

22

Initial Probability Data (no information)

Prior Probability Matrix

P() B: Black W: White

1: Box 1 0.5 0.6 0.4

2: Box 2 0.5 0.8 0.2

Page 23: CHAPTER 6 (handout)

23

Decision tree without information

Box 1

Box 2

1 : 0.5

2: 0.5

$100

$0

1: 0.5

2: 0.5

$0

$100

$50

$50

Page 24: CHAPTER 6 (handout)

24

Decision Tree Example with information

Samples from box can be taken Ball is returned to the box Up to 2 samples are allowed Cost = $3 per sample

What is the optimal plan?

Page 25: CHAPTER 6 (handout)

25

Posterior probabilities for sample 1

Probability Calculations

P() P(B) P(W) Joint Posterior1: 0.5 0.6 0.4 0.3 0.2 0.43 0.67

2: 0.5 0.8 0.2 0.4 0.1 0.57 0.33

1.0 0.7 0.3 1.00 1.00

Page 26: CHAPTER 6 (handout)

26

Decision tree with information

No sample

Sample 1

B: 0.7

$50

$

W: 0.3$

$

a1 or a2

$

Sample 2

Sample 2

No sample

No sample

No information

Page 27: CHAPTER 6 (handout)

27

Posterior probabilities for sample 2when sample 1 is Black

Probability Calculations

P() P(B) P(W) Joint Posterior1: 0.43 0.6 0.4 0.26 0.17 0.36 0.61

2: 0.57 0.8 0.2 0.46 0.11 0.64 0.39

1.0 0.72 0.28 1.00 1.00

Page 28: CHAPTER 6 (handout)

28

Sample 1 Black, No Sample 2

No 2nd sample

Sample 2

1: 0.43

2: 0.57

$97

$-3

B: 0.72

W: 0.28

$

$

1: 0.43

2: 0.57

$-3

$97

a1

a2Black sample 1

40

54

54

Page 29: CHAPTER 6 (handout)

29

Samples 1 & 2 Both Black

Black sample 2

Sample 2

1: 0.36

2: 0.64

$94

$-6

B: 0.72

W: 0.28

$

1: 0.36

2: 0.64

$-6

$94

a1

a2

Black sample 1

30

58

No Sample

$54

58

Page 30: CHAPTER 6 (handout)

30

Sample 1 Black, Sample 2 White

White sample 2

Sample 2

1: 0.61

2: 0.39

$94

$-6

W: 0.28

B: 0.72

$58

1: 0.61

2: 0.39

$-6

$94

a1

a2

Black sample 1

55

33

No Sample

$54

55

57.16

Page 31: CHAPTER 6 (handout)

31

Posterior probabilities for sample 2when sample 1 is White

Probability Calculations

P() P(B) P(W) Joint Posterior1: 0.67 0.6 0.4 0.40 0.27 0.61 0.79

2: 0.33 0.8 0.2 0.26 0.07 0.39 0.21

1.0 0.66 0.34 1.00 1.00

Page 32: CHAPTER 6 (handout)

32

Sample 1 White, No Sample 2

No 2nd sample

Sample 2

1: 0.67

2: 0.33

$97

$-3

B: 0.66

W: 0.34

$

$

1: 0.67

2: 0.33

$-3

$97

a1

a2White sample 1

64

30

64

Page 33: CHAPTER 6 (handout)

33

Sample 1 White, Sample 2 Black

Black sample 2

Sample 2

1: 0.61

2: 0.39

$94

$-6

B: 0.66

W: 0.34

$

1: 0.61

2: 0.39

$-6

$94

a1

a2

White sample 1

55

33

No Sample

$64

55

Page 34: CHAPTER 6 (handout)

34

Samples 1 & 2 Both White

White sample 2

Sample 2

1: 0.79

2: 0.21

$94

$-6

W: 0.34

B: 0.66

$55

1: 0.79

2: 0.21

$-6

$94

a1

a2

White sample 1

73

15

No Sample

$64

73

61.12

Page 35: CHAPTER 6 (handout)

35

Decision tree summary of results

No samples

Sample 1

B: 0.7

$50

$55W: 0.3

a1 or a2

$54

Sample 2

No 2nd sample

No information

$64

Sample 2

No 2nd sample

$58

$55

W, 0.28: a1

B, 0.72: a2

a2

B, 0.66: a1

W, 0.34: a1

a1

$73

57.2

61.1

64

57.2

59.2

a1: 6B, 4W

a2: 8B, 2W

Page 36: CHAPTER 6 (handout)

36

Decision Tree with Fixed Costs

Example of fixed cost: • sampling cost = 3/sample in previous example

If objective is to maximize expected payoff, Constant costs can be deducted either from:

• Terminal node payoffs • Expected values

Page 37: CHAPTER 6 (handout)

37

Example: Including fixed costs

Sample 1 Black, cost = $3

1: 0.43

2: 0.57

$100

$0

43 – 3a1

Sample 1 Black, cost = $3

1: 0.43

2: 0.57

$97

$– 3

40a1

Recall Slide 9

Page 38: CHAPTER 6 (handout)

38

Fixed Costs & Utilities

Utilities can be used instead of payoffs If objective is to maximize expected utility

• Constant costs must be deducted from terminal node payoffs

• Net payoffs are converted to net utilities• Expected values are taken of utilities of net

payoffs

Page 39: CHAPTER 6 (handout)

39

Including fixed costs

Sample 1 Black, cost = $3

1: 0.43

2: 0.57

U(100)

U(0)

EU–U(3)a1

Sample 1 Black, cost = $3

1: 0.43

2: 0.57

U(97)

U(– 3)

EUa1

Incorrect

Correct

Page 40: CHAPTER 6 (handout)

40

Allowing an optional 3rd sample

Suppose now a 3rd sample is allowed Sample cost = $3 Assume the decision whether or not to

take sample 3 depends on results of samples 1 and 2

What is the optimal plan?

Page 41: CHAPTER 6 (handout)

41

Posterior probabilities for sample 3

After 2 blacks (slide 8)

P() P(B) P(W) Joint Posterior1: 0.36 0.6 0.4 0.22 0.14 0. 3 0.52

2: 0.64 0.8 0.2 0.51 0.13 0. 7 0.48

1.0 0.73 0.27 1.00 1.00

Page 42: CHAPTER 6 (handout)

42

Decision tree with optional sample 3

Sample 1 B: 0.7

$50

W: 0.3

$54Sample 2

No 2nd sample

No sample

$

$57.2

Sample 3

No 3rd sample

$64

Sample 2

No 2nd sample

$

$61.1

Sample 3

No 3rd sample

Page 43: CHAPTER 6 (handout)

43

Fixing the number of samples

Suppose now a 3rd sample is allowed Sample cost = $3 Assume we must decide the number of samples

in advance:

0, 1, 2, or 3

What is the optimal plan?

Page 44: CHAPTER 6 (handout)

44

Zero samples

a1: Box 1

1 : 0.5

2: 0.5

$100

$0

1: 0.5

2: 0.5

$0

$100

$50

$50a2: Box 2

50No samples

Page 45: CHAPTER 6 (handout)

45

One Sample

B: 0.7

W: 0.3

1: 0.43

2: 0.57

$97

$-3

1: 0.43

2: 0.57

$-3

$97

a1

a2

Sample once

40

54

54

1: 0.67

2: 0.33

$97

$-3

1: 0.67

2: 0.33

$-3

$97

a1

a2

64

30

64

57

Page 46: CHAPTER 6 (handout)

46

Posterior probabilities for 2 samples

Examples: P(BB|1) = P(BB) = 0.6(0.6) = 0.36

P(BW|1) = P(BW) + P(WB) = 0.6*0.4 + 0.4*0.6 = 0.48

P(WW|1) = P(WW) = 0.4(0.4) = 0.16

P() BB BW WW Joint1: 0.5 0.36 0.48 0.16 0.18 0.24 0.08

2: 0.5 0.64 0.32 0.04 0.32 0.16 0.02 0.50 0.40 0.10

Post 1: 0.36 0.60 0.80

2: 0.64 0.40 0.20

Page 47: CHAPTER 6 (handout)

47

Two Samples

BB: 0.5

WW: 0.1

1: 0.36

2: 0.64$94

$-6a1

a2

Sample twice

30

58

58

1: 0.36

2: 0.64$-6

$94

58

1: 0.6

2: 0.4$94

$-6a1

a2

54

541: 0.6

2: 0.4$-6

$94

34

1: 0.8

2: 0.2$94

$-6a1

a2

74

741: 0.8

2: 0.2$-6

$94

14

BW: 0.4

Page 48: CHAPTER 6 (handout)

48

Posterior probabilities for 3 samplesP(BBB|1) = 0.6(0.6)(0.6) = 0.216P(BBW|1) = P(BBW) + P(BWB) + P(WBB)= 3*0.6*0.6*0.4 = 0.432P(BWW|1) = P(BWW) + P(WBW) + P(WWB)= 3*0.6*0.4*0.4 = 0.288P(WWW|1) = 0.4(0.4)(0.4) = 0.064

P BBB BBW BWW WWW Joint1:0.5 0.216 0.432 0.288 0.064 0.108 0.216 0.144 0.0322:0.5 0.512 0.384 0.096 0.008 0. 256 0.192 0.048 0.004

0.364 0.408 0.192 0.036Post

1: 0.30 0.53 0.75 0.892: 0.70 0.47 0.25 0.11

Page 49: CHAPTER 6 (handout)

49

Three Samples

BBB: 0.36

WWW: 0.04

1: 0.3

2: 0.7$91

$-9a1

a2

Sample 3 times

2161

55.7

1: 0.3

2: 0.7$-9$91

61

BBW: 0.41

1: 0.53

2: 0.47$91$-9a1

a2

4444

1: 0.53

2: 0.47$-9$91

38

1: 0.75

2: 0.25$91$-9a1

a2

6666

1: 0.75

2: 0.25$-9$91

16

1: 0.89

2: 0.11$91$-9a1

a2

8080

1: 0.89

2: 0.11$-9$91

2

BWW: 0.19

Page 50: CHAPTER 6 (handout)

50

Summary of results with fixed number of samples

$50

$57

1 sample

0 samples

$55.7

$58

2 samples

3 Samples

Page 51: CHAPTER 6 (handout)

51

Value of Sample (new information Results of previous example

• With sequential samples (slide 23)• With fixed no. of samples (slide 31)

3rd Sample is never needed Questions:

• How many samples should be taken?• Is it better to decide immediately or after more

information?

Page 52: CHAPTER 6 (handout)

52

Expected Value of Information Assume P(1) = p, P(2) = 1 – p ThenP() P(B) P(W) Joint1:p 0.6 0.4 0.6p 0.4p2:1–p 0.8 0.2 0.8(1-p) 0.2(1-p) 1.0 (4-p)/5 (1+p)/5

Posterior3p/(4-p) 2p/(1+p)4(1-p)/(4-p) (1-p)/(1+p)

Page 53: CHAPTER 6 (handout)

53

Expected payoff Best payoff if Black = 100[ max{3p/(4-p), 4(1-p)/(4-p)} ] Best payoff if White = 100[ max{2p/(1+p), (1-p)/(1+p)} ]

Expected outcome F(p) = 100 (4-p)/5 [ max{3p/(4-p), 4(1-p)/(4-p)} ]

+ 100 (1+p)/5[ max{2p/(1+p), (1-p)/(1+p)} ]

F(p) = 100[ max{0.6p, 0.8(1-p)} + max{0.4p, 0.2(1-p)} ] F(p) = max{60p, 80(1-p)} + max{40p, 20(1-p)} F(p) = max{a, b} + max{c, d}

Page 54: CHAPTER 6 (handout)

54

Graph of expected payoff

p

100

1

80

4/71/3

Page 55: CHAPTER 6 (handout)

55

Maximum Expected Payoff To maximize F(p) on 0 < p < 1, Graphical solution gives

• 0 < p < 1/3 F(p) = 100(1 – p) b + d• 1/3 < p < 4/7 F(p) = 80 – 40p b + c• 4/7 < p < 1 F(p) = 100p a + c

For 1st and 3rd ranges, solution is same as expected payoff given only P(1) = p, P(2) = 1 – p.

Only 2nd range has improvement in expected payoff Sample should be taken only if: 1/3 < p < 4/7

Page 56: CHAPTER 6 (handout)

56

Expected Value of Sample Information Value of sample information

= Expected improvement in payoff= 80 – 40p – (100 – 100p), 0 < p < 0.5

= 80 – 40p – (100p), 0.5 < p < 1Or

= 60p – 20, 0 < p < 0.5

= 80 – 140p, 0.5 < p < 1

Page 57: CHAPTER 6 (handout)

57

Range of p for sample cost = 3 For sample cost = 3 Sample should be taken only improvement is > 3

• 60p – 20 > 3• p > 0.383

• 80 – 140p > 3• p < 0.55

Thus, 0.383 < p < 0.55

Page 58: CHAPTER 6 (handout)

58

For fixed no. of samples

Posteriors after 2 samples (slide 27)

BB BW WWP(1) = p 0.36 0.60 0.80

Since all probabilities are outside the range (0.383 < p < 0.55)

A 3rd sample should not be taken

Page 59: CHAPTER 6 (handout)

59

How many samples?

So far, analysis is for the value of 1 sample We can estimate value of several samples

Max. no. of samples• Expected payoff with no information = 50 • Payoff with perfect information = 100• Max. no. of samples = (100 – 50)/3 = 16


Recommended