+ All Categories
Home > Documents > Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy...

Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy...

Date post: 14-Jan-2016
Category:
Upload: brian-anderson
View: 215 times
Download: 0 times
Share this document with a friend
63
Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy University at Albany State University of New York [email protected] Warning Decision Making II Workshop
Transcript
Page 1: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

Thomas R. Stewart, Ph.D.Center for Policy Research

Rockefeller College of Public Affairs and PolicyUniversity at Albany

State University of New [email protected]

Warning Decision Making II Workshop

Page 2: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

2

Outline

• Uncertainty• Decision making and judgment• Inevitable error• Problem 1: Choosing the warn/no warn cutoff• Problem 2: Reducing error by improving

forecast accuracy

Page 3: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

3

Uncertainty

• Uncertainty occurs when, given current knowledge, there are multiple possible states of nature.

Page 4: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

4

Probability is a measure of uncertainty

• Relative frequency

• Subjective probability (Bayesian)

Page 5: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

5

• Uncertainty 1 - States (events) and probabilities of those events are known– Coin toss– Dice toss– Precipitation forecasting (approximately)

Uncertainty

Page 6: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

6

• Uncertainty 2 - States (events) are known, probabilities are unknown– Elections– Stock market– Forecasting severe weather

Uncertainty

Page 7: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

7

• Uncertainty 3 - States (events) and probabilities are unknown– Y2K– Global climate change

• The differences among the types of uncertainty are a matter of degree.

Uncertainty

Page 8: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

8

Picturing uncertainty

• There are many ways to depict uncertainty. For example,• Continuous events:

scatterplot

• Discrete events:decision table

0

20

40

60

80

100

0 20 40 60 80 100Forecast

Eve

nt

Forecast for tomorrow’sweather

Tomorrow’sactualweather

No rain fortomorrow

Rain fortomorrow

Rain 6 14

No rain 71 9

Page 9: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

9

Scatterplot: Correlation = .50

Page 10: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

10

Scatterplot: Correlation = .20

Page 11: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

11

Scatterplot: Correlation = .80

Page 12: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

12

Scatterplot: Correlation = 1.00

The perfect forecast

Page 13: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

13

Forecast for tomorrow’sweather

Tomorrow’sactualweather

No rain fortomorrow

Rain fortomorrow

Rain 6 14

No rain 71 9

Base rate = 20/100 = .20

Decision table: Data for an imperfect categorical forecast over 100 days (uncertainty)

Page 14: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

14

Forecast for tomorrow’sweather

Tomorrow’sactualweather

No rain fortomorrow(negativeforecast)

Rain fortomorrow(positiveforecast)

Rain(positive)

6(false

negative)

14(true

positive)

No rain(negative)

71(true

negative)

9(false

positive)

Base rate = 20/100 = .20

Decision table terminology: Data for an imperfect categorical forecast over 100 days (uncertainty)

Page 15: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

15

Uncertainty, Judgment, Decision, Error

• Taylor-Russell diagram– Decision cutoff– Criterion cutoff (linked to base rate)– Correlation (uncertainty)– Errors

• False positives (false alarms)• False negatives (misses)

Page 16: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Taylor-Russell diagram

Forecasting and Decision Making Under Uncertainty

16

Page 17: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

17

Tradeoff between false positives and false negatives

Page 18: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

18

Uncertainty, Judgment, Decision, Error

• Another view: ROC analysis– Decision cutoff– False positive proportion– True positive proportion

– Az measures forecast quality

Page 19: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

19

ROC Curve

Page 20: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

20

Problem 1: Optimal decision cutoff

• Given that it is not possible to eliminate both false positives and false negatives, what decision cutoff gives the best compromise?

– Depends on values– Depends on uncertainty– Depends on base rate

• Decision analysis is one optimization method.

Page 21: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

21

Decision tree

Page 22: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

22

Expected value

Expected Value = P(O1)V(O1) +

P(O2)V(O2) +

P(O3)V(O3) +

P(O4)V(O4)

V(Oi) is the value of outcome i

P(Oi) is the probability of outcome i

Page 23: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

23

Expected value

• One of many possible decision making rules

• Used here for illustration because it’s the basis for decision analysis

• Intended to illustrate principles

Page 24: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

24

Where do the values come from?

Page 25: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

25

Descriptions of outcomes

• True positive (hit--a warning is issued and the storm occurs as predicted)– Damage occurs, but people have a chance to prepare.

Some property and lives are saved, but probably not all.

• False positive (false alarm--a warning is issued but no storm occurs)– No damage or lives lost, but people are concerned and

prepare unnecessarily, incurring psychological and economic costs. Furthermore, they may not respond to the next warning.

Page 26: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

26

Descriptions of outcomes (cont.)

• False negative (miss--no warning is issued, but the storm occurs)– People do not have time to prepare and property and lives

are lost. NWS is blamed.

• True negative (no warning is issued and storm occurs)– No damage or lives lost. No unnecessary concern about the

storm.

Page 27: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

27

Values depend on your perspective

• Forecaster

• Emergency manager

• Public official

• Property owner

• Business owner

• Many others...

Page 28: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

28

Which is the best outcome?

� True positive?

� False positive?

� False negative?

� True negative?

Give the best outcome a value of 100.

Measuring values

Page 29: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

29

Which is the worst outcome?

� True positive?

� False positive?

� False negative?

� True negative?

Give the worst outcome a value of 0.

Measuring values

Page 30: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

30

Rate the remaining two outcomes

� True positive?

� False positive?

� False negative?

� True negative?

Rate them relative to the worst (0) and the best (100)

Measuring values

Page 31: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

31

Values reflect different perspectives

True positive?

False positive?

False negative?

True negative?

Perspective

1 2 3

40

50

0

100

90

80

0

100

80

98

0

100

Measuring values

Page 32: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

32

Expected value

Expected Value = P(O1)V(O1) +

P(O2)V(O2) +

P(O3)V(O3) +

P(O4)V(O4)

V(Oi) is the value of outcome i

P(Oi) is the probability of outcome i

Page 33: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

33

Expected value depends on the decision cutoff

Page 34: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

34

Expected value depends on the value perspective

Page 35: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

35

Whose values?

• Forecasting weather is a technical problem.

• Issuing a warning to the public is a social act.

• Each warning has an implicit set of values.

• Should those values be made explicit and subject to public scrutiny?

Page 36: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

36

Problem 2: Improving forecast accuracy

• Examine the components of forecast skill. This requires a detailed analysis of the forecasting task.

• Address those components that are problematic, but be aware that solving one problem may create others.

• Problems are addressed by changing the forecast environment and by training. Training alone has little effect.

Page 37: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

37

Problem 2: Improving forecast accuracy

• Metatheoretical issue: Correspondence vs. coherence

Page 38: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

38

Coherence research

Coherence research measures the quality of judgment against the standards of logic, mathematics, and probability theory. Coherence theory argues that decisions under uncertainty should be coherent, with respect to the principles of probability theory.

Page 39: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

39

Correspondence research

Correspondence research measures the quality of judgment against the standards of empirical accuracy. Correspondence theory argues that decisions under uncertainty should result in the least number of errors possible, within the limits imposed by irreducible uncertainty.

Page 40: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

40

Coherence and correspondence theories of competence

Coherence theory of competenceUncertainty irrationality error

Correspondence theory of competenceUncertainty inaccuracy error

What is the relation between coherence and correspondence?

Page 41: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

41

Fundamental tenet of coherence research

"Probabilistic thinking is important if people are to understand and cope successfully with real-world uncertainty."

Page 42: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

42

Fundamental tenet of correspondence research

"Human competence in making judgments and decisions under uncertainty is impressive. Sometimes performance is not. Why? Because sometimes task conditions degrade the accuracy of judgment."

Hammond, K. R. (1996). Human Judgment and Social Policy: Irreducible Uncertainty, Inevitable Error, Unavoidable Injustice. New York, Oxford University Press (p. 282).

Page 43: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

43

Brunswik's lens model

Event Forecast

X

Cues

Ye Ys

Page 44: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

44

TrueDescriptors

Subjective

Event Forecast

CuesCues

Expanded lens model

Page 45: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Environmental predictability

Fidelity of the information system

Match between environment and judge

Reliability of information acquisition

Reliability of information processing

TrueDescriptors

Subjective

Event Forecast

CuesCues

Forecasting and Decision Making Under Uncertainty

45

Components of skill and the lens model

Page 46: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Decomposition of Skill Score

rYO

( )2

-

rYO

- ( / ) ][ - [ ( Y - O ) / ]sY

_

sO

2 2s

O

Squared correlation

Conditional Unconditional

bias bias

SS =

RO.X

G RY.X

GSS

rYO- - ( / ) ][ - [ ( Y - O ) / ]s

Ys

O

2 2s

O

( )2

RO.T

_

Murphy

Tucker

VT.X G R

Y.U

Expandedlens

model

Components of skill:

1. Environmental predictability

2. Fidelity of the information system

3. Match between environment and judge

4. Reliability of information acquisition

5. Reliability of information processing

6. Conditional/regression bias

7. Unconditional/base rate bias

Step 1:

Step 2:

Step 3:

(regression) (base rate)

rYO - ( / ) ][ - [ ( Y - O ) / ]s

Ys

O

2 2s

O

_-

(1988)

(1964)

1 2 3 4 765

_

_

_[ ]

2

SS = Skill Score = 1 - ( MSE Y MSE B/ )

~=

SS ~= VU.X

Forecasting and Decision Making Under Uncertainty

46

Page 47: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

47

Decomposition of skill score

processing

ninformatio

of yreliabilit

nacquisitio

ninformatio

of yreliabilit

forecaster and

t environmen

between match

system

ninformatio

the of fidelity

litypredictabi

talenvironmen

bias

nalunconditio

bias

lconditiona

Skill score =

Page 48: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Components of skill addressed by selected methods for improving judgments

Component of Skill* Method for improving judgments 1 2 3 4 5 6 7 Identify new descriptors through research X Develop better measures of true descriptors X Develop clear definitions of cues X Training to improve cue judgments X Improve information displays X Bootstrapping--replace judge with model X Require justification of judgments X X Combine several judgments X Decompose judgment task X Mechanical combination of cues X Train judge about environmental system X Experience with problem X X XCognitive feedback X Train judge to ignore non-predictive cues X Statistical training X XFeedback about nature of biases in judgment X XSearch for discrepant information X Statistical correction for bias X X

*1. Environmental predictability 2. Fidelity of the information system 3. Reliability of information acquisition 4. Reliability of information processing 5. Match between environment and judge 6. Conditional/regression bias 7. Unconditional/base rate bias

Forecasting and Decision Making Under Uncertainty

48

Page 49: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

49

1. Environmental predictability

• Environmental predictability is conditional on current knowledge and information. It can be improved through research that results in improved information and improved understanding of environmental processes.

• Environmental predictability determines an upper bound on forecast performance and therefore indicates how much improvement is possible through attention to other components.

Components of skill

Page 50: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

50

Environmental predictability limits accuracy of forecasts

Page 51: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

51

2. Fidelity of information system

• Forecasting skill may be degraded if the information system that brings data to the forecaster does not accurately represent actual conditions, i.e., if the cues do not accurately measure the true descriptors. Fidelity of the information system refers to the quality, not the quantity, of information about the cues that are currently being used.

• Fidelity is improved by developing better measures, e.g., though improved instrumentation or increased density in space or time.

Components of skill

Page 52: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

52

3. Match between environment and forecaster

• The match between the model of the forecaster and the environmental model is an estimate of the potential skill that the forecaster's current strategy could achieve if the environment were perfectly predictable (given the cues) and the forecasts were unbiased and perfectly reliable.

• This component might be called “knowledge.” It is addressed by forecaster training and experience. If the forecaster learns to rely on the most relevant information and ignore irrelevant information, this component will generally be good.

Components of skill

Page 53: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

53

Reliability

• Reliability is high if identical conditions produce identical forecasts.

• Humans are rarely perfectly reliable.

• There are two sources of unreliability:– Reliability of information acquisition

– Reliability of information processing

Components of skill

Page 54: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

54

Reliability

• Reliability decreases as amount of information increases.

Components of skill

Theoretical relation between amount of information and

accuracy of forecasts

Page 55: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Reliability decreases as environmental predictability decreases.

Components of skill

Forecasting and Decision Making Under Uncertainty

55

Page 56: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

56

4. Reliability of information acquisition

• Reliability of information acquisition is the extent to which the forecaster can reliably interpret the objective cues.

• It is improved by organizing and presenting information in a form that clearly emphasizes relevant information.

Components of skill

Page 57: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

57

5. Reliability of information processing

Decreases with increasing information and with increasing environmental uncertainty

Methods for improving reliability of information processing: Limit the amount of information used in judgmental

forecasting. Use a small number of very important cues.

Use mechanical methods to process information (e.g. MOS).

Combine several forecasts (consensus). Require justification of forecasts.

Components of skill

Page 58: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Acc

urac

y

Amount of Information

Actual accuracy

Theoretical limit of accuracy

Perfectaccuracy

Noaccuracy

Effect of limited

information and

environmental uncertainty

Effect of

limitations in

information processing

Forecasting and Decision Making Under Uncertainty

58

Theoretical relation between amount of information and accuracy of forecasts

Page 59: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

59Forecasting and Decision Making Under Uncertainty

The relation between information and accuracy depends on environmental

uncertainty

Low predictability task

Amount of Information

Acc

ura

cy

Theoretical Limit of Accuracy

Actual Accuracy

Effect of limited information and environmental

uncertainty

Effect of limitations in information processing

No accuracy

Perfect accuracy

High predictability task

Amount of Information

Acc

ura

cy

Theoretical Limit of Accuracy

Actual Accuracy

Effect of limited information and environmental

uncertainty Effect of limitations in information processing

No accuracy

Perfect accuracy

- - - - - Theoretical limit of accuracy——— Actual accuracy

- - - - - Theoretical limit of accuracy——— Actual accuracy

Page 60: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

60

6 and 7. Bias -- Conditional (regression bias) and unconditional (base rate bias)

Together, the two bias terms measure forecast "calibration” (sometimes called “reliability” in meteorology).

Reducing bias: Experience Statistical training Feedback about nature of biases in forecast Search for discrepant information Statistical correction for bias

Components of skill

Page 61: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

61

Calibration (a.k.a. reliability) of forecasts depends on the task

Calibration data for precipitation forecasts (Murphy and Winkler, 1974) Heideman (1989)

Page 62: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

62

Reading about judgmental forecasting

• Components of skill– Stewart, T. R., & Lusk, C. M. (1994). Seven components of

judgmental forecasting skill: Implications for research and the improvement of forecasts. Journal of Forecasting, 13, 579-599.

• Principles of Forecasting Project– http://www-marketing.wharton.upenn.edu/forecast/– Principles of Forecasting: A Handbook for Researchers and

Practitioners, J. Scott Armstrong (ed.): Norwell, MA: Kluwer Academic Publishers, (scheduled for publication in 1999).

– Stewart, Improving Reliability of Judgmental Forecasts (http://www.albany.edu/cpr/StewartPOF98.PDF)

Page 63: Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy.

Forecasting and Decision Making Under Uncertainty

63

Conclusion

• Problem 1: Choosing the warn/no warn cutoff– Value tradeoffs are unavoidable.– Warnings are based on values that should be

critically examined.

• Problem 2: Improving forecast accuracy– Understanding and improving forecasts requires

understanding the task and the forecasting environment.

– Decomposing skill can aid in identifying the factors that limit forecasting accuracy.


Recommended