Title: Forecasting and Decision Making Under Uncertainty
1Forecasting and Decision Making Under Uncertainty
Warning Decision Making II Workshop
- Thomas R. Stewart, Ph.D.
- Center for Policy Research
- Rockefeller College of Public Affairs and Policy
- University at Albany
- State University of New York
- T.STEWART_at_ALBANY.EDU
2Outline
- Uncertainty
- Decision making and judgment
- Inevitable error
- Problem 1 Choosing the warn/no warn cutoff
- Problem 2 Reducing error by improving forecast
accuracy
3Uncertainty
- Uncertainty occurs when, given current knowledge,
there are multiple possible states of nature.
4Probability is a measure of uncertainty
- Relative frequency
- Subjective probability (Bayesian)
5Uncertainty
- Uncertainty 1 - States (events) and probabilities
of those events are known - Coin toss
- Dice toss
- Precipitation forecasting (approximately)
6Uncertainty
- Uncertainty 2 - States (events) are known,
probabilities are unknown - Elections
- Stock market
- Forecasting severe weather
7Uncertainty
- Uncertainty 3 - States (events) and probabilities
are unknown - Y2K
- Global climate change
- The differences among the types of uncertainty
are a matter of degree.
8Picturing uncertainty
- There are many ways to depict uncertainty. For
example, - Continuous eventsscatterplot
- Discrete eventsdecision table
9Scatterplot Correlation .50
Forecasting and Decision Making Under Uncertainty
9
10Scatterplot Correlation .20
Forecasting and Decision Making Under Uncertainty
10
11Scatterplot Correlation .80
Forecasting and Decision Making Under Uncertainty
11
12Scatterplot Correlation 1.00
The perfect forecast
Forecasting and Decision Making Under Uncertainty
12
13Decision table Data for an imperfect
categorical forecast over 100 days (uncertainty)
Base rate 20/100 .20
Forecasting and Decision Making Under Uncertainty
13
14Decision table terminology Data for an
imperfect categorical forecast over 100 days
(uncertainty)
Base rate 20/100 .20
Forecasting and Decision Making Under Uncertainty
14
15Uncertainty, Judgment, Decision, Error
- Taylor-Russell diagram
- Decision cutoff
- Criterion cutoff (linked to base rate)
- Correlation (uncertainty)
- Errors
- False positives (false alarms)
- False negatives (misses)
16Taylor-Russell diagram
Forecasting and Decision Making Under Uncertainty
16
17Tradeoff between false positives and false
negatives
18Uncertainty, Judgment, Decision, Error
- Another view ROC analysis
- Decision cutoff
- False positive proportion
- True positive proportion
- Az measures forecast quality
19ROC Curve
20Problem 1 Optimal decision cutoff
- Given that it is not possible to eliminate both
false positives and false negatives, what
decision cutoff gives the best compromise? - Depends on values
- Depends on uncertainty
- Depends on base rate
- Decision analysis is one optimization method.
21Decision tree
22Expected value
Expected Value P(O1)V(O1) P(O2)V(O2)
P(O3)V(O3) P(O4)V(O4)
P(Oi) is the probability of outcome i
V(Oi) is the value of outcome i
23Expected value
- One of many possible decision making rules
- Used here for illustration because its the basis
for decision analysis - Intended to illustrate principles
24Where do the values come from?
25Descriptions of outcomes
- True positive (hit--a warning is issued and the
storm occurs as predicted) - Damage occurs, but people have a chance to
prepare. Some property and lives are saved, but
probably not all. - False positive (false alarm--a warning is issued
but no storm occurs) - No damage or lives lost, but people are concerned
and prepare unnecessarily, incurring
psychological and economic costs. Furthermore,
they may not respond to the next warning.
26Descriptions of outcomes (cont.)
- False negative (miss--no warning is issued, but
the storm occurs) - People do not have time to prepare and property
and lives are lost. NWS is blamed. - True negative (no warning is issued and storm
occurs) - No damage or lives lost. No unnecessary concern
about the storm.
27Values depend on your perspective
- Forecaster
- Emergency manager
- Public official
- Property owner
- Business owner
- Many others...
28Which is the best outcome?
Measuring values
- True positive?
- False positive?
- False negative?
- True negative?
Give the best outcome a value of 100.
29Which is the worst outcome?
Measuring values
- True positive?
- False positive?
- False negative?
- True negative?
Give the worst outcome a value of 0.
30Rate the remaining two outcomes
Measuring values
- True positive?
- False positive?
- False negative?
- True negative?
Rate them relative to the worst (0) and the best
(100)
31Values reflect different perspectives
Measuring values
- True positive?
- False positive?
- False negative?
- True negative?
Perspective
1 2 3
90
40
80
80
50
98
0
0
0
100
100
100
32Expected value
Expected Value P(O1)V(O1) P(O2)V(O2)
P(O3)V(O3) P(O4)V(O4)
P(Oi) is the probability of outcome i
V(Oi) is the value of outcome i
33Expected value depends on the decision cutoff
34Expected value depends on the value perspective
35Whose values?
- Forecasting weather is a technical problem.
- Issuing a warning to the public is a social act.
- Each warning has an implicit set of values.
- Should those values be made explicit and subject
to public scrutiny?
36Problem 2 Improving forecast accuracy
- Examine the components of forecast skill. This
requires a detailed analysis of the forecasting
task. - Address those components that are problematic,
but be aware that solving one problem may create
others. - Problems are addressed by changing the forecast
environment and by training. Training alone has
little effect.
37Problem 2 Improving forecast accuracy
- Metatheoretical issue Correspondence vs.
coherence
38Coherence research
- Coherence research measures the quality of
judgment against the standards of logic,
mathematics, and probability theory. Coherence
theory argues that decisions under uncertainty
should be coherent, with respect to the
principles of probability theory.
39Correspondence research
- Correspondence research measures the quality of
judgment against the standards of empirical
accuracy. Correspondence theory argues that
decisions under uncertainty should result in the
least number of errors possible, within the
limits imposed by irreducible uncertainty.
40Coherence and correspondence theories of
competence
Coherence theory of competence Uncertainty
irrationality error Correspondence theory
of competence Uncertainty inaccuracy
error What is the relation between coherence
and correspondence?
41Fundamental tenet of coherence research
- "Probabilistic thinking is important if people
are to understand and cope successfully with
real-world uncertainty."
42Fundamental tenet of correspondence research
- "Human competence in making judgments and
decisions under uncertainty is impressive.
Sometimes performance is not. Why? Because
sometimes task conditions degrade the accuracy of
judgment." - Hammond, K. R. (1996). Human Judgment and Social
Policy Irreducible Uncertainty, Inevitable
Error, Unavoidable Injustice. New York, Oxford
University Press (p. 282).
43Brunswik's lens model
Cues
Event
Forecast
X
44Expanded lens model
45Components of skill and the lens model
True
Subjective
Cues
Descriptors
Cues
Event
Forecast
Environmental predictability
Reliability of information processing
Fidelity of the information system
Reliability of information acquisition
Match between environment and judge
Forecasting and Decision Making Under Uncertainty
45
46Forecasting and Decision Making Under Uncertainty
46
47Decomposition of skill score
Skill score
48Forecasting and Decision Making Under Uncertainty
48
491. Environmental predictability
Components of skill
- Environmental predictability is conditional on
current knowledge and information. It can be
improved through research that results in
improved information and improved understanding
of environmental processes. - Environmental predictability determines an upper
bound on forecast performance and therefore
indicates how much improvement is possible
through attention to other components.
50Environmental predictability limits accuracy of
forecasts
Forecasting and Decision Making Under Uncertainty
50
512. Fidelity of information system
Components of skill
- Forecasting skill may be degraded if the
information system that brings data to the
forecaster does not accurately represent actual
conditions, i.e., if the cues do not accurately
measure the true descriptors. Fidelity of the
information system refers to the quality, not the
quantity, of information about the cues that are
currently being used. - Fidelity is improved by developing better
measures, e.g., though improved instrumentation
or increased density in space or time.
523. Match between environment and forecaster
Components of skill
- The match between the model of the forecaster and
the environmental model is an estimate of the
potential skill that the forecaster's current
strategy could achieve if the environment were
perfectly predictable (given the cues) and the
forecasts were unbiased and perfectly reliable. - This component might be called knowledge. It
is addressed by forecaster training and
experience. If the forecaster learns to rely on
the most relevant information and ignore
irrelevant information, this component will
generally be good.
53Reliability
Components of skill
- Reliability is high if identical conditions
produce identical forecasts. - Humans are rarely perfectly reliable.
- There are two sources of unreliability
- Reliability of information acquisition
- Reliability of information processing
54Reliability
Components of skill
- Reliability decreases as amount of information
increases.
Theoretical relation between amount of
information and accuracy of forecasts
55Reliability decreases as environmental
predictability decreases.
Components of skill
Forecasting and Decision Making Under Uncertainty
55
564. Reliability of information acquisition
Components of skill
- Reliability of information acquisition is the
extent to which the forecaster can reliably
interpret the objective cues. - It is improved by organizing and presenting
information in a form that clearly emphasizes
relevant information.
575. Reliability of information processing
Components of skill
- Decreases with increasing information and with
increasing environmental uncertainty - Methods for improving reliability of information
processing - Limit the amount of information used in
judgmental forecasting. Use a small number of
very important cues. - Use mechanical methods to process information
(e.g. MOS). - Combine several forecasts (consensus).
- Require justification of forecasts.
58Theoretical relation between amount of
information and accuracy of forecasts
Forecasting and Decision Making Under Uncertainty
58
59- The relation between information and accuracy
depends on environmental uncertainty
- - - - - Theoretical limit of accuracy
Actual accuracy
- - - - - Theoretical limit of accuracy
Actual accuracy
606 and 7. Bias -- Conditional (regression bias)
and unconditional (base rate bias)
Components of skill
- Together, the two bias terms measure forecast
"calibration (sometimes called reliability in
meteorology). - Reducing bias
- Experience
- Statistical training
- Feedback about nature of biases in forecast
- Search for discrepant information
- Statistical correction for bias
61Calibration (a.k.a. reliability) of forecasts
depends on the task
Calibration data for precipitation forecasts
(Murphy and Winkler, 1974)
Heideman (1989)
62Reading about judgmental forecasting
- Components of skill
- Stewart, T. R., Lusk, C. M. (1994). Seven
components of judgmental forecasting skill
Implications for research and the improvement of
forecasts. Journal of Forecasting, 13, 579-599. - Principles of Forecasting Project
- http//www-marketing.wharton.upenn.edu/forecast/
- Principles of Forecasting A Handbook for
Researchers and Practitioners, J. Scott Armstrong
(ed.) Norwell, MA Kluwer Academic Publishers,
(scheduled for publication in 1999). - Stewart, Improving Reliability of Judgmental
Forecasts (http//www.albany.edu/cpr/StewartPOF98.
PDF)
63Conclusion
- Problem 1 Choosing the warn/no warn cutoff
- Value tradeoffs are unavoidable.
- Warnings are based on values that should be
critically examined. - Problem 2 Improving forecast accuracy
- Understanding and improving forecasts requires
understanding the task and the forecasting
environment. - Decomposing skill can aid in identifying the
factors that limit forecasting accuracy.