Forecast Verification - PowerPoint PPT Presentation

1 / 26
About This Presentation
Title:

Forecast Verification

Description:

all have special meanings in statistics. Accuracy ... so it is a very accurate scheme! or is it? ... Increase access to verification information. Simplify ... – PowerPoint PPT presentation

Number of Views:30
Avg rating:3.0/5.0
Slides: 27
Provided by: joannec3
Category:

less

Transcript and Presenter's Notes

Title: Forecast Verification


1
Forecast Verification
  • Presenter Neil Plummer
  • National Climate Centre
  • Lead Author Scott Power
  • Bureau of Meteorology Research Centre
  • Acknowledgements
  • A. Watkins, D. Jones, P. Reid, NCC

2
Introduction
  • Verification - what it is and why it is
    important?
  • Terminology
  • Potential problems
  • Comparing various measures
  • Assisting users of climate information

3
What is verification?
  • check truth or correctness of
  • process of determining the quality of forecasts
  • objective analysis of degree to which a series
    of forecasts compares and contrasts with the
    equivalent observations of a given period

4
Why bother with verification?
  • Scientific admin support
  • is a new system better?
  • assist with consensus forecasts
  • Application of forecasts
  • how good are your forecasts?
  • should I use them?
  • can be used to help estimate value

5
Terminology can be confusing
  • Verification is made a little tricky by the fact
    that everyday words are used to describe
    quantities with a precise statistical meaning.
    Common words include
  • accuracy
  • skill
  • reliability
  • bias
  • value
  • hit rates, percent consistent, false alarm rate,
    ...
  • all have special meanings in statistics

6
Accuracy
  • Average correspondence between forecasts and
    observations
  • Measures
  • mean absolute error, root mean square error

7
Bias
  • Correspondence between average forecast with
    average observation
  • e.g. average forecast - average value of
    observation

8
Skill
  • Accuracy of forecasts relative to accuracy of
    forecasts using a reference method (e.g.
    guessing, persistence, climatology, damped
    persistence, )
  • Measures
  • numerous!

9
Reliability
  • Degree of correspondence between the average
    observation, given a particular forecast, and
    that forecast taken over all forecasts
  • e.g. suppose forecasts of 10 or 30 or , ,
    or 70 or chance of rain tomorrow are
    routinely issued for many years
  • if we go back through all of the forecasts issued
    a forecast of looking for occasions when forecast
    probability of 70 was issued, then we would
    expect to find rainfall on 70 of occasions if
    the forecast system is reliable
  • this is often not the case

10
Reliability Graph
11
Value
  • Impact that prudent use of a given forecast
    scheme has on the users profits, COMPARED WITH
    profits made using a reference strategy
  • Measures
  • , lives saved, disease spread reduced,

12
Contingency Table
OBSERVED
HIT RATE Hits/(Hits Misses) FALSE
ALARM RATE False Alarms/(False Alarms Correct
Rejections) PERCENT CONSISTENT
100(HitsCorrect Rejections)/Total
13
Accuracy measures
  • Hit rates
  • Proportion of observed events correctly forecast
  • False alarm rates
  • Proportion of observed non-events forecasted as
    events
  • Percent Correct
  • 100x (proportion of all forecasts that are
    correct)

14
1. Forecast performance2x2 contingency table
15
Is this a good scheme?
  • 1. Original Scheme
  • Percent correct 100(28 2680)/2803
  • 96.6
  • so it is a very accurate scheme!
  • or is it?

16
2. Performance of 2nd (reference) forecast
method never predict a tornado a lazy
forecast scheme!
17
Performance measures
Percent Correct
  • 1. Original Scheme
  • Percent correct 100(28 2680)/2803
  • 96.6
  • 2. Reference Lazy Scheme
  • Percent correct 100(0 2752)/2803
  • 98.2 !!

18
Performance measures
  • Hit rates
  • ) 28/51 so over half the tornadoes predicted
  • ) reference scheme 0/51 no tornadoes
    predicted

19
Value
  • Suppose an unexpected (unpredicted) tornado
    causes 500 million damage and that an expected
    (predicted) tornado results in only 100 million
    damage
  • So forecast scheme (1) saves 28 x 400 million
    compared to forecast scheme (2)
  • a huge saving - highly valuable!!

20
(No Transcript)
21
(No Transcript)
22
Categorical versus probabilistic
  • Categorical
  • The temperature will be 26ºC tomorrow
  • Probabilistic
  • There is a 30 chance of rain tomorrow
  • There is a 90 chance that wet season rainfall
    will be above median

23
Artificial Skill
  • danger of too many inputs
  • danger of trying too many inputs
  • independent data
  • cross-validation
  • importance of supporting evidence
  • simple plausible hypothesis
  • climate models
  • process studies

24
How do users verify predictions?
  • No single answer, however
  • some switch from probabilistic to categorical
  • media prefer categorical forecasts
  • assessments made on a single season
  • extrapolation

25
How can we assist users in verification
  • Increase access to verification information
  • Simplify information
  • Build partnerships
  • media
  • users user groups
  • other government departments
  • Education (booklets, web, )

26
Summary
  • Verification is crucial but care is needed!
  • Familiarise with terminology used
  • skill, accuracy, value,
  • No single measure tells the whole story
  • Importance of using independent data in
    verification
  • Keep it simple
  • Communicating verification results is challenging
  • Users sometimes do their own verification -
    sobering
  • Most people like to think categorically -
    challenging
  • Dialogue with end-users is very important
Write a Comment
User Comments (0)
About PowerShow.com