How reliable are ONS indicators - PowerPoint PPT Presentation

1 / 33
About This Presentation
Title:

How reliable are ONS indicators

Description:

Statistical concepts should be those that users need. Concepts ... Changes from provisional to final estimates unbiassed. Different periodicities ... – PowerPoint PPT presentation

Number of Views:34
Avg rating:3.0/5.0
Slides: 34
Provided by: niaj
Category:

less

Transcript and Presenter's Notes

Title: How reliable are ONS indicators


1
How reliable are ONS indicators?
  • Paul Smith
  • Methodology Group
  • Office for National Statistics

2
Outline
  • What is reliability?
  • Quality attributes
  • Designing in quality
  • Examples
  • trade-offs of quality attributes
  • trade-offs of quality and cost
  • Conclusion

3
What is reliability?
  • Resists numerical definition
  • Fitness for purpose
  • Purpose is defined by the user
  • Many statistics have multiple purposes

4
Quality Attributes
  • ONS concentrating on quality measures
  • National Statistics quality measurement and
    reporting framework
  • Relevance
  • Completeness
  • Timeliness
  • Accessibility
  • Accuracy
  • Comparability
  • Coherence
  • Derived from Eurostat quality framework

5
Relevance
  • Statistical concepts should be those that users
    need
  • Concepts
  • international standards (eg ILO unemployment)
  • How do we know what users need?
  • triennial reviews, NS quality reviews, user
    groups
  • the same (only more)

6
Completeness
  • Domains for which estimates are available should
    reflect users needs
  • industrial classification domains
  • SIC2003
  • regional domains less well served
  • Allsopp review
  • may need survey redesigns
  • microdata ultimate freedom for defining domains

7
Timeliness
  • How soon after the reference period are results
    produced?
  • what other attributes of quality are affected by
    speed?
  • Is the frequency right for decisions based on the
    results?
  • Dates pre-announced

8
Accessibility and clarity
  • Data clearly presented
  • Accessible in a variety of formats
  • paper
  • web
  • electronic
  • Unpublished data available on request
  • Metadata
  • Assistance for users

9
Accuracy
  • Sampling errors
  • random sampling
  • Non-sampling errors
  • non-response error
  • coverage error
  • measurement error
  • processing error
  • modelling errors

10
Accuracy (2)
  • non-response error
  • ideal measure bias due to non-response
  • indicator response rates (by questionnaire/
    activity)
  • coverage error
  • ideal measure bias due to under/over-coverage
  • requires supplementary survey
  • measurement error
  • ideal measure bias due to measurement error
  • indicator questionnaire problems, proxy response
    rate

11
Accuracy (3)
  • processing error
  • ideal measure error due to poor processing
    (miskeys/ bad scans etc). Can measure this with
    some effort!
  • modelling errors
  • ideal how sensitive are the results to the
    choice of model/methods?
  • Do the models reflect the data structures (are
    model assumptions valid)?

12
Comparability...
  • across space and time
  • Consistent approach and methods in different
    areas/regions
  • Breaks in time series accompanied by
  • description of change
  • estimate of effect of change
  • clear labelling of change
  • back series estimated on a consistent basis

13
Coherence
  • Coherence is promoted by common
  • definitions
  • classifications
  • methods
  • sources
  • Do different sources tell a similar story?
  • National Accounts balancing
  • Changes from provisional to final estimates
    unbiassed
  • Different periodicities

14
Designing in quality
  • ONS actively seeking information on how
    statistics are used
  • NS theme groups, quality reviews
  • triennial reviews
  • Quality assurance procedures for all new
    methodology

15
Some examples
16
What do users want?
  • What do users look at first when new data are
    published?
  • Has it changed from last time
  • Decisions usually based on changes in statistics
  • has policy X had an effect?
  • have we reached a turning point in the economy?
  • should we change interest rates?

17
Stability of variance estimates
  • Variance estimates often variable
  • What to publish? Users prefer an indicator of
    accuracy

18
(No Transcript)
19
Non-response error and coherence vs accuracy
  • LFS weighting compensates for differential
    non-response
  • Ensures consistency with population estimates...
  • ...at detailed regional level
  • Can potentially adjust to many variables
  • Variance made up of variability due to data and
    variability due to weights

20
Existing balance
  • LFS - many constraints, lots of consistency
  • probably suboptimal accuracy at national level
  • different impact on regional accuracy and
    national accuracy
  • Business surveys - constraints in strata
  • one constraint per stratum - accuracy within
    strata
  • too much stratification?
  • trade-off bias and variance in strata with small
    sample sizes

21
Constraints and variance
22
RSI - stratum minimum sample size
  • Ratio estimator

23
variance of changes
  • To support users, want variance of changes
  • If the population and sample stayed the same,
    calculating
  • would be easy
  • but things are not so simple

24
Population and sample dynamics
25
Illustrative example variances of changes
  • Monthly production inquiry 2000 - nsa

26
Sampling error for the IoP
  • Variance calculation not straightforward for
    complex statistics
  • Kokic (1998) bootstrap method for estimating the
    sampling error of (the level of) the
    non-seasonally adjusted IoP
  • Work includes the effects of the sampling
    variability in the estimates of
  • turnover
  • adjustment for inventories
  • deflation
  • Component sampling errors needed as an input

27
Illustrative IoP variance estimates not
seasonally adjusted
28
Decisions and variability
  • Economic decisions are based on movements in
    published series
  • The reliability of a movement is affected by
    the variability
  • Take an LFS example (from David Steels work in
    the mid-1990s)
  • How fast is turning point detection?
  • How reliable is turning point detection?

29
Artificial LFS - unemployment
30
Artificial LFS3 month change measured in May
31
Artificial LFS1 month change measured in May
32
How reliable is the change?
33
How reliable are ONS indicators?
  • Depends
  • on the various dimensions of quality
  • on the use of the data
  • ONS already produces some quality measures
  • Quality measurement and reporting framework
    leading to development of further measures
  • Not possible for ONS to evaluate all the uses of
    data
  • Important for users to understand the quality
    measures and documentation which is produced
Write a Comment
User Comments (0)
About PowerShow.com