Uncertainty analysis - PowerPoint PPT Presentation

About This Presentation
Title:

Uncertainty analysis

Description:

Uncertainty analysis and Model Validation – PowerPoint PPT presentation

Number of Views:112
Avg rating:3.0/5.0
Slides: 34
Provided by: MaryA220
Category:

less

Transcript and Presenter's Notes

Title: Uncertainty analysis


1
Uncertainty analysis and
Model Validation
2
Final Project
  • Summary of Results
  • Conclusions

3
In a real-world problem we need to establish
model specific calibration criteria and define
targets including associated error.
Calibration Targets
associated error
calibration value
???0.80 m
20.24 m
Target with smaller associated error.
Target with relatively large associated error.
4
  • Smith Creek Valley (Thomas et al., 1989)
  • Calibration Objectives
  • Heads within 10 ft of measured heads. Allows for
  • Measurement error and interpolation error.
  • Absolute mean residual between measured and
  • simulated heads close to zero (0.22 ft) and
    standard
  • deviation minimal (4.5 ft).
  • Head difference between layers 12 within 2 ft of
  • field values.
  • 4. Distribution of ET and ET rates match field
    estimates.

5
Also need to identify calibration parameters and
their reasonable ranges.
6
(No Transcript)
7
Calibration to Fluxes
  • When recharge rate (R) is a calibration
    parameter, calibrating to fluxes can help in
    estimating K and/or R.

8
In this example, flux information helps calibrate
K.
q KI
K ?
H1
H2
9
In this example, discharge information helps
calibrate R.
R ?
10
In our example, total recharge is known/assumed
to be 7.14E08 ft3/year and discharge recharge.
All water discharges to the playa. Calibration to
ET merely fine tunes the discharge rates within
the playa area.
11

12
Includes results from 2000, 2001, 2003
13
Includes results from 2000, 2001, 2003
14
Particle Tracking
15
Observations
Predicted ARM gt Calibrated ARM
Predicted ARM at pumping wells gt
Predicted ARM at nodes with targets
Flow predictions are more robust (consistent
among different calibrated models) than transport
(particle tracking) predictions.
16
Conclusions
  • Calibrations are non-unique.
  • A good calibration (even if ARM 0)
  • does not ensure that the model will make
  • good predictions.

  • You can never have enough field data.
  • Modelers need to maintain a healthy skepticism
  • about their results.
  • Need for an uncertainty analysis to accompany
  • calibration results and predictions.

17
Uncertainty in the Calibration
Involves uncertainty in
  • Targets
  • Parameter values
  • Conceptual model including boundary conditions,
  • zonation, geometry, etc.

18
Ways to analyze uncertainty in the calibration
Sensitivity analysis
Use an inverse model (automated calibration) to
quantify uncertainties and optimize the
calibration.
19
Uncertainty in the Prediction
  • Reflects uncertainty in the calibration.
  • Involves uncertainty in how parameter values
  • (e.g., recharge) will vary in the future.

20
Ways to quantify uncertainty in the prediction
Sensitivity analysis
Stochastic simulation
21
MADE site Feehley and Zheng, 2000, WRR 36(9).
22
(No Transcript)
23
A Monte Carlo analysis considers 100 or more
realizations.
24
(No Transcript)
25
Stochastic modeling option in GW Vistas
26
Ways to quantify uncertainty in the prediction
Sensitivity analysis
Scenario analysis
Stochastic simulation
27
Model Validation
How do we validate a model so that we have
confidence that it will make accurate predictions?
28
Modeling Chronology
1960s Flow models are great! 1970s
Contaminant transport models are great!
1975 What about uncertainty of flow models?
1980s Contaminant transport models dont work.
(because of failure to account for
heterogeneity)
1990s Are models reliable? Concerns
over reliability in predictions arose over
efforts to model a geologic repository for high
level radioactive waste.
29
The objective of model validation is to
determine how well the mathematical
representation of the processes describes the
actual system behavior in terms of the degree of
correlation between model calculations and actual
measured data (NRC, 1990)
30
What constitutes validation? (code vs.
model) NRC study (1990) Model validation is
not possible.
Oreskes et al. (1994) paper in Science
Calibration forced empirical adequacy
Verification assertion of truth (possible
in a closed system, e.g., testing of codes)
Validation establishment of legitimacy (does
not contain obvious errors),
confirmation, confidence building
31
How to build confidence in a model Calibration
(history matching) steady-state
calibration(s) transient
calibration Verification requires
an independent set of field data Post-Audit
requires waiting for prediction to occur Models
as interactive management tools
32
HAPPY MODELING!
33
Have a good summer!
Write a Comment
User Comments (0)
About PowerShow.com