Title: Uncertainty analysis
1Uncertainty analysis and
Model Validation
2Final Project
- Summary of Results
-
- Conclusions
3In a real-world problem we need to establish
model specific calibration criteria and define
targets including associated error.
Calibration Targets
associated error
calibration value
???0.80 m
20.24 m
Target with smaller associated error.
Target with relatively large associated error.
4- Smith Creek Valley (Thomas et al., 1989)
- Calibration Objectives
- Heads within 10 ft of measured heads. Allows for
- Measurement error and interpolation error.
- Absolute mean residual between measured and
- simulated heads close to zero (0.22 ft) and
standard - deviation minimal (4.5 ft).
- Head difference between layers 12 within 2 ft of
- field values.
- 4. Distribution of ET and ET rates match field
estimates.
5Also need to identify calibration parameters and
their reasonable ranges.
6(No Transcript)
7Calibration to Fluxes
- When recharge rate (R) is a calibration
parameter, calibrating to fluxes can help in
estimating K and/or R.
8In this example, flux information helps calibrate
K.
q KI
K ?
H1
H2
9In this example, discharge information helps
calibrate R.
R ?
10In our example, total recharge is known/assumed
to be 7.14E08 ft3/year and discharge recharge.
All water discharges to the playa. Calibration to
ET merely fine tunes the discharge rates within
the playa area.
11 12Includes results from 2000, 2001, 2003
13Includes results from 2000, 2001, 2003
14Particle Tracking
15Observations
Predicted ARM gt Calibrated ARM
Predicted ARM at pumping wells gt
Predicted ARM at nodes with targets
Flow predictions are more robust (consistent
among different calibrated models) than transport
(particle tracking) predictions.
16Conclusions
- Calibrations are non-unique.
- A good calibration (even if ARM 0)
- does not ensure that the model will make
- good predictions.
- You can never have enough field data.
- Modelers need to maintain a healthy skepticism
- about their results.
- Need for an uncertainty analysis to accompany
- calibration results and predictions.
17Uncertainty in the Calibration
Involves uncertainty in
- Conceptual model including boundary conditions,
- zonation, geometry, etc.
18Ways to analyze uncertainty in the calibration
Sensitivity analysis
Use an inverse model (automated calibration) to
quantify uncertainties and optimize the
calibration.
19Uncertainty in the Prediction
- Reflects uncertainty in the calibration.
- Involves uncertainty in how parameter values
- (e.g., recharge) will vary in the future.
20Ways to quantify uncertainty in the prediction
Sensitivity analysis
Stochastic simulation
21MADE site Feehley and Zheng, 2000, WRR 36(9).
22(No Transcript)
23A Monte Carlo analysis considers 100 or more
realizations.
24(No Transcript)
25Stochastic modeling option in GW Vistas
26Ways to quantify uncertainty in the prediction
Sensitivity analysis
Scenario analysis
Stochastic simulation
27Model Validation
How do we validate a model so that we have
confidence that it will make accurate predictions?
28Modeling Chronology
1960s Flow models are great! 1970s
Contaminant transport models are great!
1975 What about uncertainty of flow models?
1980s Contaminant transport models dont work.
(because of failure to account for
heterogeneity)
1990s Are models reliable? Concerns
over reliability in predictions arose over
efforts to model a geologic repository for high
level radioactive waste.
29The objective of model validation is to
determine how well the mathematical
representation of the processes describes the
actual system behavior in terms of the degree of
correlation between model calculations and actual
measured data (NRC, 1990)
30What constitutes validation? (code vs.
model) NRC study (1990) Model validation is
not possible.
Oreskes et al. (1994) paper in Science
Calibration forced empirical adequacy
Verification assertion of truth (possible
in a closed system, e.g., testing of codes)
Validation establishment of legitimacy (does
not contain obvious errors),
confirmation, confidence building
31How to build confidence in a model Calibration
(history matching) steady-state
calibration(s) transient
calibration Verification requires
an independent set of field data Post-Audit
requires waiting for prediction to occur Models
as interactive management tools
32HAPPY MODELING!
33Have a good summer!