Uncertainty Analysis - PowerPoint PPT Presentation

About This Presentation
Title:

Uncertainty Analysis

Description:

Need for an uncertainty analysis to accompany. calibration results and predictions. ... Perform sensitivity analysis during calibration. Sensitivity ... – PowerPoint PPT presentation

Number of Views:221
Avg rating:3.0/5.0
Slides: 37
Provided by: MaryAn5
Category:

less

Transcript and Presenter's Notes

Title: Uncertainty Analysis


1
Uncertainty Analysis and
Model Validation or Confidence
Building
2
Conclusions
  • Calibrations are non-unique.
  • A good calibration (even if ARM 0)
  • does not ensure that the model will make
  • good predictions.
  • Field data are essential in constraining the
    model
  • so that the model can capture the essential
  • features of the system.

  • Modelers need to maintain a healthy skepticism
  • about their results.

3
Conclusions
  • Head predictions are more robust (consistent
  • among different calibrated models) than
    transport
  • (particle tracking) predictions.
  • Need for an uncertainty analysis to accompany
  • calibration results and predictions.

Ideally models should be maintained for the long
term and updated to establish confidence in the
model. Rather than a single calibration exercise,
a continual process of confidence building is
needed.
4
Uncertainty in the Calibration
Involves uncertainty in
  • Conceptual model including boundary conditions,
  • zonation, geometry, etc.
  • Parameter values
  • Targets

5
Zonation
Kriging
6
Zonation vs Pilot Points
To use conventional inverse models/parameter
estimation models in calibration, you need to
have a pretty good idea of zonation (of K, for
example).
(New version of PEST with pilot points does not
need zonation as it works with continuous
distribution of parameter values.)
Also need to identify reasonable ranges for
the calibration parameters and weights.
7
Parameter Values
  • Field data are essential in constraining the
    model
  • so that the model can capture the essential
  • features of the system.

8
Calibration Targets
Need to establish model specific calibration
criteria and define targets including associated
error.
associated error
calibration value
???0.80 m
20.24 m
Target with smaller associated error.
Target with relatively large associated error.
9
Examples of Sources of Error in a Calibration
Target
  • Surveying errors
  • Errors in measuring water levels
  • Interpolation error
  • Transient effects
  • Scaling effects
  • Unmodeled heterogeneities

10
Importance of Flux Targets
  • When recharge rate (R) is a calibration
    parameter, calibrating to fluxes can help in
    estimating K and/or R.

R was not a calibration parameter in our final
project.
11
In this example, flux information helps calibrate
K.
q KI
K ?
H1
H2
12
Here discharge information helps calibrate R.
R ?
Q
13
In this example, flux information helps calibrate
K.
q KI
K ?
H1
H2
14
In our example, total recharge is known/assumed
to be 7.14E08 ft3/year and discharge recharge.
All water discharges to the playa. Calibration to
ET merely fine tunes the discharge rates within
the playa area. Calibration to ET does not help
calibrate the heads and K values except in the
immediate vicinity of the playa.
15
  • Smith Creek Valley (Thomas et al., 1989)
  • Calibration Objectives (matching targets)
  • Heads within 10 ft of measured heads. Allows for
  • Measurement error and interpolation error.
  • Absolute residual mean between measured and
  • simulated heads close to zero (0.22 ft) and
    standard
  • deviation minimal (4.5 ft).
  • Head difference between layers 12 within 2 ft of
  • field values.
  • 4. Distribution of ET and ET rates match field
    estimates.

16
724 Project Results
?
Includes results from 2006 and 4 other years
A good calibration does not guarantee an
accurate prediction.
17
Sensitivity analysis to analyze uncertainty in
the calibration
Use an inverse model (automated calibration) to
quantify uncertainties and optimize the
calibration.
Perform sensitivity analysis during
calibration. Sensitivity coefficients
18
Steps in Modeling
calibration loop
Sensitivity analysis performed during the
calibration
(Zheng and Bennett)
19
Uncertainty in the Prediction
  • Reflects uncertainty in the calibration.
  • Involves uncertainty in how parameter values
  • (e.g., recharge) or pumping rates will vary
  • in the future.

20
Ways to quantify uncertainty in the prediction
Sensitivity analysis - parameters
Scenario analysis - stresses
Stochastic simulation
21
Steps in Modeling
Traditional Paradigm
Sensitivity analysis performed after the
prediction
(Zheng and Bennett)
22
New Paradigm for Sensitivity Scenario Analysis
Multi-model Analysis (MMA)
Predictions and sensitivity analysis are
inside the calibration loop
From J. Doherty 2007
23
Ways to quantify uncertainty in the prediction
Sensitivity analysis - parameters
Scenario analysis - stresses
Stochastic simulation
24
Stochastic simulation
Stochastic modeling option available in GW Vistas
MADE site Feehley and Zheng, 2000, WRR 36(9).
25
(No Transcript)
26
A Monte Carlo analysis considers 100 or more
realizations.
27
(No Transcript)
28
Zheng Bennett Fig. 13.2
29
Hydraulic conductivity
Initial concentrations (plume configuration)
Both
Zheng Bennett Fig. 13.5
30
Reducing Uncertainty
Hypothetical example
truth
Hard data only
Soft and hard data
With inverse flow modeling
ZB Fig. 13.6
31
Model Validation
How do we validate a model so that we have
confidence that it will make accurate predictions?
Confidence Building
32
Modeling Chronology
1960s Flow models are great! 1970s
Contaminant transport models are great!
1975 What about uncertainty of flow models?
1980s Contaminant transport models dont work.
(because of failure to account for
heterogeneity)
1990s Are models reliable? Concerns
over reliability in predictions arose over
efforts to model geologic repositories for high
level radioactive waste.
33
The objective of model validation is to
determine how well the mathematical
representation of the processes describes the
actual system behavior in terms of the degree of
correlation between model calculations and actual
measured data (NRC, 1990)
Hmmmmm. Sounds like calibration What they
really mean is that a valid model will yield an
accurate prediction.
34
What constitutes validation? (code vs.
model) NRC study (1990) Model validation is
not possible.
Oreskes et al. (1994) paper in Science
Calibration forced empirical adequacy
Verification assertion of truth (possible
in a closed system, e.g., testing of codes)
Validation establishment of legitimacy (does
not contain obvious errors),
confirmation, confidence building
35
How to build confidence in a model Calibration
(history matching) steady-state
calibration(s) transient
calibration Verification requires
an independent set of field data Post-Audit
requires waiting for prediction to occur Models
as interactive management tools (e.g., the AEM
model of The Netherlands)
36
HAPPY MODELING!
Write a Comment
User Comments (0)
About PowerShow.com