Martin Pilch, PhD, PMP - PowerPoint PPT Presentation

1 / 14
About This Presentation
Title:

Martin Pilch, PhD, PMP

Description:

Martin Pilch, PhD, PMP. Validation and Uncertainty Quantification, 1533 ... Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed ... – PowerPoint PPT presentation

Number of Views:73
Avg rating:3.0/5.0
Slides: 15
Provided by: escSa
Category:
Tags: pmp | phd | martin | pilch

less

Transcript and Presenter's Notes

Title: Martin Pilch, PhD, PMP


1
Welcome and Opening Remarks
  • Martin Pilch, PhD, PMP
  • Validation and Uncertainty Quantification, 1533
  • SNL/ASC VV Program Element Manager
  • Email mpilch_at_Sandia.gov
  • Ph 505 845-3047
  • Presented at
  • Validation Challenge Workshop
  • Albuquerque, NM
  • May 22-23, 2006

Sandia is a multiprogram laboratory operated by
Sandia Corporation, a Lockheed Martin
Company,for the United States Department of
Energy under contract DE-AC04-94AL85000.
2
Welcome to Albuquerque
  • Land of Enchantment

3
Why Care About Validation?
  • Modeling and simulation is playing an increasing
    role in the design and in the assessment of
    regulatory compliance of high consequence systems
  • Accurate quantification of margins and
    uncertainties in the decision context requires
    that
  • 1) models accurately capture trends appropriate
    to the application parameter space
  • 2) sources of application-important variabilities
    can be reflected through the model
  • 3) uncertainties associated with the use of the
    model in the application parameter space can be
    quantified
  • Model validation is a very necessary, but not
    sufficient, element in establishing the
    credibility of models for these important types
    of applications

4
Definition of Validation
  • Validation The process of determining the degree
    to which a model is an accurate representation of
    the real world from the perspective of the
    intended uses of the model
  • AIAA, Guide for the Verification and Validation
    of Computational Fluid Dynamics Simulations,
    (1998)

5
Intended UseValidation is Application Specific
  • Regulatory Assessment (Application) -Validation
    is best judged in the application context, which
    often involves a rigorous assessment against
    design or regulatory requirements
  • Accreditation - subsystem or full-system testing
    with application hardware under conditions that
    more closely represent the application of the
    model
  • Ensemble Validation separate physics or low
    order interactions of important physics in
    stylized or de-featured geometries often for
    environments that are not fully representative of
    the application parameter space
  • Material Characterization - identification of
    material properties or constitutive-law
    parameters

6
Challenge ProblemsBenchmarks for Methodology
Comparison
  • Assessing accuracy and adequacy of a model when
    there is a database of multiple tests
  • Assessing accuracy and adequacy of a model when
    there is only a single test
  • Assessing the impact of variabilities and
    uncertainties when using the model to extrapolate
    beyond existing databases
  • Assessing confidence in regulatory assessments
    based on limited data and uncertainties in the
    use of the model

7
Three Challenge Problems
  • Three disciplines to engage broad interest and
    historical perspective
  • Hope is that methodology is independent of
    discipline

8
Key Features Incorporatedinto Each Challenge
Problem
  • Provides an application context, requiring
    extrapolation of models beyond their validation
    basis, with a regulatory requirement stated in
    probabilistic terms
  • Reflects a hierarchal approach to validation
    material characterization, validation against an
    ensemble of data, validation against a single
    test
  • Easy to evaluate models that should not require
    subject matter expertise
  • Synthetic experiment data generated from a
    truth model acting as a surrogate for Mother
    Nature

Truth model(s) never to be revealed !!
9
Key Features NOT Incorporatedinto Each Challenge
Problem
  • Diagnostic variability and uncertainty
    (measurement errors) were not added to exp data
  • Over-simplification of real-world experimental
    conditions
  • Numerical errors need not be addressed
  • Simple, easy-to-evaluate models can be assumed to
    be free of numerical error
  • Many real world applications may require the use
    of under resolved models
  • Nonlinear coupled multi-physics
  • Nonlinear coupled multi-physics is common in many
    real world applications
  • May require validation against SRQs that are
    different from what the application demands

10
Workshop Format
  • Presentations grouped by problem discipline
  • 15 min problem description
  • 30 min presentation/15 min (immediate)
    discussion/questions
  • Address the questions identified in the tasking
    document
  • 45 min discussion period after the 4
    presentations
  • Discuss/compare/contrast methodologies for the
    same problem
  • Workshop summary and closure
  • 60 min summary, lessons learned, path forward

Discussion is a integral part of the workshop!
11
Workshop Participants ChosenBecause of their
Diverse Perspectives
  • Communities academia, professional committees,
    national laboratories, and industry
  • Backgrounds various engineering disciplines,
    math/statistics
  • View points Bayesian, engineering, frequentists,
    validationcalibration, validationassessment

We expect discussions from diverse perspectives!
12
Workshop Proceedings
  • Ask your permission to post copies of
    presentations on the web site
  • Special issue of Computer Methods in Applied
    Mechanics and Engineering, ed. T.J.R. Hughes,
    J.T. Oden, M. Papadrakakis
  • Proposed schedule
  • Jan. 1, 2007 submit papers for peer review
  • Special issue to be published in 2008
  • Guidelines
  • Maximum of 25 pages (8-1/2 x 11)
  • CMAME format (minimal color)

13
Summary
  • Focus is methodology, not results
  • Questions/Discussion encouraged during the
    presentations

Setting the national agenda!
14
Concluding Programmatic Challenge
How do you measure and communicate progress in
Predictive Capability?
Write a Comment
User Comments (0)
About PowerShow.com