verification and validation in computational engineering

37
Verification and Validation in Computational Engineering and Sciences Serge Prudhomme epartement de math ´ ematiques et de g ´ enie industriel Ecole Polytechnique de Montr´ eal, Canada SRI Center for Uncertainty Quantification King Abdullah University of Science and Technology, Saudi Arabia TERATEC 2015 FORUM Ecole Polytechnique, Palaiseau, France June 23-24, 2015 S. Prudhomme V&V in CES June 23-24, 2015 1 / 37

Upload: others

Post on 15-Oct-2021

12 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Verification and Validation in Computational Engineering

Verification and Validationin Computational Engineering and Sciences

Serge Prudhomme

Departement de mathematiques et de genie industrielEcole Polytechnique de Montreal, Canada

SRI Center for Uncertainty QuantificationKing Abdullah University of Science and Technology, Saudi Arabia

TERATEC 2015 FORUMEcole Polytechnique, Palaiseau, France

June 23-24, 2015

S. Prudhomme V&V in CES June 23-24, 2015 1 / 37

Page 2: Verification and Validation in Computational Engineering

Outline

Outline

• Introduction and definitions.

• Description of validation process.• Numerical examples.

I Data model reductionI Model selection

• Concluding remarks

Model validation requires HPC!

S. Prudhomme V&V in CES June 23-24, 2015 2 / 37

Page 3: Verification and Validation in Computational Engineering

Introduction

Introduction

Vehicle during atmospheric entry Ablation rate

Project conducted at PSAAP Center at ICES, UT Austin.Objective: predict the ablation rate of a vehicle during atmospheric entry.

S. Prudhomme V&V in CES June 23-24, 2015 3 / 37

Page 4: Verification and Validation in Computational Engineering

Introduction

SLEIPNER A Offshore Plateform – Norway, 1991

• Degats: $700 millions

• Analyse par Elements Finis

• Logiciel Nastran

• Contraintes sous-estimees

• Structure tricell

S. Prudhomme V&V in CES June 23-24, 2015 4 / 37

Page 5: Verification and Validation in Computational Engineering

Introduction

Predictive Science: The Fundamentals

MATHEMATICAL MODEL:

A collection of mathematical constructions that represent the essen-tial aspects of a system in a usable form (a mathematical represen-tation of observations and theory proposed for explaining specificphysical phenomena).

QUANTITIES OF INTEREST:

Specific objectives that can be expressed as the target outputs of amodel (mathematically, they are often defined by functionals of thesolutions and provide focus on the goal of scientific computation).

S. Prudhomme V&V in CES June 23-24, 2015 5 / 37

Page 6: Verification and Validation in Computational Engineering

Introduction

ABSTRACT MATHEMATICAL MODEL:

A mathematical model determines a map of physical parameters θ ∈Minto theoretical observables Q ∈ D for various scenarios s ∈ S:

Evaluate Q(u(θ, s)) s.t. A(θ, s;u) = 0, given θ ∈M

with A = The mathematical model

θ = The model (material) parameters

s = The particular scenario on which the model is applied

Q = The quantity of interest (observable) for scenario s

COMPUTATIONAL MODEL:

A discretization (or corruption) of a mathematical model designed to ren-der it to a form that can be processed by computing devices.

Evaluate Q(uh(θ, s)) s.t. Ah(θ, s;uh) = 0, given θ ∈M

S. Prudhomme V&V in CES June 23-24, 2015 6 / 37

Page 7: Verification and Validation in Computational Engineering

Introduction

Verification

The process of determining the accuracy with which a computationalmodel can produce results deliverable by the mathematical model onwhich it is based.

”Do we solve the equations right?” (Roache, 2009)

• Code Verification:I Manufactured solution method;I Estimation of rates of convergence; etc.

• Solution Verification:I It is based on a posteriori error estimation and adaptive methods (with

respect to quantities of interest).I Its main objective is to verify that discretization parameters (in space,

time, and stochastic dimension) and numerical parameters (modelreduction method, nonlinear solver, etc.) are correctly chosen.

S. Prudhomme V&V in CES June 23-24, 2015 7 / 37

Page 8: Verification and Validation in Computational Engineering

Introduction

Validation

The process of determining the accuracy with which a model canpredict observed physical events (or the important features of aphysical reality).

”Do we solve the right equations?” (Roache 2009)

An art, not a science (yet) . . . It involves many a priori choices:

• Experiments or observations, splitting between calibration andvalidation datasets.

• Likelihood and prior in Bayesian inference.

• Acceptance metric to assess the validity of the model, etc.

An iterative process . . .

• Goal is to test as many hypotheses of the model as possible.

S. Prudhomme V&V in CES June 23-24, 2015 8 / 37

Page 9: Verification and Validation in Computational Engineering

Introduction

Paths to Knowledge

DECISION

KNOWLEDGE

Errors

Discretization

MODELS

COMPUTATIONAL

Errors

Observational

VERIFICATIONVALIDATION

PHYSICAL

REALITIES

THE UNIVERSE

of

Modeling

Errors

MODELS

THEORY /

MATHEMATICALOBSERVATIONS

Oden and Prudhomme, IJNME (Sept. 2010)Oden, Moser, and Ghattas, SIAM News, (Nov. 2010)

S. Prudhomme V&V in CES June 23-24, 2015 9 / 37

Page 10: Verification and Validation in Computational Engineering

Introduction

The Path of Truth . . .

“If error is correctedwhenever it is recognized as such,

the path to erroris the path of truth.”

Hans Reichenbach (1891- 1953)The Rise of Scientific Philosophy, 1951.

S. Prudhomme V&V in CES June 23-24, 2015 10 / 37

Page 11: Verification and Validation in Computational Engineering

Introduction

Control of Errors

Errors are all a matter of comparison!

• Code Verification: Using the method of “manufactured solutions”, forexample, we can easily compare the computed solution with themanufactured solution.

• Solution Verification: In this case, the solution of the problem isunknown and one can use a posteriori error estimates to assess theaccuracy of the approximate solutions.

• Calibration Process: Comparison of observable data with modelestimates of the observables.

• Validation Process: The main idea behind validation is to knowwhether a model can be used for prediction purposes.

⇒What should we compare in this case?

S. Prudhomme V&V in CES June 23-24, 2015 11 / 37

Page 12: Verification and Validation in Computational Engineering

Validation Process

Model Validation

1 Is the model a good model? How good is the model? What is a goodmodel?

2 Experimental data.

3 Calibration: Identification of model parameters in order that the modelreproduce experimental measurements. Bayesian inference:

p(θ|D) =L(θ|D) p(θ)

p(D)where

D ∈ Rn = Calibration dataL(θ|D) = Likelihoodp(θ) = Priorp(θ|D) = Posterior

4 Assessment with respect to a quantity of interest on a predictionscenario (cannot be measured or observed, otherwise it would not bea prediction any longer).

5 Definition of an acceptance metric.

S. Prudhomme V&V in CES June 23-24, 2015 12 / 37

Page 13: Verification and Validation in Computational Engineering

Validation Process

The Prediction Pyramid

for validating

a single model

of QoI

Prediction

QP

If model predicts QoIwithin prescribed accuracy

the model is said to be not invalidated

SV

Prediction

Scenario

Validation Scenario(s)

SC

Calibration Scenarios

SP

Experimental

Data DC

on calibration

scenarios

Experimental

Data

on validation

scenario(s)

DV

(As simple as possible, numerous)

(more complex than calibration)

NoData

Available

The Validation Pyramid

S. Prudhomme V&V in CES June 23-24, 2015 13 / 37

Page 14: Verification and Validation in Computational Engineering

Validation Process

Classical Approach for Validation

Solve

Forward

Problem

on

Validation

Scenario

M O( , γ) <

?

Yes

No

c Dv

Solve

Calibrationθc

Parameter(s)with

Model

Problem

Inverse

on

Scenario

Unknown

Parameter(s)

θ

Calibration

ValidationDv

Oc

Observables

Estimate

cDData

vDData

Calibrated

Model with

Model not

Invalid

Increased

Confidence

Model

is

Invalid

M(Ocal,Dval

)≤ γtol (?)

S. Prudhomme V&V in CES June 23-24, 2015 14 / 37

Page 15: Verification and Validation in Computational Engineering

Validation Process

Proposed Validation Process (Babuska, Nobile, Tempone, 2009)

M Q( , γ) <

?

Yes

No

c Qv

Solve

Calibration

Validation

Solve

θc

θv

Model with

Parameter(s)

Parameter(s)

with

Parameter(s)

θ

with

Model

Problem

Inverse

on

Scenario

Problem

Inverse

on

Scenario

Same

Model

Unknown

Parameter(s)

θ

Calibration

Validation

Scenario

Prediction

on

Problem

Forward

Solving the

QoI

Estimate

Quantification

Uncertainty

Sensitivity/

Data D

Data D

c

v

Q

Qv

c

Model with

Re−Calibrated

Calibrated

Model not

Invalid

Increased

Confidence

Model

is

Invalid

M(Qcal,Qval

)≤ γtol (?)

S. Prudhomme V&V in CES June 23-24, 2015 15 / 37

Page 16: Verification and Validation in Computational Engineering

Validation Process

Metric

Probability density functions:

pC(Q) = p(Q(u(pCpost(θ), sp)

))pV (Q) = p

(Q(u(pVpost(θ), sp)

))Cumulative density functions:

cdfC(Q) =

∫ Q

−∞pC(Q) dQ

cdfV (Q) =

∫ Q

−∞pV (Q) dQ

S. Prudhomme V&V in CES June 23-24, 2015 16 / 37

Page 17: Verification and Validation in Computational Engineering

Planning of Validation Processes

Validation process requires detailed planning:

1. Description of goals: Describe background and goals of the predictions.Clearly define the quantity (or quantities) of interest.

2. Modeling: Write mathematical equations of selected model(s), list allparameters that are necessary to solve the problem, as well as assumptionsand limitations of the model(s),

3. Data collection: Collect as many data as possible from literature oravailable sources (data should include, if available, the statistics).

4. Sensitivity analysis: Quantify the sensitivity of QoI with respect toparameters of the model. Rank parameters according to their influence.

5. Calibration experiments: Provide description of scenario (as precisely aspossible), observables and statistics, prior and likelihood of the parametersto be calibrated.

6. Validation experiments: Provide same as above + clearly stateassumption to be validated.

S. Prudhomme V&V in CES June 23-24, 2015 17 / 37

Page 18: Verification and Validation in Computational Engineering

Planning of Validation Processes

Morgan & Henrion’s “Ten Commandements” (1990)∗

In relation to quantitative risk and policy analysis

1. Do your homework with literature, experts and users.

2. Let the problem drive the analysis.

3. Make the analysis as simple as possible, but no simpler.

4. Identify all significant assumptions.

5. Be explicit about decision criteria and policy strategies.

6. Be explicit about uncertainties.

7. Perform systematic sensitivity and uncertainty analysis.

8. Iteratively refine the problem statement and the analysis.

9. Document clearly and completely.

10. Expose to peer review.

∗ Extracted from D. Vose, “Risk Analysis: A Quantitative Guide” (2008)

S. Prudhomme V&V in CES June 23-24, 2015 18 / 37

Page 19: Verification and Validation in Computational Engineering

Planning of Validation Processes

A systematic approach to the planning and implementation of experiments(Chapter 1 - Section 2)

In Wu & Hamada “Experiments, Analysis, and Optimization” (2009)

1. State objective.

2. Choose response.

3. Choose factors and levels.

4. Choose experimental plan.

5. Perform the experiment.

6. Analyze the data.

7. Draw conclusions and make recommendations:

. . . the conclusions should refer back to the stated objectives of the experiment.A confirmation experiment is worthwhile for example, to confirm the recommendedsettings. Recommendations for further experimentation in a follow-up experimentmay also be given. For example, a follow-up experiment is needed if two modelsexplain the experimental data equally well and one must be chosen for optimization.

S. Prudhomme V&V in CES June 23-24, 2015 19 / 37

Page 20: Verification and Validation in Computational Engineering

Planning of Validation Processes

Planning• Planning is a cumbersome and time-consuming process.

• Planning of validation processes involves many choices thateventually need to be carefully checked.

Choices are made about:• Physical models• Quantities of interest and surrogate quantities of interest• Experiments for calibration and validation purposes• Data sets to be used in calibration and validation• Prior pdf and likelihood function• Probabilistic models . . .

Our preliminary experiences with validation has revealed that many “sanitychecks” need to be added within the proposed validation process.

Our objective is to develop a suite of tools to systematically verify thecorrectness of each stage of the validation process.

S. Prudhomme V&V in CES June 23-24, 2015 20 / 37

Page 21: Verification and Validation in Computational Engineering

Examples

Examples: EAST (Shock-Tube) Experiments

Wavelength [nm]

Dis

tan

ce

[cm

]

Shot B38

Shock Front

Atomic Line

Radiation

Contact

Discontinuity

S. Prudhomme V&V in CES June 23-24, 2015 21 / 37

Page 22: Verification and Validation in Computational Engineering

Examples

Data for Thermal and Chemical Non-equilibrium Models

S. Prudhomme V&V in CES June 23-24, 2015 22 / 37

Page 23: Verification and Validation in Computational Engineering

Examples

Data reduction model for ICCD Camera

S. Prudhomme V&V in CES June 23-24, 2015 23 / 37

Page 24: Verification and Validation in Computational Engineering

Examples

Data reduction model for ICCD Camera

0.01 0.1 1 10Gate Width [µs]

0

0.5

1

1.5

2

2.5

3

Rec

ipro

city

Fac

tor

Pecos - Data (Nov)Parabola -LSEAST - dataPecos - Data (Dec)Pecos - Data (Mar)

The (non)-reciprocity factor quantifies the deviation w.r.t. the linear regime.

S. Prudhomme V&V in CES June 23-24, 2015 24 / 37

Page 25: Verification and Validation in Computational Engineering

Examples

Data reduction model for ICCD Camera

Applied voltage

Photons

Plate (MCP)

Fiber OpticOutput Window

Phosphor

Photo−electron

Photocathode

Microchannel

Applied voltage

The opening/closing of the camera involves complex electronics phenomena. Inshock-tube experiments at high speed, opening/closing of camera has to be doneon the order of one microsecond. The major issue in this study was to identifywhether the camera was being used in its linear or nonlinear regime for properestimation of the intensity.

S. Prudhomme V&V in CES June 23-24, 2015 25 / 37

Page 26: Verification and Validation in Computational Engineering

Examples

Validation Planning

Proposed physical modelsand corresponding modelparameters.

[Based on RC circuits and“expert” opinion]

Long gate width

Short gate width

Ltt !""

#

α1 α2 β δ ν Photon counts N∆t

M1 7 7 3 7 7 β∆tM2 3 3 3 7 7 β∆t− Λ(∆t, α1, α2, β)M3 3 3 3 3 7 β(∆t+ δ)− Λ(∆t+ δ, α1, α2, β)M4 3 3 3 7 3 (β + ν)∆t− Λ(∆t, α1, α2β)M5 3 3 3 3 3 (β + ν)(∆t+ δ)− Λ(∆t+ δ, α1, α2β)

Symbols 3 or 7 indicate that the parameter is or is not part of the model, respectively.

S. Prudhomme V&V in CES June 23-24, 2015 26 / 37

Page 27: Verification and Validation in Computational Engineering

Examples

Validation Planning

QoI• Reciprocity at ∆t = 0.1 [µs]

Hypothesis to be validated• Can the model(s) be

predictive at low gatewidths?

Calibration/Validation data• Calibration data:

in range 0.9 – 10.0 [µs]

• Validation data:in range 0.5 – 0.8 [µs]

0.01 0.1 1 10Gate Width [µs]

0

0.5

1

1.5

2

2.5

3

Rec

ipro

city

Fac

tor

Pecos - Data (Nov)Parabola -LSEAST - dataPecos - Data (Dec)Pecos - Data (Mar)

Calibration• Based on Bayesian inference.

• Uniform priors.

• Uncertainties in data andmodeling error are combined.

S. Prudhomme V&V in CES June 23-24, 2015 27 / 37

Page 28: Verification and Validation in Computational Engineering

Examples

Results

CDF of QoI for model M1 (left) and model M5 (right):

0 1 2 3 4 5 6 7Reciprocity

0

0.2

0.4

0.6

0.8

1

Mar

gina

l CD

Fs

[0.5,10.0][0.9,10.0]

21%

0 5 10 15 20Reciprocity

0

0.2

0.4

0.6

0.8

1

Mar

gina

l CD

Fs

[0.5,10.0][0.9,10.0]

133%

S. Prudhomme V&V in CES June 23-24, 2015 28 / 37

Page 29: Verification and Validation in Computational Engineering

Examples

Analysis of calibrated variance

PDF of calibrated variance σ for model M1 (left) and model M5 (right):

-6 -5 -4 -3 -2 -1 0Log10(σ

2)

0

5

10

15

20

Mar

gina

l PD

Fs

[0.5,10.0][0.9,10.0]

-6 -5 -4 -3 -2 -1 0Log10(σ

2)

0

10

20

30

Mar

gina

l PD

Fs

[0.5,10.0][0.9,10.0]

The validation process presupposes that the models can accuratelypredict observable data.

S. Prudhomme V&V in CES June 23-24, 2015 29 / 37

Page 30: Verification and Validation in Computational Engineering

Examples

Convergence of calibration processwith respect to number of data points

CDF of QoI for model M1 (left) and model M5 (right) for different data sets:

0 1 2 3 4 5 6 7Reciprocity

0

0.2

0.4

0.6

0.8

1

Mar

gina

l CD

Fs

[0.3,10.0][0.4,10.0][0.5,10.0][0.6,10.0][0.7,10.0][0.8,10.0][0.9,10.0][2.0,10.0]

0 5 10 15 20Reciprocity

0

0.2

0.4

0.6

0.8

1

Mar

gina

l CD

Fs

[0.3,10.0][0.4,10.0][0.5,10.0][0.6,10.0][0.7,10.0][0.8,10.0][0.9,10.0][2.0,10.0]

S. Prudhomme V&V in CES June 23-24, 2015 30 / 37

Page 31: Verification and Validation in Computational Engineering

Examples

Models are rejected . . . and what?

• Acquire more data?

• Improve the models?

500 1000 1500 2000Gate width [µs]

0

0.5

1

1.5

Norm

aliz

ed P

hoto

-counts

800ns

300ns200ns100ns

[From Aaron Brandis, NASA, September 2011]

S. Prudhomme V&V in CES June 23-24, 2015 31 / 37

Page 32: Verification and Validation in Computational Engineering

Examples

Model Selection

Fully-developed incompressible channel flow:

Mean flow equations: u = U + u′∇ ·(ν∇U − u′iu′j

)= ρ−1∇P

∇ ·U = 0

Eddy-viscosity assumption:

u′iu′j = −νT (Ui,j + Uj,i)

Channel equations:Assuming homogeneous turbulence in x, i.e. U = U(y),

∂y ((ν + νT )∂yU) = 1, y ∈ (0, H)

S. Prudhomme V&V in CES June 23-24, 2015 32 / 37

Page 33: Verification and Validation in Computational Engineering

Examples

Spalart-Allmaras (SA) modelEddy viscosity is given by:

νT = νfv1, fv1 = χ3/(χ3 + cv13), χ = ν/ν

where ν is governed by the transport equation

Dt= Pν(κ, cb1)− εν(κ, cb1, σSA, cw2)

+1

σSA

[∂

∂xj

((ν + ν)

∂ν

∂xj

)+ cb2

(∂ν

∂xj

)2]

with

• Pν = production term

• Dν = wall destruction term

Parameter Valuesκ 0.41 cb2 0.622cb1 0.1355 cv1 7.1σSA 2/3 cw2 0.3

1Allmaras, Johnson, and Spalart, 2012; Oliver and Darmofal, 2009S. Prudhomme V&V in CES June 23-24, 2015 33 / 37

Page 34: Verification and Validation in Computational Engineering

Examples

Calibration data:• Data is obtained from direct numerical simulation (DNS) 1

• Mean velocity measurements at Reτ = 944 and Reτ = 2003

Uncertainty models:• Three multiplicative error models

〈u〉+ (z; ξ) = (1 + ε(z; ξ))U+(z; ξ)

I independent homogeneous covarianceI correlated homogeneous covarianceI correlated inhomogeneous covariance

• Reynolds stress model⟨u′iu

′j

⟩+(z; ξ) = T+(z; ξ)− ε(z; ξ)

1Del Alamo et al., 2004; Hoyas and Jimenez, 2006S. Prudhomme V&V in CES June 23-24, 2015 34 / 37

Page 35: Verification and Validation in Computational Engineering

Examples

Bayesian inference

Bayes rule:

p(θ|D) =L(θ|D) p(θ)

p(D)where

D ∈ Rn = Calibration dataL(θ|D) = Likelihoodp(θ) = Priorp(θ|D) = Posterior

Model selection:• Set of modelsM = {M1,M2, . . . ,Mn}• Likelihood = p(D|Mi,M) =

∫θ p(D|θ,Mi,M)p(θ|Mi,M) dθ

• Posterior plausibility = p(Mi|D,M)

p(Mi|D,M) ∝ p(D|Mi,M) p(Mi|M)

• Evidence = p(D)

1QUESO (developed at ICES, UT Austin)S. Prudhomme V&V in CES June 23-24, 2015 35 / 37

Page 36: Verification and Validation in Computational Engineering

Examples

Numerical Results: Model selection

Model evidence (log(E)):

Surrogate Full modelIndependent homogeneous -1.457 8.862Correlated homogeneous 1.963 8.045Correlated inhomogeneous 164.9 164.0Reynolds stress 164.8 169.0

Relative runtimes (in seconds):

Surrogate Full modelIndependent homogeneous 130 1720Correlated homogeneous 162 1906Correlated inhomogeneous 151 1735Reynolds stress 147 1743Cumulative 590 7104

S. Prudhomme V&V in CES June 23-24, 2015 36 / 37

Page 37: Verification and Validation in Computational Engineering

Conclusions

Concluding remarks1. Model validation is a complex, time-consuming, and CPU intensive

process.I Model is never validated. It is at best not invalidated.I Validation should be performed on a prediction scenario with respect to

given QoI’s.I UQ should be included for comparison of data and computer outputs.

2. Validation planning requires insight and creativity.I DocumentationI Data selection and analysis, etc.

3. Verification tools are needed to test correctness of processes.I Manufactured data to test calibration processI Sensitivity analysis to partially test the quality of the dataI Evidence/plausibility to select best model among class of modelsI Tools to test selection of calibration and validation data setsI Tools to test various probabilistic models: prior pdf, likelihoodI Experimental Design to select most informative experimental data

S. Prudhomme V&V in CES June 23-24, 2015 37 / 37