streamlining uncertainty conceptual model and scenario uncertainty frames-2.0 workshop u.s. nuclear...

35
Streamlining Uncertainty Streamlining Uncertainty Conceptual Model and Scenario Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16, 2007 Pacific Northwest National Laboratory Richland, Washington

Upload: ralf-shaw

Post on 13-Jan-2016

222 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

Streamlining UncertaintyStreamlining UncertaintyConceptual Model and Scenario UncertaintyConceptual Model and Scenario Uncertainty

Streamlining UncertaintyStreamlining UncertaintyConceptual Model and Scenario UncertaintyConceptual Model and Scenario Uncertainty

FRAMES-2.0 WorkshopU.S. Nuclear Regulatory Commission

Bethesda, MarylandNovember 15-16, 2007

Pacific Northwest National LaboratoryRichland, Washington

Page 2: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

2

Model ApplicationsModel ApplicationsModel ApplicationsModel Applications

Regulatory and design applications of hydrologic models of flow and contaminant transport often involve using the models to make predictions of future system behavior Performance assessment of new facilities (safety

evaluation, environmental impact assessment) Monitoring network design for contaminant detection or

performance monitoring License termination Design of a subsurface contaminant remediation system

Page 3: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

3

New Reactor Potential Model ApplicationsNew Reactor Potential Model ApplicationsNew Reactor Potential Model ApplicationsNew Reactor Potential Model Applications

Assessing effects of accidental releases on ground and surface waters groundwater flow pathways transport characteristics

Assessing flood design bases Stream flooding Local flooding, site drainage

Impacts of water use Watershed analysis – impacts on other users of water source

upstream and downstream, particularly during drought conditions

Page 4: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

4

Time

Con

cent

ratio

n

Predictive PeriodHistory-Matching Period

~ 10 years ~10 - 1000 years

Framework for the Application of Hydrologic Framework for the Application of Hydrologic Models to Regulatory Decision Making Models to Regulatory Decision Making

Framework for the Application of Hydrologic Framework for the Application of Hydrologic Models to Regulatory Decision Making Models to Regulatory Decision Making

History Matching - reproduce observed behavior Demonstrate understanding of site behavior Provide confidence in use of models to support decisions

Prediction – forecast future behavior Apply model results to decisions For risk-informed decision making, provide estimates of risk

Model Development & Evaluation

Model Application for Comparisonwith Regulatory or Design Criteria

Page 5: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

5

Time

Con

cent

ratio

n

Predictive PeriodHistory-Matching Period

~ 10 years ~10 - 1000 years

Model Predictive Uncertainty Quantifies Model Predictive Uncertainty Quantifies Element of RiskElement of Risk

Model Predictive Uncertainty Quantifies Model Predictive Uncertainty Quantifies Element of RiskElement of Risk

In general, uncertainty is assessed in the history-matching period and propagated into the predictive period Reduce these uncertainties by collecting additional data

Some uncertainties only apply in the predictive period Irreducible characteristics of the system being modeled

Page 6: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

6

Prediction UncertaintyPrediction UncertaintyPrediction UncertaintyPrediction UncertaintyModel conceptualization uncertainty

A hypothesis about the behavior of the system being modeled and the relationships between the components of the system

Each site is unique and heterogeneous/variable. Behavior typically involves complex processes. Site characterization data is limited.

Assessed in history-matching period, applied in the predictive period

Parameter uncertainty Model-specific quantities required to obtain a solution Measurement/sampling errors. Disparity among sampling, simulation, and actual

scales of the system. Assessed in history-matching period, applied in the predictive period

Scenario uncertainty Future state or condition that affects the hydrology Historical record not representative of future conditions – process variability, limited

historical record, land/water use changes, climate change Applies to predictive period only

Page 7: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

7

Model UncertaintyModel UncertaintyModel UncertaintyModel UncertaintyCommon to rely on a single conceptual model of a system. This approach is inadequate when there are:

different interpretations of data insufficient data to resolve differences between conceptualizations

Page 8: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

8

Failure to Consider Model UncertaintyFailure to Consider Model UncertaintyFailure to Consider Model UncertaintyFailure to Consider Model UncertaintyHas two potential pitfalls:

rejection by omission of valid alternatives (underestimates uncertainty) reliance on an invalid model (produces biased results)

Page 9: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

9

Hydrogeologic Model Application (Ref.) Comments Error

Phoenix (Konikow 1986) Assumed past groundwater pumping would continue in future

Scenario/Conceptual

Cross Bar Ranch Wellfield (Stewart and Langevin 1999)

Assumed a 75-day, no-recharge scenario would represent long-term maximum drawdown

Scenario/Conceptual

Arkansas Valley (Konikow and Person 1985) Needed a longer period of calibration Scenario/Parameter

Coachella Valley (Konikow and Swain 1990) Recharge events unanticipated Scenario

INEL (Lewis and Goldstein 1982) Dispersivities poorly estimated Parameter

Milan Army Plant (Andersen and Lu 2003) Extrapolated localized pump test results to larger area Parameter

Blue River (Alley and Emery 1986) Storativity poorly estimated Parameter/Conceptual

Houston (Jorgensen 1981) Including subsidence in model improved predictions Conceptual

HYDROCOIN (Konikow et al. 1997) Boundary condition modeled poorly Conceptual

Ontario Uranium Tailings (Flavelle et al. 1991)

Inadequate chemical reaction model Conceptual

Los Alamos (Bredehoeft 2005) Flow through unsaturated zone not understood Conceptual

Los Angeles (Bredehoeft 2005) Flow vectors 90 off in model Conceptual

Summitville (Bredehoeft 2005) Seeps on mountain unaccounted for Conceptual

Santa Barbara (Bredehoeft 2005) Fault zone flow unaccounted for Conceptual

WIPP (Bredehoeft 2005) Assumed salt had no mobile interstitial brine Conceptual

Fractured Rock Waste Disposal (Bredehoeft 2005)

Preferential flow in unsaturated zone unaccounted for Conceptual

Page 10: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

10

How to Proceed?How to Proceed?How to Proceed?How to Proceed?

Desirable characteristics of a methodology for uncertainty assessment Comprehensive: as many types of uncertainty as

possible should be included Quantitative: it should be possible to compare results

with regulatory criteria or design requirements Systematic: able to be applied to a wide range of sites

and objectives and to enable the common application of computer codes and methods

Page 11: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

11

Deterministic ApproachDeterministic ApproachDeterministic ApproachDeterministic Approach1.0

0.8

0.6

0.4

0.2

0.0

p()

806040200 = Peak Dose (mrem/yr)

Reg

ulat

ory

Thr

esho

ld

Assumptions Model parameters

are correct Model is correct Scenario is known

Page 12: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

12

Parameter Sensitivity ApproachParameter Sensitivity ApproachParameter Sensitivity ApproachParameter Sensitivity Approach

p()

806040200 = Peak Dose (mrem/yr)

?

???

?

Reg

ulat

ory

Cri

teri

on

Assumptions Model parameters

are unknown Model is correct Scenario is known

Page 13: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

13

Parameter Sensitivity ApproachParameter Sensitivity ApproachParameter Sensitivity ApproachParameter Sensitivity Approach

Results Probability of peak dose represents the degree of plausibility of the

model result ? indicates that the actual values of the probabilities are unknown;

statements about the relative values may be possible Bounding (conservative) analysis: the desired predicted value

represents the worst plausible behavior of the system

Limitations Can’t quantitatively estimate risk since probabilities are unknown

[risk = p( > 25 mrem/yr)] Significance of bounding case must be assessed to avoid over-

conservatism Significant sources of uncertainty not included

Page 14: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

14

Parameter Uncertainty ApproachParameter Uncertainty ApproachParameter Uncertainty ApproachParameter Uncertainty Approach

Assumptions Model parameters

are uncertain Model is correct Scenario is known

0.10

0.08

0.06

0.04

0.02

0.00

p()

806040200 = Peak Dose (mrem/yr)

Reg

ulat

ory

Cri

teri

on

Page 15: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

15

Parameter Uncertainty ApproachParameter Uncertainty ApproachParameter Uncertainty ApproachParameter Uncertainty Approach

Method Assign joint probability distribution to model parameters and

propagate through the model (e.g., using Monte Carlo simulation)

Results Peak dose probability density represents the degree of plausibility

of the model result Quantitative estimates of probabilities can be computed Quantitative estimates of risk can be computed

Limitations Joint probability distribution of parameters must be determined May be computationally expensive Significant sources of uncertainty not included

Page 16: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

16

Conceptual Model Sensitivity ApproachConceptual Model Sensitivity ApproachConceptual Model Sensitivity ApproachConceptual Model Sensitivity Approach

0.10

0.08

0.06

0.04

0.02

0.00

p()

806040200 = Peak Dose (mrem/yr)

Model 1 Model 2 Model 3

Reg

ulat

ory

Cri

teri

on

Page 17: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

17

Conceptual Model Sensitivity ApproachConceptual Model Sensitivity ApproachConceptual Model Sensitivity ApproachConceptual Model Sensitivity Approach

Method Postulate alternative conceptual models for a site that are each consistent

with site characterization data and observed system behavior,

Results Each model is used to simulate the desired predicted quantity Parameters of each model (which may be different) are represented using a

joint probability distribution

Limitations Without a quantitative measure of the degree of plausibility of model

alternatives, it is impossible to determine the risk of a decision based on the model predictions

A conservative approach to model uncertainty relies on an implied belief that the most conservative model has a non-negligible degree of plausibility

Requires formulation & simulation of multiple models

Page 18: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

18

Quantitative Model UncertaintyQuantitative Model UncertaintyQuantitative Model UncertaintyQuantitative Model Uncertainty

Assign a discrete probability distribution to the conceptual model alternatives Analogous to the interpretation of parameter probability,

the discrete model probability distribution represents the degree of plausibility of the model alternatives

What quantity to compare with regulatory/design criteria?

Page 19: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

19

Probability-Based Model SelectionProbability-Based Model SelectionProbability-Based Model SelectionProbability-Based Model Selection

Use the model with the highest probability for predictions Potentially biased result if significant probability with alternative models If variance due to model uncertainty is desired, must compute predicted

value using each model0.10

0.08

0.06

0.04

0.02

0.00

p()

806040200 = Peak Dose (mrem/yr)

Model 1 (Prob = 0.5) Model 2 (Prob = 0.25) Model 3 (Prob = 0.25)

Reg

ulat

ory

Thr

esho

ld

Page 20: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

20

Conservative Model SelectionConservative Model SelectionConservative Model SelectionConservative Model Selection

0.10

0.08

0.06

0.04

0.02

0.00

p()

806040200 = Peak Dose (mrem/yr)

Model 1 (Prob = 0.5) Model 2 (Prob = 0.25) Model 3 (Prob = 0.25)

Reg

ulat

ory

Thr

esho

ld

Use the model with the most significant consequence How little probability must lie with the highest consequence model before it

is judged implausible? Consequence must be computed with each model to determine the

conservative model

Page 21: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

21

Probability-Weighted Model AveragingProbability-Weighted Model AveragingProbability-Weighted Model AveragingProbability-Weighted Model Averaging

0.10

0.08

0.06

0.04

0.02

0.00

p()

806040200 = Peak Dose (mrem/yr)

Model 1 (Prob = 0.5) Model 2 (Prob = 0.25) Model 3 (Prob - 0.25) Model-Averaged Result

Reg

ulat

ory

Thr

esho

ld

Page 22: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

22

Probability-Weighted Model AveragingProbability-Weighted Model AveragingProbability-Weighted Model AveragingProbability-Weighted Model Averaging

Method Model predictions are combined using a weighted average with the

weight for each model’s prediction consisting of that model’s probability

Results Model-averaged probability density function represents the degree

of plausibility of the predicted value that takes into consideration the joint effect of parameter and model uncertainties

Reduces bias Less likely to underestimate predictive uncertainty Consistent treatment of parameter and model uncertainties Quantitative estimates of risk can be computed from the model-

averaged result

Page 23: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

23

Probability-Weighted Model AveragingProbability-Weighted Model AveragingProbability-Weighted Model AveragingProbability-Weighted Model Averaging

Limitations Model probability is a relative measure with respect to

the other model alternatives considered Requires specifying model probability distribution Requires formulating & simulating multiple models Doesn’t consider scenario uncertainty

Page 24: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

24

Model-Averaging Informative ResultsModel-Averaging Informative ResultsModel-Averaging Informative ResultsModel-Averaging Informative Results

MeanDose

Prob(Dose > 25)

90%ile

Model 1 (prob = 0.5) 10.0 8.2 23.0

Model 2 (prob = 0.25) 20.0 23.9 32.7

Model 3 (prob = 0.25) 45.0 97.7 57.8

Model Average 21.2 34.5 48.5

Results suggest collection of additional data to better discriminate between models (i.e., to modify model probabilities until one model dominates)

Exceedance probability and 90th percentile suggest that a conservative regulatory action may be preferred based on a fully-informed consideration of model and parameter uncertainty

(i.e., risk), rather than on adoption of the most conservative model

Page 25: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

25

Scenario Uncertainty: Unknown Future Scenario Uncertainty: Unknown Future State or Condition of the SystemState or Condition of the System

Scenario Uncertainty: Unknown Future Scenario Uncertainty: Unknown Future State or Condition of the SystemState or Condition of the System

Time

Con

cent

ratio

n

Predictive PeriodHistory-Matching Period

Scenario 1

Scenario 3

Scenario 2

Scenario uncertainty can’t be reduced through the application of data (unlike parameter & model uncertainty)

Page 26: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

26

0.10

0.08

0.06

0.04

0.02

0.00p(

)806040200

= Peak Dose (mrem/yr)

Scenario 1 Model 1 (Prob = 0.5) Model 2 (Prob = 0.25) Model 3 (Prob = 0.25) Model-Averaged Result

Reg

ulat

ory

Thr

esho

ld

0.10

0.08

0.06

0.04

0.02

0.00

p()

806040200 = Peak Dose (mrem/yr)

Scenario 2 Model 1 (Prob = 0.5) Model 2 (Prob = 0.25) Model 3 (Prob = 0.25) Model-Averaged Result

Reg

ulat

ory

Thr

esho

ld

Scenario Scenario Uncertainty Uncertainty Sensitivity Sensitivity ApproachApproach

Scenario Scenario Uncertainty Uncertainty Sensitivity Sensitivity ApproachApproach

Page 27: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

27

Scenario Averaging ApproachScenario Averaging ApproachScenario Averaging ApproachScenario Averaging Approach

0.05

0.04

0.03

0.02

0.01

0.00

p()

806040200 = Peak Dose (mrem/yr)

Scenario 1 (Prob = 0.7) Scenario 2 (Prob = 0.3) Scenario-Averaged Result

Reg

ulat

ory

Thr

esho

ld

Page 28: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

28

Probability-Weighted Scenario AveragingProbability-Weighted Scenario AveragingProbability-Weighted Scenario AveragingProbability-Weighted Scenario Averaging

Method Model-averaged predictions for each scenario are combined using a

weighted average with the weight for each scenario’s prediction consisting of that scenario’s probability

Results Scenario- and model-averaged probability density function represents the

degree of plausibility of the predicted value that takes into consideration the joint effect of parameter, model, and scenario uncertainties

Quantitative estimates of risk can be computed from the scenario- and model-averaged result

Limitations Requires specifying scenario probabilities Requires simulations of each model under each scenario

Page 29: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

29

Scenario-Averaging Informative ResultsScenario-Averaging Informative ResultsScenario-Averaging Informative ResultsScenario-Averaging Informative Results

MeanDose

Prob(Dose > 25)

90%ile

Scenario 1 (prob = 0.7) (model-average)

21.2 34.5 48.5

Scenario 2 (prob = 0.3) (model-average)

27.3 44.8 58.5

Scenario Average 23.0 37.6 52.1

Mean dose results straddle regulatory threshold suggesting that a conservative regulatory action may be preferred based on a fully-informed consideration of model, parameter, and

scenario uncertainty (i.e., risk), rather than on adoption of the most

conservative modeling choices

Page 30: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

30

NRC Staff Application of Probability-Weighted Model Averaging

MODEL 2 MODEL 3

MODEL 4 MODEL 5

Page 31: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

31

Alternative Model DevelopmentAlternative Model Development

Models developed using Groundwater Modeling System (GMS). Model 2: average values for hydraulic conductivity,

recharge, and evapotranspiration Model 3: average values for hydraulic conductivity and

evapotranspiration, zonal values for recharge Model 4: average value for hydraulic conductivity, zonal

values for recharge and evapotranspiration Model 5: same as model 4 with a general head

boundary, recharge, and evapotranspiration

Page 32: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

32

Model & Model & Scenario Scenario

Averaging Averaging Application Application Simulation Simulation

Results Results under Two under Two Scenarios Scenarios (Well 399-1-1)(Well 399-1-1)

Model & Model & Scenario Scenario

Averaging Averaging Application Application Simulation Simulation

Results Results under Two under Two Scenarios Scenarios (Well 399-1-1)(Well 399-1-1)

80

60

40

20

0

Ura

nium

Con

cent

ratio

n (u

g/l)

at W

ell 3

99-1

-1

1/1/2005 1/1/2010 1/1/2015 1/1/2020 1/1/2025Date

Model 4, Alternative Scenario(200 Realizations)

Monte Carlo Realization Result Average

80

60

40

20

0

Ura

nium

Con

cent

ratio

n (u

g/l)

at W

ell 3

99-1

-1

1/1/2005 1/1/2010 1/1/2015 1/1/2020 1/1/2025Date

Model 4, Baseline Scenario(200 Realizations)

Monte Carlo Realization Result Average

Page 33: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

33

Scenario Scenario Average Average

(Baseline 70%)(Baseline 70%)

Scenario Scenario Average Average

(Baseline 70%)(Baseline 70%)

0.20

0.15

0.10

0.05

0E

mpi

rica

l pdf

3025201510Predicted Uranium Concentration (ug/l) at Well 399-1-1 on 1/1/2025

Baseline Scenario - Model Average Alternative Scenario - Model Average Scenario Average

1.0

0.8

0.6

0.4

0.2

0

Em

piri

cal c

df

3025201510Predicted Uranium Concentration (ug/l) at Well 399-1-1 on 1/1/2025

Baseline Scenario - Model Average Alternative Scenario - Model Average Scenario Average

Mean +/- 1 standard deviation shown

Page 34: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

34

Project ObjectivesProject ObjectivesProject ObjectivesProject Objectives

Improve access to the uncertainty assessment methodology by integrating methods with FRAMES Provide guidance on the use of model abstraction

techniques to generate plausible and realistic alternative conceptual models for a site

Parameter estimation Quantitative model comparison Simulation using multiple models and scenarios

Demonstrate using a realistic application relevant to NRC/NRO analyses

Page 35: Streamlining Uncertainty Conceptual Model and Scenario Uncertainty FRAMES-2.0 Workshop U.S. Nuclear Regulatory Commission Bethesda, Maryland November 15-16,

35

Project ScheduleProject ScheduleProject ScheduleProject Schedule

Summer 2008 Implementation of methods completed NRC workshop

Summer 2009 Completion of application NRC workshop