A bayesian framework for quantifying uncertainty in stochastic simulation

Wei Xie, Barry L. Nelson, Russell R. Barton

Research output: Contribution to journalArticle

39 Citations (Scopus)

Abstract

When we use simulation to estimate the performance of a stochastic system, the simulation often contains input models that were estimated from real-world data; therefore, there is both simulation and input uncertainty in the performance estimates. In this paper, we provide a method to measure the overall uncertainty while simultaneously reducing the influence of simulation estimation error due to output variability. To reach this goal, a Bayesian framework is introduced. We use a Bayesian posterior for the input-model parameters, conditional on the real-world data, to quantify the input-parameter uncertainty; we propagate this uncertainty to the output mean using a Gaussian process posterior distribution for the simulation response as a function of the input-model parameters, conditional on a set of simulation experiments. We summarize overall uncertainty via a credible interval for the mean. Our framework is fully Bayesian, makes more effective use of the simulation budget than other Bayesian approaches in the stochastic simulation literature, and is supported with both theoretical analysis and an empirical study. We also make clear how to interpret our credible interval and why it is distinctly different from the confidence intervals for input uncertainty obtained in other papers.

Original languageEnglish (US)
Pages (from-to)1439-1452
Number of pages14
JournalOperations Research
Volume62
Issue number6
DOIs
StatePublished - Nov 1 2014

Fingerprint

Stochastic systems
Error analysis
Stochastic simulation
Uncertainty
Simulation
Experiments
Estimation error
Theoretical analysis
Confidence interval
Parameter uncertainty
Posterior distribution
Simulation experiment
Empirical study
Bayesian approach
Gaussian process
Simulation estimation

All Science Journal Classification (ASJC) codes

  • Computer Science Applications
  • Management Science and Operations Research

Cite this

@article{324a6f14118746769d9cabffd4923b84,
title = "A bayesian framework for quantifying uncertainty in stochastic simulation",
abstract = "When we use simulation to estimate the performance of a stochastic system, the simulation often contains input models that were estimated from real-world data; therefore, there is both simulation and input uncertainty in the performance estimates. In this paper, we provide a method to measure the overall uncertainty while simultaneously reducing the influence of simulation estimation error due to output variability. To reach this goal, a Bayesian framework is introduced. We use a Bayesian posterior for the input-model parameters, conditional on the real-world data, to quantify the input-parameter uncertainty; we propagate this uncertainty to the output mean using a Gaussian process posterior distribution for the simulation response as a function of the input-model parameters, conditional on a set of simulation experiments. We summarize overall uncertainty via a credible interval for the mean. Our framework is fully Bayesian, makes more effective use of the simulation budget than other Bayesian approaches in the stochastic simulation literature, and is supported with both theoretical analysis and an empirical study. We also make clear how to interpret our credible interval and why it is distinctly different from the confidence intervals for input uncertainty obtained in other papers.",
author = "Wei Xie and Nelson, {Barry L.} and Barton, {Russell R.}",
year = "2014",
month = "11",
day = "1",
doi = "10.1287/opre.2014.1316",
language = "English (US)",
volume = "62",
pages = "1439--1452",
journal = "Operations Research",
issn = "0030-364X",
publisher = "INFORMS Inst.for Operations Res.and the Management Sciences",
number = "6",

}

A bayesian framework for quantifying uncertainty in stochastic simulation. / Xie, Wei; Nelson, Barry L.; Barton, Russell R.

In: Operations Research, Vol. 62, No. 6, 01.11.2014, p. 1439-1452.

Research output: Contribution to journalArticle

TY - JOUR

T1 - A bayesian framework for quantifying uncertainty in stochastic simulation

AU - Xie, Wei

AU - Nelson, Barry L.

AU - Barton, Russell R.

PY - 2014/11/1

Y1 - 2014/11/1

N2 - When we use simulation to estimate the performance of a stochastic system, the simulation often contains input models that were estimated from real-world data; therefore, there is both simulation and input uncertainty in the performance estimates. In this paper, we provide a method to measure the overall uncertainty while simultaneously reducing the influence of simulation estimation error due to output variability. To reach this goal, a Bayesian framework is introduced. We use a Bayesian posterior for the input-model parameters, conditional on the real-world data, to quantify the input-parameter uncertainty; we propagate this uncertainty to the output mean using a Gaussian process posterior distribution for the simulation response as a function of the input-model parameters, conditional on a set of simulation experiments. We summarize overall uncertainty via a credible interval for the mean. Our framework is fully Bayesian, makes more effective use of the simulation budget than other Bayesian approaches in the stochastic simulation literature, and is supported with both theoretical analysis and an empirical study. We also make clear how to interpret our credible interval and why it is distinctly different from the confidence intervals for input uncertainty obtained in other papers.

AB - When we use simulation to estimate the performance of a stochastic system, the simulation often contains input models that were estimated from real-world data; therefore, there is both simulation and input uncertainty in the performance estimates. In this paper, we provide a method to measure the overall uncertainty while simultaneously reducing the influence of simulation estimation error due to output variability. To reach this goal, a Bayesian framework is introduced. We use a Bayesian posterior for the input-model parameters, conditional on the real-world data, to quantify the input-parameter uncertainty; we propagate this uncertainty to the output mean using a Gaussian process posterior distribution for the simulation response as a function of the input-model parameters, conditional on a set of simulation experiments. We summarize overall uncertainty via a credible interval for the mean. Our framework is fully Bayesian, makes more effective use of the simulation budget than other Bayesian approaches in the stochastic simulation literature, and is supported with both theoretical analysis and an empirical study. We also make clear how to interpret our credible interval and why it is distinctly different from the confidence intervals for input uncertainty obtained in other papers.

UR - http://www.scopus.com/inward/record.url?scp=84918522350&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84918522350&partnerID=8YFLogxK

U2 - 10.1287/opre.2014.1316

DO - 10.1287/opre.2014.1316

M3 - Article

AN - SCOPUS:84918522350

VL - 62

SP - 1439

EP - 1452

JO - Operations Research

JF - Operations Research

SN - 0030-364X

IS - 6

ER -