A Bayesian Approach to Sequential Optimization based on Computer Experiments

Sam Davanloo Tajbakhsh, Enrique Del Castillo, James Landis Rosenberger

Research output: Contribution to journalArticle

8 Citations (Scopus)

Abstract

Computer experiments are used frequently for the study and improvement of a process under study. Optimizing such process based on a computer model is costly, so an approximation of the computer model, or metamodel, is used. Efficient global optimization (EGO) is a sequential optimization method for computer experiments based on a Gaussian process model approximation to the computer model response. A long-standing problem in EGO is that it does not consider the uncertainty in the parameter estimates of the Gaussian process. Treating these estimates as if they are the true parameters leads to an improper assessment of the precision of the approximation, a precision that is crucial to assess not only in optimization but in metamodeling in general. One way to account for these uncertainties is to use bootstrapping, studied by previous authors. Alternatively, some other authors have mentioned how a Bayesian approach may be the best way to incorporate the parameter uncertainty in the optimization, but no fully Bayesian approach for EGO has been implemented in practice. In this paper, we present a fully Bayesian implementation of the EGO method. The proposed Bayesian EGO algorithm is validated through simulation of noisy nonlinear functions and compared with the standard EGO method and the bootstrapped EGO. We also apply the Bayesian EGO algorithm to the optimization of a stochastic computer model. It is shown how a Bayesian approach to EGO allows one to optimize any function of the posterior predictive density.

Original languageEnglish (US)
Pages (from-to)1001-1012
Number of pages12
JournalQuality and Reliability Engineering International
Volume31
Issue number6
DOIs
StatePublished - Jan 1 2015

Fingerprint

Global optimization
Experiments
Computer experiments
Bayesian approach

All Science Journal Classification (ASJC) codes

  • Safety, Risk, Reliability and Quality
  • Management Science and Operations Research

Cite this

Tajbakhsh, Sam Davanloo ; Del Castillo, Enrique ; Rosenberger, James Landis. / A Bayesian Approach to Sequential Optimization based on Computer Experiments. In: Quality and Reliability Engineering International. 2015 ; Vol. 31, No. 6. pp. 1001-1012.
@article{1096ff2c9a1d4f7699de051673dcbac7,
title = "A Bayesian Approach to Sequential Optimization based on Computer Experiments",
abstract = "Computer experiments are used frequently for the study and improvement of a process under study. Optimizing such process based on a computer model is costly, so an approximation of the computer model, or metamodel, is used. Efficient global optimization (EGO) is a sequential optimization method for computer experiments based on a Gaussian process model approximation to the computer model response. A long-standing problem in EGO is that it does not consider the uncertainty in the parameter estimates of the Gaussian process. Treating these estimates as if they are the true parameters leads to an improper assessment of the precision of the approximation, a precision that is crucial to assess not only in optimization but in metamodeling in general. One way to account for these uncertainties is to use bootstrapping, studied by previous authors. Alternatively, some other authors have mentioned how a Bayesian approach may be the best way to incorporate the parameter uncertainty in the optimization, but no fully Bayesian approach for EGO has been implemented in practice. In this paper, we present a fully Bayesian implementation of the EGO method. The proposed Bayesian EGO algorithm is validated through simulation of noisy nonlinear functions and compared with the standard EGO method and the bootstrapped EGO. We also apply the Bayesian EGO algorithm to the optimization of a stochastic computer model. It is shown how a Bayesian approach to EGO allows one to optimize any function of the posterior predictive density.",
author = "Tajbakhsh, {Sam Davanloo} and {Del Castillo}, Enrique and Rosenberger, {James Landis}",
year = "2015",
month = "1",
day = "1",
doi = "10.1002/qre.1658",
language = "English (US)",
volume = "31",
pages = "1001--1012",
journal = "Quality and Reliability Engineering International",
issn = "0748-8017",
publisher = "John Wiley and Sons Ltd",
number = "6",

}

A Bayesian Approach to Sequential Optimization based on Computer Experiments. / Tajbakhsh, Sam Davanloo; Del Castillo, Enrique; Rosenberger, James Landis.

In: Quality and Reliability Engineering International, Vol. 31, No. 6, 01.01.2015, p. 1001-1012.

Research output: Contribution to journalArticle

TY - JOUR

T1 - A Bayesian Approach to Sequential Optimization based on Computer Experiments

AU - Tajbakhsh, Sam Davanloo

AU - Del Castillo, Enrique

AU - Rosenberger, James Landis

PY - 2015/1/1

Y1 - 2015/1/1

N2 - Computer experiments are used frequently for the study and improvement of a process under study. Optimizing such process based on a computer model is costly, so an approximation of the computer model, or metamodel, is used. Efficient global optimization (EGO) is a sequential optimization method for computer experiments based on a Gaussian process model approximation to the computer model response. A long-standing problem in EGO is that it does not consider the uncertainty in the parameter estimates of the Gaussian process. Treating these estimates as if they are the true parameters leads to an improper assessment of the precision of the approximation, a precision that is crucial to assess not only in optimization but in metamodeling in general. One way to account for these uncertainties is to use bootstrapping, studied by previous authors. Alternatively, some other authors have mentioned how a Bayesian approach may be the best way to incorporate the parameter uncertainty in the optimization, but no fully Bayesian approach for EGO has been implemented in practice. In this paper, we present a fully Bayesian implementation of the EGO method. The proposed Bayesian EGO algorithm is validated through simulation of noisy nonlinear functions and compared with the standard EGO method and the bootstrapped EGO. We also apply the Bayesian EGO algorithm to the optimization of a stochastic computer model. It is shown how a Bayesian approach to EGO allows one to optimize any function of the posterior predictive density.

AB - Computer experiments are used frequently for the study and improvement of a process under study. Optimizing such process based on a computer model is costly, so an approximation of the computer model, or metamodel, is used. Efficient global optimization (EGO) is a sequential optimization method for computer experiments based on a Gaussian process model approximation to the computer model response. A long-standing problem in EGO is that it does not consider the uncertainty in the parameter estimates of the Gaussian process. Treating these estimates as if they are the true parameters leads to an improper assessment of the precision of the approximation, a precision that is crucial to assess not only in optimization but in metamodeling in general. One way to account for these uncertainties is to use bootstrapping, studied by previous authors. Alternatively, some other authors have mentioned how a Bayesian approach may be the best way to incorporate the parameter uncertainty in the optimization, but no fully Bayesian approach for EGO has been implemented in practice. In this paper, we present a fully Bayesian implementation of the EGO method. The proposed Bayesian EGO algorithm is validated through simulation of noisy nonlinear functions and compared with the standard EGO method and the bootstrapped EGO. We also apply the Bayesian EGO algorithm to the optimization of a stochastic computer model. It is shown how a Bayesian approach to EGO allows one to optimize any function of the posterior predictive density.

UR - http://www.scopus.com/inward/record.url?scp=84942234898&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84942234898&partnerID=8YFLogxK

U2 - 10.1002/qre.1658

DO - 10.1002/qre.1658

M3 - Article

AN - SCOPUS:84942234898

VL - 31

SP - 1001

EP - 1012

JO - Quality and Reliability Engineering International

JF - Quality and Reliability Engineering International

SN - 0748-8017

IS - 6

ER -