The impact of sampling methods on bias and variance in stochastic linear programs

Michael B. Freimer, Jeffrey T. Linderoth, Douglas J. Thomas

Research output: Contribution to journalArticle

14 Citations (Scopus)

Abstract

Stochastic linear programs can be solved approximately by drawing a subset of all possible random scenarios and solving the problem based on this subset, an approach known as sample average approximation (SAA). The value of the objective function at the optimal solution obtained via SAA provides an estimate of the true optimal objective function value. This estimator is known to be optimistically biased; the expected optimal objective function value for the sampled problem is lower (for minimization problems) than the optimal objective function value for the true problem. We investigate how two alternative sampling methods, antithetic variates (AV) and Latin Hypercube (LH) sampling, affect both the bias and variance, and thus the mean squared error (MSE), of this estimator. For a simple example, we analytically express the reductions in bias and variance obtained by these two alternative sampling methods. For eight test problems from the literature, we computationally investigate the impact of these sampling methods on bias and variance. We find that both sampling methods are effective at reducing mean squared error, with Latin Hypercube sampling outperforming antithetic variates. For our analytic example and the eight test problems we derive or estimate the condition number as defined in Shapiro et al. (Math. Program. 94:1-19, 2002). We find that for ill-conditioned problems, bias plays a larger role in MSE, and AV and LH sampling methods are more likely to reduce bias.

Original languageEnglish (US)
Pages (from-to)51-75
Number of pages25
JournalComputational Optimization and Applications
Volume51
Issue number1
DOIs
StatePublished - Jan 1 2012

Fingerprint

Sampling Methods
Linear Program
Latin Hypercube Sampling
Sampling
Objective function
Mean Squared Error
Sample Average Approximation
Test Problems
Estimator
Subset
Alternatives
Condition number
Estimate
Minimization Problem
Biased
Express
Optimal Solution
Likely
Scenarios

All Science Journal Classification (ASJC) codes

  • Control and Optimization
  • Computational Mathematics
  • Applied Mathematics

Cite this

Freimer, Michael B. ; Linderoth, Jeffrey T. ; Thomas, Douglas J. / The impact of sampling methods on bias and variance in stochastic linear programs. In: Computational Optimization and Applications. 2012 ; Vol. 51, No. 1. pp. 51-75.
@article{8cf1dd59cd134b8db68702ebeac1acb8,
title = "The impact of sampling methods on bias and variance in stochastic linear programs",
abstract = "Stochastic linear programs can be solved approximately by drawing a subset of all possible random scenarios and solving the problem based on this subset, an approach known as sample average approximation (SAA). The value of the objective function at the optimal solution obtained via SAA provides an estimate of the true optimal objective function value. This estimator is known to be optimistically biased; the expected optimal objective function value for the sampled problem is lower (for minimization problems) than the optimal objective function value for the true problem. We investigate how two alternative sampling methods, antithetic variates (AV) and Latin Hypercube (LH) sampling, affect both the bias and variance, and thus the mean squared error (MSE), of this estimator. For a simple example, we analytically express the reductions in bias and variance obtained by these two alternative sampling methods. For eight test problems from the literature, we computationally investigate the impact of these sampling methods on bias and variance. We find that both sampling methods are effective at reducing mean squared error, with Latin Hypercube sampling outperforming antithetic variates. For our analytic example and the eight test problems we derive or estimate the condition number as defined in Shapiro et al. (Math. Program. 94:1-19, 2002). We find that for ill-conditioned problems, bias plays a larger role in MSE, and AV and LH sampling methods are more likely to reduce bias.",
author = "Freimer, {Michael B.} and Linderoth, {Jeffrey T.} and Thomas, {Douglas J.}",
year = "2012",
month = "1",
day = "1",
doi = "10.1007/s10589-010-9322-x",
language = "English (US)",
volume = "51",
pages = "51--75",
journal = "Computational Optimization and Applications",
issn = "0926-6003",
publisher = "Springer Netherlands",
number = "1",

}

The impact of sampling methods on bias and variance in stochastic linear programs. / Freimer, Michael B.; Linderoth, Jeffrey T.; Thomas, Douglas J.

In: Computational Optimization and Applications, Vol. 51, No. 1, 01.01.2012, p. 51-75.

Research output: Contribution to journalArticle

TY - JOUR

T1 - The impact of sampling methods on bias and variance in stochastic linear programs

AU - Freimer, Michael B.

AU - Linderoth, Jeffrey T.

AU - Thomas, Douglas J.

PY - 2012/1/1

Y1 - 2012/1/1

N2 - Stochastic linear programs can be solved approximately by drawing a subset of all possible random scenarios and solving the problem based on this subset, an approach known as sample average approximation (SAA). The value of the objective function at the optimal solution obtained via SAA provides an estimate of the true optimal objective function value. This estimator is known to be optimistically biased; the expected optimal objective function value for the sampled problem is lower (for minimization problems) than the optimal objective function value for the true problem. We investigate how two alternative sampling methods, antithetic variates (AV) and Latin Hypercube (LH) sampling, affect both the bias and variance, and thus the mean squared error (MSE), of this estimator. For a simple example, we analytically express the reductions in bias and variance obtained by these two alternative sampling methods. For eight test problems from the literature, we computationally investigate the impact of these sampling methods on bias and variance. We find that both sampling methods are effective at reducing mean squared error, with Latin Hypercube sampling outperforming antithetic variates. For our analytic example and the eight test problems we derive or estimate the condition number as defined in Shapiro et al. (Math. Program. 94:1-19, 2002). We find that for ill-conditioned problems, bias plays a larger role in MSE, and AV and LH sampling methods are more likely to reduce bias.

AB - Stochastic linear programs can be solved approximately by drawing a subset of all possible random scenarios and solving the problem based on this subset, an approach known as sample average approximation (SAA). The value of the objective function at the optimal solution obtained via SAA provides an estimate of the true optimal objective function value. This estimator is known to be optimistically biased; the expected optimal objective function value for the sampled problem is lower (for minimization problems) than the optimal objective function value for the true problem. We investigate how two alternative sampling methods, antithetic variates (AV) and Latin Hypercube (LH) sampling, affect both the bias and variance, and thus the mean squared error (MSE), of this estimator. For a simple example, we analytically express the reductions in bias and variance obtained by these two alternative sampling methods. For eight test problems from the literature, we computationally investigate the impact of these sampling methods on bias and variance. We find that both sampling methods are effective at reducing mean squared error, with Latin Hypercube sampling outperforming antithetic variates. For our analytic example and the eight test problems we derive or estimate the condition number as defined in Shapiro et al. (Math. Program. 94:1-19, 2002). We find that for ill-conditioned problems, bias plays a larger role in MSE, and AV and LH sampling methods are more likely to reduce bias.

UR - http://www.scopus.com/inward/record.url?scp=84857192213&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84857192213&partnerID=8YFLogxK

U2 - 10.1007/s10589-010-9322-x

DO - 10.1007/s10589-010-9322-x

M3 - Article

VL - 51

SP - 51

EP - 75

JO - Computational Optimization and Applications

JF - Computational Optimization and Applications

SN - 0926-6003

IS - 1

ER -