TY - GEN
T1 - On the use of statistics in design and the implications for deterministic computer experiments
AU - Simpson, Timothy W.
AU - Peplinski, Jesse
AU - Koch, Patrick N.
AU - Allen, Janet K.
N1 - Publisher Copyright:
© 1997 American Society of Mechanical Engineers (ASME). All rights reserved.
PY - 1997
Y1 - 1997
N2 - Perhaps the most prevalent use of statistics in engineering design is through Taguchi's parameter and robust design-using orthogonal arrays to compute signal-To-noise ratios in a process of design improvement. In our view, however, there is an equally exciting use of statistics in design that could become just as prevalent: it is the concept of metamodeling whereby statistical models are built to approximate detailed computer analysis codes. Although computers continue to get faster, analysis codes always seem to keep pace so that their computational time remains non-Trivial. Through metamodeling, approximations of these codes are built that are orders of magnitude cheaper to run. These metamodels can then be linked to optimization routines for fast analysis, or they can serve as a bridge for integrating analysis codes across different domains. In this paper we first review metamodeling techniques that encompass design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We discuss their existing applications in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of metamodeling techniques in given situations and how common pitfalls can be avoided.
AB - Perhaps the most prevalent use of statistics in engineering design is through Taguchi's parameter and robust design-using orthogonal arrays to compute signal-To-noise ratios in a process of design improvement. In our view, however, there is an equally exciting use of statistics in design that could become just as prevalent: it is the concept of metamodeling whereby statistical models are built to approximate detailed computer analysis codes. Although computers continue to get faster, analysis codes always seem to keep pace so that their computational time remains non-Trivial. Through metamodeling, approximations of these codes are built that are orders of magnitude cheaper to run. These metamodels can then be linked to optimization routines for fast analysis, or they can serve as a bridge for integrating analysis codes across different domains. In this paper we first review metamodeling techniques that encompass design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We discuss their existing applications in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of metamodeling techniques in given situations and how common pitfalls can be avoided.
UR - http://www.scopus.com/inward/record.url?scp=84888831107&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84888831107&partnerID=8YFLogxK
U2 - 10.1115/DETC97/DTM-3881
DO - 10.1115/DETC97/DTM-3881
M3 - Conference contribution
AN - SCOPUS:84888831107
T3 - Proceedings of the ASME Design Engineering Technical Conference
BT - 9th International Design Theory and Methodology Conference
PB - American Society of Mechanical Engineers (ASME)
T2 - ASME 1997 Design Engineering Technical Conferences, DETC 1997
Y2 - 14 September 1997 through 17 September 1997
ER -