Activities such as sensitivity analysis, statistical effect screening, uncertainty propagation, or model calibration have become integral to the Verification and Validation (V&V) of numerical models and computer simulations. Because these analyses involve performing multiple runs of a computer code, they can rapidly become computationally expensive. For example, propagating uncertainty with a 1,000 Monte Carlo samples wrapped around a finite element calculation that takes only 10 minutes to run requires seven days of single-processor time. An alternative is to combine a design of computer experiments to meta-modeling, and replace the potentially expensive computer simulation by a fast-running surrogate. The surrogate can then be used to estimate sensitivities, propagate uncertainty, and calibrate model parameters at a fraction of the cost it would take to wrap a sampling algorithm or optimization solver around the analysis code. In this publication, we focus on the dangers of using too sparsely populated design-of-experiments to propagate uncertainty or train a fast-running surrogate model. One danger for sensitivity analysis or calibration is to develop meta-models that include erroneous sensitivities. This is illustrated with a high-dimensional, non-linear mathematical function in which several parameter effects are statistically insignificant, therefore, mimicking a situation that is often encountered in practice. It is shown that using a sparse design of computer experiments leads to an incorrect approximation of the function.