Metastudies for robust tests of theory

Beth Baribault, Chris Donkin, Daniel R. Little, Jennifer S. Trueblood, Zita Oravecz, Don Van Ravenzwaaij, Corey N. White, Paul De Boeck, Joachim Vandekerckhove

Research output: Contribution to journalArticlepeer-review

20 Scopus citations

Abstract

We describe and demonstrate an empirical strategy useful for discovering and replicating empirical effects in psychological science. The method involves the design of a metastudy, in which many independent experimental variables—that May be moderators of an empirical effect—are indiscriminately randomized. Radical randomization yields rich datasets that can be used to test the robustness of an empirical claim to some of the vagaries and idiosyncrasies of experimental protocols and enhances the generalizability of these claims. The strategy is made feasible by advances in hierarchical Bayesian modeling that allow for the pooling of information across unlike experiments and designs and is proposed here as a gold standard for replication research and exploratory research. The practical feasibility of the strategy is demonstrated with a replication of a study on subliminal priming.

Original languageEnglish (US)
Pages (from-to)2607-2612
Number of pages6
JournalProceedings of the National Academy of Sciences of the United States of America
Volume115
Issue number11
DOIs
StatePublished - Mar 13 2018

All Science Journal Classification (ASJC) codes

  • General

Fingerprint Dive into the research topics of 'Metastudies for robust tests of theory'. Together they form a unique fingerprint.

Cite this