Back to Projects

Projects

Deciphering the Decline Effect – A Prospective Multi-laboratory Replication Study

Brian Nosek, Ph.D. University of Virginia, Charlottesville, USA Foundations

There has been an increasing concern among scientists regarding irreproducibility of scientific findings. Across several fields of research including medicine, psychology, economics, and genetics, evidence has accumulated that a considerable amount of reported findings are actually smaller, less robust, or simply less true, than originally believed. This has contibuted to the emergence of the field of meta-science, which uses quantifiable methodologies to understand how scientific practices influence the veracity of scientific conclusions.

To date, the understanding of reproducibility and the lack thereof has been impeded by two related challenges:

1. the lack of transparency of the scientific record, and
2. the retrospective nature of reproducibility studies.

The suggestion, for example, that reproducibility issues are due to publication bias and selective reporting hinges on the presumption of a large body of unpublished studies with negative outcomes. Although the existence of such studies can hardly be doubted, the contribution of unpublished findings to reproducibility issues is difficult to assess. Compounding this, replications of published studies are of little value in and of themselves in establishing why initial studies frequently report inflated results.

This project aims to overcome those obstacles. Four labs (at UC Santa Barbara, Stanford University, UC Berkeley, and University of Virginia) are engaged in conducting a multi-site prospective multi-replication study. Whenever a new experimental effect is discovered by any one of the participating labs, the study which revealed the effect is then systematically replicated by the originating laboratory and by the others, following a complex pre-specified sequence of various replications and analyses. In so doing, this project both assesses the individual hypotheses explored in each study, and provides the context for a deeper meta-scientific understanding. This will help to evaluate the evidence for different accounts for variation in the reproducibility of scientific findings, such as: false positive effects, statistical artifacts, selective reporting, publication bias, and changes in procedure or sampling.

This project has three primary goals:

1. To develop a gold standard for replication protocol, in which every effort is made to design experiments and implement replications in a manner that will maximize the likelihood of full replicability.
2. To examine whether the replications of newly devised experimental protocols are associated with declining effect sizes, even when all reasonable efforts are made to minimize such declines.
3. If declining effect sizes are still observed, to identify their possible locus by, for example, assessing whether other labs can replicate the findings as effectively as the originating lab.

The study has been preregistered with the Open Science Framework (OSF). For the overall pre-registration see https://osf.io/6t9vm. For the pre-registration of the studies conducted at University of Virginia see https://osf.io/y8adg/, https://osf.io/gn4ex/ and https://osf.io/edt62/.

Upcoming Conference Sept 5th to Sept. 8th 2019 at Stanford University:
The current status of the project will be presented at the Metascience 2019 Symposium of which the principal investigators of this project (Brian Nosek (Center for Open Science, USA), Jonathan Schooler (UC Santa Barbara, USA), Jon Krosnick (Stanford Univ., USA), Leif Nelson (UC Berkeley, USA), Jan Walleczek (Phenoscience Laboratories, DE)) are the main organizers.

Publications:

There are five principle investigators on this project. The list below contains publications related to Brian Nosek. For publications from Krosnick, Nelson, Schooler, Walleczek), go back to the overview or download the project description.

Project-related Publications as of July 2019:
Axt, J.R. & Lai, C.K. (in press). Reducing discrimination: A bias versus noise perspective. Journal of Personality and Social Psychology.

Axt, J.R., Casola, G.M. & Nosek, B.A (in press). Reducing social judgment biases may require identifying the potential source of bias. Personality and Social Psychology Bulletin.

Axt, J.R., Nguyen, H., & Nosek, B.A. (2018). The Judgment Bias Task: A reliable, flexible method for assessing individual differences in social judgment biases. Journal of Experimental Social Psychology, 76, 337-355.

Ebersole, C. R., Axt, J. R., & Nosek, B. A. (2016). Scientists’ reputations are based on getting it right, not being right. PLOS Biology, 14, e1002460. Doi: 10.1371/journal.pbio.1002460