From a University of Waterloo news release:
Fewer than half of psychology studies published can replicate their original results, according to University of Waterloo researchers involved in the most comprehensive investigation ever conducted on the rate and predictors of reproducibility in a field of science.
Published today in the journal Science, the Reproducibility Project: Psychology found that only 35 of 100 attempted replications produced the same findings as the original study. The project, launched nearly four years ago, asked 270 researchers around the world to replicate studies published in Psychological Science, the Journal of Personality and Social Psychology, and the Journal of Experimental Psychology: Learning, Memory, and Cognition. All of them are prominent psychology journals.
“Error correction is central to science moving forward in the pursuit of new knowledge and innovation,” said Professor Michael Barnett Cowan, of the Faculty of Applied Health Sciences at Waterloo, who replicated a study as part of the project. “While reproducing all scientific experiments is not feasible, sciences such as psychology need to occasionally take stock and question previously published results."
The study reports that failure to reproduce does not necessarily mean that the original research was incorrect. Researchers involved in the project noted that even though most teams worked with the original authors to use the same materials and methods, small differences in when, where, or how they conducted the replication study might have influenced the results. The replication might have failed to detect the original result by chance, or the original result might have been a false positive.
Science is unique from other ways of gaining knowledge by relying on reproducibility to gain confidence in ideas. The Reproducibility Project: Psychology is the first project of its kind to make the data and replication reports public.
“Research that is novel and innovative is most likely to be published in prestigious journals, which benefits the scientist’s career," said Professor Denise Marigold, of Renison University College at Waterloo, who also replicated a study as part of the project. “Research reporting the precise conditions under which other scientist’s findings do or do not replicate doesn’t earn the same kind of recognition, but is necessary to move science forward as a whole.”
In recent years, many journals have taken steps to improve reproducibility by improving transparency of original research materials, codes and data. An increasing number of publishers encourage researchers to submit reports of replication studies and share their results through open-access initiatives and archiving data. At Waterloo, many researchers develop built-in replications as part of their projects.
“This study sounds a cautionary note to researchers and those on editorial boards who ultimately choose what gets published,” said Professor Michael Dixon, chair of the Department of Psychology at Waterloo. “This landmark study provides clues about the factors that promote reproducibility, and will hopefully spur editorial boards to reward researchers who take concrete steps to ensure that their findings meet this basic tenet of science.”