Researchers at 125 institutions, including three scientists at the UO, have found that fewer than half of 100 studies published in three leading psychology journals in 2008 produced similar results when they sought to reproduce them.
That, however, doesn't mean the original results were incorrect, but it does suggest more diligence is needed in how studies are replicated to confirm or build on original insights, noted the authors in a paper published in the Aug. 28 issue of the journal Science. Reproducibility is a core principle of the scientific method and depends heavily on detailed methodology. Through replication, science moves forward in a self-correcting system.
The paper in Science emerged from a large-scale collaborative project led by the University of Virginia-based Center for Open Science that has been exploring the replication of research in the psychological sciences since November 2011.
Among the paper's 270 co-authors were UO psychology professor Pranjal H. Mehta, his former lab manager Bethany Lassetter, who now is a doctoral student at the University of Iowa, and Grace Binion, a doctoral student in clinical psychology whose participation took place while she was an undergraduate student at Georgia Gwinnett College.
Mehta and Lassetter teamed with researchers at Tilburg University in the Netherlands to replicate a study involving optimism and the future that had appeared in the journal Psychological Science. This study was among those in the project that were successfully replicated.
In a news release, the contributing authors said that regardless of the analytic method or criteria used, just 39 percent of the replicated studies produced the same findings as the original. Science described the paper in its own news coverage and the journal Nature immediately published this news story online.
It isn't necessarily a goal to have a higher rate of replication, Mehta said, because research often tackles challenging and new questions that sometimes have unwieldy or complicated answers. Paying more attention to replication, he added, can help sort out false starts and identify initial studies that are more likely to reveal promising leads to build on.
"The way scientific research is published must change because the current incentive structure is not consistent with a core scientific value: reproducibility," Mehta said. "Journals should amend their editorial policies and publish studies that test for reproducibility."
Mehta's lab got involved in the project as Mehta followed discussions about incentive structures for researchers on an email listserv hosted by the Center for Open Science. "In one email the researchers were asking for labs to collaborate on the reproducibility project. Bethany was in my laboratory at the time, and I thought this could be a great project for her. I asked her if she was interested in participating, and she said yes."
His concerns about incentives also were raised in the group's news release.
"Scientists must publish their research in top-tier scientific journals to obtain jobs and research grants," Mehta said. "However, many top-tier scientific journals tend to publish only groundbreaking, novel findings that yield statistically significant results. These journals typically reject papers that test the extent to which the findings from previously reported studies can be reproduced using similar methods and procedures. The upshot is that very little is known about the reproducibility of previously published findings."
The decision by Science to publish the paper may signal that top-tier journals are changing their policies to be consistent with the core scientific value of reproducibility, Mehta said. However, he added: "We have to ask if scientific journals value and promote reproducibility as much as they should. This is a question that all journal policymakers should now be asking themselves."
Lassetter helped examine the original paper that the UO helped replicate, and she had a lead role in conducting the study, analyzing the data and writing the results. The UO sample included 20 men and 29 women, who completed a series of exercises, of which one scenario replicated the original research.
"I enjoyed being a part of each stage of our specific replication, and seeing it contribute to the larger effort to examine reproducibility," said Lassetter, who earned a psychology degree in 2012 from the UO and then worked in Mehta's lab before beginning her doctoral work in social psychology at Iowa.
Binion participated in two failed replications. One involved a study that explored how emotion priming and emotional experience influence cognition; the other focused on a study that looked at the roles of eye gaze and perspective as preschool children learned new words.
Like Lassetter, Binion was involved in all the steps involved in the research.
"While this hands-on experience was, without a doubt, extremely valuable, I am most grateful for the opportunity to meaningfully engage in a larger dialogue about the way in which psychological science is conducted," Binion said. "Not only was I able to make a concrete contribution to this immensely beneficial effort, but I am now an informed participant in what seems to be a markedly positive transformational dialogue aimed at revolutionizing the way we both practice and conceptualize science.
— By Jim Barlow, Public Affairs Communications