In September 2016, the psychologist Dana Carney came forward with a confession: She no longer believed the findings of a high-profile study she co-authored in 2010 to be true. The study was about “power-posing” — a theory suggesting that powerful stances can psychologically and physiologically help one when under high-pressure situations. Carney’s co-author, Amy Cuddy, a psychologist at Harvard University, had earned much fame from power poses, and her 2012 TED talk on the topic is the second most watched talk of all time.
Carney, now based at the University of California, Berkeley, had, however, changed her mind. “I do not believe that ‘power pose’ effects are real,” she wrote on her website in 2016. The reason, she added, was that “since early 2015 the evidence has been mounting suggesting there is unlikely any embodied effect of nonverbal expansiveness.” Other researchers, it turned out, could not replicate the power pose results, and withering scrutiny of the Carney and Cuddy study by fellow scientists mounted.
Carney’s assertions and Cuddy’s responses were widely covered in the media. (Earlier this year, Forbes reported that Cuddy had successfully refuted criticism of the power-posing study.) And despite her own eventual refutation of the findings, Carney did not believe the original paper warranted a full retraction, because it “was conducted in good faith based on phenomena thought to be true at the time,” she told the research integrity blog Retraction Watch.
But in many researchers’ minds, Carney’s change of heart and the pointed questions surrounding power pose research typifies the replication crisis that has dogged fields like social psychology for years. Some researchers have gone so far as to suggest that most published research findings are false.
A 2016 survey by Nature of more than 1,500 scientists found that more than 70 percent of researchers failed to successfully reproduce another researcher’s work and more than half failed to reproduce their own. Psychology is one of the most affected disciplines, with studies suggesting that wearing red makes one more attractive, or that smiling makes people happier, proving difficult for follow-up researchers to reproduce.
Of course, some researchers have argued that the replication crisis is exaggerated. But even if that is the case, there really is no effective way for scientists to quickly and publicly inform colleagues that they are no longer confident in their published work. Public declarations like Carney’s are one way to go, but they are often difficult to track.
So an ambitious new effort, motivated by Carney’s move, is encouraging psychologists to own up to shortcomings in their published work via a website in the form of official loss-of-confidence statements — published at a single online clearinghouse for such confessions called the Loss of Confidence Project.
“An Idealistic Approach”
The aim is to simplify how such statements are reported as opposed to the current process, in which researchers bicker in back-and-forth commentaries and rebuttals, says Julia Rohrer, who studies personality psychology at the International Max Planck Research School on the Life Course in Berlin, Germany, and is one of three researchers working on the project.
“People will defend their scientific claims until their death,” Rohrer said. “As scientists, we should be aware that people are often wrong.” Carney’s move, for example, was generally well-received by psychologists who welcomed her transparency, she noted.
Rohrer and her colleagues, Tal Yarkoni of the University of Texas at Austin and Christopher Chabris, at the Geisinger Health System in Pennsylvania, are currently accepting submissions of loss-of-confidence statements, focusing on psychology studies — and with some ground rules: Authors submitting a loss-of-confidence statement, for example, are expected take primary responsibility for methodological or theoretical problems with their paper — otherwise, the entry goes into whistleblowing territory and is not eligible for publication. The researchers eventually plan to publish the statements in an academic paper, Rohrer said.
Only a handful of psychologists have submitted proposals so far, Rohrer said, despite an initial show of enthusiasm for the initiative. The dearth of submissions might be because it’s a bold step, Rohrer suggested, or because researchers don’t have the time to sift through all of their past work. Co-authors of a paper may also disagree over the decision to submit a statement, she adds — especially if they are at different stages in their career.
“We do know that more people have these stories,” Rohrer noted, but they are reluctant to share them. She also admitted that her project is an “idealistic approach to scientific self-correction,” since it relies on researchers being honest about their own work.
Still, more emphasis on self-correction will help to address the fact that some researchers whose work has not held up feel they have been treated unfairly, Rohrer said. Researchers are often too cautious about admitting errors because they assume the consequences will be negative, she adds, even though reactions to cases like Carney’s have been predominantly positive.
A Warning, Not a Punishment
Rohrer’s supervisor, Stefan Schmukle, a psychologist at the University of Leipzig, has submitted a loss-of-confidence statement based on a 2007 study he co-authored about variations in the lengths of ring and index fingers in men and women.
“It is important to communicate to other scholars that I have lost confidence in these specific results so that other researchers are aware that it might not make sense to follow this line of research,” says Schmukle.
Like Carney, Schmukle also feels his paper shouldn’t be retracted even though he doubts the study’s main findings are reproducible. “In my view, a retraction would be appropriate if the data were faked, if the statistical analysis was wrong, or something like that,” Schmukle says. The problem with his paper, however, is that some of the results, which in hindsight are important for understanding it, were not reported.
Participating in Rohrer’s project doesn’t necessarily imply that authors should retract or correct their papers, she noted — adding that she would consider it dysfunctional if papers were retracted every time authors admitted to mistakes, because that would be a disincentive for coming forward in the first place.
Marcus Munafò, a biological psychologist at the University of Bristol in the U.K. who has pulled a paper after spotting an error, made a similar case. “Whether or not to retract a paper is a tricky issue,” he said. “But I wouldn’t retract papers that report results that are almost certainly wrong but that were conducted in good faith.”
(For its part, Retraction Watch credits authors who retract their own papers by acknowledging them for “doing the right thing.”)
Munafò, who co-authored a manifesto for reproducible science, described the new loss-of-confidence project as an ambitious effort with the potential to introduce a valuable self-correction mechanism. “The only real drawback is that it might not take off, perhaps because it’s a bit ahead of its time,” he said.
Although science is often described as self-correcting, Munafò added, many of the means of self-correction — such as replicating studies and reporting null results — are not currently prioritized.
· · ·
“Take the Chance of Disclosing Now”
Some researchers are tackling loss-of-confidence incidents on their own. Rebecca Willén, an independent psychology and meta-science researcher who divides her time between Indonesia and Europe, publishes disclaimers about her studies on her own website. “I believe there is a difference between loss-of-confidence and retroactive disclosure statements,” said Willén, who added that she plans to participate in Rohrer’s project.
One difference, she noted, is that unlike loss-of-confidence statements, retroactive disclosures can also just say everything was reported transparently in the study, as Willén does for some of her papers listed on her site. “A few of my disclosure statements were made public already in my Ph.D. thesis, while a couple of others were not published until just recently.”
Another psychologist who has acknowledged shortcomings in his published work is Will Gervais of the University of Kentucky. In 2012, a flashy study published in the journal Science by Gervais and colleagues — which attracted several media headlines — claimed that thinking analytically can make people less religious.
But when outside researchers took a second look at his paper, it failed to hold up. The replicators praised Gervais and his team for their openness with their data, and Gervais published a statement about the replication study and his personal perspective on his own website. In it, he wrote that his “methodological awakening” started around 2012.
Gervais says the loss-of-confidence project would seem to offer researchers an opportunity to publicly distance themselves from previous work that they’ve come to believe is not robust. “This is certainly better than just holding your breath and hoping nobody tries to replicate it,” he said.
And Willén agreed, adding that this is a good time to consider coming forward with such admissions, because the atmosphere is charitable. “Take the chance of disclosing now,” she said. “Once this phase is over, it’s more likely that retroactive disclosures will result in negative consequences for your career.”
Daniël Lakens, an experimental psychologist at Eindhoven University of Technology in the Netherlands, included a loss-of-confidence statement within a commentary he wrote after a meta-analysis and a large replication study raised doubts about findings in his own paper. “Scientific papers are regrettably not like software,” he said. “I can update software I write, and the latest version is always the best version.”
Opening Up the “File Drawer”
In recent years, some academic publishers have made efforts to turn scholarly articles into living documents but most papers remain static, often containing outdated or refuted information. One problem psychology and many other disciplines suffer from is publication bias, where academic journals favor positive results over negative ones.
This means negative findings are stashed away by researchers and never published — also known as the “file-drawer problem” — resulting in a skewed picture of whether a phenomenon exists in scholarly literature.
To solve that problem, a journal called Meta-Psychology was launched last year. The journal, which has no publication fee, welcomes papers reporting negative findings — dubbed “file-drawer reports” — and replication reports in which authors confirm or refute their own or somebody else’s results.
Whatever the method, owning up to mistakes is not easy. Willén, for instance, said she became unpopular in her department for struggling against what she calls “questionable practices” during her Ph.D. years. “Publishing my retroactive disclosure statements were simply the result of many years of inner struggle,” she said. “Follow my conscience or be loyal to people I deeply respect and would love to continue working with? My conscience finally won that battle.”
Dalmeet Singh Chawla is a freelance science journalist based in London. Between January 2016 and January 2017, he worked as a full-time reporter for Retraction Watch. This article was first published on Undark.