Editor’s Note: This article was originally published in The Conversation. It was also picked up by the National Post.

File 20180221 132680 1hql3pa.jpg?ixlib=rb 1.1
Shutterstock.

 

“Statistics are like a bikini. What they reveal is suggestive, but what they conceal is vital,” Aaron Levenstein, a business professor at Baruch College, once said.

I first heard a version of this quote in an undergraduate social psychology class in 2003. Nearly a decade and a half later, psychology is having a replication crisis — and the “bikini” is largely to blame.

Recently, more than 270 psychologists set out to repeat 100 experiments to see if they could generate the same results. They successfully replicated only 39 of the 100 studies.

Over several years, failed attempts to replicate published studies have caused generally accepted bodies of research to be called into question — or rejected outright.

One example is the idea that your willpower is a limited resource that, like a muscle, becomes exhausted when it is used. Another is that power posing — standing like a superhero for two minutes — makes you feel bolder, reduces stress hormones and increases testosterone. Both have fallen aside due to failed replications.

Psychology was wrong about the power pose.
(Shutterstock)

These aren’t dusty, arcane findings limited to academic journals; a TED talk by social psychologist Amy Cuddy on the effectiveness of power posing has been viewed over 45 million times and is near the top of the list of the most popular TED talks of all time.

Bad habits

The “bikini” at the centre of the crisis refers to the way researchers collect and analyze data and report their results. Many important details and decisions are often concealed.

When carrying out experiments, researchers make decisions about how much data to collect, whether some observations should be excluded from the analysis and what controls, if any, should be included in analyses.

After the data has been collected, researchers have additional, undisclosed leeway.

They may “torture the data” until it reaches statistical significance (a cut-off that suggests the real effect may not be zero), a practice called “p-hacking.”

Or they may engage in the practice of “HARKing,” short for “hypothesizing after results are known.” Creating a hypothesis to confirm a result that has already been found makes it easier to satisfy journal reviewers and editors who are interested in publishing statistically significant results.

In academia, where researchers are often under pressure to “publish or perish” to advance their careers and win grants, amassing publications is the route to success.

All told, this undisclosed flexibility can lead to extremely high rates of false positive results. A false positive is essentially claiming there is an effect when there isn’t one. An example would be concluding that standing up straight increases testosterone levels, when it doesn’t.

A new research culture

Despite all the upheaval, psychology’s replication crisis may have a silver lining. In a few short years, researchers have proposed many ideas and recommendations for reforming research with the goal of improvement.

Journals and granting agencies are demanding more from authors with respect to openness and transparency. There are accessible online repositories, such as Github, the Open Science Framework and OpenDOAR, that allow researchers to share their raw materials, exact protocols, scripts, data, code, etc. with anyone who has an internet connection. The aim is to essentially have nothing concealed in the scientific process.

Researchers who manipulate their data or engage in poor research practices will wind up with results that can’t be replicated.
(Shutterstock)

Some journals, such as Psychological Science, and recently American Psychological Association journals are encouraging authors to store their data and code in these repositories and to disclose details about data collection decisions before submitting a manuscript for peer review. Researchers can also preregister their hypotheses. But something has been missing.

The missing link

While psychological science has been moving toward more open and transparent methods, graduate student training has been largely left out of discussions.

Many of the practices that created the crisis are embedded in our research culture: We do things a certain way because we have always done things this way and other people do, too. Much of this culture is assimilated when researchers are in graduate school.

To sustain and maintain the momentum of positive change, it is important for graduate education to keep up with changes in the field. If training fails to keep up, graduate students may leave programs with antiquated ideas and practices. These ideas and practices can proliferate as students become faculty members, start their own labs and train graduate students in the same manner they were taught.

Part of educating students is ensuring they are aware of the changing cultural landscape, and then explicitly teaching them to follow open and transparent research practices and avoid bad habits.

Finding the light

In our department at the University of Guelph, a group of methodologically minded faculty have recognized the importance of tackling this problem head-on. Our goal is to create positive change and take steps to avoid history repeating itself with the next generation of researchers.

We created “Statistical methods in theses: Guidelines and explanations” to help students when conducting their thesis research. Students can work through the guidelines with their advisers, allowing them to make better decisions in the planning stages of their research projects.

The document’s rather humble-sounding purpose belies an unintended provocative side. The guidelines identify questionable research practices — to provide explanations and advice for students who wish to follow open and transparent research practices. Because some of the questionable practices it identifies may be standard, previously unquestioned — and sometimes taught — procedures, the document has the potential to be viewed, by some, as extreme.

Culture is not something that can be changed overnight. But with explicit efforts to cultivate a new research culture, change can be targeted and purposeful.

This crisis in psychology makes me think about a line in John Milton’s epic poem, Paradise Lost: “Long is the way and hard, that out of Hell leads up to Light.”

The ConversationBy acting on the crisis, psychology has embarked upon its symbolic journey back to “light.” It will be current and future graduate students that will decide how, and where, the journey ends.