Those amazing research results we’re always reading about may not be so amazing after all, says a business professor at Longwood University.

Research in journal articles is sometimes manipulated by questionable practices that include deleting, adding or altering data to fit hypotheses or changing hypotheses to fit the results, said Dr. George Banks, co-author of a study that tracked the transformation some research findings went through on the way from dissertation—when there is less to pressure to support a hypothesis and more oversight—to journal article. The study investigated outcome-reporting bias which is a sub-set of the phenomenon known as publication bias or "the file drawer problem" because nonsignificant findings are sometimes hidden away in file drawers.

The study, "The Chrysalis Effect: How Ugly Data Metamorphosize Into Beautiful Articles," has been accepted to be presented at the conference of the Academy of Management, the largest management organization, in August in Orlando, Fla. It also received a "best paper" award in the research methods division.

"The pressure to publish encourages people to do things they shouldn’t do," said Banks, assistant professor of management, whose study focused on publication bias in the management science field. "Most researchers don’t outright cheat, but many are comfortable engaging in practices that are questionable.

"For example, in journal articles, researchers often collect data and look at results—then they create a hypothesis. They report hypotheses that were supported and fail to disclose those that weren’t supported, which is misleading. In some studies, researchers find a negative relationship when they had predicted a positive one, but in the journal article they say they predicted a negative relationship all along, which is unethical."

Although Banks’ study focused on a business field, he said publication bias cuts across disciplines.

"Publication bias has been a problem in the medical field, and the field has been cracking down on this—they’re light years ahead of us in business and the social sciences as a whole. In disciplines other than medicine, the potential consequences are less serious, so it’s been overlooked. I’d like my field, and other fields including education and psychology, to use more effective research methods."

The study looked at 142 articles in management or industrial-organizational psychology, a similar field, that have appeared in refereed journals since 2000. All of the articles were originally dissertations. The study found that from dissertation to journal publication, the ratio of supported to unsupported hypotheses more than doubled.

"This rise is directly attributable to the dropping of nonsignificant hypotheses, the addition of statistically significant hypotheses, the reversing of the predicted direction of hypotheses, and data manipulation," the study states. "The published literature overestimates the predictive accuracy of management science and as such is biased."

Among the "questionable research practices" (QRPs) the study measured were adding, deleting or altering data after hypothesis testing, selectively deleting or adding variables, dropping unsupported hypotheses or adding post-hoc hypotheses, or reversing the direction or reframing hypotheses.

The study includes several recommendations, including asking researchers to sign a disclosure statement when submitting a prospective article to a journal that they have not engaged in any of the listed QRPs. "You can’t stop true cheaters, but this would help keep the honest researchers honest which would reduce the problem," said Banks. "Many researchers engage in some of these QRPs, typically out of ignorance, but they need to stop. Some people are on the fence between the dark and the light, and you want to push them toward the light."

The study’s co-authors are at the University of Iowa: Dr. Ernest O’Boyle, assistant professor of management, and Erik Gonzalez-Mule, a Ph.D. candidate.

Leave a Comment