A couple of recent stories caught my attention. Stories about scientific studies that can’t be replicated. Stories about scientific papers that had to be retracted. And stories about welfare fraud and mismanagement of General Assistance.

What’s the connection? Let me see if I can explain.

The Washington Post, Nature magazine, the Committee on Publication Ethics, and Retraction Watch report that fake peer-reviews have caused publishers of scholarly medical and scientific articles to retract about 170 articles in recent years.

Peer review is one of the ways that scholarly journals ensure the quality of the material they publish. They have other experts in the field read submissions to see if they make sense and seem accurate, reliable and worthy of publication. Some of the fake reviews were the product of laziness. Some were the product of collusion amongst friends. And some were the product of sophisticated schemes to generate fraudulent favorable reviews.

The Boston Globe reports that a quiet crisis is mounting in science because of the large number of findings in top research journals that cannot be replicated. For example, scientists at Bayer could only replicate a quarter of 67 published discoveries in oncology, women’s health and cardiovascular disease. Amgen could only repeat 6 of 53 major, reported cancer findings.

Science is one of our more credible institutions. Its goals are to understand, explain, predict and control. Science improves understanding by making predictions, conducting experiments to test those predictions, and adjusting the predictions to accommodate the results of those experiments. Part of that process involves publishing experimental results to allow others to try to confirm and build upon them. The inability to reproduce reported results undermines the process.

Advertisement

There may be a number of explanations for the inability to reproduce reported findings. The subjects of science are not neat and tidy. They are messy and irregular. And some experiments may detect that messiness. Other experiments may be impaired by some sort of error in the sample, measurement or calculation, by some malfunction in the equipment, or bias in the observer. And some experimental results may be fraudulent and motivated by the pressure to publish and preference for unexpected findings.

The results of bad science can be doozies. Like the 2011 discovery of particles that travel faster than light, but which turned out to be product of mismeasurement caused by a loose cable. Or the 1989 discovery of cold fusion that was going to solve the world’s energy problem, but turned out to be the product of experimental error.

It’s hard to know how longstanding, widespread and serious the problem in science is. There is little demand for publicizing the cases when scientists can’t replicate reported experimental results. People tend to assume that scientists are motivated by the purest of motives, the quest for knowledge for knowledge’s sake. But scientists can be proud, competitive and driven by the desire for prestige and money, like anyone else.

The other stories on my mind are about EBT cards being used to buy inappropriate items. About the couple who got $39,000 worth of food stamp, MaineCare and LIHEP benefits based on the false claim that they were separated and homeless. About the guy who fraudulently obtained $30,000 worth of food stamps. And about the mismanagement of General Assistance in Portland.

It seems to me that if there is error and fraud in science, then there is bound to be error and fraud in less rigorous institutions like welfare. Denial invites greater mistakes and abuse. Better to acknowledge our imperfections and be vigilant about them.

Sidebar Elements


Halsey Frank is a Portland resident, attorney and former chairman of the Republican City Committee.


Only subscribers are eligible to post comments. Please subscribe or login first for digital access. Here’s why.

Use the form below to reset your password. When you've submitted your account email, we will send an email with a reset code.