That wide schism between science and journalism
Having been in scientific research for almost a decade, it always appalls me to hear science reporters make sweeping generalizations and oversimplifications on television. While this is by no means confined to broadcast news, the most erroneous bits of science information do come from soundbites of so-called experts called upon to enlighten lay audiences about complex scientific topics, usually on the pretext of an equally poorly-reported study in a newspaper. Unfortunately most scientists aren’t listening to that news report and most laymen aren’t schooled to realize that it ranges from anywhere between mildly misleading to dangerously inaccurate. This article in today’s Washington Post, which distinguishes between evidence and opinion in scientific reporting, is illuminating.
The biggest problem in science reporting, as I see it, is that scientific studies rarely have enough evidence to make categorical conclusions. Journalism, on the other hand, is all about conclusions; journalists are trained to jump to conclusions long before all the relevant facts are available – think of all the speculation and analysis and hearsay and exit polls that gets widely reported in mainstream media as if it were indistinguishable from fact. "You are making news here," Chris Matthews often cries after doggedly challenging a political strategist to make a prediction about an election result that is months away.
While a political pundit’s word is as good as news in politics, no research will be published in the journal Nature because a Nobel prize-winning scientist “thinks” something might be true (notwithstanding the in-high-places-prejudice that exists in academic research).
As David Brown is attributed as saying in the Post piece, science reporters have to take excessive care not to make foregone conclusions. They should be able to give the reader sufficient facts and background on all the evidence that is available on a given subject, and follow up, as research progresses. A reporter should spend less time telling a reader what to conclude from a study and more time detailing the evidence available so that the latter may be allowed to make his own conclusions.
A good example of the really shoddy reporting that happens in science is a recent reportage of a study that found links between areas with higher rainfall and the prevalence of autism. All the study really concludes is that potential environmental factors linked to precipitation levels might, when combined with genetic and behavioral influences, cause an increased incidence of autism. As Ewen Callaway notes in the New Scientist, headlines in newspapers ranged from Heavy rainfall could be linked to autism to -- appallingly -- Autism: Blame it on the rain.
This is not to say that this problem with opinions overriding evidence is not existent in other areas. For better or for worse, it is more clearly visible in science reporting, I think, because science so steadfastly relies on hard data and factual evidence. Every result you ever report has to be tested, refuted, challenged, reproduced and proved beyond doubt to be true. “That could just be an artifact,” my professor would say in graduate school, after my fifth consistent reproduction of an experiment. Inspite of this ruthless insistence on accuracy (one that among many other things prompted me to quit science), nothing you read in a peer-reviewed science journal can be taken as the gospel truth (boy, do gospels have it easier).
For every successful experiment in science, there are usually a dozen failed attempts, about two-dozen repititions, and a hundred contradictory results published elsewhere. Then there are a whole slew of exceptions, conditions, factors, and biases that might have influenced the results.
So, while it might be sensational and eye-catching to report that rain causes autism, it is far from the truth. “Environmental factors related to precipitation might contribute to autism,” is less exciting but closer to the truth, and might prevent a potential mass exodus from Seattle.
|