I’ve found Chris Mooney’s past work on the politics of science and on scientific literacy interesting, but there is something that gently grates on me in his Mother Jones essay published last week.
In the essay, Mooney reviews arguments from neuroscience about why we believe what we believe, how we react to new information that contradicts our existing convictions, and about the actual cognitive processes involved in persuasion or being persuaded, concluding that at least some of the tendency to reject new information or challenges to our beliefs is cognitively hard-wired. Mooney extends this observation to explain how many people arrive at a misreading of scientific publication or knowledge, in part because the norms of scientific publication require the provision of information which permits or encourages misreading. Much of his analysis dovetails into established arguments about the power of framing discourses in the media, forms of confirmation-seeking consumption of information, and the degree to which strongly held values trump factual information or rational persuasion.
I have a lot of complicated misgivings about the implications of this overall approach in its reconsideration of the public sphere, deliberative processes, the act of persuasion, and our models of subjectivity, agency and consciousness. But I have a simpler objection to this particular subset of the bigger paradigm. Namely, that it is not irrational or unreasonable to regard scientific claims which recommend or insist upon particular public policy initiatives with sharply pronounced skepticism across the board. Not because science itself requires a particular form of skepticism (though it does) but because such skepticism is evidence-based, derived from the history of the relationship between policy, the modern state, and science, a history which even non-experts have often viscerally experienced or witnessed.
Three kinds of evidence particularly warrant this preemptive skepticism. The first is spectacular, well-known examples of flagrant ethical misconduct in the pursuit of scientific knowledge justified by an appeal to the public good or in service to a public policy objective. The conventional response is that these incidents, such as the Tuskegee Experiment or the fudging of informed consent in the creation of the HeLa cell line, have been dramatically reduced through institutional reforms and safeguards. Perhaps, but the record on this point alone is sufficient to justify caution about scientific work whose procedures and costs are justified or demanded because of some allegedly urgent public good or policy priority.
Second, the interests of political elites and institutional actors within modern states are demonstrably not identical in all or even most instances to the public good, and have a history in their own right of delivering policies which subsequently prove to have unintended, uneven, self-interested or destructive effects. When scientific knowledge gets caught up in that process, it becomes by definition less trustworthy or more worthy of skepticism than research which is not strongly directed towards justifying political or bureaucratic decisions. Add to this the intrusion of businesses and other private institutions with a strong interest in the production (or suppression) of particular kinds of scientific knowledge in relationship to the making of public policy. A historical perspective quickly demonstrates that many claims imbued with the authority of science, deployed in service to policy, have had powerful consequences but a very weak relationship to scientific truths. If you lived through the last fifty years, you can remember a great many things which public officials and influential scientists told the public to do or believe which were not just wrong but primarily served the self-interest of government, private industry or research institutions. It is completely rational to recall this evidence every single time that science and policy intersect.
Third, more and more studies today are suggesting that a good deal of scientific research, both that which concerns policy-making and that which does not, is covertly or subconsciously manipulated to produce results which just barely cross the threshold of statistical significance or otherwise establish “legitimate” results. What this suggests is less that many researchers (both scientists and social scientists) operate with conscious bad faith and more that there is a system of underlying incentives which pushes research communities towards the entrepreneurial overproduction of unnecessary or marginal knowledge. Where this has particular implications for the intersection of policy and science is that overestimating the significance of results or simplified interpretations of results in order to suit policy formation and public debate are governed by the same system of incentives. As a result, attempts to apply or deploy research findings in policy formation are often drastically premature, or mismatch the expense and difficulty of policy to the strength of research findings. Moreover, the rhetoric of scientific truth is often used in such cases to strongarm more complex or humanistic ethical and practical objections to a particular finding and its application aside. Again, the historical record of the last fifty years in the United States and Western Europe is fairly replete with expensive or drastic public policies adopted on the strength of thin or tentative findings which were easily contradicted or reversed by later research. And so again, presumptive skepticism towards scientific and social scientific claims that inform or demand policy implementation is justified on evidence, not because of underlying, intrinsic cognitive orientations.
Mooney’s essay addresses a lack of belief in the findings of fundamental or basic science. You could argue that this lack of belief is not justified by the evidence I’ve described above, that basic science should be subjected to the ordinary skepticism demanded by the scientific method but not judged against some record of particular historical propensity for error. Indeed, the opposite, given basic science’s strong record of continuous progressive improvement in the quality and depth of its understanding of the universe. The problem is that scientists operating in this domain rarely take pains to distinguish themselves from science which is claimed by policy-makers or claims to have found concrete solutions to real-world problems. Nor are many scientists particularly eager to acknowledge the sociology, politics or history of science as having any relationship to their research work, whether pure or applied. Instead, many would rather do what Mooney offers: use science to explain away even the critique or suspicion of science as definitionally extra-rational and to consign any actual engagement with that popular skepticism to humanists who wallow in the rhetorical and discursive to begin with.