Science is uncertain; and yet we have no better basis for making important decisions about the future than the best scientific knowledge currently available. Moreover, there are powerful economic interests that exert themselves to undermine the confidence of the public and our policy makers in the findings of science that appear to harm those interests. How should we think about these two factors, one epistemic and the other political? The first lays out the reasons for thinking that some of our most confident theories may in fact be erroneous; the second makes us worry that even strongly credible science will be undermined by corporate and financial interests.
Naomi Oreskes and Erik Conway explore the latter dynamics in substantial detail in Merchants of Doubt. And Henry Pollack, a noted and respected climate scientist, explores the implications of the first point in Uncertain Science … Uncertain World.
Oreskes’ work on the politics and methods of science denial is substantial and convincing. She is an historian of science, and she has carefully traced the pathways through which business interests have exerted themselves to affect the outcome of a range of scientific debates: for example, the harmful effects of tobacco, acid rain, the reality of an ozone hole, and the reality of global warming. She traces the influence that conservative think tanks and corporations have had on the scientific debates over these issues. But more, she demonstrates that a small number of conservative nuclear scientists have played a key and recurring role in drumming up spurious attacks on the scientific credentials of researchers in a number of these fields.
Call it the “Tobacco Strategy.” Its target was science, and so it relied heavily on scientists— with guidance from industry lawyers and public relations experts— willing to hold the rifle and pull the trigger. Among the multitude of documents we found in writing this book were Bad Science: A Resource Book— a how-to handbook for fact fighters, providing example after example of successful strategies for undermining science, and a list of experts with scientific credentials available to comment on any issue about which a think tank or corporation needed a negative sound bite. (kl 170)
Here is what the tobacco strategy looked like in 1979 in the hands of tobacco corporation R. J. Reynolds, in the words of Colin Stokes, former chairman of R. J. Reynolds:
“Science really knows little about the causes or development mechanisms of chronic degenerative diseases imputed to cigarettes,” Stokes went on, “including lung cancer, emphysema, and cardiovascular disorders.” Many of the attacks against smoking were based on studies that were either “incomplete or … relied on dubious methods or hypotheses and faulty interpretations.” The new program would supply new data, new hypotheses, and new interpretations to develop “a strong body of scientific data or opinion in defense of the product.” ^14 Above all, it would supply witnesses. (kl 316)
The purpose of this strategy was clear to its creators:
The industry’s position was that there was “no proof” that tobacco was bad, and they fostered that position by manufacturing a “debate,” convincing the mass media that responsible journalists had an obligation to present “both sides” of it. Representatives of the Tobacco Industry Research Committee met with staff at Time, Newsweek, U.S. News and World Report, BusinessWeek, Life, and Reader’s Digest, including men and women at the very top of the American media industry. (kl 403)
Oreskes and her colleagues make a very worrisome case for the likelihood that good scientific research on controversial issues will be drowned out by money and astute public relations strategies by self-interested corporations. And ultimately this possibility has potentially devastating results for public health and our global future, if the public and our policy makers succumb to this attack on science.
The attack on the scientific legitimacy and credentials of climate science is of equal concern to Henry Pollack. Pollack honestly acknowledges the limits of uncertainty that are characteristic of all areas of science. But he strongly defends the rational confidence we have in the results of empirical and scientific inquiry into the major natural and social processes which surround us. Here are his four key ideas:
- Uncertainty is always with us and can never be fully eliminated from our lives, either individually or collectively as a society. Our understanding of the past and our anticipation of the future will always be obscured by uncertainty.
- Because uncertainty never disappears, decisions about the future, big and small, must always be made in the absence of certainty. Waiting until uncertainty is eliminated before making decisions is an implicit endorsement of the status quo, and often an excuse for maintaining it.
- Predicting the long-term future is a perilous business, and seldom do the predictions fall very close to reality. As the future unfolds, ‘mid-course corrections’ can be made that take into account new information and new developments.
- Uncertainty, far from being a barrier to progress, is actually a strong stimulus for, and an important ingredient of, creativity. (2-3)
Pollack urges the public and our legislators to take the time to understand the nature of the scientific enterprise more fully and to inoculate themselves against self-interested efforts to undermine the enterprise and its core findings on controversial subjects.
Now consider a third perspective on this topic of the reliability and vulnerability of science, the point of view associated with Science and Technology Studies (STS) and Sociology of Scientific Knowledge (SSK) (link). A good exemplar of this approach is Harry Collins and Trevor Pinch, The Golem at Large. What they mean by the “golem” is that science, like almost any other human activity, is two-sided when it comes to its effects on human wellbeing. So they are as interested in the failures of technology and science as in the successes. They focus on investigations of technology success and failure in this volume, including the effectiveness of Patriot missile defense systems in the Gulf War, the causes of the Challenger explosion, assessing the effects of the Chernobyl radiation plume on Cumbrian sheep, tests of nuclear fuel flasks in the 1980s, and several other interesting cases. In their own way their message is similar to that of Pollack: science and technology involve investigations, inferences, and manipulations that are inherently fallible. And yet there is no better alternative on the basis of which to assess risky alternatives and solutions.
One of the signature themes of STS and SSK is attention to the non-rational and political factors that influence the conduct of science. Philosophers of science often focus on the positive ability of science to gain truths about the world. STS scholars, in contrast, are often inclined to bracket the objectivity and veridicality of science, and to focus instead on the multiple social processes that influence the development of a body of scientific thought. This leads to an interpretation of science along the lines of a “social construction” model.
Pragmatism seems to point towards the most plausible position on scientific knowledge that incorporates both positions. Nothing in the methods or practices of science guarantees success. But we have a capacity to observe, theorize, measure, and test; and these abilities are crucial to our human ability to navigate an uncertain world. So we should look at the institutions and findings of science much as pragmatists like Israel Scheffler and WVO Quine did: as imperfect but valuable tools on the basis of which to learn some of the more important properties and dynamics of the world around us.
In the current context this means we should pay a lot of attention indeed to the convergence of evidence about climate change that environmental and climate scientists have painstakingly arrived at. And we should be vigilant in uncovering the secretive efforts in play to undermine those findings.