OUP UNCORRECTED PROOF – FIRST-PROOF, 06/21/12, NEWGEN
Chapter 29
ENVIRONMENTAL
POLICY AND SCIENCE
william ascher and
toddi a. steelman
1. The Problems of Science and
Environmental Policy
Few would dispute that we are faced with pressing environmental challenges.
Oceans are under assault, our climate is changing in unprecedented ways, and biodiversity loss is at historic levels. Science clearly has an important role to play in
helping us understand these problems and what we might do about them. After
all, who could be against science-based decision making or using the best available
science? Despite its appeal, however, enormous controversy surrounds the use of
science in environmental policy.
Consider, for instance, the debate that has taken place not only in the United
States but in other countries throughout the 2000s to place science at the heart of
decision making. The European Union (Holmes and Savgard 2008), United Kingdom
(Holmes and Clark 2008), Australia (Marston and Watts 2003), Canada (Bielak et al.
2008), and New Zealand (Parliamentary Commissioner for the Environment 2004)
pushed reforms so that policy would be driven by science and evidence of what is
effective instead of ideologically driven politics (Holmes and Clark 2008; Nutley
2003). “Data rather than dogma,” it was argued, is needed to shape policy (Shaxon
2005). These moves mirrored efforts advanced in the United States by the Barack H.
Obama administration in 2009. Under the George W. Bush administration (2001–
2008), science was believed to have played “second fiddle” to politics (Becker and
29_Kamienieck_Ch29.indd 652
6/21/2012 6:04:17 PM
OUP UNCORRECTED PROOF – FIRST-PROOF, 06/21/12, NEWGEN
environmental policy and science
653
Gellman 2007; Grifo 2007; Union of Concerned Scientists 2008). Obama issued a
memorandum about the importance of restoring scientific veracity to the White
House shortly after taking office. In December 2010 this memorandum was followed by a directive outlining the administration’s formal policy on “the preservation and promotion of scientific integrity” in policy making (Holdren 2010, 1).
Understanding the complex relationship between science and environmental
policy is crucial for bringing the broadest range of sound science and related considerations to policy makers. In this chapter, we cover the intellectual traditions
that have shaped how we view science, current controversies related to science and
environmental policy, and some alternatives to deal with the complexities of science and environmental policy making. We suggest along the way that science
ought to be seen as one form of knowledge that can inform policy and that understanding the role of science and these other types of knowledge within the distinct
phases of the policy process can provide greater insight into how we might think
about and use science more constructively.
We touch only on the broadest themes due to space considerations. These
themes point to a larger research agenda dealing with the complex relationship
between science and policy. To address this complex relationship, future research
on the roles of knowledge in the environmental policy process ought to focus on
how to uncover the largely unexamined biases embedded in how science is currently incorporated into the policy process. Research should also be devoted to
refining the criteria for evaluating the generation, transmission, and use of science, by examining case studies to determine what is most important for a healthy
relationship between science and policy. In a more prescriptive vein, research
should explore how to overcome the obstacles to providing policy makers with a
more balanced set of considerations for their deliberations. Finally, we suggest that
case studies could be constructive ways to shed light on how and why knowledge
hybrids succeed and fail.
We offer our own outlook, focusing on the combination of technical and political
considerations responsible for privileging particular types of knowledge over others
and consequently privileging, often unfairly, the considerations conveyed by those
types of knowledge. Our premise is that technical constraints, institutional interests,
and broad value commitments all shape the effective criteria that elevate particular
efforts and outcomes of knowledge processes over other knowledge elements.
2. Intellectual Traditions in Science
and Environmental Policy
A well-developed, lengthy discourse has shaped current debates about the role
of science in environmental policy, and several threads are relevant to current
29_Kamienieck_Ch29.indd 653
6/21/2012 6:04:17 PM
OUP UNCORRECTED PROOF – FIRST-PROOF, 06/21/12, NEWGEN
654
policy approaches and analytic tools
controversies. We elaborate on five of these threads and demonstrate how they
weave together to create a complicated tapestry for interpreting how science relates
to policy.
While the early intellectual history of science and policy could be traced back
to Aristotle, Sir Francis Bacon, and René Descartes, the intellectual traditions
most relevant for our purposes start with the Progressive movement in the late
nineteenth century, with its faith in a technocratic approach to management efficiency. Comprehensive rationality and objective technical approaches were proposed as the solution to the pervasive influence of corruptive local politics of the
time (Hays 1959). These beliefs shaped the nascent institutions that came to govern
natural resources management agencies. Over time, these Progressive assumptions
extended to the environmental and health agencies as well.
The second intellectual thread that influences current beliefs about science is
positivism. The hallmarks of positivist knowledge are that “facts can be pursued
independently of values, only information subject to empirical verification counts
as fact, and the goal of policy research is to discover universal laws that can then be
applied to all policy problems” (Shapiro and Schroeder 2008, 1). Positivists believe
that there is an objective, knowable reality that can be revealed through the scientific process. Karl Popper (1968) advocated the idea that empirical falsification
and rigorous hypothesis testing led by deductive reasoning was a better alternative to knowledge generation than an observational, inductive approach. Facts are
separable from values, according to positivists; consequently, it is important for
researchers to be neutral or unbiased in how they conduct research.
A belief in a linear model of how science influences policy makes up the
third thread. In the postwar period, Vannevar Bush, an engineer and the first
presidential science adviser, reemphasized the role of science and how it could
contribute to the betterment of society in the twentieth century. Based on his
experience leading some 6,000 scientists during World War II to leverage the
application of science to warfare, Bush wanted to see the same structure applied
to peacetime activities (Zachary 1997). With the publication of Science: The
Endless Frontier in 1945, Bush advocated the creation of the National Science
Foundation and laid out a process—a fundamental social contract—by which
basic science would inform policy making. The assumptions that drove Bush
included a belief that knowledge passed from basic research to applied research
and that the benefits ultimately trickled down to society (Byerly and Pielke 1995).
This linear model implied that science could inform policy and benefit society
simply by unearthing truth, setting it free, and allowing it to wind its way into
practical applications.
A fourth thread challenges the linear-model view and calls for a new relationship between scientists and society. These critiques question the practical
relevance of the linear-rational model whereby science leads to a policy influence (Brunner and Ascher 1992; Funtowicz and Ravetz 2001; Owens, Petts, and
Bulkeley 2006). Some have called for the social contract for science outlined by
Bush in 1945 to be replaced with a new social contract (Byerly and Pielke 1995;
29_Kamienieck_Ch29.indd 654
6/21/2012 6:04:17 PM
OUP UNCORRECTED PROOF – FIRST-PROOF, 06/21/12, NEWGEN
environmental policy and science
655
Lubchenko 1998; Stokes 1997). This new social contract would have scientists serve
society directly by encouraging them to be more responsive to pressing societal
problems, rather than work on basic, purely theoretical problems defined by scientists in the absence of greater input from the nonscientific community. In other
words, the linearity of the model would be changed to have society influence science and have scientists work more collaboratively with those whom science ultimately serves.
The final thread in this intellectual tapestry involves science as social construction. Conventional framing suggests that politics is separate from science.
The academic field known as Science and Technology Studies (STS) questions
this view. Scientists are not godlike generators of pure knowledge. Rather, they
are subjected to a variety of influences and biases that affect what they produce
and how they produce it. In the 1960s and 1970s, STS scholars characterized science as a social institution (Kuhn 1962; Latour and Woolgar 1979; Merton 1973;
Nelkin 1979; Ziman 1978). Challenges to the objectivity and neutrality of science
reemerged in the 1990s (Beck 1992; Fischer 1990, 2000; Forsythe 2002; Jasanoff
and Wynne 1998; Jasanoff 1990, 1995, 1996, 2003; Latour 1987; Wynne 1996). Even
if some scientists conceive of their role as providing values-free analysis, there are
fundamental limitations in the concept of scientists’ neutrality. Scientists are part
of a process in which they interact with political actors and respond to institutional
interests and forces. These institutional interests are not necessarily selfish—they
include strengthening the respect for the overall scientific enterprise, filtering out
what scientists regard as “questionable knowledge,” and securing more funding
for what the scientists view as compelling. Yet their actions, either as individuals
or within scientific institutions, shape what values will be furthered by the science
that emerges from these processes.
This history of thought creates a complex context into which the rest of
this chapter is situated. The first three threads are mutually reinforcing, while
the last two threads challenge this prevailing view. Progressive beliefs resulted
in the creation of long-lasting institutions that emphasized the bureaucratic
ideal of efficiency guided by expertise. Often these institutions excluded nonexperts. Faith in positivism reinforced the status of scientific expertise within
these institutions and within society at large. Science was generated using clear
protocols that privileged it above other types of knowledge. Finally, the linear
model of how science influenced society, and hence policy, maintained a clear
distinction between science and society and the status of scientists in driving
this process. Challenging this cumulative view of science, STS scholars question
positivistic beliefs that the process and outputs of science are objective and unbiased and call for new institutions that can readily accommodate other types of
knowledge or information that is equally legitimate for informing policy. Those
who critique the linear-rational model call for new institutional structures that
reorder how science could play a role in influencing the scientific process and
outputs to better serve society at large. Our own beliefs about science and policy
align more closely with these last two threads of thought than the first three.
29_Kamienieck_Ch29.indd 655
6/21/2012 6:04:17 PM
OUP UNCORRECTED PROOF – FIRST-PROOF, 06/21/12, NEWGEN
656
policy approaches and analytic tools
3. Controversies and Shortcomings
Related to Science and Environmental
Policy
Science has not always been able to deliver what it has promised, leading to disappointment over what science can do for environmental policy. These shortcomings,
on which we elaborate below, are indicative of long-standing controversies related
to the use of science. These controversies include the politicization of science and
the inherent challenges associated with complexity and uncertainty.
3.1. Politicization of Science
Science—that is, the science of conventionally credentialed scientists operating within
the mainstream scientific protocols—is widely seen as the source of rationality and
efficiency. In contrast, politics is often seen as, at best, a necessary evil, leading to
inferior choices and delay. However, politics infuses all aspects of these controversies, because the boundary between science and politics is permeable and indistinct
throughout the policy process. Scientific knowledge is used politically; and science
itself is often driven by institutional interests and other manifestations of politics.
The politicization of science highlights several shortcomings associated with
science and policy. First, whose science do we use as “the best available science”?
Different sides in a debate may leverage their own science to advance their own
position (Service 2003). Second, there may be legitimate disagreements about what
“the best available science” is, as in the cases of the debates over the health risks of
the plastics additive bisphenol A (Ascher, Steelman, and Healy 2010, 145–152) and
the environmental risks of genetically engineered grains (Sarewitz 2004). In both
of these cases, scientists interpret data differently. Contextual differences between
studies that are carried out can produce inconsistencies in data. What may constitute the best science in one place may not be the best in another. These differences
can be used to exploit the process for political ends (Ascher 2004; Davis 2007). For
example, the high-stakes game of continuing the development of Yucca Mountain
as the central depository for spent nuclear fuel has been fought in large part over
different scientific approaches to address the technical question of whether and
how precipitation would percolate through the rock shielding the nuclear waste
buried under the mountain (Metlay 2000). Third, different sides in a debate may
disagree over the levels of certainty or uncertainty associated with data. Scientists
and politicians have different levels of tolerance for uncertainty (Pouyat 1999). As
in the case addressing global climate change, decision makers may exploit uncertainty to delay a decision, which inherently advantages the side that would prefer
inaction on an issue. In all of these instances, science is marginalized or devalued
because it is subjected to unrealistic expectations.
29_Kamienieck_Ch29.indd 656
6/21/2012 6:04:17 PM
OUP UNCORRECTED PROOF – FIRST-PROOF, 06/21/12, NEWGEN
environmental policy and science
657
In addition to these challenges, there is a highly optimistic outlook that science
leads to greater consensus among stakeholders; greater certainty about trends, conditions, and the potential impact of alternative policies clarifies the common interest; and sound science discredits the special-interest proposals based on unsound
knowledge (Sarewitz 2004). While in some cases additional scientific knowledge
can create consensus, as in the Montreal Protocol on chlorofluorocarbons in the
1980s, we should calibrate our optimism about what science can do. More and better
science alone does not necessarily lead to improved environmental decision making. New knowledge does not necessarily induce contending groups to converge
toward a science-based agreement. The gulf between European and American perspectives on whether to permit genetically engineered organisms into agricultural
production and food consumption has persisted despite large amounts of research
on health and environmental risks. Despite spending $40 billion on climate change
research since the 1990s, there is arguably less consensus now in the United States
around the topic of climate change than in previous decades. As long as people
have different goals, additional scientific information that clarifies the stakes and
who the winners and losers will be for each policy option may consequently reinforce value disputes and competing interests rather than harmonize them (Healy
and Ascher 1995).
Sabatier (1988) focuses on the interpretations and differential acceptability of
the content of science, according to whether data conform to the positions and preferences of each “advocacy coalition” involved in the environmental policy debates.
The premise is that information that runs counter to what a particular group wants
to believe will be rejected, unless and until the evidence supporting the data is
extremely strong. Thus, the reception of science is politicized. Sabatier’s outlook
also reinforces the skepticism that additional science, even if sound, would lead
to the convergence of policy views. The science on a host of issues, ranging from
genetically engineered organisms and childhood vaccinations to nuclear risks and
global climate change, encounters deeply entrenched skepticism born of suspicion
of political or economic motivations.
In fact, politics, as contestation over the outcomes that a society generates and
how they will be shared, is inevitable; fair politics that permits the full range of
interests to be taken into account is an essential mechanism required to pursue
the public interest, as is science. Sound environmental policy—based on constructive public involvement, balanced weighing of interests, relevant science, pertinent local information, and fair mechanisms of conflict resolution—can serve to
clarify and secure the common interest. Yet unless a constructive and open synthesis of science and politics can be established, political factors will continue to
be seen as illegitimate in orienting science, or will be embedded in implicit forms
that distort the roles or functions that science can play. In contrast, science and
politics have been merged constructively in a number of cases: the work of the
Intergovernmental Panel on Climate Change has recently brought more coherence
to the deliberations over cutting greenhouse gas emissions; the science shops in
Denmark have been able to provide local communities with contextually relevant
29_Kamienieck_Ch29.indd 657
6/21/2012 6:04:17 PM
OUP UNCORRECTED PROOF – FIRST-PROOF, 06/21/12, NEWGEN
658
policy approaches and analytic tools
science in deliberations over water pollution, organic farming, and energy-saving
transportation (Brodersen, Jorgenson, and Hansen 2006).
3.2. Inherent Challenges of Complexity and Uncertainty
A key belief in the science-policy process is that the role of science is to reduce
uncertainty and, once uncertainty is addressed, policy solutions will easily follow
(Sarewitz 2004). However, two challenges to this view highlight the limitations of
science to fulfill these expectations.
First, complexity itself is a barrier to resolving uncertainty. Environmental
problems are problems of socio-ecological systems, and the sheer complexity of
socio-ecological systems calls into question the ability of science to reduce fully
uncertainties associated with understanding them (Gallopìn 2004). Complex natural and social systems inescapably involve uncertainty because they are coevolving systems (Gunderson 1999). Surprises are inevitable. Ecological theory has
changed since the 1960s to recognize the randomness of events that cause unique
path dependencies rather than tending toward more uniform climax states (Botkin
1990; Gunderson and Holling 2002; Holling 1978). Under these ever-changing conditions, more scientific information will not necessarily lead to better understanding as the system continues to evolve. Confidence in science may even contribute
to a misguided sense of certainty in such dynamic systems. The main point in
appreciating the role of complexity is that there is a big difference between the
view that nature is not in equilibrium and is inherently unpredictable and the view
that science is just difficult, and if we know enough we can clarify uncertainties
and move forward (Berkes, Colding, and Folke 2003). As complexity increases, our
confidence in what we know ought to be more modest.
Second, the political implication of the dynamics of complexity and uncertainty is that science may raise new questions, making agreement less likely, rather
than more likely (Rayner 2006; Sarewitz 2004). In the protracted debate over the
siting of the high-level radioactive waste site at Yucca Mountain, Nevada, more
science actually led to greater disagreement over whether and how to store radioactive waste at this location (Sarewitz 2004). Additional scientific studies may aggravate rather than alleviate problems in environmental policy making. More science
means what, exactly?
These limitations of science related to complexity and uncertainty rest in part
with societal expectations about science. Gallopín et al. (2001) question the “existing rules of enquiry” of conventional science. Importantly, they do not lay all the
blame on the scientists, in that they argue that society has thus far not questioned
the inability of what they call the “classical science project” of conventional, nonintegrative science to address complex socio-ecological interactions. They assert that
the prevailing proscience ideology interprets the limitations of the science and/
or its application. The presumption on behalf of society is “that more knowledge
29_Kamienieck_Ch29.indd 658
6/21/2012 6:04:17 PM
OUP UNCORRECTED PROOF – FIRST-PROOF, 06/21/12, NEWGEN
environmental policy and science
659
will reduce uncertainties, increase capacity for control, and permit the remedying
of past mistakes” (Gallopín et al. 2001, 227). As such, Gallopín et al. (2001) suggest
that it is important to separate the nonapplication or misapplication of what science can do from the need to actually change the processes of science.
Despite the challenges that complexity and uncertainty impose on science,
Sarewitz (2004, 393) suggests that “uncertainty about facts need not be an impediment to political resolution of heated controversy.” Scientific uncertainty does not
have to be resolved before taking action, because no amount of data or theory can
eliminate all uncertainty. Policy can and does proceed under uncertainty. Given the
complexities inherent in environmental policy, uncertainty will always be prevalent.
The real challenge is in how to proceed in light of these inevitable uncertainties.
4. Knowledge in the Environmental
Policy Process
Given the significant challenges posed in the previous section, how do we proceed
to make sense of science and environmental policy? To make sense of this complexity, it is useful to make two distinctions. First, environmental policy making is
potentially responsive to three broad knowledge types, not just conventional science
(Ascher, Steelman, and Healy 2010). Conventional science, as one subset of knowledge, includes biophysical data as well as effects on human health, ecosystem health,
and natural resources. It is incomplete unless it also includes understanding of
social, political, and economic aspects of human behavior. For example, predictions
of pollution levels that would result from a more stringent regulation depend not
only on the biophysical changes that would occur if affected actors comply with the
regulations, but also on the economic and social factors that determine the degree
of compliance. As broad as this seems to be, other types of knowledge are also crucial in informing policy. These additional knowledge types include local knowledge
and public preferences. Local knowledge includes information and insights about
contextual and practical environmental conditions, insights into environmental
patterns and management derived from practice, and awareness of political relationships within a community or patterns of interaction. Knowing whether the
migratory waterfowl passing through a particular locality use the natural ponds or
reservoirs may be critical for deciding how much water to release from dams during
droughts, yet this knowledge cannot be found in scientific journals. Public preferences, beliefs, and priorities provide insight into various individuals’ or groups’
support of or opposition to a given position or outcome. For example, the public’s
concern over nuclear reactor meltdowns is a crucial factor in the halt to expanding
nuclear energy in the United States following Chernobyl, Three Mile Island, and
Fukushima Daiichi, regardless of whether the likely number of deaths is far greater
29_Kamienieck_Ch29.indd 659
6/21/2012 6:04:17 PM
OUP UNCORRECTED PROOF – FIRST-PROOF, 06/21/12, NEWGEN
660
policy approaches and analytic tools
from coal-fired electricity generation.1 Policy makers need to be exposed to all three
of these forms of knowledge about how natural systems work, how people interact
with ecosystems, what the citizenry prefers in terms of policies and outcomes, and
what effects conceivable policy alternatives would have.
Second, conceptualizing how knowledge is influenced in different phases of
the policy process provides greater opportunities for understanding the limitations
and opportunities for science, as well as other types of knowledge. The straightforward distinction among the knowledge processes of generation, transmission, and
use are helpful in this respect (Ascher, Steelman, and Healy 2010). Knowledge generation involves six aspects: gathering, interpreting, theorizing, modeling, endorsing, and synthesizing information and insights. Knowledge transmission may take
place among a variety of pathways, including the scientific literature, the minds of
experts, or those who support or oppose a given policy. Knowledge may be transmitted through an equally diverse set of channels—technical journal articles, popular media, congressional testimony, or word of mouth at the hairdresser or local
barber shop. The uses of knowledge are also varied, from bringing an issue onto the
active policy agenda to helping formulate policy options to choosing and enacting
or evaluating a particular policy. Each of these generation, transmission, and use
processes entails distinctive actors, technical challenges, and political issues.
In one respect, generation, transmission, and use comprise a linear sequence
in that one can trace any element of generated knowledge to determine whether
it is or is not transmitted, and whether or not the transmitted knowledge is used.
However, this straightforward analytic distinction among the knowledge processes disguises the processes’ intricate interactions, which are by no means simply sequential. This “ecology” of knowledge processes begins with the fact that
the transmission of particular knowledge elements tends to enhance their standing, often encouraging support for those who created them: transmission not only
transforms and gives additional meaning to generated knowledge, but also stimulates knowledge generation. By the same token, the use of knowledge can stimulate
knowledge generation and transmission through both demand and rewards: by
the gratification that knowledge generators derive from knowing that their work is
heeded, by the grants and contracts that government agencies or other interested
parties provide to generate knowledge, and by the requirements for information
and analysis that new laws and regulations impose.
The twin conceptual lenses of knowledge (science, local knowledge, and public
preferences) and the policy process (generation, transmission, and use) provide
insight into further dysfunctions related to science and policy.
• Some forms of knowledge are inappropriately privileged, at the expense of
other forms.
• Local, practical knowledge is frequently given short shrift in favor of more
abstract science developed in laboratories, simulation modeling efforts,
1. See Dunlap, Kraft, and Rosa 1993 for analyses of public attitudes and their importance.
29_Kamienieck_Ch29.indd 660
6/21/2012 6:04:17 PM
OUP UNCORRECTED PROOF – FIRST-PROOF, 06/21/12, NEWGEN
environmental policy and science
•
•
•
•
29_Kamienieck_Ch29.indd 661
661
or straightforward theorizing. The disastrous water management of the
Klamath River Basin, stemming from the neglect of farmers’ long experience with rain and flooding cycles related to the endangered suckerfish,
is a sad case in point. The ostensibly scientific “biological opinions” of the
U.S. Fish and Wildlife Service disregarded decades of experience of the
region’s farmers, leading to years’ worth of battle over the regions resources
that was hard on the farmers, Native Americans, and endangered species
alike (Brunner and Steelman 2005).
Public preferences, an obviously important form of knowledge in democratic systems, are frequently poorly incorporated into planning and policy
making. For example, although the U.S. National Forest Management Act
requires public input, the U.S. Forest Service officials have minimal guidance for using this input (Steelman and Ascher 1997).
Progress in formulating environmental regulations is sometimes stymied
by “quality-control” watchdogs, such as the U.S. Office of Information and
Regulatory Affairs (OIRA), through their claims that analysis about the
benefits of environmental protection and conservation fails the standards
of rigor. This became a highly politicized issue during the George W. Bush
administration, as critics of OIRA accused it of using its authority as part
of an antiregulation campaign by unjustifiably claiming that regulatory
impact analyses were not of high enough quality to proceed through the
approval process. OIRA officials claimed that they were merely upholding
scientific standards (Adams 2002, 525; Heinzerling 2006).
Whereas in some nations the suspicion of environmental risks is enough
to prompt regulation in keeping with some version of the Precautionary
Principle, in the United States the invocation of uncertainty of knowledge
often allows potentially hazardous conditions to persist (Vig and Faure
2004). Many examples can be found beyond the well-known tactic of delaying action on global climate change on the absurd grounds that it is better
to wait until all uncertainty is resolved. For instance, the delay in the U.S.
government’s actions to regulate the chemical BPA, a known endocrine
disruptor present in a wide range of plastics, reflects the ease with which
the opponents of regulation can paralyze progress by invoking scientific
uncertainty. While the Canadian government banned BPA, the governments of the United States, Germany, and Japan simply called for more
studies, and the U.S. chemical industry association denounced all the studies linking BPA to the potential for health problems, including a report by
an expert panel convened by the Center for Evaluation of Risks to Human
Reproduction of the National Institute of Environmental Health.
Existing data-gathering efforts are often slow to give way to gathering
newly relevant data. For instance, wildfire managers continue to collect
data on the number of acres treated to reduce wildfire threats, even when
they are in wilderness areas completely removed from their ability to affect
human communities. The more relevant data would be the number of acres
6/21/2012 6:04:17 PM
OUP UNCORRECTED PROOF – FIRST-PROOF, 06/21/12, NEWGEN
662
policy approaches and analytic tools
treated near human communities where the needs are greatest to mitigate
the effects from catastrophic wildfires (Steelman and Burke 2007). Obsolete
or otherwise questionable knowledge sometimes takes on iconic status,
insulating questionable understandings from proper scrutiny. The severely
flawed research on the so-called resource curse published in the mid-1990s
continues to be cited as gospel (Banks 2005; Bravo-Ortega and De Gregorio
2006; Matsen and Torvik 2005). The outdated early-1990s estimates of benefits accruing from better water quality were still used in assessing these
benefits in the mid-2000s (U.S. Environmental Protection Agency 2009).
• Existing laws, regulations, and executive orders often restrict the decisionaiding analysis that is generated, transmitted, or used in environmental
policy making. For example, although presidential executive orders require
that the formal benefit-cost analysis be conducted in assessing air quality
and water quality regulations, the U.S. Clean Air and Clean Water Acts
forbid the explicit application of the benefit-cost analysis in determining
the outcome. While the benefit-cost analysis can add to the understanding of the stakes, these analyses are often incomplete. Professional and
institutional disincentives are often at play to limit analysts’ ability to conduct a comprehensive analysis. The benefits that are the most difficult to
monetize are often underinvestigated because they may expose the analyst
to professional ridicule. Consequently, the easy to monetize benefits, and
most professionally defendable, are most likely to be included. The result is
that additional information is considered, even though by law it is prohibited, that is more than likely would reduce the impetus for stronger environmental regulations in future rounds of rule making. By the same token,
the 1973 U.S. Endangered Species Act narrows data collection to only the
listed species and their habitats, excluding other relevant information. This
is an impediment to adopting policies that would protect land containing
not only the target species, but others, listed and unlisted, as well.
When laws or regulations formally prohibit the consideration of relevant
knowledge, these laws or regulations may be subverted to adjust decisions to take
the knowledge into account. For example, when the operation of the $100 million Tellico Dam in Tennessee was stalled because of the potential extinction of
the snail darter fish, Congress amended the Endangered Species Act to permit
a cabinet-level committee to decide whether an exemption should be permitted;
when the committee denied the exemption, Congress gave the dam the go-ahead
anyway (Wheeler and McDonald 1986).
The potential of sound knowledge to reveal the weaknesses of unsound proposals pushed by special interests is frequently compromised by uncertainty about
what criteria ought to be applied to judge the quality of knowledge inputs. In the
United Kingdom, local sheep herders’ concerns about the Chernobyl disaster’s
impact on their flocks were ignored because they “merely” had local historical
knowledge of the soils where radioactive disposition landed. Scientists’ confidence
29_Kamienieck_Ch29.indd 662
6/21/2012 6:04:17 PM
OUP UNCORRECTED PROOF – FIRST-PROOF, 06/21/12, NEWGEN
environmental policy and science
663
in their models led to the loss of entire herds of sheep (Wynne 1996). U.S. nuclear
policy has foundered in no insignificant part because federal officials dismissed
public anxieties over nuclear risks as scientifically naive, although the key political
fact has been the anxieties themselves (Dunlap, Kraft, and Rosa 1993).
5. Privileged Knowledge
A broad conclusion covering all three of the knowledge processes and knowledge
types is the abundance of possibilities of privileging some forms of knowledge over
others. This often creates self-reinforcing patterns that shape the very processes
by which future knowledge is produced and applied. Interactive effects among the
generation, transmission, and use of knowledge result in changes in the institutions
that shape policy making. The adverse consequence is that important knowledge,
like local knowledge and public preferences, receives insufficient standing because
certain methods of analysis and certain types of scientists are privileged over others. The relative power and status of certain participants in the policy-making
process are elevated, while others are diminished. For instance, misconceptions
about what constitutes “scientific rigor” often elevate quantification over genuine relevance (Majone 1989). Easily measurable aspects that can be captured with
apparent rigor are privileged over less tangible aspects of equal or greater importance. For example, estimates of the economic costs of stringent environmental
protection are typically regarded as straightforward and reliable. In contrast, estimates of less tangible environmental amenities such as aesthetics and ecosystem
persistence are typically regarded as shakier, and, therefore, are often deemphasized in the benefit-cost analyses conducted by environmental agencies such as
the U.S. Environmental Protection Agency. These same biases are prevalent given
the reputation for rigor of simulation models, regression analysis, and controlled
laboratory experiments. Simulation models may have hundreds of equations, but
they cannot represent anything beyond the modeler’s theoretical understanding of
the dynamics of the modeled systems, with parameters that are estimated on the
basis of past data that may or may not be relevant in projecting future trends. By
the same token, regression models used to assess the value of environmental amenities are vulnerable to model mis-specification, omitted variables, and inaccurate
or missing data. Laboratory experiments appear rigorous by virtue of their capacity to include controls, but their correspondence with real behavior is beyond the
ability of the experiments to confirm. Despite these vulnerabilities, such methods
are typically presumed to be rigorous and reliable.
Previously generated knowledge is endorsed by the very act of using it, or even
simply citing it. This influences the standing of different elements of knowledge.
For example, as environmental agencies assess toxicity risks, they necessarily highlight some dosage-response studies over others. Going further, they may decide
29_Kamienieck_Ch29.indd 663
AQ: We
have
renumbered head
level for
continuity please
ensure
cross
references.
6/21/2012 6:04:18 PM
OUP UNCORRECTED PROOF – FIRST-PROOF, 06/21/12, NEWGEN
664
policy approaches and analytic tools
to require reportage of a broader range of discharges, generating more knowledge
about toxic releases. They may also call for more laboratory studies, thereby stimulating (and perhaps funding) knowledge generation.
Institutional interests come into play in the efforts to secure higher standing
for particular knowledge generation efforts by particular institutions, such as university labs or independent think tanks; or by particular sets of people with specific
credentials of expertise such as people with doctorates in biology or economics.
Institutional interests are also reflected in the reliance on particular transmission
mechanisms, ranging from the highly technical, such as biochemical engineering
journals, to the highly popular, such as USA Today. They also play a role in efforts
to secure decision-making routines that take up particular types of knowledge,
ranging from the species-population inventories that drive the application of the
U.S. Endangered Species Act to the admissibility of expert evidence in the courts.
Broader value orientations shape the formulation of the laws, regulations, and
processes that use knowledge—and thereby create demand for particular forms
of knowledge. The most striking example is the range of principles that underlie
the most important U.S. environmental laws. The 1970 Clean Air Act and the 1972
Federal Water Pollution Control Act enshrine the rights of Americans to “clean” air
and water—though how clean is unclear. The Endangered Species Act indirectly
enshrines the rights of nonhuman species—though its rhetoric is couched in terms
of benefits for citizens. The National Forest Management Act calls for the systematic
incorporation of public input—though what to make of that input is also unclear.
The Clean Water Act requires the best available economically achievable technology to reduce toxic emissions, which entails balancing benefits and costs, but is not
a full societal benefit-cost analysis because the social benefits of controlling some
discharges may still exceed control costs even if economically achievable.
Broad value orientations also shape how different forms of knowledge inputs
are received. For example, a high societal priority for wealth will predispose society to demand economic analyses. The risk-averse will favor the Precautionary
Principle and will be more responsive to knowledge about risks than about opportunities. Scientific inputs will be more acceptable to those who have high levels of
technological optimism.
In addition, assuming a distinct boundary between science and the expression
of policy preferences, with the implication that science must be given precedence,
leads to the neglect of other types of knowledge in decision making and filters out
important considerations in formulating environmental policy. People want both
their policy preferences and particular processes for coming up with these preferences (Sagoff 1994). Because of these filterings and attributions of standing that
occur in both generating and transmitting knowledge, the knowledge that policy
makers come to regard as reliable overly narrows the range of considerations that
they take into account. There are limitations on the capacity to convey the full
set of sound, relevant knowledge or to filter out unsound knowledge. On the one
hand, some limits to generated and transmitted knowledge are inevitable; on the
other hand, some unsound knowledge will inevitably escape the filter. In short, the
29_Kamienieck_Ch29.indd 664
6/21/2012 6:04:18 PM
OUP UNCORRECTED PROOF – FIRST-PROOF, 06/21/12, NEWGEN
environmental policy and science
665
culture, institutions, and rules that shape the filtering of knowledge run the risk
of limiting comprehensiveness, jeopardizing dependability, and inappropriately
bounding selectivity for knowledge transmitted in policy making.
Yet we cannot regard these problems as arising from merely technical constraints. The filtering, attributions of standing, invocations of some knowledge
inputs over others, and the treatment of uncertainty all present opportunities and
vehicles for politics at several levels. Control over the generation and dissemination of knowledge is a key resource of the bureaucratic politics that pits one
agency against another for standing, jurisdiction, and budget. Consider the proposed building of a cement plant that pits a state-level Department of Commerce
against the Department of Environment and Natural Resources. The Commerce
Department will argue for the jobs the plant will bring and suggest that the state
provide incentives to locate the plant. The Environment and Natural Resources
Department will counter that mercury emissions will exceed regulatory standards
and that the state should encourage cleaner economic options. The Commerce
Department may present its own analysis of environmental impacts, find existing
studies with lower predictions of environmental damage, or cite the analysis generated by the cement industry. The “politics of expertise” is employed by agencies;
they play the card of expertise, given their respective perspectives in the broader
game of bureaucratic politics (Benveniste 1977; Fischer 1990, 2003).
7. Opportunities for Science, Politics,
and Policy
The intertwining of knowledge processes, science, policy making, and politics
should not be viewed as an alarming sign that science has been inappropriately
infiltrated by politics, or that the environmental policy process is incapable of
making the best of the various forms of knowledge (Brown 2009). Yet finding better ways to integrate science, politics, and policy requires exploring new relationships and new institutions.
The overarching thrusts of our specific recommendations are to
• Redefine the boundaries between science and politics. This includes moving beyond the misguided mind-set that conveying values undermines
knowledge. In doing so, we prescribe a greater role for alternative forms of
knowledge, including increasing the participation of stakeholders in shaping the generation, transmission, and use of knowledge. Counterintuitively,
this also means defending the integrity of science.
• Deal constructively with complexity and uncertainty. This includes confronting and overcoming the paralysis that often emerges from the recognition that environmental effects are subject to uncertainty.
29_Kamienieck_Ch29.indd 665
6/21/2012 6:04:18 PM
OUP UNCORRECTED PROOF – FIRST-PROOF, 06/21/12, NEWGEN
666
policy approaches and analytic tools
• Improve knowledge integration, synthesis, and communication between
knowledge generators, transmitters, and users, especially policy makers.
This includes ensuring that a broad range of considerations can move
through the gauntlet of filtering to the policy makers, while still applying a
sufficient degree of quality control.
7.1. Redefining the Boundaries between
Science and Politics
We cannot depoliticize science, nor would we want to. The challenge is not to eradicate politics or interest groups from the process but, rather, to build constructive
roles for them. Current knowledge processes are dominantly structured to separate
politics from science—a futile effort because science is thoroughly enmeshed in
social and political practices. This is not to say that science is unimportant—it is
essential. The goal is to acknowledge openly the social and political influences on
science, while doing a better job of integrating the knowledge about the biophysical
world into the policy process. Additionally, we need to improve on the legitimacy
of all types of knowledge, including local knowledge and public preferences. These
alternative knowledge types are often unjustifiably regarded as flawed because of
their explicit social and political connection.
Our world is too complex to ever have perfect, comprehensive knowledge within
the policy making process. Since the universe of knowledge brought to bear on a policy
issue is impossible to bound, this means that we must be selective in which knowledge
we choose to use. Being selective means we are employing biases, either implicitly or
explicitly. Given this reality, being clear about what influenced the generation and
transmission of the knowledge is crucial. We need to make science more transparent,
including clarifying the assumptions behind the generation, transmission, and use of
science. A whole range of institutional, professional, and personal interests influence
how knowledge is created, passed on, and used. Simply ignoring that these interests
exist may help perpetuate the myth of objective science and neutral scientists. Science
is not values-free. In the long run the effort to develop values-free knowledge generation is counterproductive if we want to restore confidence in science. As Rykiel (2001,
435) observes, “Scientific procedures are aimed at minimizing subjectivity, not at the
unattainable goal of eliminating it.” Awareness of values positions of scientists, other
knowledge generators, and knowledge users allows us to confront those biases that are
inappropriate. This will entail restructuring incentives and accountability structures
throughout our social institutions such as universities, peer-reviewed publications,
grant-making organizations and agencies, repositioning professional reward systems,
and encouraging personal change in behavior.
The environmental policy process requires more space for forms of knowledge
beyond conventional science, without undermining science. Science and the scientific process are already in a state of crisis. Increasingly, science as an enterprise is
29_Kamienieck_Ch29.indd 666
6/21/2012 6:04:18 PM
OUP UNCORRECTED PROOF – FIRST-PROOF, 06/21/12, NEWGEN
environmental policy and science
667
castigated in debates ranging from climate change to BPA to the listing of endangered species. This creeping cynicism about the scientific process undermines its
long-term legitimacy among policy makers and the public alike.
How then do we defend the integrity of formal science? The Union of
Concerned Scientists has suggested several ways to restore the integrity of science.
These reforms include legislating whistle-blower rights, enforcing whistle-blower
protection, and making government more transparent. Specific policies include
giving the public better access to federal science, reforming agency media policies,
reforming the Freedom of Information Act, ending overclassification of documents
as secret, and disclosing and mitigating conflicts of interest (Union of Concerned
Scientists 2008).
Scientists should be allowed to be advocates, and in fact have to be. But this
requires transparency in the process. Consequently, it is important to partition the
work of sciences from the advocacy of the science. Science has an important role to
play in the policy-making process: to help define problems, identify key trends as
they relate to problems, inform debate, and educate (Karl, Susskind, and Wallace
2007). However, science cannot provide objective answers to policy questions that
inherently involve values. Science is better at identifying the problems than it is in
providing clear-cut solutions to those problems (Pouyat 1999). Scientists can advocate their findings, but this is an explicitly values-laden process. The peer review
process exists to provide a set of methodological protocols, including checks and
balances, to create legitimacy in the science that can be used to inform policy. It is
imperative that these processes be maintained, albeit with opportunities to make
the biases that go into these processes explicit.
Science has often been privileged because it has been seen as a more dependable form of knowledge relative to local knowledge and public preferences. Local
knowledge and public preferences are more explicitly influenced by political and
social processes, though conventional science is influenced more implicitly by
these values. Recognizing that these social and political influences are prevalent
in all knowledge generation, transmission, and use is the first step to evening the
playing field for these different knowledge types. To do so, we need to make better
progress in the processes that can render local knowledge and public preferences
more dependable for specific decision processes. A strictly science-based policy
process will be inadequate for the demands of policy makers. Nutley (2003) suggests that strictly evidence-based approaches are naive given the practicalities of
policy making and suggests the need for “evidence-informed” or “evidence-aware”
policy making. The idea here is that the evidence is clearly important but needs to
be tempered by other considerations in the policy process. In the United Kingdom,
this has come to mean evidence from stakeholders, experts, departmental staff,
street-level bureaucrats, and agencies that will be affected by the policy (United
Kingdom Cabinet Office 1999). The inclusion of local knowledge and policy
preferences, however, raises the discomfort level of those who want to rely on
evidence-only approaches. Beierle and Cayford’s (2002) meta-analysis of 237 case
studies of stakeholder-based decision-making processes indicated that the quality
29_Kamienieck_Ch29.indd 667
6/21/2012 6:04:18 PM
OUP UNCORRECTED PROOF – FIRST-PROOF, 06/21/12, NEWGEN
668
policy approaches and analytic tools
of the decisions was not compromised and that greater participation had a positive
effect on the outcomes of the decision.
Science can be useful, but it depends on how credibility and legitimacy are
created. According to Karl, Susskind, and Wallace (2007, 23), “The credibility and
legitimacy of science depend upon how and by whom information is gathered and
the process by which scientific inquiry is conducted.” In part, it is the processes
we utilize that politicize science. Our constitutional and administrative systems
are based on a model of conflict envisioned by the founding fathers such that no
one group or branch of government should dominate another. Within this model
one side does battle with the other with the belief that interest groups and different
branches of government will check and balance each other—the outcome of which
will best serve the best interests of society at large. The role for science in this
system is less to inform decision making than to be used as a weapon in a political process to advance one’s agenda (Karl, Susskind, and Wallace 2007). Science
becomes a casualty in the process, collateral damage within a system that unintentionally devalues what science can deliver. If we restructure our processes to be
less intentionally conflictual, then we open up opportunities for science to be more
constructive. These opportunities include providing a greater role for collaboration between scientists and nonscientists in the generation, transmission, and use
of “knowledge hybrids” (Ascher, Steelman, and Healy 2010). Creating the appropriate incentive structures will be important in moving these visions forward. This
will include creating consistent institutional and professional incentives within
educational and research-based institutions to encourage rather than discourage
the inclusion of nonscientists in these processes.
Many models have been proposed to enhance collaboration among scientists from different disciplines and scientists and nonscientists in the generation,
transmission, and use of knowledge. Knowledge hybrids integrate science, local
knowledge, and public preferences in different combinations. The intention is to
preserve the usefulness of formal science while embracing the benefits of other
knowledge forms, including citizen-scientist collaboration, joint fact-finding, and
stakeholder-based decision making, among others.
Two scenarios call for different types of knowledge hybrids. One scenario
entails employing technical knowledge to implement policies that have been established in broad outline through a participatory process. For example, if the policy
emerging from a participatory process is to consolidate nuclear waste in one location, external reviewers should be given authority to undertake their assessments
insulated from political pressures. The methodologies to pursue this approach are
well developed through advances in expert elicitation, modeling, and forecasting
(Arkes et al. 1997; Armstrong 2001; Brewer 2007). The other scenario pertains to
situations in which technical issues exist alongside of still-unresolved questions of
goals and priorities. Some form of public participation is necessary, but it has to be
integrated with systematic assessment of information and theory.
A host of approaches has been offered—and tested in various contexts—to
effect this integration. Numerous design dimensions exist to structure interaction
29_Kamienieck_Ch29.indd 668
6/21/2012 6:04:18 PM
OUP UNCORRECTED PROOF – FIRST-PROOF, 06/21/12, NEWGEN
environmental policy and science
669
between experts and the public.2 Understanding these design advantages and limitations can help facilitate choosing the right hybrid for the right problem under
the right political and technical conditions. For instance, in choosing between a
joint fact-finding process and an initiative process, one might want to consider
how they differ on the dimensions of representation and the centrality of science.
Representation of participants in the initiative process is high, but the centrality of science is low. Conversely, the joint fact-finding process compares less well
on representation, but the centrality of science is much higher. Additional design
dimensions that are important, but are not covered here due to space limitations,
include the role of the convener, independence of participants, early involvement of
the public, public influence on final policy, process transparency, resource accessibility, task definition, structured decision making, potential for local knowledge
to influence, resource demands, cost-effectiveness, potential for the expression
of preferences, the public’s role in framing the problem, and the public’s role in
framing alternatives (Ascher, Steelman, and Healy 2010, 190–201; Rowe and Frewer
2000).
A challenge remains in terms of what standards can be applied to these
approaches. Norms are beginning to emerge to assess the effectiveness of the integration of local knowledge and public-preference knowledge into decision making.
This is necessary to identify best practices in the form of processes (citizen-science
collaboration, joint fact-finding, citizen advisory councils, etc.) and outcomes in
such areas as soil management, watershed buffers, and so on. Best practices derived
from practical knowledge are inherently localized and contextual. An implication
is that decision aids that leverage these best practices must include direction about
whether and when the practices are applicable to their particular situations.
6.2. Dealing Constructively with
Complexity and Uncertainty
One approach to dealing constructively with complexity and uncertainty is to use
scientific conventions to reveal sciences’ own limitations. This could be done by
demanding that conventional science be explicit about the assumptions, incompleteness, and legitimate disagreements, rather than burying them under arcane
formulas or ignoring them altogether. One path forward is to encourage standardized reporting for expressing uncertainty in peer-reviewed work (Moss and
Schneider 2000). These standards could be enforced by journal editors and government agencies as criteria for publication. The straightforward acts of expressing
uncertainty explicitly, and explaining how conventional models do not eliminate
2. For a comprehensive list and evaluation of trade-offs see Rowe and Frewer 2000 as well
as Ascher, Steelman, and Healy 2010, 190–201.
29_Kamienieck_Ch29.indd 669
6/21/2012 6:04:18 PM
OUP UNCORRECTED PROOF – FIRST-PROOF, 06/21/12, NEWGEN
670
policy approaches and analytic tools
all uncertainty, go a long way in sensitizing policy makers and the public of science’s strengths as well as limitations.
Another approach to uncertainty and complexity is to make policy more flexible. Adaptive management, an approach that focuses on monitoring, analysis,
and adjustment of policy in light of new information, has received great attention
since the early 1990s for being able to deal more realistically with the uncertainties inherent in environmental policy (Holling 1978, 1995, 1997). Iterative feedback
loops that facilitate social learning among diverse participants are key for making
midcourse corrections if the management action is not meeting specified goals.
Given the great uncertainties inherent in socio-ecological systems, these feedbacks
can detect problems and rectify the situation without having to wait lengthy time
periods for comprehensive policy appraisals (Lasswell 1971).
6.3. Improve Knowledge Integration,
Synthesis, and Communication
If knowledge is to be used by policy makers, then it is incumbent on scientists and
other knowledge producers and transmitters to integrate and synthesize findings
so they can be communicated better to those who will ultimately use the information. Scientists think they know how to communicate, when often they do not
(Pouyat 1999). Conversely, scientists believe nonscientists need to be better educated to understand what scientists have to say. The responsibility rests on both
sides.
On one side, scientists plead that they are misunderstood. Policy makers often have unrealistic expectations about what science can deliver (Pouyat
1999). Scientists commonly believe that policy makers think science can provide values-free, universal “truths” to guide the “right” answer for policy decisions (Herrick and Jamieson 1995; Pouyat 1999). Conveniently for policy makers,
if scientists can provide the right answer, then this exculpates policy makers
from having to make tough values-based decisions. Pouyat (1999, 282) suggests
that what policy makers want is a “bright line” to guide their decision making. Revising and clarifying expectations is a first step in defending the integrity of formal science, as we mention above, and can also help facilitate better
communication.
Improving the overall knowledge that will inform policy will come with better
attempts to integrate science, local knowledge, and public preferences. After all,
the knowledge base is more dependable if both science and experience reinforce
each other. The challenge is to discover how to organize and integrate the interactions among knowledge generators operating within different paradigms, knowledge transmitters applying different standards, and knowledge users employing
knowledge to pursue different values. Knowledge needs to be integrated among
the many different disciplines that are relevant to environmental policy. Several
29_Kamienieck_Ch29.indd 670
6/21/2012 6:04:18 PM
OUP UNCORRECTED PROOF – FIRST-PROOF, 06/21/12, NEWGEN
environmental policy and science
671
factors have led to the increasingly specialized and esoteric output of professional
scientists. This reflects the expansion of foci and scientific personnel as well as
a reward structure within professional science that still favors specialization.
Because in most fields publishing in disciplinary journals is still a more secure
path to promotion and prestige, scientists have incentives to conduct their research
and couch their findings to meet specialized disciplinary criteria. Skills for integration across these disciplines is valued less within the culture of science than is
specialized knowledge generation. Government and research agencies that receive
government funding might be required to hire technical writers to synthesize and
translate findings into usable formats.
Because greater interaction among those who generate, transmit, and use
knowledge is essential both for maintaining the relevance of knowledge inputs and
their responsible use, boundary-spanning organizations and individuals are known
to play key roles in bringing their knowledge closely to bear on policy questions
(Guston 2001). Hass (2004) suggests the need to create “usable knowledge” that
transform “raw” science into a usable form for policy makers as well as a mechanism
for transferring that knowledge to the policy world from scientists. The boundary
manager has been described by Pielke (2007) as a “science arbiter” and “honest
broker”of policy alternatives. “Boundary organizations produce boundary objects,
such as reports, conferences, and the like” (McNie 2007, 28). Cooperative State
Research, Education and Extension Services, U.S. Sea Grant, Consultative Group
of International Agricultural Research, the Institute for Applied Systems Analysis,
the U.S. National Research Council, the now-defunct Congressional Office of
Technology Assessment, the Health Effects Institute, the European Environment
Agency, and NOAA’s Regional Integrated Sciences and Assessments Program are
all examples of boundary organizations. However, organizational innovation is
needed to entice scientists into these potentially uncomfortable interactions and to
induce policy makers to accept the knowledge generated or transmitted by boundary organizations.
7. Conclusions
In conclusion, we note two paths that the future of science and environmental
policy may follow. For the first path, the prospect of broader knowledge incorporation into environmental policy making threatens conventional science institutions,
which respond by becoming more rigid, insular, and confined to conventional
protocols. In essence, scientists and those who support science would attempt
to assert the institution of science as the best and exclusive option for informing policy. This path would devalue other forms of knowledge in the policy process. The consequence of this path would be to drive politics underground so that
values-based disputes would be hammered out within the practices of formalized
29_Kamienieck_Ch29.indd 671
6/21/2012 6:04:18 PM
OUP UNCORRECTED PROOF – FIRST-PROOF, 06/21/12, NEWGEN
672
policy approaches and analytic tools
science without participants knowing the value consequences of what appear to be
the “scientifically sound” policies.
The second option entails embracing politics and other forms of knowledge to
create a constructive role for all knowledge types and how they come to bear on
contested claims. This path includes explicit acknowledgement of bias and distributional effects associated with science, local knowledge, and public preferences in
the generation, transmission, and use of knowledge. In a democratic society, the
common good is most likely served by moving toward this path of openness, inclusion, and greater participation. It is unlikely that greater focus on science only as a
privileged source of knowledge will serve the common interest. Maintaining strict
boundaries between science and politics in a “science-centric” world is increasingly
unrealistic given the diversification in participants, arenas, and resources associated with knowledge generation, transmission, and use. We encourage movement
toward the second path for the many reasons laid out in this chapter. This is why
further research needs to focus on the biases lurking in the protocols of the generation and transmission of conventional science, as well as on how to improve
the evaluation of how knowledge is used or abused in an inevitably politicized
policy process. By making our perspectives explicit and our logic clear, we hope to
encourage others to debate, argue, or follow our lead in reimagining these boundaries between science, policy, and politics.
REFERENCES
Adams, R. 2002. “Regulating the Rule-Makers: John Graham at OIRA.”
Congressional Quarterly Weekly Report 23 (February): 520–526.
Arkes, H. R., J. L. Mumpower, and T. R. Stewart. 1997. “Combining Expert
Opinions.” Science 275: 461–465.
Armstrong, J. S. 2001. Principles of Forecasting: A Handbook for Researchers and
Practitioners. Dordrecht, The Netherlands: Kluwer Academic.
Ascher, W. 2004. “Scientific Information and Uncertainty: Challenges for the Use of
Science in Policymaking.” Science and Engineering Ethics 10: 437–455.
Ascher, W., T. Steelman, and R. Healy. 2010. Knowledge and Environmental Policy:
Re-imagining the Boundaries of Science and Politics. Cambridge, MA: MIT Press.
Banks, G. 2005. “Linking Resources and Conflict the Melanesian Way.” Pacific
Economic Bulletin 20 (1) (May): 185–191.
Beck, U. 1992. Risk Society: Towards a New Modernity. London: Sage.
Becker, J., and B. Gellman. 2007. “Leaving No Tracks.” Washington Post, June 27.
Available at blog.washingtonpost.com/cheney/chapters/leaving_no_tracks/index.
html (accessed April 15, 2009).
Beierle, T., and J. Cayford. 2002. Democracy in Practice: Public Participation in
Environmental Decisions. Washington, DC: Resources for the Future.
Benveniste, G. 1977. The Politics of Expertise. 2nd ed. San Francisco: Boyd & Fraser.
Berkes, F., J. Colding, and C. Folke, eds. 2003. Navigating Social-Ecological Systems:
Building Resilience for Complexity and Change. Cambridge: Cambridge University
Press.
29_Kamienieck_Ch29.indd 672
6/21/2012 6:04:18 PM
OUP UNCORRECTED PROOF – FIRST-PROOF, 06/21/12, NEWGEN
environmental policy and science
673
Bielak, A., A. Campbell, S. Pope, K. Schaefer, and L. Shaxon. 2008. “From Science
Communications to Knowledge Brokering: The Shift from ‘Science Push’ to
‘Policy Pull.’” In Communicating Science in Social Contexts: New Models, New
Practices, ed. D. M. Cheng, D. M. Claessens, T. Gascoigne, J. Metcalfe, and B.
Schiele, 201–226. Dordrecht, The Netherlands: Springer.
Botkin, D. B. 1990. Discordant Harmonies: A New Ecology for the Twenty-First
Century. New York: Oxford University Press.
Bravo-Ortega, C., and J. De Gregorio. 2006. “The Relative Richness of the Poor?
Natural Resources, Human Capital and Economic Growth.” In Natural
Resources, Neither Curse nor Destiny, ed. D. Lederman and W. F. Maloney, 71–99.
Washington, DC: World Bank.
Brewer, G. D. 2007. “Inventing the Future: Scenarios, Imagination, Mastery, and
Control.” Sustainability Science 2 (1): 159–177.
Brodersen, S., M. S. Jorgensen, and A. Hansen. 2006. “Environmental Empowerment:
The Role of Co-operation between Civil Society, Universities, and Science Shops.”
PATH Conference Proceedings, Scotland. Available at http://www.macaulay.ac.uk/
PATHconference/outputs/PATH_abstract_3.2.3.pdf (accessed April 20, 2007).
Brown, M. B. 2009. Science in Democracy: Expertise, Institutions, and Representation.
Cambridge, MA: MIT Press.
Brunner, R. D., and W. Ascher. 1992. “Science and Social Responsibility.” Policy
Sciences 25 (3): 295–331.
Brunner, R. D., and T. A. Steelman. 2005. “Toward Adaptive Governance.” In
Adaptive Governance: Integrating Science, Policy, and Decision-Making, ed. R. D.
Brunner, T. A. Steelman, L. Coe-Juell, C. Cromley, C. Edwards, and D. Tucker,
268–304. New York: Columbia University Press.
Bush, V. 1945. Science: The Endless Frontier. Washington, DC: U.S. Government
Printing Office. Available at www.nsf.gov/about/history/vbush1945.htm (accessed
December 28, 2010).
Byerly, R., and R. Pielke Jr. 1995. “The Changing Ecology of United States Science.”
Science 269: 1531–1532.
Davis, D. 2007. The Secret History of the War on Cancer. New York: Basic Books.
Dunlap, R. E., M. E. Kraft, and E. A. Rosa, eds. 1993. Public Reactions to Nuclear
Waste: Citizens’ Views of Repository Siting. Durham, NC: Duke University Press.
Fischer, F. 1990. Technocracy and the Politics of Expertise. Newbury Park, CA: Sage.
———. 2000. Citizens, Experts, and the Environment: The Politics of Local Knowledge.
Durham, NC: Duke University Press.
———. 2003. Reframing Public Policy: Discursive Politics and Deliberative Practices.
Oxford: Oxford University Press.
Forsyth, T. 2002. Critical Political Ecology: The Politics of Environmental Science.
London: Routledge.
Funtowicz, S., and J. Ravetz. 2001. “Global Risk, Uncertainty, and Ignorance.” In Global
Environmental Risk, ed. J. Kasperson and R. Kasperson, 173–194. London: Earthscan.
Gallopín, G. C. 2004. “What Kind of System of Science (and Technology) Is Needed
to Support the Quest for Sustainable Development?” In Earth Systems Analysis
for Sustainability, ed. H. J. Schellnhuber, P. J. Crutzen, W. C. Clark, and H. Held,
367–386. Cambridge, MA: MIT Press.
Gallopín, G. C., S. Funtowicz, M. O’Connor, and J. Ravetz. 2001. “Science for
the Twenty-First Century: From the Social Contract to the Scientific Core.”
International Social Science Journal 53 (168): 219–229.
29_Kamienieck_Ch29.indd 673
6/21/2012 6:04:18 PM
OUP UNCORRECTED PROOF – FIRST-PROOF, 06/21/12, NEWGEN
674
policy approaches and analytic tools
Grifo, F. T. 2007. “Senior Scientist with Union of Concerned Scientists Scientific
Integrity Program.” Written Testimony before the Committee on Oversight and
Government Reform, U.S. Congress, House of Representatives, 110th Cong., 1st
sess., January 30.
Gunderson, L. 1999. “Resilience, Flexibility, and Adaptive Management—Antidotes
for Spurious Certitude?” Conservation Ecology 3 (1). Available at www.consecol.
org/vol3/iss1/art7/ (accessed July 10, 2009).
Gunderson, L. H., and C. S. Holling. 2002. Panarchy: Understanding Transformations
in Human and Natural Systems. Covelo, CA: Island.
Guston, D. 2001. “Boundary Organizations in Environmental Policy and Science: An
Introduction.” Science, Technology, and Human Values 26 (4): 399–408.
Haas, P. 2004. “When Does Power Listen to Truth? A Constructivist Approach to the
Policy Process.” Journal of European Public Policy 11 (August): 569–592.
Hays, S. 1959. Conservation and the Gospel of Efficiency: The Progressive Conservation
Movement, 1890–1920. Cambridge, MA: Harvard University Press.
Healy, R., and W. Ascher. 1995. “Knowledge in the Policy Process: Incorporating New
Environmental Information in Natural Resources Policy Making.” Policy Sciences
28: 1–19.
Heinzerling, L. 2006. “Statutory Interpretation in the Era of OIRA.” Fordham Urban
Law Journal 33: 101–120.
Herrick, C., and D. Jamieson. 1995. “The Social Construction of Acid Rain: Some
Implications for Science/Policy Assessment.” Global Environmental Change 5: 105–112.
Holdren, J. P. 2010. “Scientific Integrity: Memorandum for the Heads of Executive
Departments and Agencies.” Available at www.whitehouse.gov/sites/default/files/
microsites/ostp/scientific-integrity-memo-12172010.pdf (accessed January 2, 2010).
Holling, C. S., ed. 1978. Adaptive Environmental Assessment and Management. New
York: John Wiley.
———. 1995. “What Barriers? What Bridges?” In Barriers and Bridges to the Renewal
of Ecosystems and Institutions, ed. L. H. Gunderson, C. S. Holling, and S. S. Light,
1–34. New York: Columbia University Press.
———. 1997. “Two Cultures of Ecology.” Conservation Ecology 2 (2). www.consecol.
org/vol2/iss2/art4 (accessed March 4, 2009).
Holmes, J., and R. Clark. 2008. “Enhancing the Use of Science in Environmental
Policy-Making and Regulation.” Environmental Science and Policy 11: 702–711.
Holmes, J., and J. Savgard. 2008. “Dissemination and Implementation of
Environmental Research.” Swedish Environmental Protection Agency Report
5681, February. www.skep-era.net/site/files/WP4_final%20report.pdf (accessed
January 2, 2010).
Jasanoff, S. 1990. The Fifth Branch: Science Advisors as Policymakers. Cambridge,
MA: Harvard University Press.
———. 1995. Science at the Bar: Law, Sciences, and Technology in America.
Cambridge, MA: Harvard University Press.
———. 1996. “Beyond Epistemology: Relativism and Engagement in the Politics of
Science.” Social Studies of Science 26 (2): 393–418.
———. 2003. “Breaking the Waves in Science Studies: Comment on H. M. Collins and
Robert Evans, the Third Wave of Science Studies.” Social Studies of Science 33 (3):
389–400.
Jasanoff, S., and B. Wynne. 1998. “Science and Decision Making.” In Human Choice
and Climate Change. Vol. 1, The Societal Framework, ed. S. Raynor and E. L.
Malone, 1–87. Columbus, OH: Battelle Institute.
29_Kamienieck_Ch29.indd 674
6/21/2012 6:04:18 PM
OUP UNCORRECTED PROOF – FIRST-PROOF, 06/21/12, NEWGEN
environmental policy and science
675
Karl, H., L. Susskind, and K. Wallace. 2007. “A Dialogue, Not a Diatribe: Effective
Integration of Science and Policy through Joint Fact Finding.” Environment:
Science and Policy for Sustainable Development 49 (1): 20–34.
Kuhn, T. 1962. The Structure of Scientific Revolutions. Chicago: University of Chicago
Press.
Lasswell, H. D. 1971. A Pre-view of Policy Sciences. New York: American Elsevier.
Latour, B. 1987. Science in Action: How to Follow Scientists and Engineers through
Society. Cambridge, MA: Harvard University Press.
Latour, B., and S. Woolgar. 1979. Laboratory Life: The Social Construction of Scientific
Facts. Beverly Hills, CA: Sage.
Lubchenko, J. 1998. “Entering the Century of the Environment: A New Social
Contract for Science.” Science 279: 491–497.
Majone, G. 1989. Evidence, Argument and Persuasion in the Policy Process. New
Haven, CT: Yale University Press.
Marston, G., and R. Watts. 2003. “Tampering with the Evidence: A Critical Appraisal
of Evidence-Based Policy Making.” The Drawing Board: An Australian Review of
Public Affairs 3 (3): 143–163.
Matsen, E., and R. Torvik. 2005. “Optimal Dutch Disease.” Journal of Development
Economics 78: 494–515.
McNie, E. 2007. “Reconciling the Supply of Scientific Information with User
Demands: An Analysis of the Problem and Review of the Literature.”
Environmental Science and Policy 10 (1): 17–38.
Merton, R. K. 1973. “The Normative Structure of Science.” In The Sociology of
Science: Theoretical and Empirical Investigations, ed. N. W. Storer, 267–278.
Chicago: University of Chicago Press.
Metlay, D. 2000. “From Tin Roof to Torn Wet Blanket: Predicting and Observing
Groundwater Movement at a Proposed Nuclear Waste Site.” In Prediction: Science,
Decision Making, and the Future of Nature, ed. D. Sarewitz, R. Pielke, and R.
Byerly, 199–227. Washington, DC: Island.
Moss, R. H., and S. H. Schneider. 2000. “Uncertainties in the IPCC TAR:
Recommendations to Lead Authors for More Consistent Assessment and
Reporting.” In Guidance Papers on the Cross Cutting Issues of the Third Assessment
Report of the IPCC, ed. R. Pachauri, T. Taniguchi, and K. Tanaka, 33–51. Geneva:
World Meteorological Organization.
Nelkin, D. 1979. Controversy: Politics of Technical Decisions. London: Sage.
Nutley, S. 2003. “Bridging the Policy/Research Divide. Reflections and Lessons from
the UK.” Keynote Paper, Facing the Future: Engaging Stakeholders and Citizens in
Developing Public Policy. NIG Conference. Canberra, New Zealand, April 28.
Owens, S., J. Petts, and H. Bulkeley. 2006. “Boundary Work: Knowledge, Policy and
the Urban Environment.” Environment and Planning C: Government and Policy 24
(5): 633–643.
Parliamentary Commissioner for the Environment. 2004. “Missing Links:
Connecting Science with Environmental Policy.” Wellington, New Zealand,
September.
Pielke, R., Jr. 2007. Honest Broker: Making Sense of Science in Policy and Politics. New
York: Cambridge University Press.
Popper, K. 1968. The Logic of Scientific Discovery. London: Hutchinson.
Pouyat, R. 1999. “Science and Environmental Policy—Making Them Compatible.”
Bioscience 49 (4): 281–286.
29_Kamienieck_Ch29.indd 675
6/21/2012 6:04:18 PM
OUP UNCORRECTED PROOF – FIRST-PROOF, 06/21/12, NEWGEN
676
policy approaches and analytic tools
Rayner, S., 2006. “What Drives Environmental Policy?” Global Environmental
Change 16: 4–6.
Rowe, G., and L. Frewer. 2000. “Public Participation Methods: A Framework for
Evaluation.” Science, Technology, and Human Values 25 (1): 3–29.
Rykiel, E. 2001. “Scientific Objectivity, Value Systems, and Policymaking.” BioScience
51 (6): 433–436.
Sabatier, P. 1988. “An Advocacy Coalition Framework of Policy Change and the Role
of Policy-Oriented Learning Therein.” Policy Sciences 21 (Fall): 129–168.
Sagoff, M. 1994. “Should Preferences Count?” Land Economics 70: 127–144.
Sarewitz, D. 2004. “How Science Makes Environmental Controversies Worse.”
Environmental Science and Policy 7: 385–403.
Service, R. F. 2003. “‘Combat Biology’ on the Klamath.” Science 300: 36–39.
Shapiro, S. A., and C. H. Schroeder. 2008. “Beyond Cost-Benefit Analysis: A
Pragmatic Reorientation.” Harvard Environmental Law Review 32: 433.
Shaxson, L. 2005. “Is Your Evidence Robust Enough? Questions for Policy Makers
and Practitioners.” Evidence and Policy 1 (1): 101–111.
Steelman, T. A., and W. Ascher. 1997. “Public Involvement in Natural Resource
Policymaking.” Policy Sciences 30: 71–90.
Steelman, T. A., and C. Burke. 2007. “Is Wildfire Policy in the United States
Sustainable?” Journal of Forestry 105 (2): 67–72.
Stokes, D. 1997. Pasteur’s Quadrant: Basic Science and Technological Innovation.
Washington, DC: Brookings Institution Press.
Union of Concerned Scientists. 2008. Interference at the EPA: Science and Politics at
the US Environmental Protection Agency. Cambridge, MA: Union of Concerned
Scientists, April. Available at www.ucsusa.org/scientific_integrity/interference/
interference-at-the-epa.html (accessed April 5, 2008).
United Kingdom Cabinet Office. 1999, March. Modernising Government. London:
Prime Minister and the Minister for the Cabinet Office.
U.S. Environmental Protection Agency, Scientific Advisory Board. 2009, May.
Valuing the Protection of Ecological Systems and Services. Washington, DC: U.S.
Environmental Protection Agency.
Vig, N. J., and M. G. Faure, eds. 2004. Green Giants? Environmental Policies of the
United States and the European Union. Cambridge, MA: MIT Press.
Wheeler, W. B., and M. J. McDonald. 1986. TVA and the Tellico Dam 1936–1979: A
Bureaucratic Crisis in Post-Industrial America. Knoxville: University of Tennessee
Press.
Wynne, B. 1996. “May the Sheep Safely Graze? A Reflexive View of the Expert-Lay
Knowledge Divide.” In Risk, Environment, and Modernity: Towards a New Ecology,
ed. S. Lash, B. Szerszynski, and B. Wynne, 44–83. London: Sage.
Zachary, G. P. 1997. Endless Frontier: Vannevar Bush, Engineer of the American
Century. New York: Free Press.
Ziman, J. 1978. Reliable Knowledge: An Exploration of the Grounds for Belief in
Science. Cambridge: Cambridge University Press.
29_Kamienieck_Ch29.indd 676
6/21/2012 6:04:18 PM