Causas Belgas-1
Causas Belgas-1
Causas Belgas-1
Causal counterfactual theory provides clear semantics and sound logic for causal reasoning
and may help foster research on, and clarify dissemination of, weather and climate-related
event attribution.
A
signif ica nt a nd grow ing par t of climate with the challenge of generating causal information
research studies the causal links between about episodes of extreme weather or unusual climate
climate forcings and observed responses. This conditions. This challenge arises from the needs for
part has been consolidated into a separate research public dissemination, litigation in a legal context,
topic known as detection and attribution (D&A). adaptation to climate change, or simply improvement
The D&A community has increasingly been faced of the science associated with these events (Stott et al.
2013). For clarity, we start by introducing a few no-
tations that will be used throughout this article: an
AFFILIATIONS: HANNART—IFAECI, CNRS/CONICET/UBA, event here is associated with a binary variable, say Y,
Buenos Aires, Argentina; PEARL—Computer Science Department, which is equal to 1 when the event occurs and to 0
University of California, Los Angeles, Los Angeles, California; when it does not, and we use the term event Y as an
OTTO —Environmental Change Institute, University of Oxford, abbreviation for the event defined by Y = 1. In any
Oxford, United Kingdom; NAVEAU —LSCE, CNRS/CEA, Gif-sur- event attribution study, the precise definition of the
Yvette, France; GHIL—École Normale Supérieure, Paris, France, event to be studied—that is, the choice of the vari-
and Department of Atmospheric and Oceanic Sciences, University
able Y—is crucial. Often, Y is defined ad hoc in the
of California, Los Angeles, Los Angeles, California
CORRESPONDING AUTHOR: Alexis Hannart, IFAECI, Ciudad
aftermath of an observed extreme situation based on
Universitaria, Pab. II, Piso 2, 1428 Buenos Aires, Argentina exceedance over a threshold u of a relevant climate
E-mail: [email protected] index Z, where both the index and the threshold are to
a large extent arbitrary. In the conventional approach,
The abstract for this article can be found in this issue, following the table
of contents.
which was introduced one decade ago by M. R. Allen
DOI:10.1175/BAMS-D-14-00034.1 and colleagues (Allen 2003; Stone and Allen 2005),
one evaluates the extent to which a given external
In final form 8 February 2015
©2016 American Meteorological Society climate forcing f F —where F encompasses, for
instance, solar irradiation, greenhouse gas (GHG)
probability distribution, provided conditional relevant because other factors potentially affecting Y
dependence relationships are fully specified; such may not be controlled in the experimental setup. The
specification is conveniently encoded by using an ori- notion of intervention was hence introduced to de-
ented graph in which each arrow represents a causal scribe the situation where X is set by the experimenter
relationship. The existence of causal relationships at a chosen value x; it is denoted do(X = x). The notion
has various implications on the joint dependence of interventional probability then corresponds to the
structure; for example, independent causes become distribution of Y obtained in an experiment under the
dependent conditional upon their common effect, intervention do(X = x). It is denoted P[Y|do(X = x)] or
and dependent effects become independent condi- alternatively P(Yx), where Yx denotes the new random
tional upon their common cause. From the moment variable obtained for Y subject to the intervention
we have access to enough observations to infer the do(X = x). The set {P(Yx = y)|x,y = 0,1} obtained by
dependence structure, we are able to detect these collecting all the interventional probabilities of Y for
signatures and thereby provide evidence of causal every possible value of X is termed the causal effect of
relationships. Algorithms such as those described in X on Y. It is important to note that, in general,
Spirtes et al. (2000) and Shimizu et al. (2006) basically
follow this strategy and could perfectly be applied to P[Y|do(X = x) ≠ P(Y|X = x), (5)
the natural observations of R, B, and L collected by O.
An important limitation of using natural data which is why the notation do(X = x) is required.
is that several graphs can be compatible with the Indeed, P(R = 1|B = 1) reads in our example the
same joint distribution and hence with the same probability of rain knowing that the barometer is
observations; identifiability is an issue. For instance, decreasing in a nonexperimental context in which the
simultaneous changes in X and Y are compatible barometer evolution is left unconstrained, whereas
with both the causal relationships X → Y and Y → X P[R = 1|do(B = 1)] reads the probability of rain forc-
whenever only these two variables are observed (e.g., ing the barometer to decrease in an experimental
when observing R and B but not L). The experimental context in which the barometer is manipulated. The
approach is thus required for disambiguation of the two probabilities are obviously distinct, and it is their
causal relationship between X and Y. Several out- difference that allows for disambiguation, as it reveals
comes Y are thereby experimentally collected for each the absence of a causal link between B and R.
tested value of X. The value of X is thus chosen by the Nonetheless, confusion is still possible because
experimenter, and treating it as a random variable is P[Y|do(X = x)], and P(Y|X = x) may also sometimes
no longer relevant in this experimental context. How- be equal. This is the case when X satisfies a property
ever, a probabilistic treatment of the response Y is still called exogeneity wrt Y. Without going into details,