Robert Francescotti: Inds Thics AND Nimals
Robert Francescotti: Inds Thics AND Nimals
Robert Francescotti: Inds Thics AND Nimals
1
Donald R. Griffin, ‘‘Prospects for a Cognitive Ethology,’’ Behavioral and Brain
Sciences 4 (1978), pp. 527–538.
2
Immanuel Kant, Lectures on Ethics, L. Infield (trans.) (New York: Harper
Torchbooks, 1963), pp. 239–240.
3
Jeremy Bentham, An Introduction to the Principles of Morals and Legislation
(New York: Hafner Press, 1948), pp. 310–311; and Peter Singer, Animal Liberation,
2nd Edition (New York: Avon Books, 1990), pp. 7–9.
4
Peter Carruthers, ‘‘Sympathy and Subjectivity,’’ Australasian Journal of Phi-
losophy 77 (1999), pp. 465–482, and ‘‘Invertebrate Minds: A Challenge for Ethical
Theory,’’ The Journal of Ethics 11 (2007).
ANIMAL MIND AND ANIMAL ETHICS 241
significant, is either more significant than the other? Also, are both
features necessary for the presence of genuine suffering? Is the belief
that oneÕs desire is frustrated also necessary? And whatever the essential
components of suffering happen to be, must all of them, or any of them,
be conscious mental states? Is it possible for the pain sensation itself to
occur non-consciously? The ethicist will need to figure out which mental
processes and which of their components matter ethically and to what
degree – but, it seems, not without sufficient attention to questions such
as those above about the nature of mind.
If the mere feeling of pain makes one a suitable object of moral
concern, then it would appear that the range of individuals to whom we
potentially have moral obligations is much larger than it would be if the
frustration of desire were what matters. After all, thoughts, beliefs,
desires, and other ‘‘intentional states’’5 seem to be more complex
mental items, and therefore harder to come by, than pain sensations. It
is arguable, however, that intentionality is required in the latter case as
well. For according to one popular theory of consciousness, a mental
state is a conscious mental state if and only if it is the object of a suitable
higher-order thought, a belief to the effect that one has that mental
state.6 So if some version of the higher-order thought theory is correct,
then even if the sensation of pain, i.e., the conscious sensation of pain,
were enough to warrant moral concern, being a suitable object of moral
concern would still require having intentional states.
With a higher-order thought theory, the requirements for con-
scious states might be more stringent still. Donald Davidson has
argued that having thoughts requires the capacity for language.7
While this view is dubious,8 Davidson is also famous for supporting
5
The term ‘‘intentional’’ indicates the representational character, the about-ness,
of a thought, belief, or desire. Since their content can typically be expressed in the
form of a proposition (e.g., the belief that it will rain), intentional states are also
called ‘‘propositional attitudes.’’
6
See David Rosenthal, ‘‘Two Concepts of Consciousness,’’ Philosophical Studies
94 (1986), pp. 329–359; and Peter Carruthers, ‘‘Natural Theories of Consciousness,’’
European Journal of Philosophy 6 (1998), pp. 203–222.
7
Donald Davidson, ‘‘Rational Animals,’’ in Ernest LePore and Brian
McLaughlin (eds.), Actions and Events: Perspectives on the Philosophy of Donald
Davidson (Oxford: Basil Blackwell, 1985), pp. 473–480. See also Donald Davidson,
‘‘Thought and Talk,’’ in Samuel Guttenplan (ed.), Mind and Language (Oxford:
Oxford University Press, 1980), pp. 7–23.
8
For it seems there can be good evidence for thought in the absence of language.
Although, one might argue that certain complex types of thought require language
[See, e.g., José Bermudez, Thinking without Words (Oxford: Oxford University Press,
2003), Chapter 8].
242 ROBERT FRANCESCOTTI
the more plausible view that having any one intentional state requires
having lots of others.9 This follows from the idea that an intentional
state acquires its content in virtue of its logical relations to other
states within a network of intentionality. If we conjoin this holistic
view of mental content with a higher-order theory of consciousness,10
we get the result that a conscious mental state, even a pain sensation,
requires lots of intentionality indeed!
While many of us might be unsure whether fish, snakes, or frogs
have genuine beliefs and desires, most will find it odd to credit any
invertebrates with these mental states. And, yet, in an essay appearing
in this issue, Carruthers describes the results of research on the
navigating behavior of honey bees and jumping spiders, and argues
that the behavior observed in these studies is best explained by
supposing that the bees and spiders possess a genuine belief-desire
psychology. While this conclusion might not be easy to accept, after
reading the description of the impressive behavior recorded in these
studies, one is hard pressed to say exactly what it is about having a
belief or a desire (or at least what it is about the requisite behavioral
dispositions) that these little creatures lack. And if one is impressed by
the complexity of the behavior observed but still reluctant to say these
invertebrates really have intentional states, one might be tempted to
accept some brand of instrumentalism regarding the mind. One might,
for instance, agree with Daniel Dennett that there is no fact of the
matter of whether an individual really has intentional states – i.e., no
fact of the matter other than the usefulness of adopting the intentional
stance when explaining and predicting its behavior.11
It would seem, then, that ethicists who are tempted to classify the
moral status of nonhuman animals based on their presumed mental
capacity should remain aware of the fact that which mental states an
individual has is a difficult issue, in part because there is much
controversy about what exactly having the relevant mental states
consists in. Another reason that questions about animal mentality are
difficult is that in many cases it is far from easy to find adequate
behavioral evidence for the presence or absence of various mental states
in other animals. There is the Quinean worry that any amount of
9
Davidson, ‘‘Rational Animals,’’ p. 475.
10
This is not to imply that Davidson himself endorses the conjunction.
11
See Daniel Dennett, The Intentional Stance (Cambridge: The MIT Press, 1987),
Chapter 1, for a general introduction to the intentional stance and the notion of an
intentional system, and Daniel Dennett, Brainchildren: Essays on Designing Minds
(Cambridge: The MIT Press, 1998), Chapter 22, for application to other animals.
ANIMAL MIND AND ANIMAL ETHICS 243
12
W. V. O. Quine, Word and Object (Cambridge: The MIT Press, 1960).
13
Davidson, ‘‘Rational Animals,’’ p. 474.
14
Davidson, ‘‘Rational Animals,’’ p. 475.
15
See, for example, Richard Seyfarth and Dorothy Cheney, ‘‘The Structure of
Social Knowledge in Monkeys,’’ in Marc Bekoff, Colin Allen, and G. M. Burghardt
(eds.), The Cognitive Animal (Cambridge: The MIT Press, 2002), pp. 379–384.
16
See Dennett, The Intentional Stance, Chapter 7.
244 ROBERT FRANCESCOTTI
the trees. But perhaps we should also attribute the desire to make the
others believe there is a leopard nearby, which is an instance of second-
order intentionality. Or maybe we should say the monkey wants the
others to recognize that he wants them to head for the trees. In addition
to this third-order intentional ascription, there is the possible fourth-
order Gricean attribution of meaning – wanting the others to believe
there is a leopard nearby in virtue of their recognition of that intention.17
Even if the highly limited overall range of vervet verbal behavior
renders attributions of third- and fourth-order intentionality implau-
sible (as Dennett notes18), we are still left with the daunting task of
deciding between the first- and second-order interpretations.19
Of course, the issue of whether there is anything more than first-order
intentionality underlying the communicative behavior observed arises
for other mammals as well – and also for birds. Irene PepperbergÕs
pigeon, Alex, is able to correctly name various objects, and their shapes
and colors, as well as correctly answering questions about the respects in
which pairs of objects are the same or different.20 This seems to show
that the bird has genuine concepts – color concepts, shape concepts, and
concepts of different objects of various types. Does the parrotÕs verbal
behavior also suggest the presence of thoughts, beliefs, and desires? Does
Alex desire to get his trainer to believe that, for example, the object is
blue? If the behavior observed does not support this second-order
attribution, then why not? What behavioral evidence would justify it?
Does Alex at least have the first-order intentional state of desiring to
answer correctly? Here, especially, it is unclear why the behavioral
evidence does or does not justify the attribution.21
17
H. P. Grice, ‘‘Meaning,’’ The Philosophical Review 66 (1957), pp. 377–388.
18
Dennett, The Intentional Stance, p. 247.
19
The higher-order thoughts (HOTs) required by a HOT theory of consciousness
differ from the higher-order intentionality mentioned here in two ways. Here we are
focusing on intentional states directed toward the mental states of others. Also,
depending on oneÕs version of the HOT theory, the HOT might be directed toward a
mental state without representing it as a mental state, i.e., without employing the
concept of a mental state (perhaps with the content, ‘‘I am having this’’ or ‘‘This is
occurring,’’ where ‘‘this’’ refers to the target mental state).
20
See, for example, Irene M. Pepperberg, ‘‘Cognitive and Communicative Abil-
ities of Grey Parrots,’’ in Marc Bekoff, Colin Allen, and GM.Burghardt, The Cog-
nitive Animal (Cambridge: The MIT Press, 2002), pp. 247–253.
21
It might even be wondered whether the bird has genuine concepts. What con-
cept-possession consists in is a big issue in the philosophy of mind. To get a sense of
the debate in connection with other animals, see Colin Allen and M. Hauser,
‘‘Concept Attribution in Nonhuman Animals,’’ Philosophy of Science 58 (1991), pp.
221–240.
ANIMAL MIND AND ANIMAL ETHICS 245
26
Charles Darwin, The Descent of Man, and Selection in Relation to Sex (London:
D. Appleton, 1897), Chapter 4.
27
Gould and Gould, The Animal Mind, p. 150.
28
Deceptive behavior, such as that mentioned earlier, is another possible source of
evidence for the kind of higher-order intentionality involved in moral agency.
Desiring to produce a false belief in others, and acting upon that desire, might be
enough to make one morally blameworthy for oneÕs action – provided there is
suitable autonomy involved.
ANIMAL MIND AND ANIMAL ETHICS 247
is a forensic term, appropriating actions and their merit; and so belongs only to
intelligent agents, capable of a law, and happiness and misery. This personality
extends itself beyond present existence to what is past, only by consciousness, –
whereby it becomes concerned and accountable ...29
33
DeGrazia, ‘‘On the Question of Personhood beyond Homo Sapiens,’’
pp. 44–46.
34
DeGrazia, ‘‘On the Question of Personhood beyond Homo Sapiens,’’
pp. 46–48.
35
Consider, for example, the linguistic behavior of Francine PattersonÕs gorilla,
Koko [F. Patterson and W. Gordon, ‘‘The Case for the Personhood of Gorillas,’’ in
P. Cavalieri and P. Singer (eds.), The Great Ape Project (London: St. MartinÕs
Griffin, 1993), pp. 58–77].
36
Gordon Gallup Jr., ‘‘Self-Recognition in Primates,’’ American Psychologist 2
(1977), pp. 329–338.
37
See J. Bennett, ‘‘Thoughtful Brutes,’’ Proceedings and Addresses of the Ameri-
can Philosophical Association 62 (1988), pp. 197–210.
38
Lynne R. Baker, Persons and Bodies (Cambridge: Cambridge University Press,
2000), Chapter 3.
ANIMAL MIND AND ANIMAL ETHICS 249
THE ESSAYS
for those interested in animal minds, and might also have important
implications for the moral status of languageless animals. Bermudez
notes, however, that the types of cognitive activity ruled out by his
argument are more limited than they might immediately seem.
For example, by differentiating between two types of desire, goal-
desires and situation-desires, Bermudez shows it is possible for non-
linguistic creatures to have a type of knowledge regarding the
psychological states of their conspecifics. This knowledge, he
explains, allows non-linguistic creatures to engage in a primitive
form of psychological explanation, and allows them to predict the
behavior of others with some success. Whatever type of reasoning is
involved here will not, however, count as logical reasoning. Logical
reasoning involves the deployment of logical concepts, which
Bermudez explains requires thoughts about thoughts, and therefore
depends on language. Thus, non-linguistic animals are incapable of
logical reasoning. Yet, Bermudez argues, it may be possible to
identify other forms of reasoning, ones that can be explained without
assuming the animal (or prelinguistic infant) is using logical concepts.
He describes for us what this non-logical reasoning might involve.
Thanks to the authors who contributed for showing the impor-
tance of various issues in animal mentality to the field of animal
ethics. These essays certainly are of value for those trying to
understand the moral status of nonhuman animals. And assuming it
is true, in general, that oneÕs moral features are a function of oneÕs
mentality, the following discussions may shed light on the moral
status of human animals as well.
Department of Philosophy
San Diego State University
San Diego, CA, 92182 USA
E-mail: [email protected]