Chapter 21
Mario Bunge and Contemporary
Cognitive Science
Peter Slezak
Abstract Bunge’s writings on the mind-body problem (Bunge 1980, 1991, 2010)
provide a rigorous, analytical antidote to the persistent anti-materialist tendency
that has characterized the history of philosophy and science. Bunge suggests that
dualism can be neutralized “with a bit of philosophical analysis” (Bunge 1991) but
this is clearly too optimistic in view of the recent revival of dualism as a respectable
doctrine despite a vast industry of philosophical analysis. The conceivability of
zombies (Chalmers 1996) leads to the possibility of dualism and thereby to the
falsity of materialism. Bunge relies on his general case that “arguably all the
factual (“empirical”) sciences only study concrete (or material) entities, from
photons to rocks to organisms to societies” (Bunge 2010). Bunge’s immunity to
philosophical extravagance is to be commended, but he is perhaps like someone
who rejects Zeno’s paradoxes as physical absurdities and thereby leaves the puzzle
itself untouched. While philosophers need to be cured of their paradoxes, perhaps
Bunge’s strategy of just getting on with real scientific inquiry is, after all, the best
approach.
Bunge has mastery of an improbably vast range of disciplines. His writings provide
a manifesto for the materialist conception of mind and body, placing ideas in their
rich historical context. Few philosophers have written in such depth and breadth
as we see in Bunge’s work and there is much to cheer for someone who shares
Bunge’s views and prejudices. However, the very dazzling comprehensiveness and
deftness in Bunge’s analyses often means a correlative lack of sufficiently detailed
argumentation for its many insights, bold claims and criticisms. Accordingly, I have
focused my following comments selectively on Bunge’s more controversial claims
in order to engage with them seriously as his eminent status deserves.
P. Slezak ()
School of Humanities and Languages, University of New South Wales, Sydney, Australia
e-mail:
[email protected]
© Springer Nature Switzerland AG 2019
M. R. Matthews (ed.), Mario Bunge: A Centenary Festschrift,
https://doi.org/10.1007/978-3-030-16673-1_21
363
364
P. Slezak
21.1 Functionalism, Behaviourism, Dualism
Following an erudite discourse on the various kinds of matter (physical, chemical,
living, thinking, social and artificial), it is difficult to dispute Bunge’s jaundiced
observation: “The reader is invited to compare this rich crop to the contributions
made by metaphysicians to both their own discipline and to science during the
same period” (Bunge 2010, p. 83–4). Indeed, despite recent heroic efforts (Stoljar
2017) to demonstrate progress in philosophy, a wide consensus among professional
philosophers since Russell (1912) concurs with Bunge’s scepticism (Slezak 2018).
Nevertheless, a difficulty arising from Bunge’s provocative critiques is that not
all the targets of his admitted “bashings” (2010, p. xi) are equally deserving. For
example, Bunge suggests “most contemporary philosophers of mind are indifferent
to psychology, or are remarkably uninformed about it” (2010, p. ix). This charge
cannot be sustained today in light of the work of such eminent philosophers as
Stich, Fodor, Cummins, Dennett, Churchlands, Thagard, Nersessian, Bechtel, Egan,
Bickhard, Elster and dozens of others. Bunge’s diagnosis is quite misplaced as, for
example, when he suggests that in the philosophy of mind “few of its practitioners
bother to keep up to date with the science of mind” (Bunge 2010, p. x).
Such sweeping ad hominem pronouncements lead Bunge to hand-waving dismissals of important, subtle doctrines. Thus, for example, he says “the functionalist
view of the mind, favoured by most contemporary philosophers of mind, should
be dropped as being both scientifically shallow and medically hazardous” (Bunge
2010, p. 155).
Functionalism in philosophy of mind is the doctrine that a mental state is
identified, not on the basis of its material composition but on the basis of its
functional role in the system of which it is a part. The doctrine may be seen in
Aristotle who begins his De Anima asking:
A . . . problem presented by the affections of soul is this: are they all affections of the
complex of body and soul, or is there any one among them peculiar to the soul by itself? To
determine this is indispensable but difficult. (Aristotle 1941, p. 536)
Bunge asks the same question “Are mind and body two separate entities?”
(Bunge 1980, xiii). As Bunge himself notes, Aristotle recognized that the question
whether body and soul are one is as misleading as the question of whether the wax
and its shape are one (Bunge 1980).
The modern version of functionalism derives from insights into computational
or abstractly specifiable states that are not intrinsically dependent on their composition or realization. An early statement of the view was Fodor’s Psychological
Explanation (Fodor 1968) in which he acknowledged particular inspiration from
Chomsky’s (1965) highly idealized, mathematical models of linguistic competence.
In particular, functionalism is, therefore, neutral between materialism and dualism
because mental states are identified by their abstractly specified role rather than
their substance or multiple possible realizations. The case for materialism then
becomes the general scientific case for the only possible physical realizations of
the relevant functional roles. Dualism is ruled out on this conception because there
21 Mario Bunge and Contemporary Cognitive Science
365
are no scientific grounds for believing that an immaterial substance is a possible
realization for the relevant functional states.
Although the conception of the physical realization has changed, the mind-body
problem has remained essentially the same for centuries. The functionalist solution
is captured loosely in the familiar distinction between hardware and software in a
computer. That is, the mind is to be understood in a literal sense as the software
rather than the hardware of the brain. Pylyshyn’s (1984) landmark Computation
and Cognition is perhaps the locus classicus of this doctrine that takes computation
to be more than merely a metaphor and, rather, when appropriately spelled out, a
literal account of information processing in the brain.
To be sure, functionalism remains open to a variety of philosophical criticisms
but cannot be dismissed in Bunge’s manner (Levin 2010). Characterizing modern
information processing psychology as “brainless cognitive science,” Bunge suggests
that, seen in historical perspective, “computationalism is a sophisticated version of
behaviourism” (Bunge 2010, p. 227). On the face of it, this is a surprising claim
because the ‘Cognitive Revolution’ and its computational paradigm has been seen as
emerging from the downfall of behaviourism (Gardner 1987) once internal, mental
representations and processes became legitimate theoretical constructs again. However, Bunge (2010) explicitly identifies computationalism with the behaviourism it
has displaced. He offers the following schematic diagram to illustrate the parallel:
(a) Classical behaviourism
Stimulus ➔ Black box ➔ Response
(b) Computationalism
Stimulus ➔ Program ➔ Readout
However, the “program” in computational models of cognition cannot be compared with the “Black Box” of behaviourism precisely because the “program”
constitutes the theoretical postulation of internal representations that were eschewed
by behaviourism. The ‘box-and-arrow’ models and formalisms of computational
theories are precisely hypotheses about mental events and processes that were taboo
for Skinner on the grounds that they must be question-begging.
Daniel Dennett has been foremost in elucidating the significance of AI and
computation for the philosophy of mind. For example, he has shown how modern
computational approaches can avoid the difficulty that Skinner posed – that is,
how to avoid the paradox that psychology with homunculi seems circular, whereas
psychology without homunculi seems empty. The successive decomposition of
tasks “discharging the loan on intelligence” with progressively stupider homunculi
shows how psychological theories may posit internal states without falling afoul of
Skinner’s worries. Dennett explains that AI is a form of psychological theorising
adopting the “Intentional Stance” and differing from traditional philosophy in that
“the AI worker pulls his armchair up to a console” (Dennett 1978a, p. 58).
AI shares with philosophy (in particular, with epistemology and philosophy of mind) the
status of most abstract investigation of the principles of psychology. But it shares with
366
P. Slezak
psychology in distinction from philosophy a typical tactic in answering its questions.
(Dennett 1978a, p. 60)
In a seeming inconsistency, on the one hand, as we have just noted, Bunge takes
computationalism to be behaviourist but, on the other hand, he also takes high-level
computational theories to be dualist. For example, Bunge suggests that “advocates
of this view [linguistic naturalism or biolinguistics] have adopted Cartesian mindbody dualism” (Bunge 2010, p. 112), but this is, at best, a caricature of Chomsky’s
program (see Jenkins 2001; Di Sciullo and Boeckx 2011) for which Bunge gives no
argument or justification. Bunge thinks that inquiry that stays at the higher level of
mental phenomena rather than neuroscience and “objective brain facts” is evidence
that “psycho-neural dualism prevails” (Bunge 2010, p. 154).
This might be charitably understood as simply characterising the distinction
between different levels of analysis rather than the usual Cartesian connotation
of the term “dualism,” but Bunge appears to intend the latter pejorative reading.
This reading is further suggested in his warning “that the dualist philosophies of
mind are hazardous to mental health because they divert the researcher’s and the
therapist’s attention from the brain to an immaterial and therefore inaccessible
item.” (Bunge 2010, p. 155). On the contrary, the biolinguistics program has
been explicitly conceived as integrating high level theories with their biological,
neurological substrate.
Fodor (1968) and Pylyshyn (1984) articulated the rationale for a high-level
inquiry with its proprietary vocabulary and explanatory principles, just as we see
elsewhere in science. One can fully acknowledge the reality of many phenomena
without falling into dualism or ascribing them to the basic, elementary physical
constituents of the world. As Fodor (1989) has argued in countering the ailment of
“epiphobia”; rivers, sails and mountains are no less real for not figuring in basic
physics – Bunge’s own reason for being a materialist rather than a physicalist, as he
uses these terms. Indigestion, inflation and other kinds of “being” in good standing
are surely real without belonging to properties of elementary particles.
Thus, for example, whatever may be its other failings, economics is not
committed to dualism or occult entities by virtue of seeking generalizations above
the level of the individuals who make up the economic system. This was, of
course, Durkheim’s (1898) famous conception of social facts as “things” widely,
but unjustly, seen as some kind of mysticism but, in fact, an anticipation of Fodor’s
(1989) criticism of “epiphobia” – the fear of postulating theoretical entities.
In this spirit, Chomsky has referred to his abstract idealizations as adopting a
‘Galilean’ approach to science (see Pylyshyn 1972, 1973). Chomsky writes:
. . . we are keeping to abstract conditions that unknown mechanisms must meet. We might
go on to suggest actual mechanisms, but we know that it would be pointless to do so in the
present stage of our ignorance concerning the functioning of the brain. . . . If we were able
to investigate humans as we study other, defenceless organisms, we might well proceed to
inquire into the operative mechanisms . . . (Chomsky 1980, p. 197)
21 Mario Bunge and Contemporary Cognitive Science
367
Chomsky’s functionalist view was unmistakable in his Aspects, where he wrote:
The mentalist . . . need make no assumptions about the possible physiological basis
for the mental reality he studies. . . . One would guess . . . that it is the mentalistic
studies that will ultimately be of greatest value for the investigation of neurophysiological
mechanisms, since they alone are concerned with determining abstractly the properties that
such mechanisms must exhibit and the functions they must perform. (Chomsky 1965, p.
193, fn. 1)
Of course, philosophers are not the only ones tempted by the barren doctrine.
Bunge notes, the foremost neuroscientists have been avowed dualists including
Sherrington, Penfield, Sperry and Eccles. Thus, Bunge’s advice that researchers
should stick to the brain is questionable if the alternative is not an avowed Cartesian
ontological dualism but only high-level, top-down theory. The virtues of such an
‘Intentional Stance’ (Dennett 1989) or semantic, knowledge-level (Newell 1990)
analysis above the level of cognitive architecture have been seen as providing
the rationale for the enterprise of cognitive psychology. The approach is seen
paradigmatically in Marr’s (1982) distinction between the levels of computation,
algorithm and implementation, corresponding roughly to Chomsky’s competenceperformance distinction. Only the implementation or “realization” level is concerned directly with the physical, neural substrate that Bunge appears to insist on as
the only respectable level of analysis.
21.2 Materialism, Physicalism and Dualism
Bunge declares “I am an unabashed monist” and “I am a materialist but not a
physicalist” (2010, p. vii). These latter terms have been often used interchangeably
and so it is important to understand Bunge’s specific meaning. By “physicalist”
Bunge means someone who holds that the laws of physics are explanatory for
all phenomena. Bunge explains that his own expertise as a physicist led him to
appreciate “that physics can explain neither life nor mind nor society” nor “chemical
reactions, metabolism, color, mentality, sociality, or artifact” (Bunge 2010, p. vii).
Bunge’s mission is “to reunite matter and mind” at a time when its materialist
message is more timely than it would have been a decade or two earlier. Bunge’s
writings on the mind-body problem (Bunge 1980, 1991, 2010) are intended to
provide an antidote to a persistent anti-materialist tendency that has characterized
the history of philosophy and science. He concludes:
. . . psychoneural dualism is worse than barren: it is an obstacle to the advancement
of science and medicine. Fortunately, this obstacle can easily be removed with a bit of
philosophical analysis. (Bunge 1991, p. 520)
This remark is clearly too optimistic in view of the revival of dualism as a
respectable doctrine despite a vast industry of philosophical analysis. The materialist
368
P. Slezak
orthodoxy of the mid twentieth century has been eroded and dualism has, indeed,
regained a certain respectability (Chalmers 1996). However, besides taking a
passing swipe (Bunge 2010, p. 177), Bunge does not address the principal arguments
based on the Method of Conceivability’ and Zombies (see also Kirk 2005).
Furthermore, there is a certain irony in the fact that Bunge’s criticism of
functionalist theories places him in the same camp as leading critics of materialist
theories such as Strawson (2006, 2018) who charge functionalism with leaving
out the essentially subjective, first person, phenomenal, qualitative features of
experience. For example, Strawson (2006, 2008) holds materialist philosophers to
be guilty of “the silliest view ever held by any human being” (Strawson 2008, p.
8). He construes the Lucretius world-view of Dennett and others as a grievous error
exceeding the implausibility of “every known religious belief.” He says,
For this particular denial is the strangest thing that has ever happened in the whole history
of human thought, not just the whole history of philosophy. It falls, unfortunately, to
philosophy, not religion, to reveal the deepest woo-woo of the human mind. (Strawson
2006, pp. 5–6)
Albeit for different reasons, Bunge joins Strawson (2006) and Searle (1997) in
heaping scorn on philosophical adversaries such as Dennett who are said to deny the
most obvious reality of their own experience. Searle parodies the title of Dennett’s
book as Consciousness Denied instead of Consciousness Explained. If Searle is
right, Dennett has managed an intellectual achievement that Descartes showed
to be impossible. This suggests that these rhetorical features of the debate about
consciousness are not irrelevant matters of polemical style but rather symptoms of
the peculiarity of the views at stake.
Thus, Searle charges materialists with making “stunning mistakes” (Searle 1992,
p. 246) and “saying things that are obviously false” (Searle 1992, p. 247). Block
(1990, p. 129), too, suggests that Dennett’s (1991) book would be more aptly titled
Consciousness Ignored. “Such authors pretend to think that consciousness exists,
but in fact they end up denying its existence” (Searle 1992, p. 7). Searle (1997)
includes Armstrong (1968, 1980) among such deniers. Searle writes acidly “I regard
Dennett’s denial of the very existence of consciousness not as a new discovery or
even as a serious possibility but rather as a form of intellectual pathology” (Searle
1997, p. 112). Since these accusations are directed at our foremost philosophers, we
are confronted with a peculiar situation that deserves attention as something more
than mere ad hominem rhetorical excess.
21.3 Fantasy Worlds
Despite the difficulties of articulating a version of materialism that is immune from
philosophical objections, from the 1950s there had been a consensus on materialism
and the progress from early ‘Identity’ versions to the more recent ‘functionalist’
accounts (Fodor 1968). Thus, for example, originally Thomas Nagel (1965) avowed
an intellectual commitment to materialism as the only scientifically respectable
21 Mario Bunge and Contemporary Cognitive Science
369
account, while confessing a psychological discomfort because of its deep intuitive,
introspective implausibility. Before his apostasy, Nagel (1965) took it as a datum
of subjective experience that materialism has a deep, intuitive implausibility that is
independent of its overwhelming systematic merits as a true scientific, philosophical
thesis. Bunge makes a typically acerbic remark: “responsible people do not mistake
conceptual possibility, or conceivability, for factual possibility or lawfulness; and
they do not regard the ability to invent fantasy worlds as evidence for their real
existence” (Bunge 2010, p. 177).
It is perhaps understandable that philosophers will elevate their only research tool
to a pre-eminent status as a guide to metaphysical possibilities, but the tendency
gives grounds for Bunge’s jaundiced view of their discipline. Indeed, the balance
has become reversed with intuitions coming to dominate systematic scientific
considerations. Thus, Nagel has been among those who have shifted their allegiance
to various forms of ‘Mysterianism’ (McGinn 1989), outright dualism or even
panpsychism (Strawson 2006). Of this latter doctrine, in a related context, Bunge
remarks that it “illustrates the cynical principle that, given an arbitrary extravagance,
there is at least one philosopher capable of inventing an even more outrageous one”
(Bunge 2010, p. 167).
This is the sentiment that Descartes had expressed in his Discourse remarking
that “nothing can be imagined which is too strange or incredible to have been
said by some philosopher” (1637/1986, p. 118). The modern shift in the consensus
has been due, not to any new scientific revelations that would provide grounds for
doubting the broad materialist picture but rather to the increased weight placed on
philosophical intuitions. Despite having been elevated to the status of an official
“Conceivability Argument” (Stoljar 2001) or “Method of Conceivability” (Chalmers
2002), it is difficult to find anything other than a bare description of the faculty itself
as “a kind of rational intuition or intellectual presentation of a possibility: a clear and
distinct idea” (Stoljar 2001, p. 393). Indeed, as Levine puts it, “The conceivability of
zombies is . . . the principal manifestation of the explanatory gap” (Levine 2001, p.
79). Yablo too, notes that for Chalmers (1996), “Almost everything in The Conscious
Mind turns on a single claim . . . that there can be zombie worlds” (Yablo 1999, p.
455). The conceivability of zombies leads to the possibility of dualism and thereby
to the falsity of materialism.
Bunge suggests that this argument “does not even distinguish between conceptual
and physical possibility” (Bunge 2010, p. 23) and, although this is not accurate,
it captures something important about the extravagance of such contemporary
philosophy which Bunge characterizes as “just jeux d’esprit” (Bunge 2010, p. 23).
Current arguments concerning the conceivability of zombies and the “explanatory
gap” are little more than unwitting, often verbatim, rehearsal of Descartes’ own
reasoning in his Meditations. However, John Cottingham has bluntly remarked
that Descartes’ argument from conceivability to dualism “is, or ought to be,
regarded as one of the most notorious nonsequiturs in the history of philosophy”
(Cottingham 1992, p. 242). The alleged metaphysical, ontological implications
of conceivability intuitions have become a central topic of philosophical debate
(Gendler and Hawthorne 2002) but the question to be considered is, in Loar’s words
370
P. Slezak
echoing Bunge, whether “we have managed to break out” (Loar 1999) of purely
conceptual premises to metaphysical conclusions.
In fairness to Descartes, it is worth noting that he had good, essentially scientific
reasons for his dualism and even his cogito meditations have an important logical
structure that has not been properly recognized (See Slezak 1988, 2010). The
criticism of Cottingham and Loar applies more to modern dualists than to Descartes
himself.
21.4 Exorcising the Ghost or the Machine?
Regarding the thesis of materialism, Bunge makes the important and perhaps surprising point that “there is no generally accepted concept of matter” and, therefore,
“We do not have a generally accepted materialist theory of mind” (Bunge 2010,
p.xvii). Although Bunge is harshly critical of Chomsky’s views on various issues
concerning language, on this question Chomsky has made a similar point. Chomsky
(2000) appears to undermine the entire philosophical mind-body enterprise as it has
been traditionally conceived. Chomsky’s makes the surprising suggestion that the
problem cannot even be formulated coherently. Whereas the problem is universally
seen as the mystery of the mind and how it might be explained in material terms,
Chomsky reverses the puzzle as one about the body. He suggests that since Isaac
Newton “the theory of body was demonstrated to be untenable” (Chomsky 2000,
p. 84). Ironically, he notes “Newton eliminated the problem of “the ghost in the
machine” by exorcising the machine; the ghost was unaffected” (Chomsky 2000,
p. 84).
With this development, “the mind-body problem disappeared, and can be
resurrected, if at all, only by producing a new notion of body (material, physical
etc.) to replace the one that was abandoned” (Chomsky 2000, p. 84). For this reason,
there is generally no reduction of one science to another but rather the reducing
science changes to permit unification of the previously recalcitrant theory. This is a
remarkable analysis of the puzzle of consciousness that turns everything on its head.
If Chomsky is right, as Bunge would appear to concur, there is no more a mindbody problem than there was a valence-atom problem or electricity-matter problem.
Chomsky writes: “the traditional mind-body problem became unformulable with
the disappearance of the only coherent notion of the body (physical, material, and
so on)” (Chomsky 2009, p. 189).
While recognizing that commitment to computationalism is the central dogma of
modern cognitive science, Bunge’s critique seems to miss its mark here. He writes:
. . . computers are not exactly natural. Worse, unlike live human brains, they are limited to
performing algorithmic operations. They lack spontaneity, creativity, insight (intuition), the
ability to feel emotions, and sociality. Indeed, computers have to be programmed; there can
be no programs for coming up with original ideas . . . (Bunge 2010, p. 110)
21 Mario Bunge and Contemporary Cognitive Science
371
Elsewhere Bunge explains further, “And, of course, by definition of “original,” an
original design is one that has never been described before – that is, one that is so far
unknown” (Bunge 2010, p. 228). Ironically, given his own harsh criticism of social
constructivists, here Bunge falls into the error seen notoriously in Brannigan (1981)
who sees creativity and originality in science as a matter of social achievement
and priority as if this precluded rational, cognitive, intellectual processes. Bunge
cites exactly the same social notion of originality and thereby entirely side-steps
the key question of whether computers can do what we do when we make original
inventions or discoveries regardless of whether they happen to have been anticipated
in an uninteresting sociological sense.
21.5 Programming Original Creativity
On the more fundamental issue, Bunge’s critique of computer creativity and
originality is the so-called “Lady Lovelace” objection addressed in Alan Turing’s
classic article ‘Computing Machinery and Intelligence’ (Turing 1950). Aside from
issues of principle, the empirical facts refute the charge that computer algorithms are
incapable of originality, spontaneity and creativity as humans are. The AI programs
of Newell and Simon (1972) developed further by Langley et al. (1987; see Slezak
1989) are an existence proof of original, creative scientific discovery by computer.
For example, the BACON program has discovered Boyle’s Law and Kepler’s Law
from the observational, numerical data which deserves to be regarded as original
in the relevant sense that the result was not programmed but found by heuristic
problem-solving methods that are essentially the methods that underlie human
creative thought.
Bunge’s suggestion that computers “are limited to performing algorithmic
operations” (Bunge 2010, p. 110) fails to recognize the crucial distinction between
algorithmic and heuristic problem solving as developed by Newell and Simon
(1972). Although even the latter are strictly algorithmic by virtue of being fundamentally computer programs, they apply techniques of problem solving that are not
guaranteed to find a solution since many interesting problems are not susceptible to
algorithmic solution.
Apart from such actual developments in AI, Bunge’s argument overlooks the fact
that human beings are also strictly subject to programming in the sense that we are
deterministic machines whose brains are a complex combination of inheritance and
learning – all forms of programming, albeit not by a human or other independent
intelligence. Unless human originality is ascribed to some mysterious, inexplicable
indeterministic source, our own creative discoveries must also be due to describable
cognitive processes that are ultimately products of the information processing
physically embodied in the brain. On Bunge’s own materialist view of human minds,
he must be committed to just such a view of originality and creativity as describable,
that is, programmable.
372
P. Slezak
21.6 Plato’s Problem: Nature or Nurture?
Perhaps the most sophisticated computational theory of mental phenomena is
Chomsky’s generative grammar of language, that Bunge acknowledges to be a
“naturalization project” (Bunge 2010, p. 112). However, far from being a kind of
Cartesian dualism (Bunge 2010, p. 122), the generative program is a vindication
of materialism by showing how a physical, biological system might embody
the special properties of language such as compositionality and recursiveness.
Moreover, Bunge’s response to Chomsky’s claims for the innateness of Universal
Grammar (UG) does not address the fundamental grounds for the claim – the idea
that the human brain has an initial state that includes the principles underlying all
humanly learnable languages and, therefore, a species-specific aspect of the human
genetic endowment. Purporting to answer Chomsky, Bunge asserts “all knowledge
is learned” (Bunge 2010, p. 166). Bunge complains:
Unfortunately no one has bothered to explicitly state the rules of UG, and geneticists have
not found the presumptive UG gene(s). Nor is there any reason to expect such findings, for
languages are highly conventional . . . . (Bunge 2010, p. 112)
First, Bunge overlooks the fact that Chomsky’s claim at this level of generality
should be uncontroversial because even the most extreme empiricist or behaviourist
must agree that something is innate to permit language “learning” at all. The only
question at issue is how much. Moreover, the “innateness” claim for language is
essentially the same as for other cognitive systems such as vision in mammals
which are not fully determined by genetically determined structures in the brain
but partially fixed by innate factors and partly determined by experience.
More specifically, the “reason to expect such findings” includes the “poverty of
the stimulus” argument or what Chomsky refers to as “Plato’s Problem” – namely,
that of explaining how we can know so much on the basis of so little data. Bunge
asserts “there is no evidence whatsoever that anything learnable is encoded in the
genome” (Bunge 2010, p. 183), but he doesn’t address the persuasive evidence from
acquisition of the complex structures of language in all children by the age of three
without effort, without instruction, without adequate evidence. This phenomenon
is quite different from “learning” and rather typical of biologically determined
maturation along a pre-determined course of development, triggered by experience
but not learned from it.
And, of course, despite Bunge’s assertion that “no one has bothered to explicitly
state the rules of UG,” modern generative linguistics in the ‘Minimalist’ program
is precisely stating the rules of UG, of course, as always, conjectured provisionally
as in any other branch of empirical science. Bunge makes the surprising remark
that “idealists like Chomsky and his followers ignore empirical linguistics” (Bunge
2010, p. 136) by which he means “real speakers and linguistic communities” rather
than “abstract systems” (Bunge 2010, p. 136).
In a certain sense Bunge is right, but this is merely the “Galilean” approach to
science that Chomsky has championed on the basis of his famous “competence –
performance” distinction (see Pylyshyn 1973; Slezak 2014). Galileo ignored the em-
21 Mario Bunge and Contemporary Cognitive Science
373
pirical evidence of real pendulums and real projectiles in favour of mathematically
idealized “abstract systems” for the same reason. Bunge complains that “brainless
psychology” can only describe mental phenomena but not explain them “because
genuine explanation involves revealing mechanisms” (Bunge 2010, p. 159). However, this would rule out Newton’s law of gravitation and Kepler’s law of elliptical
planetary orbits, inter alia, which famously did not reveal underlying mechanisms.
Like Marr’s (1982) computational model of vision, Chomsky’s competence model
of tacit knowledge abstracts and idealizes from underlying mechanisms and, thereby
provides the most fruitful approach to ultimately discovering them.
21.7 Conclusion
Bunge’s work is both exhilarating and exasperating at the same time. Its characteristic scope, insight and erudition is joined with a refreshing impatience for the
many varieties of nonsense in the academy. I confess to considerable sympathy for
Bunge’s jaundiced view of those “professors who play parlour games instead of
tackling serious problems” (Bunge 2010, p. 11) and I share his conviction that much
modern philosophy is guilty of this kind of lapse. However, as stated at the outset,
there are points where his insights, bold claims and criticisms need much finer
detail and attention to contemporary literature in philosophy of mind and cognitive
science.
References
Aristotle (1941). De Anima (R. McKeon, Trans.). The basic works of aristotle (pp. 535–561). New
York: Random House.
Armstrong, D. M. (1968). A materialist theory of mind. London: Routledge & Kegan Paul.
Armstrong, D. M. (1980). The nature of mind. Sydney: University of Queensland Press.
Block, N. (1990). Consciousness ignored. Review of Daniel Dennett consciousness explained. The
Journal of Philosophy, 181–193.
Brannigan, A. (1981). The social basis of scientific discoveries. Cambridge: Cambridge University
Press.
Bunge, M. (1980). The mind-body problem: A psychobiological approach. New York: Pergamon
Press.
Bunge, M. (1991). A philosophical perspective on the mind-body problem or, why neuroscientists
and psychologists should care about philosophy. Proceedings of the American Philosophical
Society, 135(4), 513–523.
Bunge, M. (2010). Matter and mind: A philosophical inquiry. Boston studies in the philosophy of
science volume 287. Dordrecht: Springer.
Chalmers, D. (1996). The conscious mind: In search of a fundamental theory. Oxford: Oxford
University Press.
Chalmers, D. (2002). Does conceivability entail possibility? In T. Gendler & J. Hawthorne (Eds.),
Conceivability and possibility (pp. 145–200). Oxford: Oxford University Press.
Chomsky, N. (1965). Aspects of the theory of syntax. Cambridge, MA: MIT Press.
374
P. Slezak
Chomsky, N. (1980). Rules and representations. New York: Columbia University Press.
Chomsky, N. (2000). New horizons in the study of language and mind. Cambridge: Cambridge
University Press.
Chomsky, N. (2009). The mysteries of nature: How deeply hidden? The Journal of Philosophy,
106(4), 167–200.
Cottingham, J. (1992). “Introduction” The Cambridge companion to Descartes. Cambridge:
Cambridge University Press.
Dennett, D. C. (1978a). Brainstorms. Vermont: Bradford Books.
Dennett, D. C. (1978b). Current issues in the philosophy of mind. American Philosophical
Quarterly, 15, 249–261.
Dennett, D. C. (1989). The intentional stance. Cambridge, MA: MIT Press.
Dennett, D. C. (1991). Consciousness explained. London: Penguin.
Descartes, R. (1637/1985). Discourse on the method In The philosophical writings of Descartes,
volume 1 (J. Cottingham, R. Stoothoff & D. Murdoch, Trans.). Cambridge: Cambridge
University Press.
Di Sciullo, A. M., & Boeckx, C. (Eds.). (2011). The biolinguistic enterprise: New perspectives on
the evolution and nature of the human language faculty. Oxford: Oxford University Press.
Durkheim, E. (1898). Individual and collective representations. Revue de Métaphysique et de
Morale, vi, May. Reprinted in D.F. Pocock, 1974, translator, Sociology and Philosophy by
Emile Durkheim. New York: The Free Press.
Fodor, J. A. (1968). Psychological explanation. New York: Random House.
Fodor, J. A. (1989). Making mind matter more. Philosophical topics, 17, 59–79. Reprinted in
Fodor, A theory of content and other essays ([DATE] pp. 137–160). Cambridge, MA: MIT
Press.
Gardner, H. (1987). The mind’s new science. New York: Basic Books.
Gendler, T., & Hawthorne, J. (Eds.). (2002). Conceivability and possibility. Oxford: Oxford
University Press.
Jenkins, L. (2001). Biolinguistics: Exploring the biology of language. Cambridge: Cambridge
University Press.
Kirk, R. (2005). Zombies and consciousness. Oxford: Oxford University Press.
Langley, P., Simon, H., Bradshaw, G., & Zytkow, J. (1987). Scientific discovery: Computational
explorations of the creative processes. Cambridge, MA: MIT Press.
Levin, J. (2010). Functionalism. In E. N. Zalta (Ed.), The Stanford Encyclopedia of Philosophy
(Summer 2010 Edition). http://plato.stanford.edu/archives/sum2010/entries/functionalism/.
Accessed 10 June 2018.
Levine, J. (2001). Purple haze: The puzzle of consciousness. Oxford: Oxford University Press.
Loar, B. (1999). David Chalmers’s the conscious mind. Philosophy and Phenomenological
Research, 59(2), 465–472.
Marr, D. (1982). Vision a computational investigation into the human representation and processing of visual information. Cambridge, MA: MIT Press.
McGinn, C. (1989). Can we solve the mind-body problem. Mind, 98, 891.
Nagel, T. (1965). Physicalism. The Philosophical Review, 74, 336–356. Reprinted in J. O’Connor
ed., Modern Materialism. New York: Harcourt Brace & World.
Newell, A. (1990). Unified theories of cognition. Cambridge, MA: Harvard University Press.
Newell, A., & Simon, H. A. (1972). Human problem solving. New Jersey: Prentice-Hall.
Pylyshyn, Z. (1972). Competence and psychological reality. American Psychologist, 27, 546–552.
Pylyshyn, Z. (1973). The role of competence theories in cognitive psychology. Journal of
Psycholinguistic Research, 2(1), 21–50.
Pylyshyn, Z. (1984). Computation and cognition. Cambridge, MA: MIT Press.
Russell, B. (1912/1967). The problems of philosophy. London: Oxford University Press.
Searle, J. R. (1992). The rediscovery of the mind. Cambridge, MA: MIT Press.
Searle, J. R. (1997). The mystery of consciousness. London: Granta Books.
Slezak, P. (1988). Was Descartes a liar? Diagonal doubt defended. British Journal for the
Philosophy of Science, 39, 379–388.
21 Mario Bunge and Contemporary Cognitive Science
375
Slezak, P. (1989). Scientific discovery by computer as refutation of the strong programme. Social
Studies of Science, 19(4), 563–600.
Slezak, P. (2010). Doubts about Descartes’ indubitability: The cogito as intuition and inference.
Philosophical Forum, 41(4), 389–412.
Slezak, P. (2014). Intuitions in the study of language: Syntax and semantics. In L. M. Osbeck &
B. S. Held (Eds.), Rational intuition: Philosophical roots, scientific investigations. Cambridge:
Cambridge University Press.
Slezak, P. (2018). Is there progress in philosophy? The case for taking history seriously.
Philosophy, 93(4), 529–555. https://doi.org/10.1017/S0031819118000232
Stoljar, D. (2001). The conceivability argument and two conceptions of the physical. Philosophical
Perspectives, 15: Metaphysics, 393–413.
Stoljar, D. (2017). Philosophical progress: In defence of a reasonable optimism. Oxford: Oxford
University Press.
Strawson, G. (2006). Consciousness and its place in nature. Exeter: Imprint Academic.
Strawson, G. (2008). Real materialism and other essays. Oxford: Clarendon Press.
Strawson, G. (2018). Things that bother me. New York: NYRB Press.
Turing, A. (1950). Computing machinery and intelligence. Mind, 59, 433–460.
Yablo, S. (1999). Concepts and consciousness. Philosophy and Phenomenological Research, 59(2),
455–463.