Evolutionofconsciousness 2012

Download as pdf or txt
Download as pdf or txt
You are on page 1of 24

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/245032935

The evolution of consciousness

Article  in  Contemporary Social Science: Journal of the Academy of Social Sciences · June 2012
DOI: 10.1080/21582041.2012.692099

CITATIONS READS

14 7,632

1 author:

Max Velmans
Goldsmiths, University of London
109 PUBLICATIONS   1,862 CITATIONS   

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Continuing to write about and edit major collections on Consciousness View project

All content following this page was uploaded by Max Velmans on 20 May 2014.

The user has requested enhancement of the downloaded file.


The evolution of consciousness
Max Velmans, Goldsmiths, University of London, Lewisham Way, New Cross, London SE14
6NW.

Email: [email protected]; Website: http://www.gold.ac.uk/psychology/staff/velmans/


In D. Canter and D. Tunbull (eds.) Biologising the Social Sciences. Special Issue of Contemporary
Social Science: Journal of the Academy of Social Sciences (prepublication version)

Abstract. There have been various attempts to apply Darwinian evolutionary theory to an
understanding of the human condition within psychology and the social sciences. This paper
evaluates whether Darwinian Theory can explain human consciousness. Starting with a brief
definition of phenomenal consciousness and the central features of evolutionary theory, the paper
examines whether random variations in the genome that confer a selective, reproductive advantage
can explain both the emergence of consciousness and its varied forms. To inform the discussion, the
paper reviews what is known about the conditions for consciousness within the human mind/brain,
understood in both structural (neural) terms and functional terms (in terms of human information
processing), and concludes that “random variations in the genome” provide no explanatory
mechanism for why some neural activities (but not others) are accompanied by
consciousness. The paper then evaluates the many functional advantages that have been
proposed for various forms of phenomenal consciousness once they emerge, and concludes
that, on close examination, phenomenal experiences themselves do not carry out the
information processing functions attributed to them, which challenges the Darwinian
requirement that they could only have persisted (once emergent) it they enhanced
reproductive fitness. The paper turns finally to what can be said about wider distribution of
consciousness in non-humans, contrasting discontinuity theories with continuity theories.
Discontinuity theories argue for a critical functional transition that “switches on
consciousness” while continuity theories argue for a gradual transition in consciousness from
unrecognisable to recognisable. All theories accept that there is an intimate, natural
relationship of conscious experiences with their associated material forms. Consequently, as
the material forms evolve, their associated experiences co-evolve—suggesting an indirect
mechanism by which the emergence of species-specific forms of consciousness can be
influenced by Darwinian evolution. It also allows a non-reductive understanding of human
consciousness within the social sciences.

INTRODUCTION

Darwinian accounts of the emergence and functions of consciousness

It has long been suspected that the emergence and functions of human consciousness can be
explained by Darwinian evolution. As Darwin's friend, the naturalist George Romanes argued in
1885,

“Is it not itself a strikingly suggestive fact that consciousness only, yet always, appears
upon the scene when the adjustive actions of any animal body rise above a certain level
of intricacy. ... Surely, this large and general fact points with irresistible force to the
conclusion, that in the performance of these more complex adjustments, consciousness
or the power of feeling or the power of willing are of some use. Assuredly on the
principles of evolution, which materialists at all events cannot afford to disregard, it
would be a wholly anomalous fact that so wide and important a class of faculties of
mind should have become developed in constantly ascending degrees throughout the
animal kingdom, if they were entirely without use to animals.” (cited in Vesey, 1970,
p182)

The notion that consciousness is necessary, or at any rate useful in the performance of complex
tasks, particularly when these are novel, or require flexibility is a recurring theme in
subsequent psychological theory. Many of these develop William James’ (1890) suggestion
that consciousness is associated with selective attention and primary memory. According to
James, the current contents of consciousness define the “psychological present” and are
contained in “primary memory”, a form of short-term working store; by contrast, the contents
of “secondary memory”, an unconscious long-term store, define the “psychological past”.
James also suggested that stimuli that enter consciousness are at the focus of attention, having
been selected from competing stimuli to enable effective interaction with the world.

While James associated conscious experiences with these forms of mental processing, from the
early 1970s many psychologists took it for granted that one could reduce conscious
experiences to such processing. Up to the early 1990’s for example, “preconscious” processing
was commonly identified with “pre-attentive” processing, whereas “conscious” processing was
identified with “focal-attentive” processing (e.g. Baars, 1991; Mandler, 1975, 1985, 1991;
Miller, 1962). Following Waugh & Norman’s (1965) re-introduction of James’ ideas into
psychological research, many psychologists also identified the contents of consciousness with
the contents of primary memory or some similar short-term working store (e.g. Norman, 1969,
Atkinson & Shiffrin, 1968, Mandler, 1975, and, in more detail, Ericsson & Simon, 1984, and Baddeley,
1993).

With such reductive identifications in place, various attempts were then made to spell out the
evolutionary advantages of consciousness in finer detail (See for example Mandler 1975, 1985,
1997, Crook, 1980, Dixon, 1981, Johnson-Laird, 1988, Baars, 1988, 2007, Shallice 1978, 1988,
Schacter, 1990). We evaluate such reductive identifications of consciousness with adaptive
functioning below. To begin with however, we need to return to first principles.

What is consciousness?

As with any term that refers to something that one can observe or experience it is useful, if
possible, to begin with an ostensive definition, i.e. to "point to" or "pick out" the phenomena to
which the term refers and, by implication, what is excluded.

Normally we point to some thing that we observe or experience. The term “consciousness”
however refers to experience itself. Rather than being exemplified by a particular thing that we
observe or experience, it is exemplified by all the things that we observe or experience.
Something happens when we are conscious that does not happen when we are not
conscious—and something happens when we are conscious of something that does not
happen when we are not conscious of that thing. We know what it is like to be conscious when
we are awake as opposed to not being conscious when in dreamless sleep. We also know what
it is like to be conscious of something (when awake or dreaming) as opposed to not being
conscious of that thing.
This everyday understanding of consciousness based on the presence or absence of
experienced phenomena provides a simple place to start. A person, or other entity, is
conscious if they experience something; conversely, if a person or entity experiences nothing
they are not conscious. Elaborating slightly, we can say that when consciousness is present,
phenomenal content (consciousness of something) is present. Conversely, when phenomenal
content is absent, consciousness is absent. In modern consciousness studies and related brain
and social sciences consciousness understood in this way is commonly referred to as
phenomenal consciousness.

Problems for Darwinian accounts of the emergence and functions of consciousness

Why does phenomenal consciousness present problems for Darwinian evolutionary theory?
Darwinian theory is a functional theory. Stripped down to its essence, it has only one
explanatory mechanism: novel biological forms and functions emerge through random
variation of genes, and only persist if in some way they enhance the ability of organisms (or
populations of organisms) to propagate their genes. Given this, for evolutionary theory to
explain the existence of consciousness, it must show (a) how consciousness emerged
through random variation in the genome of organisms in which it was previously absent,
and (b) how the emergence of various forms of consciousness enhanced the ability of those
organisms to propagate their genes. Unfortunately, (a) presents a “hard problem” much
discussed in the consciousness studies literature and, as we will see below, there are similar
difficulties with (b).

ACTIVITIES THAT SUPPORT PHENOMENAL CONSCIOUSNESS IN THE HUMAN BRAIN

In order to clarify the nature of the problems posed by consciousness (of any kind) it is
instructive to start with the case that is most easily investigated—the necessary and
sufficient conditions for consciousness in the human brain. Activities in the human brain
that support consciousness can be defined both functionally (viewing the brain as a complex
information processing system) and structurally, in terms of the neural hardware that
embodies such functioning. As one might expect, the relevant research literature is vast and
I do not have space to review it here, although I have reviewed it elsewhere (Velmans, 2009,
chapters 4, 5, 10 and 11). For our present purposes, a brief overview will do.

Viewed in functional terms the brain can be thought of as an information processing system,
and, although there are many differences in the detail of cognitive models of consciousness,
for example, in the way selection of input, focal attention, primary memory and the operations
of a neural ‘central processor’ relate to each other, in their attempts to relate consciousness to
such functioning, there are a number of recurring themes. It is generally agreed that the initial
processing of information arriving at the sense organs proceeds, at least to some extent, in a
parallel, automatic, preconscious fashion. When a stimulus is sufficiently well identified to be
judged more important or “pertinent” than competing stimuli it may be selected for more
detailed attention. It is only if this happens that the stimulus enters primary memory (or some
equivalent short-term “working memory”) in which case it enters consciousness and may be
subject to further processing of a novel, flexible kind in some form of ‘central processor’ or
‘global workspace’. In this there is a trade-off between the greater range of processing
resources that can be allocated to a given, attended-to task, and the smaller number of tasks
that can be at the focus of attention at any given moment. Attentional processing may involve
categorisation, choice, planning, reorganisation, retrieval from and transfer to secondary (long-
term) memory and so on. As a result of such processing, information at the focus of attention is
integrated in a coherent way, and becomes generally available (widely disseminated or
“broadcast”) throughout the system, providing the basis for a co-ordinated, adaptive, overt
response. While novel, complex tasks require such conscious processing for their successful
execution, once they are well learnt they may be dealt with in an automatic, unconscious way.

Viewed in terms of neural activity the antecedent neural causes of conscious experiences need
to be distinguished from their proximal, co-temporal neural correlates. The systems and
conditions that govern the existence of consciousness such as the sleep-wake cycle and the
operation of selective attention also have to be distinguished from the added conditions
required to support its varied forms (visual, auditory, tactile, etc.). At any given moment the
bulk of neural processing remains unconscious and only very specific activities appear to be
eligible for consciousness such as the end products of interoceptive and exteroceptive
perceptual processing that result in inner experiences (such as thoughts and visual images),
body sensations (such as pleasure and pain) and a surrounding, phenomenal world extended in
three-dimensional space and time. While these contents are indefinitely varied, the basic
experiential materials from which they are constructed are drawn from a limited number of
sensory resources and their derivatives. The external phenomenal world for example is
constructed from what we see, hear, touch, taste and smell, while verbal thoughts and dreams
draw on auditory-phonemic and visual imagery. Such sensory qualia appear to be largely
cortically based although midbrain structures play a major role in giving such qualia an affective
tone. In the visual system (and perhaps in other systems) there appear to be “essential
nodes”—topographically distinct neural assemblies that are specialised for the processing of
individual features of visual input (such as colour, shape and motion). Activation of an
essential node appears to be necessary to have an experience of the feature that it
processes which makes such activation a prime candidate for being a neural correlate of
that experienced feature. Integrated experiences of objects requires integration of their
features and phase-locked neural oscillations (in the 40Hz region) might be the mechanism
which “binds” such widely distributed feature representations into the integrated neural
activity required to support an integrated conscious field.

Is there a specific place in the brain where conscious experiences are generated? While
there are some vital, precisely located, early links in the chain of neural causation that
support human consciousness, for example the intralaminar nuclei of the thalamus where
lesions produce irreversible coma, the general answer appears to be no. And while it makes
sense to treat those neural activities that code the features of a given conscious experience
(e.g. the “essential nodes”) as the proximal neural correlates of that experience, such
activities are normally supported by activation and attentional systems elsewhere in the
brain. Rather than being a place in the brain, the global workspace in which aspects of
neural activity are integrated and disseminated can be thought of as a continuously shifting
population of synchronously firing neurons that encode information at the current focus of
attention. Normal exteroception also involves complex, preconscious interactions with the
external world, so, ultimately, in the chain of causation, the entire brain, body and
embedding world might be directly or indirectly involved.

Is there a special kind of neuron for different modalities of experience (vision versus
audition and so on)? Again (on present evidence) the answer seems to be no. Different
affective qualia appear to be associated with different neurotransmitters (pleasure with
dopamine, anxiety with noradrenaline, etc.) however different sensory qualia appear to be
linked to the functional organisation of the nodes that process the features associated with
those qualia, and to the roles that these functions play within the global economy of the
brain.

What makes neural activity conscious?

What is it about neural activity that makes it conscious? In 1976, the neurophysiologist E. Roy
John confessed that,

"We do not understand the nature of ... the physical and chemical interactions which
produce mental experience. We do not know how big a neuronal system must be
before it can sustain the critical reactions, nor whether the critical reactions depend
exclusively upon the properties of neurons or only require a particular organisation
of energy and matter". (John, 1976, p. 2)

Thirty-five years later, this is still not known. In the human brain, the level of activation
appears to be important. However, at any given moment, very little of the brain’s activity
reaches consciousness, and only some activities, such as the results of perceptual processing
appear to be eligible for consciousness. Those activities in the human brain that are eligible
for consciousness also have to compete for consciousness—a recurring theme that can be
traced back to the dawn of thinking about attentional processes in the writings of William
James. Competition is also a common strand in current neuropsychological theories of
consciousness. As Crick & Koch (2007) and Singer (2007) make clear, synchronous firing is likely
to be a mechanism by which given neural assemblies enter into winning coalitions, and,
although their theories differ in detail, Dehaene & Naccache (2001), and Edelman & Tononi
(2000) suggest that entry of information into a “global workspace” and more specifically,
dominating the information in the global workspace is what makes neural information
conscious.

What is perhaps most surprising about these converging views is that no special, added
ingredient may be required for consciousness! Neural assemblies that are eligible for
consciousness might be more or less active, and they might or might not enter into phase-
locked synchronous firing with other assemblies which allow their firing patterns to become
integrated and dominant. But doing more of what they normally do, or doing this in
synchrony with other neural assemblies, does not fundamentally alter the nature of such
neural activities. In short, eligible neural activities that remain unconscious may not be
different in kind from those which become conscious, anymore than the sound of individual
voices at a football stadium are different in kind to the concerted singing of the crowd that
drowns them out.

And that brings us back to the problem that the so-called hard problem of consciousness
presents to evolutionary theory. How did neural activity of any kind become associated
with consciousness? “Random variation in the genome of organisms” provides no
explanatory mechanism.

EVOLUTIONARY THEORY AND THE ADAPTIVE FUNCTIONS OF CONSCIOUSNESS


While many would accept that evolutionary theory doesn’t have any way of explaining the
emergence of consciousness, many proposals have been made about how consciousness
aids effective functioning once an organism has it, thereby explaining its persistence by
enhancing the propagation of that organism’s genes. As noted above, many early proposals
identified the contents of consciousness with the operations of focal attentive processing
and/or “primary memory” or some similar working store. In an attempt to unify such
proposals, Baars (1988) integrated them into what he called a “global workspace” architecture
of the brain. In their review of cognitive models of consciousness, Baars & McGovern (1996)
point out that the brain has hundreds of different types of unconscious specialised processors
such as feature detectors for colours, line orientation and faces, which can act independently
or in coalition with one another, thereby bypassing the limited capacity of consciousness.
These processors are extremely efficient, but restricted to their dedicated tasks. The
processors can also receive global messages and transmit them by ‘posting’ messages to a
limited-capacity, global workspace whose architecture enables system-wide integration and
dissemination of such information. Such communications allow new links to be formed
between the processors, and the formation of novel expert ‘coalitions’ able to work on new or
difficult problems. Baars, et al (1997, p. 423) liken this global workspace to a “theatre of
consciousness in the society of the mind.”

As did prior theories which reduced consciousness to information “at the focus of attention”,
“in a working store”, and so on, Baars & McGovern (1996) assert that “information in the
global workspace corresponds to conscious contents” (p 89—my italics). Accordingly, they give
consciousness a central role in the economy of mind that corresponds to the functions of the
global workspace. Within their model, the global workspace is essential for organising novel,
complex activities. So Baars & McGovern give consciousness many things to do:

1. By relating input to its context, consciousness defines input, removing its ambiguities in
perception and understanding.

2. Consciousness is required for successful problem solving and learning, particularly


where novelty is involved.

3. Making an event conscious raises its “access priority,” increasing the chances of
successful adaptation to that event.

4. Conscious goals can recruit subgoals and motor systems to carry out voluntary acts.
Making choices conscious helps to recruit knowledge resources essential to arriving at
an appropriate decision.

5. Conscious inner speech and imagery allow us to reflect on and, to an extent, control
our conscious and unconscious functioning.

6. In facing unpredictable conditions, consciousness is indispensable in allowing flexible


responses.

“In sum, consciousness appears to be the major way in which the central nervous system
adapts to novel, challenging and informative events in the world” (Ibid, p92). Romanes (1885)
came to a similar conclusion, as noted above.

The strengths of functionalism in cognitive psychology


For our present purposes we do not need to consider the extensive experimental work which
led to the development of the many models of conscious processing outlined above. Suffice it
to say that the evidence in support of broad functional links between consciousness, attention
and primary memory along the lines described above is considerable (see reviews in Velmans,
1991a, 2009 chapters 4 and 10, Baars & McGovern, 1996, Mandler, 1997, Styles, 1997,
Pashler,1999). These broad associations of consciousness with human information processing
are also supported by everyday experience. It is easy to demonstrate for example that one
attends to only a small amount of the information that arrives at the sense organs. Just notice,
as you read, the pressure of your feet against the floor, the range of environmental sounds, the
sensation of your own breathing, and so on. These other inputs only enter consciousness once
one allocates attention to them. So it is reasonable to suppose that there must be a process
which governs selection of input, allocation of attentional resources and entry into
consciousness. The observation that complex, novel tasks require conscious attention is also
evident to anyone learning to drive a car or play a musical instrument. Once in consciousness,
an event also becomes part of one's “psychological present”–which makes it possible for it to
become part of one's psychological past, involving storage in long-term memory, the possibility
of later recall, and so on.

In short, the cognitive psychological approach, which treats mind as a complex system that can
be analysed into its constituent functions and processes, seems to be both productive and
plausible (unlike the behaviourism that it replaced, which ignored or denied the existence of
mind). Information processing accounts have also significantly advanced our understanding of
the processes most closely associated with consciousness in the economy of mind. In principle,
functional accounts of mental operations can also be combined with neurophysiological
accounts of how the wetware of the brain operates (as in cognitive neuropsychology) with
potentially unifying results. One might have doubts about whether it makes sense to reduce
functional descriptions of the mind to neurophysiology, but few would deny that it makes
sense to investigate the manner in which mental functions are embodied in neurophysiology.

The limits of functionalism in cognitive psychology

But what does this tell us about the functions of phenomenal consciousness as such?
Developments in 20th Century psychology and beyond have made it very clear that the
functions of “mind” need to be distinguished from the functions of “consciousness” for the
simple reason that mental processes may or may not be conscious. Crucially, cognitive theories
of mind, expressed in functional, information processing terms are “third-person” accounts of
what is going on. That is, they are inferences about intervening processes based on
observations of input-output contingencies. Neurophysiological accounts of mental functioning
are similarly based on “third-person” observations of the brain. By contrast, phenomenal
consciousness is, in essence, a “first-person” observable phenomenon (we cannot observe
someone else's consciousness from the outside, so if we did not have it ourselves, we would
not suspect it was there). Consequently, one cannot take it for granted that third-person
functional accounts of mind or brain are also accounts of the functions of consciousness.

The truth of this is evident from the fact that, for many years, cognitive accounts of mental
processes now thought to be closely associated with consciousness made little if any reference
to consciousness. Theories of selective attention for example, focused on how processing
capacity was allocated, on determining the stage of input analysis at which stimulus selection
takes place, and on how pre-attentive processing differs from focal-attentive processing.
Theories of short-term memory tried to specify its capacity, the principles governing
information entry to and loss from the memory system, the modes of encoding used, and so
on. While there are good reasons to believe that phenomenal consciousness in humans is
closely associated with attentional processing and short-term memory, the nature of this
association was rarely at issue in such cognitive investigations. Consequently, it is usually not
specified in such reductive information processing accounts. In the models above, for example,
there are no “bridging laws” which cross the gap from third-person information processing
accounts to first-person accounts of phenomenal experience. Cognitive theories which identify
consciousness with one or another information processing “box” simply assume or define it to
be ontologically identical to a given form of processing in the brain (largely ignoring its
phenomenology). Such theories typically move, without blinking, from relatively well-justified
claims about the forms of information processing with which consciousness is associated, to
entirely unjustified claims about what consciousness is or what it does. Baars & McGovern
(1996), for example, move without any discussion, from the somewhat ambiguous claim that
“information in the global workspace corresponds to conscious contents,” to the claim that
consciousness actually carries out the functions of the global workspace. However, such
manoeuvres beg the question; that is, they assume or posit what they need to establish.

In short, an association of first-person experience with a given form of information


processing does not warrant the reduction of that experience to that processing. Nor does it
follow that the functions carried out by a given form of information processing are also the
functions of its associated phenomenal consciousness! The need to be cautious about such
reductive identifications is also suggested (a) by research that shows that many information
processing functions associated with consciousness can also be carried out without it, (b)
that we are not consciously aware of how we carry out most of the processes that are said
to be conscious—which leads one to question the precise sense in which those processes
can be said to be “conscious” and (c) that for processing commonly thought of as
“conscious” the phenomenal experience accompanying that processing follows the
processing to which it most closely relates and cannot therefore enter into it. This again
introduces an extensive literature on the exact ways in which phenomenal consciousness
relates to human information processing which there is not enough space to review here,
although I have reviewed it elsewhere (c.f. Velmans, 1991a, b, 2009 chapters 4 and 10). A quick
example should however suffice to illustrate the principles involved.

Consider for example Baars’ claim 1, that “By relating input to its context, consciousness
defines input, removing its ambiguities in perception and understanding.” Reading is
universally thought to be a highly complex, conscious form of input analysis that requires
one to relate input to context in a way that requires both focal attention and the resources
of a global workspace. So try noticing what you experience as you read the sentence, “The
dustmen said they would refuse to collect the refuse without a raise in pay.” Notice that the
word “refuse” appeared twice—and on its first appearance in your consciousness in the
form of “inner speech” the stress was on the second syllable (refuse—corresponding to a
verb), while on its second appearance the stress was on the first syllable (refuse—
corresponding to a noun). But how could it be that the stress patterns were already
allocated and different when these words first appeared in your consciousness as inner
speech? It could only be because by the time these words appeared in consciousness, the
semantic and grammatical analysis of the sentence had already been done! And the same
applies to reading this paragraph and the rest of this text. As you read you may visually
experience print on paper, phonemic imagery (inner speech) associated with the spoken
versions and specific meanings of the words, and a feeling of having understood the
meaning of the entire sentence (or not). But the processes responsible for recognising and
decoding visually presented text which result in these experiences are preconscious. And, if
preconscious semantic and syntactic analysis has already been completed by the time the
experienced words arise, these experiences come too late to affect such analysis—so the
conscious experiences can’t be said to carry out the analysis in the manner claimed by Baars
above. In fact the only sense in which reading is a “conscious” process is in the sense that it
results in a conscious experience!

While a single example does not suffice to establish a general principle, extensive analyses
of how phenomenal consciousness relates to human information processing suggest that
consciousness can neither be reduced to the third-person functioning with which it is
associated, nor does it carry out any of the third-person observable functions claimed for it
(c.f. Velmans, 1991a, b, 2009 chapters 3, 4, 5, 10, 13). Information processing accounts of
mental functioning, like evolutionary accounts, are third-person accounts which require no
appeal to first-person experiences for completeness. As noted above, third-person,
cognitive models of how human minds process novel or complex stimuli (in speaking,
reading, problem solving and other so-called conscious tasks) require only information
processing of a kind that could be programmed into a computer. If a model of how a system
might perform such a task is a good one, the computer should work, leaving any conscious
experience that might or might not accompany such functioning nothing to do that isn’t
already explained by the information processing built into the machine. And the same
applies to brains: once one has adequately explained their functioning in terms of
information processing or in terms of the neurophysiology that embodies it, there is nothing
left, in third-person functional terms, for first-person experiences to do—a problem
recognised at the dawn of cognitive psychology by the psychologist George Miller (1962) in
his book The Science of Mental Life, which the philosopher Saul Kripke later called the
problem of “overdetermination”.

This poses an acute, added problem for evolutionary theory: If consciousness does not
enhance third-person observable adaptive functioning why did it persist? Ironically, the
naturalist Thomas Huxley, Darwin’s staunchest defender, came to a similar conclusion about
the functional impotence of consciousness, comparing its role in third-person observable
functioning to the way a steam whistle relates to the engine that powers it (Huxley, 1874).
Huxley’s view that consciousness is just an “epiphenomenon” has often been interpreted to
mean that consciousness has no function at all. However the arguments and evidence above
merely cast doubt on whether it has a third-person function of the kind required by
Darwinian evolutionary theory for it to persist. As Huxley himself pointed out, consciousness
symbolises the activities of mind (to the individual that has it)—and there are in fact good
reasons to believe that, viewed subjectively, consciousness is of central importance to
human life. Without it life would be like nothing and there would be no point to survival!

As readers of his journal will recognise, whether “life has a point” is not an issue that can be
easily addressed within the reductive, mechanistic confines of evolutionary theory—and
this, again, is not something that can be addressed here due to space restrictions, although I
have done so elsewhere (see, for example, the analysis in Velmans, 2009, chapter 14, and
the critical review of Humphrey’s, 2010, reductionist analysis of the subjective importance
of consciousness in Velmans, 2011).

THE DISTRIBUTION OF CONSCIOUSNESS

Given that evolutionary theory has little to say about the sufficient conditions for, or
functions of phenomenal consciousness in human brains, it follows that it can tell us little
about the wider distribution of consciousness in non-humans. Consequently, it is not
surprising that opinions about the distribution of consciousness have ranged from the ultra-
conservative (only humans are conscious) to the extravagantly libertarian (everything that
might possibly be construed as having consciousness does have consciousness).

The view that only humans have consciousness has a long history in theology, following
naturally from the doctrine that only human beings have souls. Some philosophers and
scientists have elaborated this doctrine into a philosophical position. According to
Descartes only humans combine res cogitans (the thinking stuff of consciousness) with res
extensa (extended material stuff). Non-human animals, which he refers to as “brutes”, are
nothing more than nonconscious machines. Lacking consciousness, they do not have reason
or language. Eccles (in Popper & Eccles, 1976) adopted a similar, dualist position—but
argued that it is only through human language that one can communicate sufficiently well
with another being to establish whether it is conscious. Without language, he suggests, the
only defensible option is agnosticism or doubt. Jaynes (1979) by contrast, argued that
human language is a necessary condition for consciousness. And Humphrey (1983) adopted
a similar view, arguing that consciousness emerged only when humans developed a “theory
of mind.” He accepts that we might find it useful for our own ethical purposes to treat
other animals as if they were conscious, but without self-consciousness of the kind provided
by a human “theory of mind” they really have no consciousness at all! There are other,
modern variants of this position (e.g. Carruthers, 1998) but we do not need an exhaustive
survey. It is enough to note that thinkers of very different persuasions have held this view.
Early versions of this position appear to be largely informed by theological doctrine; later
versions are based on the supposition that higher mental processes of the kinds unique to
humans are necessary for consciousness of any kind.

It is easily shown that this extreme position has little to recommend it when applied to
humans, let alone other animals. Phenomenal consciousness in humans is constructed from
different exteroceptive and interoceptive resources and is composed of different
“experiential materials” (what we see, hear, touch, taste, smell, feel and so on). It is true
that our higher cognitive functions also have manifestations in experience, for example, in
the form of verbal thoughts. Consequently, without language and the ability to reason, such
thoughts would no longer be a part of what we experience (in the form of “inner speech”).
But one can lose some sensory and mental capacities while other capacities remain intact
(in cases of sensory impairment, aphasia, agnosia and so on). And there is no scientific
evidence to support the view that language, the ability to reason and a theory of mind are
necessary conditions for having visual, auditory and other sensory experiences. Applied to
humans, this view is in any case highly counterintuitive. If true, we would have to believe
that, prior to the development of language and other higher cognitive functions, babies
experience neither pleasure nor pain, and that their cries and chuckles are just the non-
conscious output of small biological machines. We would also have to accept that autistic
children without a “theory of mind” never have any conscious experience! To any parent,
such views are absurd.

Distinguishing the conditions for the existence of consciousness from the added
conditions required to support its many forms.

Such views confuse the necessary conditions for the existence of consciousness with the
added conditions required to support its many forms. Consciousness in humans appears to
be regulated by global arousal systems, modulated by attentional systems that decide which
representations (of the external world, body and mind/brain itself) are to receive focal
attention. Neural representations, arousal systems, affective systems and mechanisms
governing attention are found in many other animals (Jerison, 1985; Panksepp, 2007).
Other animals have sense organs that detect environmental information and perceptual and
cognitive processes that analyse and organise that information (see reviews in Dawkins,
1998; Velmans, 2009, chapter 8). Many animals are also able to communicate and live in
complex social worlds. Overall, the precise mix of sensory, perceptual, cognitive and social
processes found in each species is likely to be species-specific. Given this, it might be
reasonable to suppose that only humans can have full human consciousness. But it is
equally reasonable to suppose that some non-human animals have unique, non-human
forms of consciousness.

Even self-consciousness (of a kind) may not be confined to humans. Gallup (1977, 1982) for
example, found that individually housed chimpanzees, given access to a full-length mirror,
initially threatened and vocalised towards their mirror images as they would another
chimpanzee. However, within two or three days their behaviour changed. They began to
use their mirror reflections to groom themselves, remove food particles from between their
teeth, and inspect parts of their body that they could not otherwise see. On the eleventh
day the chimps were anaesthetised and a spot of red dye was placed above one eyebrow
and on top of the opposite ear. On recovery, the chimps, who were unable to see the spots,
took no notice of them, touching them only rarely. However, once the mirrors were
reintroduced they gave clear indications of noticing the change in their appearance. The
frequency of touches to the marked spots increased twenty-five-fold, and, on occasion, they
would touch the spots and then inspect and lick their fingers (although the dye was an
indelible one). In short, after a few days of familiarisation with mirrors, the chimps gave
every indication that they recognised the mirror image as a reflection of themselves. Similar
findings have been obtained with orang-utans (Tobach et al., 1997), gorillas (Shillito et al.,
1999) and tamarins (Hauser et al., 1995); mirror recognition has also been found in
elephants (Plotnik et al, 2006) and dolphins (Reiss & Marino, 2001).

Given the evidence for the gradual evolution of the human brain, it seems unlikely that
consciousness first emerged in the universe, fully formed, in homo sapiens. As Thomas
Huxley observed,

"The doctrine of continuity is too well established for it to be permissible to me to


suppose that any complex natural phenomenon comes into existence suddenly, and
without being preceded by simpler modifications; and very strong arguments would
be needed to prove that such complex phenomena as those of consciousness, first
make their appearance in man.” (Huxley, 1874, cited in Vesey, 1970, p. 138)

Is consciousness confined to complex brains?

One cannot be certain that other animals are conscious—or even that other people are
conscious (the classical problem of “other minds”). However, the balance of evidence
strongly supports it (Beshkar, 2008; Dawkins, 1998; Panksepp, 2007). In cases where other
animals have brain structures that are similar to humans, that support social behaviour that
is similar to humans (aggression, sexual activity, pair-bonding and so on), it is difficult to
believe that they experience nothing at all! But if one does not place the
conscious/nonconscious boundary between humans and non-humans where should one
place it?

It might be that consciousness is confined to animals whose brains have achieved some
(unknown) critical mass or critical complexity. In the human case, only representations at
the focus of attention reach consciousness, and then only in a sufficiently aroused state (an
awake or dreaming state, but not coma or deep sleep). But we need to be cautious about
treating such conditions as universal. Within the animal kingdom creatures that sleep
include mammals, birds, many reptiles, amphibians and fish, and even ants and fruit flies.
However not all active animals appear to sleep (for example, fish that swim continuously in
shoals), and while sleep is generally thought to be restorative, its precise biological function
remains unknown. Given that sleep is associated with diminished consciousness, it seems in
any case unlikely that having a sleep-wake cycle is a prerequisite for consciousness.

Selective attention might seem to be a more likely condition, and it is also found in many
other animals—even fruit flies! (van Swinderen, 2007) In humans the mind/brain receives
simultaneous information from a range of sense organs that monitor the external and
internal environment and this information needs to be related to information in long-term
memory, and assessed for importance in the light of ongoing needs and goals. In short,
there are many things going on at once. But we cannot give everything our full, undivided
attention. As Donald Broadbent pointed out in 1958, there is a “bottleneck” in human
information processing. The human effector system is also limited. We only have two eyes,
hands, legs, etc., and effective action in the world requires precise co-ordination of eye-
movements, limbs and body posture. As a result, the mind/brain needs to select the most
important information, to decide on the best strategy, and to co-ordinate its activity
sufficiently well to interact with the world in a coherent, integrated way.

To achieve this, it is as important to stop things happening in the brain as it is to make them
happen. As William Uttal observed

“There is an a priori requirement that some substantial portion, perhaps a majority,


of the synapses that occur at the terminals of the myriad synaptic contacts of the
three-dimensional ... (neural) ... lattice must be inhibitory. Otherwise the system
would be in a constant state of universal excitement after the very first input signal,
and no coherent adaptive response to complex stimuli would be possible" (Uttal,
1978, p192).

To prevent information overload, not to mention utter confusion, attended to information


needs to become dominantly active and conscious, while information outside the focus of
attention is inhibited (and similar inhibition of eligible activities takes place during dreamless
sleep).

As noted above, eligible neural activities in the human brain that remain unconscious may
not be different in kind from those that become conscious, anymore than the sound of
individual voices at a football stadium are different in kind to the concerted singing of the
crowd that drowns them out. If so, consciousness might be a natural accompaniment of
certain forms of neural representation, and while having an attentional system allows a
choice of what will be conscious in complex brains that have many options, this might not
be required by simple brains, with few options, to be conscious of anything at all!

It goes without saying that the neural states that support everyday human experiences must
be extremely complex. The contents of consciousness are constructed from different sense
modalities, and within a given sense modality, experiences can be of unlimited variety and
be exquisitely detailed. Complexity might also be a means whereby neural coalitions
compete for dominance (Tononi, 2007). However it does not follow from this that only
brains of similar complexity can support any experience. Once again we need to distinguish
the conditions for the existence of consciousness from the added conditions that determine
the many forms that it can take. The mechanisms required to select, co-ordinate, integrate
and disseminate conscious information in the human brain may not be required for simpler
creatures with simpler brains. Complex, highly differentiated brains are likely to be needed
to support complex, highly differentiated experiences. But it remains possible that relatively
simple brains can support relatively simple experiences.

Frogs, worms and molluscs

The visual system of the frog, for example, appears to be structured to respond to just four
stimulus features: a sustained contrast in brightness between two portions of the visual
field, the presence of moving edges, the presence of small moving spots and an overall
dimming of the visual field. This is a far cry from the variety and detail provided by the
human visual system. But there seems little reason to jump to the conclusion that the frog
sees nothing. Rather, as Lettvin, et. al. (1959) proposed, the frog may see just four things
relating to its survival. A sudden dimming of the light or a moving edge may indicate the
presence of a predator and is likely to initiate an escape response. Sustained differences in
brightness may allow the frog to separate water from land and lily pad. And moving spot
detectors may allow the frog to see (and catch) a moving fly at tongue's length.

As one continues to descend the evolutionary ladder, the plausibility of extrapolating from
human to non-human animal consciousness becomes increasingly remote. There may, for
example, be critical transition points in the development of consciousness which accompany
critical transitions in functional organisation (Sloman, 1997a, b). Self-awareness, for
example, probably occurs only in creatures capable of self-representation. That said,
phenomenal consciousness (of any kind) might only require representation. If so, even
simple invertebrates might have some rudimentary awareness, in so far as they are able to
represent and, indeed, respond to certain features of the world.

Planarians (flat worms) for example, can be taught to avoid a stimulus light if it has been
previously associated with an electric shock (following a classical conditioning procedure).
And simple molluscs such as the sea-hare Aplysia that withdraw into their shells when
touched, respond to stimulus "novelty." For example, they may habituate (show diminished
withdrawal) after repeated stimulation at a given site, but withdraw fully if the same
stimulation is applied to another nearby site. Habituation in Aplysia appears to be mediated
by events at just one centrally placed synapse between sensory and motor neurons (Uttal,
1978). This is very simple learning, and it is very difficult to imagine what a mollusc might
experience. But if the ability to learn and respond to the environment were the criterion for
consciousness, there would be no principled grounds to rule this out. It might be, for
example, that simple approach and avoidance are associated with rudimentary experiences
of pleasure and pain.

Is consciousness confined to brains?

It is commonly thought that the evolution of human consciousness is intimately linked to


the evolution of the neocortex (e.g. Jerison, 1985) and it seems likely that mid-brain as well
as cortical structures play a central role in determining the forms of consciousness that we
experience (see review in Velmans, 2009, chapter 11). However, whether consciousness first
emerged with the development of such subcortical and cortical structures, or whether there
is something special about the nature of brain cells that somehow “produces” consciousness
is less certain. As Charles Sherrington has pointed out, there appears to be nothing special
about the internal structure of brain cells that might make them uniquely responsible for
mind or consciousness. For,

"A brain-cell is not unalterably from birth a brain-cell. In the embryo-frog the cells
destined to be brain can be replaced by cells from the skin of the back, the back even
of another embryo; these after transplantation become in their new host brain-cells
and seem to serve the brain's purpose duly. But cells of the skin it is difficult to
suppose as having a special germ of mind. Moreover cells, like those of the brain in
microscopic appearance, in chemical character, and in provenance, are elsewhere
concerned with acts wholly devoid of mind, e.g. the knee-jerk, the light-reflex of the
pupil. A knee-jerk ‘kick’ and a mathematical problem employ similar-looking cells.
With the spine broken and the spinal cords so torn across as to disconnect the body
below from the brain above, although the former retains the unharmed remainder of
the spinal cord consisting of masses of nervous cells, and retains a number of
nervous reactions, it reveals no trace of recognizable mind…. Mind, as attaching to
any unicellular life would seem to be unrecognizable to observation; but I would not
feel that permits me to affirm that it is not there. Indeed, I would think, that since
mind appears in the developing source that amounts to showing that it is potential in
the ovum (and sperm) from which the source springs. The appearance of
recognizable mind in the source would then be not a creation de novo but a
development of mind from unrecognizable into recognizable." (Sherrington, 1942,
cited in Vesey, 1970, p323)

Unicellular organisms, fungi and plants

Indeed, given our current, limited knowledge of the necessary and sufficient conditions for
consciousness in humans, we cannot, as yet, rule out even more remote possibilities. If the
ability to represent and respond to the world, or the ability to modify behaviour consequent
on interactions with the world are the criteria for consciousness then it may that
consciousness extends not just to simple inverterbrates (such as Planaria) but also to
unicellular organisms, fungi and plants. For example, the leaflets of the Mimosa plant
habituate to repeated stimulation, i.e. the leaflets rapidly close when first touched, but after
repeated stimulation they re-open fully and do not close again while the stimulus remains
the same. Surprisingly, this habituation is stimulus-specific. For example, Holmes & Yost
(1966) induced leaflet closure using either water droplets or brush strokes, and after
repeated stimulation (with either stimulus) habituation occurred. But, if the stimulus was
changed (from water drops to brush strokes or vice-versa) leaflet closure re-occurred (see
also Applewhite, 1975, for a review).

For many who have thought about this matter, the transition from rudimentary
consciousness in animal life to sentience in plants is one transition too far. Perhaps it is. It is
important to note however that a criterion of consciousness based on the ability to respond
to the world does not prevent it. Nor, on this criterion, can we rule out the possibility of
consciousness in systems made of materials other than the carbon-based compounds that
(on this planet) form the basis for organic life. For example, silicon-based computers can in
principle carry out many functions that, in humans, we take to be evidence of conscious
minds. So how can we be certain that they are not conscious?

One should recognise too, that even a criterion for the existence of consciousness based on
the ability to respond or adapt to the world is entirely arbitrary. It might for example be
like something to be something irrespective of whether one does anything!
Panexperientialists such as Whitehead (1929) have suggested that there is no arbitrary line
in the ascent from microscopic to macroscopic matter at which consciousness suddenly
appears out of nothing. Rather, elementary forms of matter may be associated with
elementary forms of experience. And if they encode information they may be associated
with rudimentary forms of mind.

Can one draw a line between things that have consciousness and those that don’t?
Where then should one draw the line between entities that are conscious and those that
are not? Theories about the distribution of consciousness divide into continuity and
discontinuity theories. Discontinuity theories all claim that consciousness emerged at a
particular point in the evolution of the universe. They merely disagree about which point.
Consequently, discontinuity theories all face the same problem. What switched the lights
on? What is it about matter, at a particular stage of evolution, which suddenly gave it
consciousness? As noted above, most try to define the point of transition in functional
terms, although they disagree about the nature of the critical function. Some think
consciousness “switched on” only in humans, for example once they acquired language or a
theory of mind. Some believe that consciousness emerged once brains reached a critical size
or complexity. Others believe it co-emerged with the ability to learn, or to respond in an
adaptive way to the environment.

As noted above, such theories confuse the conditions for the existence of consciousness
with the conditions that determine the many forms that it can take. Who can doubt that
verbal thoughts require language, or that full human self-consciousness requires a theory of
mind? Without internal representations of the world, how could consciousness be of
anything? And without motility and the ability to approach or avoid, what point would
there be to rudimentary pleasure or pain? However, none of these theories explains what it
is about such biological functions that suddenly switch on consciousness.

The co-evolution of conscious experiences with their associated material forms

Continuity theorists do not face this problem for the simple reason that they do not believe
that consciousness suddenly emerged at any stage of evolution. Rather, as Sherrington
suggests above, consciousness is a “development of mind from unrecognisable into
recognisable.” On this panpsychist or panexperientialist view, all forms of matter have an
associated form of consciousness (see Skrbina, 2005a, b; De Quincy, 2002; Weber & Desmond,
2008; Seager, 2012; Weekes, 2012). In the cosmic explosion that gave birth to the universe,
consciousness co-emerged with matter and co-evolves with it. As matter became more
differentiated and developed in complexity, consciousness became correspondingly
differentiated and complex. The emergence of carbon-based life forms developed into
creatures with sensory systems that had associated sensory “qualia.” The development of
representation was accompanied by the development of consciousness that is of something.
The development of self-representation was accompanied by the dawn of differentiated
self-consciousness and so on. On this view, evolutionary theory can in principle account for
the different forms that consciousness takes. But, consciousness, in some primal form, did
not emerge at any particular stage of evolution. Rather, it was there from the beginning. Its
emergence, with the birth of the universe is neither more nor less mysterious than the
emergence of matter and energy.

Most discontinuity theorists take it for granted that consciousness could only have appeared
(out of nothing) through some random mutation in complex life forms that happened to
confer a reproductive advantage that can be specified in third-person functional terms. This
deeply ingrained, pre-theoretical assumption has set the agenda for what discontinuity
theorists believe they need to explain. Within cognitive psychology, for example,
consciousness has been thought by one or another theorist to be necessary for every major
phase of human information processing, for example in the analysis of complex or novel
input, learning, memory, problem solving, planning, creativity, and the control and
monitoring of complex, adaptive response. It should be apparent that continuity theory
shifts this agenda. The persistence of different, emergent biological forms may be governed
by reproductive advantage. If each of these biological forms has a unique, associated
consciousness, then matter and consciousness co-evolve. However, conventional
evolutionary theory does not claim that matter itself came into being, or persists through
random mutation and reproductive advantage. According to continuity theory, neither does
consciousness.

Which view is correct? One must choose for oneself. However, in the absence of anything
other than arbitrary criteria for when consciousness suddenly emerged, continuity theory is
arguably more elegant. Continuity in the evolution of consciousness favours continuity in the
distribution of consciousness, although there may be critical transition points in the forms of
consciousness associated with the development of life, representation, self-representation,
and so on.

SUMMARY AND CONCLUSIONS

Darwinian evolutionary theory has powerful explanatory value within biology, so it is not surprising
that various attempts have been made to apply it to a more general understanding of the human
condition within psychology and the social sciences. Viewed subjectively, human consciousness is
central to the human condition, so it is important to assess whether random variations in the
genome that confer a selective, reproductive advantage can also explain the existence
consciousness.

To inform this evaluation, I have reviewed what is known about the conditions for consciousness
within the human mind/brain, understood in both structural (neural) terms and functional terms (in
terms of human information processing). Current research suggests that there is no specific place in
the brain where consciousness is generated, nor is there a special kind of neuron for each distinct
modality of experience (vision, audition, etc.) although different affective qualia appear to be
associated with different neurotransmitters (pleasure with dopamine, anxiety with
noradrenaline, and so on). Different sensory qualia also appear to be linked to the
functional organisation of the neural nodes that process the features associated with those
qualia. However, why certain forms of neural activity are associated with consciousness is
not known. Although only some forms of neural activity are eligible for consciousness, and
coalitions amongst synchronously firing neurons into dominant patterns appear to play a
role in deciding which of these eligible neural activities become conscious, there appears to
be nothing special in the activities of individual neurons that makes them conscious (other
than doing more of what they normally do or doing this in synchrony with other neurons).
This is one aspect of what is sometimes known as the “hard problem” of consciousness—for
which “random variations in the genome” provide no explanatory mechanism.

We then considered the many functional advantages that have been proposed for
phenomenal consciousness once it emerges, ranging from the ability to deal with novel,
complex input stimuli to enabling voluntary choices and decisions and complex skills such as
speech production, reading, thinking and so on. Although this paper does not have space to
present a full analysis of all these processes, a close examination of the literature on how
the conscious experiences that accompany such tasks relate to the detailed information
processing required to perform them suggests that the phenomenal experiences, in each
case, follow the information processing to which they most closely relate and therefore
arrive too late to enter into them. The complex details of human information processing are
in any case unconscious, and, provided that they are well specified, could in principle be
carried out on a non-conscious machine. While not challenging the subjective importance of
conscious experiences to human life, such findings do challenge the Darwinian requirement
that such experiences could only have persisted (once emergent) if they enhanced
reproductive fitness.

The paper finally turns to what can be said about the wider distribution of consciousness in
non-humans. Given that evolutionary theory has little to say about the sufficient conditions
for, or adaptive functions of phenomenal consciousness in human brains, current theories
about the distribution of consciousness largely reflect the pre-theoretical commitments of
the protagonists, ranging from the anthropocentric view that only humans are conscious to
the pan-experientialist view that, in some sense, everything is conscious—although that
sense might be unrecognisable by humans. Broadly speaking, these widely ranging views
can be categorised into continuity theories versus discontinuity theories.

Discontinuity theorists argue that consciousness only emerged from an entirely non-
conscious universe once a given evolutionary development took place, although they
disagree about which transition is the critical one (e.g. when creatures acquired motility,
adaptive response, emergence of brains, brains of a given complexity, acquisition of
language, etc.). Discontinuity theories all face the so-called “hard problem” of
consciousness: In a previously non-conscious universe, what is it about the emergence of a
particular biological function that suddenly “switches on the light of consciousness”? And, if
the adaptive advantage of the chosen biological function can be explained without
consciousness (as, viewed from a third-person perspective, it always can be) why should
that function be responsible for the emergence of consciousness?

Continuity theorists do not face this problem for the simple reason that they do not believe
that consciousness suddenly emerged at any stage of evolution. Rather, consciousness is
basic to the universe in the same sense that matter is basic, although in its more primal
forms it progressively loses its resemblance to human consciousness and might be
unrecognisable to humans. At the same time, consciousness has a discoverable, natural
relationship to material forms (a point on which continuity and discontinuity theories
agree). Consequently, as the material forms evolve, for example in the way that species
evolve, their associated experiences co-evolve—providing a mechanism by which the varied
forms of consciousness can be influenced by Darwinian evolution. In sum, although
evolutionary theory (on present evidence) cannot explain the existence of consciousness
(anymore than it can explain the existence of matter) it can, through the principle of co-
evolution, explain the development of some of its basic, species-specific forms.

There are various other ways beyond the scope of this paper in which consciousness may be
said to evolve. Within a human, socio-cultural and historical context, communal ways of
understanding social issues have a developmental history sometimes thought of as “raising
consciousness” about those issues in ways that lead to social change. In ways addressed by
psychotherapeutic, meditative and related practices the conscious experiences of
individuals can also (potentially) be said to “evolve” in ways that promote well being.
Although there are large literatures devoted to such methods of individual and social
change, they make little sense within an understanding of the world of the kind promoted
by the more extreme, reductive-materialist defenders of Darwinism. While the analysis
above recognises the important role of Darwinian principles in the evolution of human
consciousness it does so in an entirely non-reductive way. Although it closely associates
given conscious experiences with given brain states and functions, it does not require
conscious experiences to be reduced to their correlated brain states or functions. Nor does
it require experiences to be illusions or tricks of the brain whose only purpose is the
enhanced propagation of genes. I do not have space to elaborate on what a non-reductive
understanding of consciousness and its functions might be like here, although I have done
so extensively elsewhere (see, for example, Velmans, 2003, 2009, chapters 6 to 14). Suffice
it to say that what it is like to experience the world is of profound first-person significance to
human life, as without it, life would be like nothing.

REFERENCES

Applewhite, P. B. 1975. Learning in bacteria, fungi, and plants. In: W. C. Corning. J. A. Dyal, and
A.O. D. Willows. eds. Invertebrate Learning, Vol.3: Cephalopods and Echinoderms. New York
and London: Plenum Press.

Atkinson, R.C. and Shiffrin, R.M., 1968. Human memory: A proposed system and its control
processes. In: K. W. Spence and J. T. Spence (eds.) The Psychology of Learning and Motivation,
Vol. 2, New York: Academic Press.

Baars, B.J., 1988. A Cognitive Theory of Consciousness. New York: Cambridge University Press.

Baars, B.J., 1991. A curious coincidence? Consciousness as an object of scientific scrutiny fits
our personal experience remarkably well. Behavioral and Brain Sciences, 14(4), pp. 669-670.

Baars, B.J., 2007. The global workspace theory of consciousness. In: M.Velmans and S.
Schneider, eds. The Blackwell Companion to Consciousness. Malden, MA: Blackwell, pp. 236-
246.

Baars, B.J., Fehling, M.R., LaPolla, M. and McGovern, K., 1997. Consciousness creates access:
Conscious goal images recruit unconscious action routines, but goal competition serves to
“liberate” such routines, causing predictable slips. In J.D. Cohen and J.W. Schooler, eds.
Scientific Approaches to Consciousness. Hillsdale, N.J.: Lawrence Erlbaum Associates, pp. 423-
444.

Baars, B.J. and McGovern, K., 1996. Cognitive views of consciousness: What are the facts? How
can we explain them? In: M. Velmans (ed.) The Science of Consciousness: Psychological,
Neuropsychological, and Clinical Reviews, London: Routledge, pp. 63-95.
Baddeley, A.D., 1993. Working memory and conscious awareness. In: A.F. Collins, S.E.
Gathercole, M.A. Conway and P.E. Morris, eds. Theories of Memory. Hillsdale, N.J: Lawrence
Erlbaum Associates, pp. 11-28.

Beshkar, M., 2008. Animal consciousness. Journal of Consciousness Studies, 15(3), pp. 5-33.

Broadbent, D.E., 1958. Perception and Communication. New York: Pergamon Press.

Carruthers, P., 1998. Natural theories of consciousness. European Journal of Philosophy, 6(2),
pp. 203-222.

Crick, F. and Koch, C., 2007. A neurobiological framework for consciousness. In M.Velmans and
S. Schneider, eds. The Blackwell Companion to Consciousness. Malden, MA: Blackwell, pp. 567-
579.

Crook, J.H., 1980. The Evolution of Human Consciousness. Oxford: Clarendon Press.

Dawkins, M.S., 1998. Through Our Eyes Only? The Search for Animal Consciousness. Oxford:
Oxford University Press.

Dehaene, S. and Naccache, L. 2001. Towards a cognitive neuroscience of consciousness: Basic


evidence and a workspace framework. Cognition, 79, pp. 1-37.

De Quincy, C., 2002, Radical Nature: Rediscovering the Soul of Matter. Montpelier, VT: Invisible
Cities Press.

Dixon, N.F., 1981. Preconscious Processing. Chichester: Wiley.

Edelman, G. M. and Tononi, G., 2000. A Universe of Consciousness: How Matter Becomes
Imagination. New York, NY: Basic Books.

Ericsson, K.A. and Simon, H., 1984. Protocol Analysis: verbal reports as data, Cambridge, Mass.:
MIT Press.

Gallup, C. G., 1977. Chimpanzees: self-recognition. Science 167, pp. 86-87.

Gallup, C. G., 1982. Self-awareness and the emergence of mind in primates. American Journal
of Primatology, 2, pp. 237-248.

Hauser, M.D., Kralik, J., Botto-Mahan, C., Garrett, M. and Oser, J., 1995. Self-recognition in
primates: phylogeny and the salience of species-typical features. Proceedings of the
Academy of Sciences of the United States of America, 92, pp. 10811-10814.
Holmes, E. and Yost, M., 1966. Behavioral studies in the sensitive plant. Worm Runners
Digest, 8, p.38.

Humphrey, N., 1983. Consciousness Regained. Oxford: Oxford University Press.

Humphrey, N., 2010. Soul Dust: The Magic of Consciousness, London, Quercus.

Huxley, T. H., 1874, On the Hypothesis that Animals are Automata, and its History (Address,
British Association for the Advancement of Science, Belfast), reprinted from T. H. Huxley,
1898, Collected Essays, Vol 1, London: MacMillan.
James, W., 1890. The Principles of Psychology. New York: Henry Holt.

Jaynes, J., 1979. The Origin of Consciousness in the Breakdown of the Bicameral Mind, London:
Allen Lane.

Jerison, H.J., 1985. On the evolution of mind. In: D. A. Oakley, ed. Brain and mind, London:
Methuen, pp. 1-31.

John, E.R., 1976. A model of consciousness. In: G. Schwartz and D. Shapiro, eds. Consciousness
and Self-Regulation, New York: Plenum Press, pp. 1-50.

Johnson-Laird, P.N., 1988. A computational analysis of consciousness. In: A. Marcel and E.


Bisiach, eds. Consciousness and Contemporary Science, Oxford: Oxford University Press, pp.
357-368.

Lettvin, J.Y., Maturana, H.R, McCulloch, W.S. and Pitts, W.H., 1959. What the frog’s eye tells
the frog’s brain. Institute of Radio Engineer’s Proceedings, 47, pp. 1940-1951.

Mandler, G., 1975. Mind and Emotion, New York: Wiley.

Mandler, G., 1985. Cognitive Psychology: An essay in cognitive science, Hillsdale, NJ: Erlbaum.

Mandler, G., 1991. The processing of information is not conscious, but its products often are.
Behavioral and Brain Sciences, 14(4), pp. 688-689.

Mandler, G., 1997. Consciousness redux. In: J.D. Cohen and J.W. Schooler, eds. Scientific
Approaches to Consciousness, Hillsdale, N.J.: Lawrence Erlbaum Associates, pp. 479-498.

Miller, G., 1962. The Science of Mental Life, Gretna, LA: Pelican Books.

Norman, D., 1969. Memory and Attention: an introduction to human information processing,
New York: Wiley.

Panksepp, J., 2007. Affective consciousness. In: M.Velmans and S. Schneider (eds.) (2007) The
Blackwell Companion to Consciousness. Malden, MA: Blackwell, pp 114-129.

Pashler, H. (1999) The Psychology of Attention. London: MIT Press.

Plotnik, J.M., de Waal, F.B. and Reiss, D. (2006) ‘Self-recognition in Asian elephant’,
Proceedings of the National Academy of Sciences of the United States of America, 103,
17053-17057.

Popper, K.R. and Eccles, J.C. (1993[1976]) The Self and its Brain, London: Routledge.

Reiss, D. and Marino, L. (2001) ‘Mirror self-recognition in the bottlenose dolphin: A case of
cognitive convergence’, Proceedings of the Scientific Academy of the United States of America,
98, 5937-5942.

Romanes, G.J. 1896 [1885], Mind and Motion (Rede Lecture). Reprinted in G.J. Romanes Mind
and Motion and Monism, London: Longmans, Green & Co.

Seager, W., 2012, emergentist panpsychism. In M.Velmans and Y. Nagasawa (eds.) Monist
Alternatives to Physicalism, Special Issue of the Journal of Consciousness Studies (in press).
Schacter, D.L., 1990. Toward a cognitive neuropsychology of awareness: implicit knowledge
and anosognosia. Journal of Clinical and Experimental Neuropsychology, 12(1), pp. 155-178.

Shallice, T.R., 1978. The dominant action system: an information processing approach to
consciousness. In: K.S. Pope and J.L. Singer, eds. The Stream of Consciousness: scientific
investigations into the flow of experience, New York: Plenum, pp. 117-157.

Shallice, T.R., 1988. Information processing models of consciousness: possibilities and


problems. In: A. Marcel and E. Bisiach, eds. Consciousness and Contemporary Science, Oxford:
Oxford University Press, pp. 305-333.

Sherrington, C.S., 1942. Man on His Nature, Cambridge: Cambridge University Press.

Shillito, D.J., Gallup, G.G. and Beck, B.B., 1999. Factors affecting mirror behaviour in western
lowland gorillas. Animal Behavior, 55, pp. 529-536.

Singer, W., 2007. Large-scale temporal coordination of cortical activity as a prerequisite for
conscious experience. In: M.Velmans and S. Schneider (eds.) (2007) The Blackwell Companion
to Consciousness. Malden, MA: Blackwell, pp. 605-615.

Skrbina, D., 2005a. Panpsychism. In: Stanford Encyclopedia of Philosophy,


http://plato.stanford.edu/entries/panpsychism/

Skrbina, D., 2005b. Panpsychism in the West, Cambridge, MA: MIT Press.

Sloman, A., 1997a. Design spaces, niche spaces and the “hard” problem.
http://66.102.9.104/search?q=cache:wo--
I9fMte8J:www.cs.bham.ac.uk/research/projects/cogaff/Sloman.design.and.niche.spaces.ps
+Design+spaces,+niche+spaces+and+the+%22hard%22+problem%27,&hl=en&ct=clnk&cd=3

Sloman, A., 1997b. What sorts of machine can love? Architectural requirements for human-like
agents both natural and artificial’.
http://66.102.9.104/search?q=cache:TEfwTa36DlUJ:www.cs.bham.ac.uk/research/projects/co
gaff/Sloman.voicebox.2page.ps+What+sorts+of+machine+can+love&hl=en&ct=clnk&cd=1

Styles, E., 1997. The Psychology of Attention, Hove: Psychology Press Ltd.

Tobach, E., Skoilnick, A.J., Klein, I. and Greenberg, G., 1997. Viewing of self and nonself images
in a group of captive orangutangs (Pongo pygmaeus Abellii). Perceptual and Motor Skills, 84,
pp. 355-370.

Tononi, G., 2007, The information integration theory. In M. Velmans and S. Schneider, eds. The
Blackwell Companion to Consciousness. Malden, MA: Blackwell, pp. 287-299.

Uttal, W.R., 1978. The Psychobiology of Mind. Hillsdale, NJ: Lawrence Erlbaum.

van Swinderen, B., 2007. Attention-like processes in Drosophilia require short-term memory
genes. Science, 315, pp. 1590-1593.

Velmans, M. 1991a. Is human information processing conscious? Behavioral and Brain Sciences
14(4), pp. 651-669.
Velmans, M., 1991b. Consciousness from a first-person perspective. Behavioral and Brain
Sciences 14(4), 702-726.

Velmans, M., 2003. How could conscious experiences affect brains? Exeter: Imprint Academic.

Velmans, M. 2009, Understanding Consciousness Edition 2. London: Routledge/Psychology


Press.

Velmans, M., 2011. Can evolutionary theory explain the existence of consciousness? A
review of N. Humphrey, 2010, Soul Dust: The Magic of Consciousness, London: Quercus.
Journal of Consciousness Studies, 18 (11-12), pp. 243-254.

Vesey, G.N.A., ed., 1970. Body and Mind: readings in philosophy, London: George Allen &
Unwin.

Waugh, N.C. and Norman, D.A., 1965. Primary memory. Psychological Review, 72, pp. 89-104.

Weber, M and Desmond, Jr., W., eds., 2008. Handbook of Whiteheadian Process Thought,
Frankfurt / Lancaster: Ontos Verlag.

Weekes, A., 2012, The mind-body problem and Whitehead’s nonreductive monism. In: M.
Velmans and Y. Nagasawa, eds. Monist Alternatives to Physicalism, Special Issue of the
Journal of Consciousness Studies (in press).

Whitehead, A.N., 1957[1929]. Process and Reality, New York: Macmillan.

View publication stats

You might also like