Available online at www.sciencedirect.com
Educational Research Review 2 (2007) 75–97
Can we be intelligent about intelligence?
Why education needs the concept of plastic general ability
Philip Adey a,∗ , Benő Csapó b,1 , Andreas Demetriou c,2 , Jarkko Hautamäki d,3 , Michael
Shayer a,4
a
b
King’s College London Department for Education and Professional Studies, Franklin Wilkins Building,
Stamford Street, London SE1 9NH, UK
Graduate School of Educational Sciences, Department of Education, University of Szeged, H-6722 Szeged, Petőfi sgt. 30-34, Hungary
c University of Cyprus, P.O. Box 20537, 1678 Nicosia, Cyprus
d Centre for Educational Assessment, P.O. Box 26, University of Helsinki, Helsinki FIN-00014, Finland
Received 5 November 2006; received in revised form 29 April 2007; accepted 2 May 2007
Abstract
The notion of general cognitive ability (or ‘intelligence’) is explored and why the time might now be ripe for educators to
re-consider the power offered by a general intellectual capacity which is itself amenable to educational influence. We review
existing knowledge concerning general intelligence, including the cohabitation of general and special abilities, cognitive modules,
development, and evidence for plasticity of the general processor. We examine why this knowledge is generally absent from
educational practice and present a number of models that attempt to synthesise the main aspects of current psychological theories.
We explore how the models might be used in educational applications and look at examples of effective cognitive stimulation
considering both practicalities and theoretical notions of what in our cognitive models is affected by stimulation. We discuss finally
the possible political, cultural and social barriers to the inclusion of general ability as central to educational aims.
© 2007 Elsevier Ltd. All rights reserved.
Keywords: Intelligence; General mental ability; Specialised abilities; Cognitive stimulation; Academic achievement
1. Part I: Characteristics of general ability and some explanatory models
1.1. Introduction
In modern societies where the content of applicable knowledge changes rapidly, educational theorists, researchers,
developers and practitioners are searching for something stable in the mind that helps adaptation throughout the life
Corresponding author. Tel.: +44 20 8785 2990.
E-mail addresses:
[email protected] (P. Adey),
[email protected] (B. Csapó),
[email protected] (A. Demetriou),
[email protected] (J. Hautamäki),
[email protected] (M. Shayer).
URL: www.staff.u-szeged.hu/∼csapo (B. Csapó).
1 Tel.: +36 62 420034; fax: +36 62 544354.
2 Tel.: +357 22892196; fax: +357 892207.
3 Tel.: +358 9 191 44163.
4 Tel.: + 44 1954 231814.
∗
1747-938X/$ – see front matter © 2007 Elsevier Ltd. All rights reserved.
doi:10.1016/j.edurev.2007.05.001
76
P. Adey et al. / Educational Research Review 2 (2007) 75–97
span. It is quite clear that just transmitting the core knowledge of scientific disciplines – however up-to-date that
knowledge is – within the traditional school subjects does not solve the problem. This tendency is marked visibly for
example by the efforts of international organizations that are searching for valid indicators of achievement in school
systems (see the PISA frameworks, especially the notion of Cross Curricular Competencies, OECD, 2000; and the
assessment of complex problem solving, OECD, 2003). Other projects are looking for new ways of setting goals for
education (see Rychen & Salganik, 2001, also the EU framework of key competencies which includes the idea of
“learning to learn”, Hautamäki et al., 2002). Throughout the history of education, at least since Aristotle, the goal
of improving general abilities has frequently been raised but in the absence of adequate theoretical frameworks or
supporting scientific knowledge, these intentions could not be fulfilled.
All these efforts point in one direction, although they avoid naming the key construction they are looking for: this is
the conception of a general ability or intelligence. We believe that without re-introducing general cognitive ability into
the educational discourse, efforts at theorizing educational goals, setting standards and conceptualizing assessment
frameworks cannot be successful. But what about the semantics? What should we call this general ability? ‘Intelligence’
may have fallen into disrepute, but ‘general ability’ seems just a bit too general, and phrases such as ‘general mental
ability’ or ‘general cognitive ability’, quite common in the literature and apparently acceptable to many educators,
are rather awkward. As a compromise, and in the interests of varying style, we propose to use all of those terms
interchangeably throughout this paper. But in doing this, there should be no doubt that the construct we are talking
about is the one that cognitive psychologists call “intelligence”, notwithstanding the fact that the same word has rather
different, and often distasteful, connotations in the world of education. To some extent we would like to rehabilitate
the use of the word with its proper psychological meaning, because in recent years there has been powerful research in
cognitive neuroscience (Jung & Haier, in press) and evolutionary psychology (Geary, 2005) suggesting its importance
both as an explanatory construct and as a construct that may direct interventions. Once one appreciates the plasticity
of intelligence and sees it as perhaps the most complete way of both accounting for individual differences, intellectual
development, and learning, and of offering prescriptions for raising academic achievement across contexts, we believe
it becomes hard to deny intelligence its proper place in the armament of educators. We are here taking the optimistic
line that the world of education has matured somewhat since 1996, when it effectively ignored a seminal paper in
Educational Researcher on the subject by Robert Sternberg: Myths, Countermyths, and Truths about Intelligence
(Sternberg, 1996; see also Cianciolo & Sternberg, 2004).
1.2. The Problem
We have been struck by the disparity which exists between the conceptions of ‘intelligence’ as routinely debated
within the psychological literature, and the conceptions of intelligence which find their way into educational practice and
folklore. If you ask teachers anywhere in the world what counts to them as a ‘clever’ or ‘smart’ response from a student,
you get a remarkably consistent sets of characteristics (Adey, 2007). High in the lists of common answers are a set
of essentially convergent abilities (‘thinks logically’, ‘applies knowledge from one context to another’, ‘demonstrates
deep understanding of a concept’) and also a set of generally divergent abilities (‘creative’, ‘asks surprising questions’,
‘goes beyond the given’). These responses explicate professional, intuitive, experienced-based conceptions of general
ability. But as soon as you try to suggest that a good word to describe this general ability is ‘intelligence’, you encounter
resistance. One may speculate about the reasons for this disparity: is it because of the rather idiosyncratic uses of the
word now common in educational circles, such as ‘multiple intelligences’ or ‘emotional intelligence’? Or is it because
the word intelligence became associated in the 1970s with the idea of something fixed, immutable, inherited, and even
as the basis for a spurious racism? Actually the reasons do not matter, but what does matter is that an idea which
offers a powerful explanatory mechanism for many of the accepted, but essentially pragmatic, ideas currently offered
as guides to educators (such as expert-novice distinctions, powerful learning environments, or conceptual change) has
been repressed.
In this paper we would like to explore the nature of general cognitive ability, and in particular to show not only that
it is a valid construct (it corresponds to something in reality), but that being general does not equate to being fixed.
It is precisely because this general ability is (a) general and (b) modifiable that it offers such a powerful opportunity
to educators. In this exploration we will need to re-visit some old arguments such as that which pits general versus
special abilities, the question of what develops in cognitive development, the nature of individual differences and
most importantly that related to the influence of the environment on intellectual development. We will draw on the
P. Adey et al. / Educational Research Review 2 (2007) 75–97
77
established literature in the field and from our own empirical work and look at a series of models which explain at least
some of the characteristic features of intelligence. From this we will draw out implications for educational policy and
practice.
One might reasonably ask: what is the practical value of intelligence research and theorizing such that a new round
of discussion would be something more than a simple academic exercise? A straightforward answer to this question
comes from research on the validity of different real life measures which suggests that the most valid predictor of future
performance and learning of new higher-level employees is their general mental ability. This is so because the more
intelligent one is the more job-relevant knowledge and skills one acquires and the better one can use them (Schmidt &
Hunter, 1998). In fact, intelligence is also the best predictor of future career development (Gottfredson, 2003).
During the 20th century, three main scientific paradigms conceptualised the question of cognitive abilities. Firstly
there is the psychometric approach, which converged on the conception of general intelligence, then there is the developmental approach mapped out by Piaget, and finally (and under the influence of computer-like models of the mind),
there is the information processing approach. For a variety of reasons, although much promoted by educational theorists
(a few of whom actually realised their theories to the extent of devising curricula based on them), these paradigms
did not find their ways into mainstream educational practice as represented by textbooks, national curricula, national
and local examinations, or actual classroom practice. We have already touched on the problem with ‘intelligence’ as
conceived by educators, where ideologically and politically heated debates distorted its scientific meaning. Although
intelligence research accumulated a large amount of scientifically sound, practically relevant knowledge, this knowledge is largely lost to the educational theorist because of essentially semantic problems with the term. At the same time,
the Piagetian developmental model was often misunderstood as implying some sort of lock-step progression through
‘stages’, impervious to environmental influence. Nothing could be further from the equilibration model propounded
by Piaget. As for information processing, only a limited sub-set of its essential features (e.g. expert-novice differences)
are routinely applied in education. The conception of expert knowledge is often exaggerated so that the role of content
and context is overestimated, and the role of transfer is underestimated. Expertise is only one type of valid knowledge.
From a theoretical point of view the main problem is that each of these paradigms explains only one aspect of the
question. This is why we will need to look at a series of models which synthesise several aspects of these paradigms
in an attempt to improve both explanatory power and acceptability. Before we present these models, we will discuss
five characteristics of intelligence which have been prominent in our experience of working with and arguing about
the concept, and which commonly arise in the academic and popular literature (e.g. Murray, 2007; Sternberg, 1996):
connectivity, general versus special, modules, development, and plasticity. Models of intelligence will have to offer
explanations for some, at least, of these characteristics.
1.3. The nature of general ability/intelligence
We have seen that practitioners’ conceptions of general ability always include features which require that the student
make some sort of connection in their mind between, for example, one context and another, actuality and possibilities,
or between data items such that a pattern is perceived. The professionals who are in the business of teaching and
learning therefore see connectivity as a central characteristic of smart behaviour. ‘Connectivity’ here is used in the
simple sense of the conscious or unconscious making of connections in the mind between one idea and another. In this
sense, making comparisons, extrapolating, relating causes to effects, or the elucidation of any relationship between
variables all involve connectivity. These characteristics actually accord very well with ideas of intelligence developed
by the founding fathers of the idea. In the classic theory advanced by Spearman (1927), the education of relations and
of correlates are the two fundamental attributes of general intelligence. The first refers to first-order direct relations
between objects or concepts and the second refers to second-order relations between relations. For example, in the
statement ‘London is to the UK as Paris is to France’ the education of relations enables the thinker to understand the
relation between each pair (that London is the capital of the UK and Paris is the capital of France) and the education
of correlates allows him to generalize and reduce the two relations to a general one (the capital of the country). This
is tantamount to saying that intelligence is a process allowing the person to see relations, draw connections between
objects and events in the environment, and extrapolate or go beyond the information given (Bruner, 1968). Perceiving
the relations and connections between aspects of the environment allow one to understand a new concept, solve a
new problem, or formulate a new idea by a systematic transformation of the concepts, problem-solving skills, or ideas
already available. These properties are also considered to underlie fluid intelligence (Cattell, 1971).
78
P. Adey et al. / Educational Research Review 2 (2007) 75–97
In a recent paper about the nature of fluid intelligence, Blair (2006) argued that neurological evidence suggests
that fluid intelligence is distinct from psychometric g traditionally defined as a higher order general factor. According
to Blair (2006) the main components of fluid intelligence are working memory and executive control and these are
related to emotional processes which function as an orienting mechanism for intellectual functioning. This general
conception of fluid intelligence seems to be supported by modern psychometric evidence (Demetriou, 2004), although
scholars add to the list of fluid components some other constructs such as selective attention (Cowan, 2006), planning,
and abstraction (Garlick & Sejnowski, 2006). All of these constructs converge to a common denominator for fluid
intelligence: intellectual plasticity, which is both a developmental and a differential construct. From the point of view
of development, it reflects the brain’s and thus the mind’s increasing ability to construct and use goal-relevant and
appropriate mental structures and skills and the responsiveness of the developing brain to environmental stimuli.
From the differential point of view, it reflects the fact that differences between individuals in plasticity will eventually
demonstrate themselves as differences in actual intellectual constructions and achievements.
Recent neuroscience and genetic research provide strong biological support to this interpretation. Specifically, it is
now well established that a number of areas in the frontal and the parietal lobes of the brain are closely and causally
associated with general intelligence (Jung & Haier, in press). Moreover, there are aspects of brain functioning, such as
energy consumption as indicated by glucose metabolism, that are closely associated with g. That is, more intelligent
individuals use less brain volume and therefore less energy to process a task than individuals of average intelligence
(Jausovec, 2000). Garlick (2002) proposed that the underlying neural correlate of g might be the plasticity of the nervous
system to respond appropriately to stimulation and build the required networks that may be spatially differentiated and
distributed for different kinds of tasks.
It remains the case, however, that a single general intelligence is not sufficient to account for actual performance
and achievements in real life, and this suggests that we need to look also at the role of more specialised abilities that
contribute to overall cognitive functioning.
1.4. General versus special
Just how general is general ability? Does it underlie all functioning of the mind, or just all cognitive functioning,
or is there no such thing: are all mental abilities entirely context dependent? The extreme positions in this debate may
be characterised by, on the one hand, the overarching importance given to general intelligence or ‘g’ by some scholars
(e.g. Jensen, 1998; Schmidt & Hunter, 1998) as the most important determinant of all academic ability and, on the
other hand, the priority given by some others (e.g. Gardner, 1993; Thurstone & Thurstone, 1941) to distinct abilities
which independently frame academic performance in different domains of knowledge and learning. We can find no
support in the current psychological literature for either of these extreme views. On the one hand, the operation of
general constraints on intellectual performance is strongly supported by the fact that significant and often substantial
correlations are obtained between tests of different types of cognitive abilities for example spatial, numerical, verbal
(Anderson, 1992; Deary, 2000). On the other hand, only a part no more than 25–30% of the variance in different domains
of learning and knowledge can be accounted for by measures of general intelligence, such as IQ (Demetriou & Andreou,
2006). To capture and explain performance in different domains of knowledge, one needs to have information about
each of the domains as well as about the general cognitive processor.
What emerges from the vast literature that exists on the structure of general abilities that both general and specialised
processes are interwoven in the human mind. This is clearly explicated in Carroll’s (1993) massive re-working of factor
analyses of over 450 datasets collected from the 1930s to the 1980s. Carroll’s work offers a three-strata model of abilities,
with general intelligence, g, at the apex, a series of broad abilities at an intermediate level (fluid intelligence, crystallised
intelligence, general memory and learning, broad visual perception, broad auditory perception, broad retrieval ability,
broad cognitive speediness, processing speed) and then many specialised component abilities within each of the
second stratum broad abilities. To take one example, ‘Crystallised Intelligence’ is a common factor in, amongst others,
language development, lexical knowledge, reading comprehension, foreign language aptitude, listening ability and
writing ability.
In a review of relationships between various behavioural, medical, and social phenomena, and individual differences
in general intelligence, Lubinski and Humphreys (1997) concluded that the relations are so strong and consistent that
to ignore the general factor would be to commit the logical fallacy of the “neglected aspect,” by excluding important
relevant evidence. They argue that: “ignoring the possibility that general intelligence holds causal status for a variety
P. Adey et al. / Educational Research Review 2 (2007) 75–97
79
of critically important behaviours is no longer scientifically respectable. We must let general intelligence compete with
other putative causes. We must answer questions concerning its scientific status empirically” (p. 190).
Several other models coming from the psychometric (Gustafsson, 1984; Gustafsson & Undheim, 1996), the cognitive
(Anderson, 1992) and the developmental traditions (Demetriou, 2004; Demetriou, Christou, Spanoudis, & Platsidou,
2002) converge on the delineation of the mind as a hierarchical system that involves both general-purpose and specialised functions and processes that interact dynamically both during on-line problem solving and in ontogentic time.
Gustafsson (2002) has recently reminded us that with hierarchical models such as this, the most general factor is actually best measured by a heterogeneous set of tests acting together. This result seems to be counter-intuitive, given that
the strength of the relationships between g and the tests of specialised abilities are not particularly high. Nevertheless,
the g-factor is, to some extent, present in every ‘special’ test and using a varied (within certain limits) set of tests allows
the most comprehensive reliable measure of g to emerge as the factor that each individual test has in common. This
means that the search for a ‘pure’ homogeneous test of g (or for any of the special abilities) is actually fruitless.
We should perhaps note here that the interpretation of factor analysis of human abilities as demonstrating a single
general factor is not universally accepted. Soon after Spearman offered his first hierarchical model, Thompson (1916)
demonstrated that a hierarchy did not necessarily imply a single general factor and, at the other chronological extreme,
Horn (2007) is not the first to make the point that just because a ‘g’ can be extracted from various sets of tests does
not mean that it is the same ‘g’ in every case. We maintain, nonetheless, that for all practical purposes a more or less
unitary general intellectual factor remains the most parsimonious explanation of the factor analyses and the most useful
construct on which to base educational practice, especially cognitive intervention - of which more anon.
Given the strength of the evidence in support of the operation of general constraints on intellectual functioning, it
might be pertinent to ask why the idea of entirely autonomous ‘multiple intelligences’ is apparently so popular with
some educators, in spite of the absence of any convincing evidence for the independence of each of the supposed
intelligences from one another. We can only speculate here that the proposal of independent multiple intelligences
arose as a very understandable reaction against a literature from some high profile American advocators of the role of
general intelligence, such as Herrnstein and Murray (1994) and Jensen (1973), who promoted general intelligence as
not only central to all human performance, but also as largely inherited and responsible for social and ethnic differences
in performance and life success. Quite apart from the fact that this approach has been associated with racist and classist
stances towards individual and group differences, which is properly repugnant to liberal educators, the idea of g as
being essentially fixed by heredity would call into question all efforts of educators to do any more than sort students
by ability and provide a series of curriculum offerings supposedly suitable for each ability band. To be fair, Jensen has
often pointed out that because of the co-occurrence of genetics and upbringing, heritability of intelligence does not
necessarily imply such an interpretation but this point has often been missed. It was natural, therefore, for educators
to welcome a counter-blast to such a view and the idea of multiple intelligences offered, at least, a chance to recognise
that each of their students might have potential in some special areas. We now consider this belief to be unsubstantiated
wishful thinking. However justified it may be to reject the notion of a largely hereditary and immutable ‘g’, to reject
any sort of ‘g’ is not justified. In fact, there is now clear empirical evidence indicating that there is a general factor on
top of Gardner’s multiple intelligences (Visser, Ashton, & Vernon, 2006). Thus, rejecting general intelligence because
it has been misused for some time in the past is tantamount to throwing out the baby with the bath water.
1.5. Modules
Distinct from specialised functions of the mind which develop in complexity over a period of time, are cognitive
‘modules’ which have specific functions but which generally appear quite quickly in fully-fledged form. A module
may be defined as a complex of processes that serves a particular biological or psychological function more or less
automatically and autonomously of other modules. When the input related to the module is present the module-specific
behaviours are released to meet the needs of the moment. An early example of such a module is the general language
processor proposed by Chomsky (1986). He argued that the fact that humans are the only species to be able to acquire
language within a particular time window is strong evidence that it is controlled by a module that specifies and constrains
when and how language is organised and used. Humans acquire language far faster and more efficiently than could
be explained by simple associationist or behaviourist (Skinnerean) learning. Modules offer an explanation for human
competencies which is independent of general cognitive abilities. For example, whatever the apparent computational
complexity is involved in perceiving the world as three-dimensional, distinguishing an approaching object which is
80
P. Adey et al. / Educational Research Review 2 (2007) 75–97
large and distant from one which is small and near, everyone can do it, and do it almost instantaneously regardless of
general mental ability. It seems plausible to suppose that this ability evolved as an essential survival mechanism.
Modules may provide some explanation for the extraordinary phenomenon of idiots savants, that is, individuals
who show advanced abilities in a particular domains, such as the ability to calculate days of the week for dates far
in the past or future, despite their being severely retarded in all other domains. Because of this bizarre pattern of
their accomplishments (see Sacks, 1985 for some stranger examples), idiot savants challenge the notion of general
intelligence, which implies the necessity of a minimal general level of functioning to produce outputs from any domain.
They are, however, extremely rare.
For Fodor (1983, 2000) an essential feature of cognitive modules is ‘Informational Ecapsulation’. They produce
outputs in response to inputs, drawing only on the information particularly available to that module, and not other
information that may generally be available to all parts of the mind. In this they are akin to reflex reactions, outside
the control of general information or processing capacities of the brain. The language processor, for example, draws
on particular syntactical understanding which may remain entirely implicit, and cannot be explicated by the speaker.
Recently Seok (2006) has provided a comprehensive typology of modules, charting the way that the term is variously
used by different traditions within psychology. For our present purposes we need note only the following features:
• Modules may be ‘big box’ or ‘little box’; they may, like the supposed language module of Chomsky, incorporate
many subsystems and cover a broad area of human cognitive processing; or they may be highly specific, such as for
phonetic analysis or line recognition.
• There is disagreement about whether modules require no, or only the most minimal, environmental triggers to start to
function, or whether their appearance and subsequent development may be quite heavily dependent on the strength
and particular features of the environment in which a person finds him or herself. In the former case the environment
provides no more than fine-tuning, while in the latter case modules actually appear as a result of interaction between
the mind and the environment (Karmiloff-Smith, 1991).
• Seok concludes, with Fodor, that Informational Encapsulation is indeed the central, but not the only, characteristic
that distinguishes modules from other hypothesised cognitive processors.
There is some disagreement in the literature about the relative importance of modular, special, and general processing
mechanisms of the mind. Some, such as Cosmides and Tooby (1994) and Pinker (1997), subscribe to what Seok calls
‘massive modularity’ explaining almost all mental function in terms of a large numbers of modules. Others restrict
modularity to a relatively small number of specialised functions, which have perhaps evolved as efficient ways of
dealing with the particular demands of an organism’s environment and lifestyle (e.g. essentially hunting, or being
hunted, or succeeding with brain-power). But even the most modular models of the mind must concede some sort of
coordination, some functions general enough to orchestrate the modules. Implicit throughout Seok’s comprehensive
analysis of ways in which modules have been conceived is the presence of an underlying general cognitive processor,
whose speed (and perhaps capacity) of processing determines much of the efficiency of input and output of modules.
Indeed, a purely modular view of the mind would face another serious conceptual problem since it cannot explain our
ability to deal with novelty and learn new concepts that do not belong to any of the module-specific domains which
have developed through evolution. Some kind of general process is required to account for our ability to handle the
new and the unexpected by capitalizing on past experience and knowledge. Putting this another way, although there
may be ‘innately specified knowledge’ (Demetriou, 2004; Demetriou & Raftopoulos, 1999; Karmiloff-Smith, 1991)
in particular areas that reflects the sculpture of evolution, so to speak, on our ancestry this knowledge is not enough. A
disposition to handle novelty is part of the evolutionary sculpture too. Module-specific knowledge and novelty handling
coexist at the starting point of development (Geary, 2005; Kanazawa, 2004).
There certainly is a danger in short-circuiting a deep analysis of the nature of general cognitive ability and its
development by positing ever more modules which save appearances but offer no real explanatory model of how the
mind functions as a whole.
1.6. Development
One very obvious feature of intellectual ability is that it develops with age. Although this truth was established
long before Piaget, it was Piaget, Inhelder, and their co-workers who, through a lifetime of empirical studies and
P. Adey et al. / Educational Research Review 2 (2007) 75–97
81
inspired theorising uncovered the qualities of thinking at different phases of life and theorised how the qualities of each
subsequent phase emerge from and expand the qualities of the previous phases. According to Piaget’s theory, the mind
is a meaning-making system that actively constructs understanding through mental operations on representations of the
external world. The stages of cognitive development represent qualitatively different structures of mental operations
that provide access to increasingly complex and abstract aspects and relations in the world. For Piaget, an older
child is not just a more clever or more efficient processor of information than a younger person. He or she is able to
see more, deeper, and more complex and intricate relations and reduce them to a more economical and parsimonious
representation. That is, the more we ascend the ladder of developmental stages the more flexible, abstract, and organised
our thinking becomes.
Criticisms of the Piagetian model include accusations of inadequacies in both its conception of the stages of
development and the mechanisms driving ascension along them. Specifically, Piaget’s neglect of individual differences
resulted in a general stage sequence which does not describe many of the children at each age phase. The research
of more differentially and educationally oriented researchers, (e.g. Shayer, Küchemann, & Wylam, 1976; Shayer &
Wylam, 1978) was able to go beyond the general Piagetian stages and establish norms and distributions for these
stages in whole populations. These latter studies confirmed what teachers had long suspected. That is, that the range
of stages of thinking which occurred in a representative sample of children was extremely wide. Typically, this was
the intellectual equivalent of having in a year group of 12 year olds both children at a stage typical of 6 year olds and
those typical of 18 year olds. In other words there is something like a 12 year range around the median.
These variations raise two important questions. First, what exactly is it that develops when thinking becomes
more complex and abstract? Second, what are the factors underlying individual differences in the rate of intellectual
development and the cognitive profiles of different individuals? The answers given to these questions vary depending
on what model of the mind is adopted, and this in turn has implications for what, if anything, teachers and parents can
do about the cognitive development of their children. This will be the main theme of the latter part of this paper.
1.7. Plasticity
In the psychometric tradition, the dominant view was that intelligence is largely fixed and impervious to environmental influence (Herrnstein & Murray, 1994) but we have to say that the tenability of this position has never been
strong, and it would appear that adherence to a ‘fixed intelligence’ model has a more political than scientific motivation.
Many years ago Dasen (1972) edited a series of studies showing how the apparent cultural variability of the appearance
of formal operations was explicable in terms of environmental opportunities, and that when the environment changed
(as for example in the urbanisation of rural West Africans) so did development of higher level thinking. The ‘Flynn
effect’ (Flynn, 1987, 1994, in press) shows that national mean scores on standard IQ tests have been rising steadily
since the 1930s, by 0.59 point a year for tests of fluid intelligence and 0.38 point a year for tests of crystallised intelligence. Whilst the reason for this rise is unclear, any hypothesis put forward to explain it (for example a generally more
intellectually stimulating environment, better nutrition, or better and longer schooling) requires that the environment
has had an influence on general mental ability, and on a massive scale. Even if, as seems possible now, the effect is
levelling out in liberal democracies with wide access to high quality education (Sundet et al. (in press) present evidence
that gains have ceased in Norway and Emanuelsson, Reuterberg, and Svensson (1993) and Teasdale and Owen (2000)
present evidence that gains have flattened since the late 1980s in Sweden and Denmark, respectively), the evidence
that general mental ability is strongly effected by the intellectual environment is strong. Indeed the ‘anti-Flynn’ effect
recently reported by Shayer, Coe, and Ginsburg (2007) that a representative sample of 12 year olds tested on the main
conservation and density concepts performed some 25 percentile points less well than a similar sample tested in 1976,
continues to support the argument that a changing environment changes levels of general cognitive functioning.
There is strong evidence at both the individual and the collective level that general intelligence does respond to
environmental influences so that it can increase or decrease with respect to the norm of a given population at a given
time. Garlick (2002), arguing from the perspective of neural networks, presents extensive evidence for “axonal and
dendritic plasticity in response to experience” (p. 120) and goes on to claim that one of the characteristic differences
between the human and other primate brain is its inherent plasticity. In the same line, computational models of cognitive
developmental processes and brain research suggest that mental computations and the underlying neural networks are
tightly intertwined so that the current structure and functional possibilities of neural networks constrain what cognitive
processes can be carried out at a particular moment but, in turn, these processes, once initiated and carried out alter the
82
P. Adey et al. / Educational Research Review 2 (2007) 75–97
structure and functioning of the neural networks in a dynamic loop where different levels of functioning and description
constrain and modify each other (Westermann et al., 2007). The modifiability of general intellectual ability is a central
theme of this paper, actually its raison d’etre, so we will return to it in strength after presenting some models of the
mind. At this point we need only make the point that plasticity is one of the characteristics of general ability that any
satisfactory model of the mind needs to be able to explain.
1.8. Models of cognition
So far we have noted the following characteristics of general intellectual ability, for which any explanatory model
proposed will have to offer some account:
•
•
•
•
•
•
Intelligence as connectivity.
The development of intelligence.
General intellectual ability is plastic, its development is amenable to both retardation and stimulation.
Individual differences in general mental ability.
The nature, status, and role of modules.
Specialised abilities.
The models which follow each explain some of these characteristics, although none meets all of them entirely
satisfactorily. They are selected as representative of a range of different types of explanation which have been offered
for intelligence, and as each makes a distinct contribution to the debate. We are aware that an alternative description of
the field of intelligence may be made based on the concept of metaphor (Cianciolo & Sternberg, 2004), but we favour
our ‘scientific’ approach over a ‘literary’ one. We speak, in Sternberg’s terms, of computational, epistemological and
systems metaphors.
1.8.1. Information processing
The critical feature of all information processing models is that they treat the mind as an input – processing – output
device, which incorporates some form of central processor, frequently referred to as Working Memory (Baddeley,
1990). In its classical version, Baddeley’s model depicts working memory as a system that involves a central executive
which is responsible for directing current processing to operate on information according to the current goal and which
filters out irrelevant information (i.e., assesses, selects, compares, collates, and connects information) together with
two slave systems, phonological storage and visual storage, responsible for the immediate storage of verbal and visuospatial information. Current research suggests that Working Memory (WM) traces never last beyond 2 s, and in most
cases all significant brain activity has ceased by about 1200 ms or earlier. WM has a very limited capacity (typically
seven bits of information for intelligent adults) so there must be, in addition to WM (which has to be cleared quickly to
leave attention for fresh stimuli) some temporary buffer, or short-term memory, between WM and long-term memory.
Finally, we know that WM capacity develops with age, from being able to handle just one bit of information at a time
at birth, to the full adult capacity somewhere around 18 years of age. In short, working memory is seen as having three
essential features: information is held for a very short time, it has a very limited capacity, but both the capacity and the
efficiency with which a given capacity is used develops with age and learning experience.
In recent years, executive control has acquired prominence as one of the key features of successful intellectual
performance in general and of intellectual development in particular (Case, 1992; Zelazo & Frye, 1998). This is
supposed to direct attention and to filter out unwanted stimuli (noise) from the signal to which attention is being paid. It
is also said to be responsible for planning and indeed some authors (Das, Naglieri, & Kirby, 1994) claim that attention and
planning are the essential building blocks of intelligence. Unfortunately the precise nature of executive control has not
yet been well explicated, and it falls into the twin dangers of the homunculus (the little man in the machine who himself
requires explanation) and of consciousness, a notoriously intractable problem for philosophers and psychologists. Illdefined though the executive function may be, the IP model does need some account of attention-direction and filtering,
and for the time being the ‘executive’ will have to fulfil that role.
This notion of cognitive processing accounts very well for the idea of intelligence as connectivity, for individual
differences, and for the development of intelligence. To make connections one must be able to hold different bits of
information in mind at once. Differences between individuals of greater or lesser general mental ability and between
P. Adey et al. / Educational Research Review 2 (2007) 75–97
83
younger and older individuals can be described in terms of different working memory capacities (Kyllonen & Christal,
1990). However, the various formulations of the IP model do not provide any account of the role of special abilities, either
those which develop continuously or those we have described as ‘modules’. Plasticity, the influence of the environment
on promoting intelligence, is not immediately explained by IP models, although the strategies of ‘chunking’ and ‘overlearning’, which collapse separate bits of information into one and which lead to some processes becoming automated,
offer an explanation of the way in which a given working memory capacity can be used more efficiently. This will give
the impression of more intelligent responses in familiar topic areas, but it can hardly be equated to genuine enhancement
of the central processing mechanism, which would imply an actual growth in working memory capacity.
1.8.2. Cognitive development
Piaget’s theory is too well known and frequently explicated to bear repetition here, but for our present purposes we
need to note two central features. The mind is seen as a meaning-making system that actively constructs understanding
through mental operations on representations of the external world. The stages of cognitive development through which
it passes represent qualitatively different structures of mental operations that provide access to increasingly complex
and abstract aspects and relations in the world.
One might wonder why the ‘epidemiology’ of formal operational thinking (surveys of its occurrence in populations)
has not followed Piaget’s and Inhelder’s claim for universality. It could be that concrete operational thinking is a universal
form of thinking, based on and formed in everyday situations, but that the formal schemata need to be taught, or rather
formed in a well-designed interventionist way. The interpretation suggested is that Piaget had described, not the norm,
but the actual genetic programme that all are born with (his term was ‘the epistemic subject’). On this view the surveys
such as CSMS (Shayer et al., 1976; Shayer & Wylam, 1978) describe the phenotype, and the overall environment
through which children in Western countries pass is not at all favourable to the realisation of the genotype.
This model is quite compatible with the idea of intelligence as connectivity since all accounts of cognitive development emphasise the increasing number of variables which the mind can handle as it matures. Indeed Pascual-Leone
(1970) proposed a direct correspondence between each Piagetian substage and a value of working memory capacity
such that each of the seven levels of working memory corresponds to the Piagetian stages of prelogical (one unit),
intuitive (two units), early concrete (three units), late concrete (four units), transitional to formal (five units), early
formal (six units), and late formal thought (seven units). On the other hand, Nunes et al. (2007) has recently produced
evidence from a maths intervention with 8 to 9 year-olds that a Piagetian logic test explained variance over and above
that provided by a test of working memory and a general intelligence test. Hence the Piagetian model accounts for more
than WM alone, contrary to the claim of Kyllonen and Christal (1990). Gustafsson’s (2002) argument quoted earlier
that a homogeneous test can never capture the construct of general intelligence is relevant here. While relatively simple
constructs such as working memory or inductive reasoning (Klauer, 1989b; Raven, 1960) are attractive candidates as
the key features of general intelligence, in fact general mental ability is a more subtle and complex construct which
will not be captured by homogeneous tests.
The Piagetian account is also entirely consistent with the evidence of plasticity of intelligence. Piaget hypothesised
that the mind develops – progresses through its characteristic stages – partly in response to physiological development
but also importantly as a process of accommodation to events in the physical and social world outside the individual.
The Cognitive Acceleration programmes which will be discussed in more detail later have drawn directly on this
idea and have operationalised the hypothesis that cognitive development can be stimulated by well managed cognitive
conflict and social construction.
1.8.3. Anderson’s model
Although not widely discussed in the education psychology literature, we mention Anderson’s (1992) model of the
mind here simply to illustrate the problems that a largely modular model has in explaining many of the features of
cognitive development which are of interest and importance to educators. The Anderson model incorporates much of
what is understood by working memory within a ‘Basic Processing Mechanism’ (BPM), but it also depends heavily on
modules, which are defined as “functionally independent, complex, computational processes of significant evolutionary
importance, which show no relationship to individual differences in intelligence” (p. 63). The model supposes that
there are two quite distinct routes to knowledge acquisition. The first employs two ‘Specific Processing Mechanisms’
(SPMs) which, in spite of their name, have a rather general function but which are mediated by the BPM. Knowledge
acquisition by this route is determined by the speed of the BPM up to the point that it becomes so fast that the speed of the
84
P. Adey et al. / Educational Research Review 2 (2007) 75–97
SPM becomes the rate determining step. The second route to knowledge acquisition is directly via the modules. While
not attempting to enumerate the modules (for fear of clouding the issue with many individual arguments) Anderson
offers four examples: perception of 3D space, phonological encoding, syntactic parsing, and theory of mind.
This model does provide an explanation for the ‘general plus specialised’ nature of cognitive abilities, and for the fact
that intellectual development and individual differences in intelligence do not have the same underlying cause. That is,
a less intelligent individual is different in important ways from a younger child of average intelligence. “Retardation”
is not just a matter of slow development. Anderson claims that it is the speed of the BPM which explains individual
difference in intelligence, but it is not the increase in this speed which accounts for development. Rather development
is explained by the emergence of new modules as the individual matures.
One problem with this model is that processing speed and processing capacity appear to be treated as the same
thing, or as inevitably entrained with one another. On the simple information processing model described above an
essential aspect of cognitive development is an increase in the number of variables that can be held in mind at once and
so operated upon. The powerful intellect may be able to relate together four or five factors and still have WM space to
hold temporary partial solutions en route to the final solution. It could do this quite slowly. On the other hand, if WM
is limited, it is no good speeding up the process of considering each factor in turn if they cannot remain in working
memory to be operated upon. The distinction has long been made in the psychometric literature between ‘power’ tests
and ‘speeded’ tests. This is not to gainsay the proposition that in practical problem solving there may well be a trade-off
between power and speed, and that different individuals may reach adequate solutions by different means.
1.8.4. Demetriou’s model
Originating from the Piagetian tradition and then trying to account both for on-line processing of information
(the main target of information processing models) and for individual differences (the main target of psychometric
theories of intelligence) Demetriou (Demetriou, 2004; Demetriou, Christou, Spanoudis, & Platsidou, 2002; Demetriou,
Efklides, & Platsidou, 1993; Demetriou & Kazi, 2001, 2006; Kargopoulos & Demetriou, 1998) has developed a model
which involves both central and general mechanisms and a number of specialised capacity systems that deal with
different domains of knowledge and relations in the environment. The model (see Fig. 1) has a three-level hierarchical
structure involving domain-general and domain-specific processes and functions.
Fig. 1. Demetriou’s model of central and specialised cognitive processors.
P. Adey et al. / Educational Research Review 2 (2007) 75–97
85
There is, first, a set of specific systems of thought that specialise in the representation and processing of problems and
relations in distinct domains. The systems of spatial, verbal, numerical, categorical, causal, and social reasoning have
been identified by psychometric, experimental, and developmental methods as autonomous domains of understanding
and problem solving. These are distinct from each other in terms of their function and the mental operations needed
to deal with domain-specific information and relations. Examples of such operations include mental rotation in the
domain of spatial reasoning, truth and validity identification in the domain of verbal reasoning, quantity estimation and
manipulation in the domain of numerical reasoning, and interpersonal understanding in the domain of social reasoning.
Second, there is a domain-general system that involves processes and functions that are responsible for self-awareness
and self-regulation, including monitoring, mapping, regulating, and coordinating all kinds of mental processes and
actions. We use the term “hypercognitive” to refer to these processes in order to stress their main characteristic of
standing over and above the cognitive abilities included in the other systems. This system includes the executive
processor described above under the information processing model. There are five basic components: (i) a directive
function which sets the mind’s current goals; (ii) a planning function that proactively constructs a road map of the steps
to be made in pursuit of the goal; (iii) a comparator or regulatory function which regularly makes comparisons between
the present state of the system and the goal; (iv) a control function which registers discrepancies between the present
state and the goal and suggests corrective actions; (v) an evaluation function which enables the system to evaluate
each step’s processing demands vis-a-vis the available structural possibilities and necessary skills and strategies of the
system. These processes operate recursively, so that goals and sub-goals may be renewed according to every moment’s
evaluation of the system’s distance from its ultimate objective. Conscious awareness and all ensuing functions, such
as a self-concept (awareness of one’s own mental characteristics, functions, and mental states) and a theory of mind
(awareness of others’ mental functions and states) are part of the of the hypercognitive system. In other words, it
generates knowledge about cognitive functioning that somehow represents the actual state of the various domainspecific modules oriented to the solution of problems posed by the environment. Obviously, other widely used terms,
such as “metacognition” (i.e., reflection on particular mental processes), “executive control processes” (i.e., selection,
orchestration, and planning of actions, overt or mental), “theory of mind”, and Sternberg’s (1985) metacomponents
denote processes that belong to and serve complementary functions in this system (Demetriou, 2000; Demetriou &
Kazi, 2006).
Finally, all of the processes and systems summarised above operate under the constraints of a central processing
mechanism which, as hinted previously, is actually a complex of a number of domain-general processes that limit
the representational and processing capabilities of the mind at a particular point in time. These include (i) speed of
processing, that defines how efficiently mental operations may be executed, (ii) inhibition and attentional capacity,
that defines how flexibly and ably processing needs may be handled at a particular moment, and (iii) working memory
capacity, that defines how much information and mental operations may be operated on (Demetriou et al., 1993, 2002).
Change occurs continuously throughout life. Processing efficiency and working memory improve until early adulthood, they remain stable until middle age and they then start to decline. Thought in the various environment-oriented
domains becomes increasingly more complex, versatile, abstract, and ingenious. Hypercognitive processes become
more accurate, refined, and focused. How are all of these courses of change interrelated? A series of studies suggests that
processing efficiency sets the frames for what mental constructions are possible at different age phases, while executive
processes contribute largely to individual differences in how these possibilities are implemented into actual conceptual
and problem-solving constructs in each of the environment-oriented domains (Demetriou et al., 2002). The domains
provide the primary material for the actualization of these potentialities and constructions in them define the most
visible aspects of the cognitive profile of individuals and groups. Hypercognitive processes abstract general reasoning
patterns by organising and tagging domain-specific realisations into supra-domain inference patterns (Demetriou &
Kazi, 2006; Demetriou & Raftopoulos, 1999).
Demetriou’s model offers us by far the best account of the ‘general-plus-special’ feature of intelligence, and it
also accounts well for the development of cognition in two ways. First, it shows that general processing constraints
associated with processing efficiency and working memory are related with the condition of problem solving in different
domains in different age phases. These constraints are obviously related to the question of plasticity of intelligence but
the model would have to specify how different values of processing efficiency and working memory would account
for the differential effects of an intervention programme. Second, the assumption about a general hypercognitive
system that monitors and regulates general efficiency and domain-specific processes is actually a general connectivity
agent. Third, the model describes a series of mechanisms of cognitive change (Demetriou & Raftopoulos, 1999;
86
P. Adey et al. / Educational Research Review 2 (2007) 75–97
Table 1
The fit of models to characteristics of intelligence.
Connectivity
Development
Individual differences
General + special
Modules
Plasticity
Information processing
Cognitive development
Demetriou
Anderson
Good
Good
Good
No
No
Possible
Good
Good
Good
No
No
Good
Possible
Good
Good
Good
No
Possible
No
Good
Good
Possible
Good
No
Demetriou, Raftopoulos, & Kargopoulos, 1999). These mechanisms specify how and why change occurs under the
constraints of the processing potentials available. Specifically, the mechanisms explicate how the mind binds together,
differentiates, refines, and abandons already available concepts, skills, and processes in order to construct new ones
to meet the understanding and problem-solving needs at a given moment. However, the model still has to explicate
how hypercognitive change mechanisms such as metarepresentation enable the person to work out new connections
between concepts and mental operations. Finally, the model accounts for individual differences in the rate and quality
of learning through differential development of the specialised systems and of the central processing mechanism
in different individuals. Overall, the model needs to work out precise mechanisms that would relate the theoretical
constructs to actual educational interventions.
1.9. Summary part I
We have offered four models of the mind each of which accounts well for some of the characteristics of general
intellectual ability, but none of which does the complete job for us. The adequacy of each model in explaining each of
the characteristics of intelligence that we have highlighted is shown in Table 1. As is common in science, the reality is
too complex to be represented adequately by a single model, but each model we propose will have its uses for particular
purposes.
2. Part II: what educators can do about it.
2.1. What develops under cognitive stimulation?
From an educational point of view, it is the plasticity of intelligence which is both most important and most in need
of further explication. After all, if a general intellectual ability which can be brought to operate on all types of learning
can itself be enhanced by appropriate educational strategies, then surely that is a most efficient way for the enterprise
of education to proceed. In this section, we will discuss just what it might be in the mind that changes in response to
cognitive stimulation and then look at some examples of successful stimulation programmes.
Looking through our models of intelligence, both the Piagetian and the more central models taken together do offer
some clear guidance as to what develops and how that development might be promoted. At one level of description,
it is the cognitive structures in the mind which process incoming information which must develop in complexity,
and it is the twin processes of cognitive conflict and reflective abstraction which drives that development. Piaget
makes clear proposals for a mechanism of equilibration by which the cognitive structures accommodate to inputs
which challenge their limits, such that the new more complex information can be assimilated. Then the mechanism
of reflective abstraction restructures mental operations so that they can resolve the conflict. This process generates a
new more refined and flexible structure for which the novel information is no longer novel and conflict-generating.
The challenge for the educator is to manage the provision of cognitive conflict such that it really is productive. The
experiences offered should be neither so easy as to be assimilated without cognitive effort, nor so complex that the
current state of the individual’s cognitive structure fails to make any sense of it whatever. In other words, the instructional
process needs to be set carefully within the learner’s Zone of Proximal Development (what Newman, Griffin, & Cole,
1989 describe as ‘The Construction Zone’). At the practical level, there is also the very real difficulty of making such
provision for 30 children of widely varying levels of development (as any class has).
P. Adey et al. / Educational Research Review 2 (2007) 75–97
87
What can we say about the meaning of cognitive stimulation in terms of the other models we have offered? Since
Demetriou’s model is a post-Piagetian one, the explication given for Cognitive Acceleration in the Piagetian model
may also be applicable. Demetriou has expanded more on the mechanisms by which development of any of the
elements within his model (central processor, executive, specialised structural systems) are supposed to develop and
has demonstrated the limitations exerted on this transfer by processing potentials. Specifically, in the context of this
model we can specify how the efficiency of the general processing mechanisms constrains the functioning of reflective
abstraction, and how the general hypercognitive mechanisms, including self-awareness of cognitive processes, executive
control, and selective attention guide reflective abstraction to actualize the reorganization of thought processes into
higher and more complex systems. We explore this further in Section 2.2.2 below (under ‘Cognitive Acceleration’).
For the simple information processing model, ‘Cognitive Acceleration’ might imply the stimulation of growth of
working memory so that the development from say, M = 4 to M = 5 is accelerated as compared with a non-stimulating
environment. But if one considers that the total development of working memory over the whole period of, say, 18
years to maturity must be at an average rate of something like 1 ‘bit’ every 2.5 years, it is difficult to see how even a
sustained intervention such as CASE (Adey & Shayer, 1994; Adey, Shayer, & Yates, 2001) could seriously claim to
‘expand working memory’. The mechanisms of chunking and automating (Case, 1985; Halford, 1993) do make more
efficient use of a given working memory capacity but these are more akin to learning strategies (Adey, Fairbrother, &
Wiliam, 1999; Halford, 1993; Halford, Wilson, & Phillips, 1998). We are now justified to assume that this changes are
related to the improvement of executive control, attention focussing, goal setting, and representational abilities (Kuhn
& Pease, 2006). In turn, these changes may change the operation of underlying neural networks, leading to changes to
fundamental processes including speed of processing itself (Deary, 2000; Demetriou et al., 2002).
This interpretation of change is in clear conflict with Anderson’s model which posits that general mechanisms such
as speed do not develop at all and that development comes as a result of change within the modules or the addition
of modules to the repertoire of the developing person. The modules, if they exist at all at the level of thought, may
indeed develop as well. But this would not explain stimulation of the general processor since modules are by definition
specialised for particular purposes. We have to conclude that at this stage Anderson’s model, as a paradigm of the
‘massive modularity’ approach to cognitive structure, does not offer much scope for hypothesising about mechanisms
of cognitive stimulation.
A somewhat different perspective on ‘what develops’ is offered by Garlick (2002) who argues that some individuals
have inherently more plastic neuronal networks than others, and that it is those with higher levels of plasticity, who are
better able to adapt to the environment, whom we call more intelligent. In other words, the more potentially intelligent
individuals are ones who reap rewards from cognitive stimulation, while others who are less naturally plastic are less
able to take advantage of stimulation. We see two problems with the hypothesis. Firstly it is based on an assumption
that there are few environmental differences in different children’s upbringing—an assumption that is patently false.
Secondly our empirical evidence (Adey & Shayer, 1993) shows that while some students certainly benefit more than
others from Cognitive Acceleration, there is no relationship at all between starting levels of cognitive development and
gains made. Garlick seems to have a rather limited familiarity with the literature of cognitive stimulation.
2.2. Evidence for cognitive stimulation
Several methods have been proposed over the last few decades to improve general cognitive abilities. In general,
they intend to provide students with specific stimuli that are necessary to optimise cognitive development which
otherwise are missing from their learning environments. Piaget maintained that for a change to count as development,
as opposed to simple learning, it must satisfy three fundamental criteria (Piaget, 1974). First, one must ask whether the
progress obtained is stable or whether, like many things learned in school, it disappears with time. Second, one must
determine whether the observed accelerations are accompanied by deviations from the general developmental trend.
Third, it is necessary to check whether progress obtained independently from natural development can serve as a basis
for new spontaneous constructions or whether the subject who passively receives information from the adult will no
longer learn anything without such help. We believe that each of the programmes described in this section meet these
criteria, by showing both mastering of the target concepts and long-term far transfer. These intervention programmes
can be classified into two main groups employing either Direct methods or Content-Based methods. We will briefly
characterize each type and outline some well-known examples of each. Examples are chosen which: (a) satisfy the
three criteria above, (b) have published evidence for their effects in controlled experimental conditions, and (c) have
88
P. Adey et al. / Educational Research Review 2 (2007) 75–97
plausible theoretical explanatory models. These criteria actually exclude all but a handful of programmes, and so our
‘examples’ may in fact represent all such programmes.
2.2.1. Direct methods
There are many cognitive intervention programmes which are independent of school subjects and which directly
target general mental abilities using specifically designed training exercises. Many of these have shown little interest
in academic respectability in the sense of offering a serious theoretical foundation or reporting valid measures of their
efficacy. In some cases they just improve short-term non-transferable test-taking skills, while others have not even
demonstrated this limited effect, but relied rather on glossy publications and claims based on subjective opinions to
sell their wares.
Among those few that have proven efficient, maybe the best known is Feuerstein’s Instrumental Enrichment
(Feuerstein, Rand, Hoffman, & Miller, 1980). This programme was originally designed to deal with serious problems faced by children returning to Israel from traumatic experiences or culturally deprived environments. Feuerstein
maintains that cultural deprivation means that the child, through a number of possible factors, may have been deprived
of the necessary Mediated Learning Experience (MLE), an essentially Vygotskian concept. Initially, through the mediation of mother and close family, children learn to accommodate, in the Piagetian sense, to incoming stimuli, and so
develop their cognition. ‘Parents may not provide MLE, through poverty, alienation, indifference or emotional disorders’ (Feuerstein et al., 1980, Chapter 3). Feuerstein’s Instrumental Enrichment (FIE) is a 2 year school intervention,
typically initiated at the beginning of adolescence, designed to increase children’s ability (and belief in that ability) to
benefit from exposure to challenging thinking tasks that cover the spectrum of schemata described by Piaget. Collaborative learning – that is peer–peer mediation – is an important feature of FIE practice. Feuerstein et al. (1980) were
some of the earlier workers to provide evidence that IQ was not a fixed value for individuals, but could be boosted.
Arbitman-Smith, Haywood, and Bransford (1994) reporting replication of FIE in the USA, cited effect-sizes of the
order of 0.6 SD on Raven’s and other tests of fluid intelligence, averaging 0.77 SD on tests of crystallised intelligence.
Shayer and Beasley (1987) reported on the use of FIE with one class in a Special School in England, effect-sizes of
1.22 SD on a battery of Piagetian tests, and 1.07 SD on Raven’s.
Perhaps the major bar to the wider adoption of FIE is that its originators will not release the published materials except as a package which includes an expensive teacher training programme. Whilst it is understandable to try
ensure proper implementation (which materials alone can never achieve), schools and individual teachers are likely
to be wary of buying into a scheme without inspecting it first. Another problem, in common with all Direct methods, is that it requires school administrations to find extra time in the timetable, at a time when national curricula
are exerting increasing pressure on the time available in the school day just for the content of the various subject
matters.
As to what it is that is stimulated by the IE programme, it seems quite likely that it is indeed the students’ executive
processing mechanisms that are developed, since the programme specifically promotes attention focussing, goal setting,
and representational abilities.
2.2.1.1. Klauer’s inductive reasoning. A second Direct method is that developed by Josef Klauer in Germany. Inductive
reasoning is often considered to be at the heart of general intelligence and Klauer operationalised the concept, describing
it as the processes of looking for similarities and dissimilarities between objects and relationships (see for example
Klauer, 1990). He developed a series of training programmes for young children (kindergarten and the first school
years) and for mentally retarded adolescents (Klauer, 1989a, 1991, 1993; Klauer & Phye, 1994). Later on, based
on his original theoretical framework of inductive reasoning, several other intervention programs were devised that
used specifically reorganised teaching materials for the training. Meta-analyses of good numbers of evaluations of the
various programmes indicate that effect sizes of from 0.5 (with older students) to 0.63 (with younger children)
on measures of general intelligence with effects lasting at least 9 months after the intervention and with good evidence
of transfer to academic achievement. Such results counter the criticism that inductive reasoning is just one component
of general intelligence and exclusive focus on that alone will not develop generally better thinkers. More details of
these results are given in Klauer (1997, 2001).
It is interesting to note that originally both Feuerstein’s, and Klauer’s programmes were developed for children
outside the mainstream of education, and in such cases they were successful. Klauer’s, at least, seems to have transferred
well to mainstream education with young children.
P. Adey et al. / Educational Research Review 2 (2007) 75–97
89
2.2.2. Content based methods
The other type of intervention programmes are content based methods, which set their stimulation activities within the
context of school subject content matter. They are also called enriched, infused or embedded methods (McGuinness &
Nisbet, 1991), and they vary in their balance between transmitting content matter and stimulating cognitive development,
or how well they integrate these two sets of goals. They mostly use science or mathematics materials, whose structure
and content appear to be especially suitable for structured exercises. Among these, one of the most widely used and
best tested is Cognitive Acceleration.
2.2.2.1. Cognitive Acceleration. Since 1983 a series of Cognitive Acceleration (CA) programmes were implemented
from King’s College London. These programmes were set within the contexts of different domains, initially in science
and later mathematics and other subjects. The complexity of concepts included in the programmes is specified with
reference to the Piagetian stage sequence from concrete to late formal thought, such that the students would encounter
cognitive conflict, but in a classroom social atmosphere where the teacher mediates the process of reflective abstraction
and conflict resolution. In other words, change is attempted through guided reflective abstraction a la Piaget, in a
context of Vygotskian-like scaffolded teaching. In the CA literature (e.g. Adey & Shayer, 1994) these principles are
presented as the three main ‘pillars’ of the method: cognitive conflict, social construction, and metacognition.
The effects of CA programmes on students’ cognitive development and subsequent academic achievement have
been widely reported. For example, a CA programme set in a science context used over 2 years with 12–14 years olds
led to significant immediate gains in levels of cognitive development compared with controls, but also to subsequent
gains in national public examinations taken 3 years later - not only in science but also in mathematics and English
(Adey & Shayer, 1993, 1994; Shayer, 1999; Shayer & Adey, 2002) with effect sizes from 0.35 to over 1 SD. Likewise
a cognitive intervention programme set in mathematics has transfer effects to science and English (Shayer & Adhami,
2007). This far transfer across contexts is rare in cognitive psychology, and seems to offer strong support for the idea
of a general intellectual processor which can be enhanced in one context but then becomes available to be deployed in
quite different contexts.
More recently Cognitive Acceleration has been applied to much younger children, and evidence is beginning to
emerge of transfer from programmes set in mathematics and science contexts with 5–7 year olds again having an
immediate effect of levels of cognitive development and a subsequent impact on their English scores in a test taken
when the children are 11 years old (Adey, Robertson, & Venville, 2002; Shayer & Adhami, submitted for publication).
It could be said that the CA programmes have worked through all of the gates proposed long ago by Wohlwill (1973)
for a developmental approach to raising levels of general ability.
One problem with CA is that it requires teachers to adopt a pedagogy very different from, and even at odds with,
what is currently recognised as “good teaching”. For example, in a CA lesson there are no clear objectives, there is
no need to finish the activity, students leave the class with no written record of their work, and collaborative work
amongst students is positively encouraged. This requires a rather extensive programme of professional development
(Adey, Hewitt, Hewitt, & Landau, 2004) which can be a barrier for mass adoption of the programme.
In asking precisely what it is that is enhanced by Cognitive Acceleration, the most obvious answer is the cognitive
structures proposed by Piaget as the active meaning-makers of informational input. And yet these ‘structures’ remain
ill-defined, and do not find a specific place within any of the mind models discussed earlier. CA activities each refer
specifically to one or more of the concrete (at primary level) or formal (at secondary level) schemata described by
Piaget and Inhelder and yet the evidence that these in fact form integrated structures (Shayer, 1979; Shayer, Demetriou,
& Pervez, 1988) suggests that they depend on both the central processing and the hypercognitive mechanisms of
Demetriou’s model. Demetriou et al. (2002) and Demetriou and Kazi (2006) have recently shown that as much as
60% of the variance on Piagetian-type tasks are accounted for by these central mechanisms. According to this model,
metarepresentation abstracts general thought patterns from experience within domains. These patterns can then be
brought to bear on the functioning of the domains themselves. General problem-solving management strategies, such
as search of the problem space, search of new solutions based previous experiences, etc., once formed can be adjusted
for use in different domains. For instance, in the domain of causal thought as used in empirical science, the new
strategies might lead to a more organised search and manipulation of the materials concerned vis-a-vis the hypotheses
to be tested. In the domains of mathematics, the new strategies might imply a more organised search of the logical
consistency between different mathematical concepts or principles so that inconsistencies can be spotted and removed.
This requires building an awareness of both the characteristics of the domains, the general patterns abstracted and the
90
P. Adey et al. / Educational Research Review 2 (2007) 75–97
very metarepresentation process itself. In other words, systematic transfer of effects across domains requires some
kind of meta-metarepresentation that will enable the thinker to tailor and tune skills and strategies to the domains the
thinker is working with (Demetriou, 2004). In fact, Kuhn and Pease (2006) have recently shown that older persons
learn more than young persons precisely because they are more in control of executive and, in the present terms, of
metarepresentation processes. It needs to be noted, however, that the state of two dimensions of processing efficiency,
that is speed of processing and working memory, interact with the type of learning experience offered to students. That
is, on the one hand, the better the state of these two dimensions the more students were able to profit from self-directed
learning. On the other hand, students who are weak on these two dimensions are still able to profit from well structured
and directed learning as much as their more efficient colleagues. Therefore, the two general systems interact in complex
ways to frame how learning and transfer is to take place. On the one hand, the hypercognitive system is the highway to
transfer, in Salomon’s (Perkins & Saloman, 1989) terms while on the other hand, the central processing system sets the
limits of moving on this highway. For instance, explicit and directed teaching compensates for possible weaknesses in
mental power (Demetriou & Andreou, 2006). Once again we see that the central cognitive processor is central, and is
general, but is not simple. Describing it just as ‘inductive reasoning’ or as ‘working memory capacity’ is not enough.
Lipman’s Philosophy for Children (Lipman, Sharp, & Oscanyan, 1980) established in the early 1970s, has earned
serious consideration through the consistency of its approach, its foundation in well-established principles of human
discourse and rationality, and some evaluation in terms of long-term enhanced student achievement. It is normally
introduced about grade 5 or 6 and delivered through the English or social studies curricula. Its materials include a
series of ‘novels’ which contain dilemmas of rationality, ethics, morals, aesthetics, science reasoning, and civic values,
and a teachers’ guide which emphasises the role of the teacher as an intelligent questioner. The dilemmas are set in
the context of children’s lives and lead into classroom discussions guided by the teacher. As with CASE, Philosophy
for Children (often known as ‘P4C’) is delivered over a long time span, it is progressive in complexity so that later
materials are suitable for 15 to 16 year-olds, it presents problems which children must puzzle over, the problems are
set in contexts familiar to children, and although different novels in the series have subject-specific foci (reasoning in
science, reasoning in language arts, reasoning in ethics, reasoning in social studies) these are seen as expressions of
domain general reasoning through different contexts rather than as distinctly different types of reasoning. Discussion
between children is encouraged to externalise the reasoning being used so that it can be scrutinised not only by others
but by the user herself. The parallels with CA’s ‘social construction’ and ‘metacognition’ are obvious. In one evaluation,
two groups of about 200 children each in grades 5–8 were followed over a year’s exposure to P4C, at the rate of about
two and one quarter hours per week. Teachers were given 2 h of PD related to the programme every week during
the school year. At post-test experimental groups showed significant gains in aspects of logical reasoning compared
with controls, not surprising after such an intensive course, but also highly significant (p < .001) gains in standard
assessments of reading and mathematics. These argue for a general effect of the programme on reasoning which leads
to better reception of instruction in traditional academic disciplines. Although the investment of time in teacher training
is considerable it does indicate the potential of a method for raising children’s reasoning capability which influences
achievement. Unfortunately there does not appear to have been a long-term follow up to test the permanency of the
effects and, in the context of this paper, the approach offers not psychological underpinning theory but is, rather, rooted
in a pragmatic philosophical tradition.
Content based methods can be generalised to other content areas and to several abilities. Csapó (1999) describes
a general method for devising cognitive training materials by using the content of teaching and illustrates it with
experimental materials. Some 40 other European training experiments are described in the inventory edited by Hamers
and Overtoom (1997). Whilst many of these lack useful evaluative evidence, many do indeed report significant quantitative gains in measures of general intelligence and/or academic achievement resulting from interventions addressed
to general thinking abilities. Few have had the resources for the type of long-term extensive evaluations reported
above, but looking at the characteristics of many of the interventions it seems likely that they suffer only from Type II
errors: failing to find effects which are there. More theoretical issues are discussed and further experimental works are
presented in the volume edited by Hamers, van Luit, & Csapó (1999).
Our conclusion from this accumulated evidence is that the development of general intelligence can indeed be
significantly influenced by long-term intervention programmes in school or in other settings. The intervention research
designed within the developmental tradition suggests strongly that the central meaning-making mechanisms of the
mind can be strengthened, restructured, and developed if appropriately influenced. The research in the psychometric
tradition (and the findings related to the Flynn effect mentioned in Section 1.7) suggest strongly that the standing of
P. Adey et al. / Educational Research Review 2 (2007) 75–97
91
individuals relative to other individuals may also be modifiable if there are enough challenging opportunities for them
to reach towards higher intellectual potentials.
2.3. Educational implications: general ability through subject matter?
If we are to study learning from the point of view of developmental change, we need to combine two types of
thinking. The construct and predictive validity claims of the general intelligence idea from the psychometric tradition
provides a very powerful technical approach, but one needs also an adequate explanatory theory to underlie the formal
or structural model. The g-theory, as such, is not a developmental theory, because its main area of interest is in a
differential tradition (of Cronbach’s well-known distinction between the differential and experimental traditions) and it
deals, to put it briefly, with variations. But there is also a need to have a strong theoretical interpretation of the near and
far transfer effects of an effective intervention. This approach of ours could be said to constitute a third metatradition
(as ter Laak, 1995, has suggested) in the psychology of learning.
In the past century, curriculum development has been dominated by the structure of knowledge and the emphasis has
been on efficient instruction for the development of subject-specific concepts. There is no doubt that this ‘instructional’
approach has paid significant dividends in terms of the development of effective teaching strategies within subject
domains (Bransford, Brown, & Cocking, 1999) but it may be that we are reaching the limits of the gains attainable
by such measures. For example, the ‘conceptual change’ approach which dominated educational research literature
from the 1970s until the turn of the century (say, from Driver, Guesne, & Tiberghien, 1985 to Vosniadou, Ioannides,
Dimitrakopolou, & Papademetriou, 2001) was able to offer many valuable insights into children’s misconceptions but
the prescriptions offered for changing those misconceptions are generally speculative, positing what might work in
changing misconceptions but falling short of actually running the experiments to find out whether the prescriptions do
in fact have an effect. One of the best known of such prescriptions is that of Posner, Strike, & Hewson (1982) who
proposed that to be acceptable, new conceptions must be intelligible, plausible, and fruitful, with the implication that if
concepts could be presented to students with these characteristics, then they would understand them. We are not aware
of any empirical research which has supported this hypothesis. One major effort to ‘correct’ children’s misconceptions
was the Children’s Learning in Science Project (Driver, Asoko, Leach, & Scott, 1987) at the University of Leeds in
the late 1970s. The methods were indeed promising, employing the bringing to consciousness of current conceptions,
the design of critical tests which often provoked cognitive conflict, and then a metacognitive phase in which students
reviewed what they had believed, what they believed now, and what had prompted them to change. Sadly (in our
opinion) this work was never quantitatively evaluated to discover whether any significant shifts in conceptions had in
fact occurred, or whether any shifts that did occur were long-lasting. The nearest that the conceptual change literature
came to offering explanatory psychological theories was to draw on Ausubel’s theory of meaningful learning, in terms
of making connections between what was known and new knowledge, but it never recognised Ausubel’s Piagetian
origins:
“. . .it undoubtedly overstates the case to claim that any subject can be taught to children in the preoperational
stage or in the stage of concrete operations provided the material is presented in an informal, intuitive fashion
with the aid of overt manipulation or concrete-empirical props. . . .some ideas simply cannot be expressed without
the use of certain higher-order abstractions. These latter kinds of concepts would be intrinsically too difficult for
pre-school and primary school children irrespective of their method of presentation.” Ausubel (1965), p. 260
Likewise the literature on powerful learning environments, vary fashionable in Europe (Boekarts, 2002; De Corte,
2000) is, we believe, under-theorised and short on empirical evidence for effects. Specifically, this approach emphasizes
the metacognitive, self-regulated, and motivational aspects of the learning environment with an emphasis on the
mastering of the mental operations and concepts related to a particular domain. In the so-called powerful learning
environment, students are led to plan their learning or problem-solving acts from the start, reflect on what they know,
what they can do, and what they do not know about the problem and the domain, build relations between the problem
and their prior knowledge and systematically and continuously monitor and regulate the process from the start through
to the end. There is clearly much value in this approach, especially for more able and/or mature students, as a guide
for students to make full use of their intellectual abilities for learning in specific domains. However, it does not seem
to pay sufficient attention to either the processing or the developmental constraints of learning at particular phases of
life by particular individuals.
92
P. Adey et al. / Educational Research Review 2 (2007) 75–97
We suggest that only by mining more deeply into the insights and models of developmental psychology and paying
attention to the general intellectual processors of the mind (both ‘executive’ and ‘central’) will educators be able to
achieve a new step-change in the educational achievement of their students. We do not need to spell out the value
of such an advance in a modern society whose economic well-being is far more dependent on intellectual than on
manual skills. As we have described, there are several interesting approaches to promoting general cognitive abilities
(or to ‘teaching intelligence’) which are theoretical and systematic in their conceptualisation of the major processes
taking place in learning and which lead to permanent enhanced levels of competence. We have shown, especially
from experiments concerning content based methods, that moving along the scale of “subject matter versus general
ability” towards the direction of developing general abilities actually opens up broad opportunities for raising levels of
traditional academic achievement. It is necessary to go beyond the simple objection to the teaching of general cognitive
abilities that “it is all very well, but we have a content curriculum to deliver” and to realise that the most efficient way
of “delivering content” is by promoting the development of general intelligence.
There remain some questions about whether the promotion of the central processing mechanisms is best achieved
through direct or content-based methods. The direct methods which have subjected themselves to any sort of academic
evaluation (and there are many that have not, such as de Bono, 1987; Rose, 1985) have shown some good effects.
Feurstein’s Instrumental Enrichment seems to work well with special populations of disadvantaged or learning-disabled
students and is well theorised. Actually, Feurstein designed IE activities quite specifically to be free of normal school
subject matter contexts, on the grounds that he did not want to remind his learning disabled students of subject matter
in which they already perceived themselves as failures. Lipmann’s Philosophy for Children has shown good transfer
effects with a general population but is essentially a pragmatic, under-theorised approach. Both face the practical
problem in school curriculum terms of requiring scheduled time for “thinking lessons”.
We suggest that the content-based approach offers more promise for large scale implementation, and not just for
practical reasons. After all, Piaget (1974, p. 7) himself commented on the learning experiment of Inhelder, Sinclair, and
Bovet (1974): “The training techniques are essentially aimed at the acquisition of the formal aspects of knowledge; they
emphasize the role of reflective abstraction and bear on operations. However, they are always related to the acquisition
of specific concepts, whose content varies with each experiment”. The point is that you have to have both concrete
content and reflective abstraction. If you teach the specifics with abstraction in mind, the general is learned, but if you
try to teach the general directly, the specifics often are not learned. Klauer (2000) calls this an ‘asymmetry of transfer
effects’. The learning of a task-specific strategy leads, in a good learning situation, to transfer effects which build on,
and are dependant on, general strategies; the reverse, however, is not true; the learning of a general strategy does not
directly lead to transfer effects, which are always dependant upon specific strategies. It makes sense to teach subjects,
but you cannot understand the general benefits of them without a good theory of cognitive (and, in a different paper
altogether we might add, emotional) development. Within Demetriou’s model, remembering that the development of
each domain specific system is somewhat independent but nevertheless constrained by the development of the central
processor, we propose that the route into the central processor must be through one or more of the domain special
systems in interaction with the executive functions involved in the hypercognitive system.
2.4. Conclusion
Long ago Vygotsky claimed that all learning in school from the early years onward should be directed as much
to children’s cognitive development as to their subject learning (Shayer, 2002). The transfer evidence from Cognitive
Acceleration, Instrumental Enrichment, and Philosophy for Children suggests that the technology is now being developed to make that a practical and realisable aim. Each of these interventions clearly stimulates something much deeper
than domain specific systems, and that ‘something’, we would claim, is general mental ability, or general intelligence.
Of all of the features of intelligence that we have discussed, the two most important to focus on in this concluding
section are: (1) its general component which operates across all contexts and domains, (2) its plasticity, or amenability
to enhancement, or acceleration, in response to appropriate environmental influences and (3) that this plasticity goes
as far as the brain itself. As long as educators see intelligence as something essentially fixed which predetermines their
students’ ability to process all kinds of data they would be quite right to treat it with great suspicion, as a lurking force
which undermines all of their efforts. But as soon as one accepts the fact that the functioning of the general intellectual
processor in the mind can be improved by education, then the construct of intelligence becomes more acceptable
than in the past. It no longer has ultimate control over our students’ ability to learn, and the tables are turned so that
P. Adey et al. / Educational Research Review 2 (2007) 75–97
93
educators now have it in their power to raise their students’ general cognitive ability, and so raise all of their academic
performance.
This is more than just a matter of losing our shackles. It should actually define the mission of education, its primary
purpose. If we know that we can help our students throughout their formal schooling period continually to raise their
ability to process information – any sort of information – then surely this must be what we should be striving for. It
is by far the most efficient use of schooling time, to concentrate on raising intelligence levels so that students become
generally more effective learners. It might be argued that making the promotion of general cognitive ability the first
priority of schooling is to devalue education’s role in emotional and moral education, but the counter argument is that
even efficient resolution of emotional and moral issues is facilitated by higher-powered “general mental ability” (see
for example Schmidt & Hunter’s, 1998 review).
This would be our action plan for raising general cognitive ability in terms of the principles developed in this paper:
1. Learning activities must be generative of cognitive stimulation, that is they must have the potential to create challenge
rather than being comfortably within the reach of the learner’s current processing capability.
2. Learning should be collaborative in the sense that learners learn to listen to one another, to argue and to justify,
and become accustomed to changing their positions. Incidentally, this has implications for the cultural norms of the
classroom: are students comfortable about changing their minds publicly without feeling ‘loss of face’? Although
this is an important issue, it is beyond the scope of this paper. We can say only that it presents more or less challenge
to current practice, depending on the particular culture in which it is implemented.
3. We need continually to raise awareness in students of what may be abstracted from any particular domain-specific
learning, such as:
(a) factors in the concept, such as the organization or the quantity of information that cause difficulties in representing
and processing;
(b) connecting the present concept to others already in their possession, as they differentiate it from other concepts,
and even as they decide that some concepts need to abandoned;
(c) control of the thinking and learning processes as such, thereby transferring mental power from the teacher to the
thinker. In other words, the student gradually becomes self-reliant and self-regulating, rather than depending
on the teacher continually to monitor and guide his or her thinking.
4. The present learning experience needs to be connected to the concept space and the learning space of the past. That
is, are similar concepts already mastered? How does the present one relate to them? How was learning of these
other concepts in the past handled? Check if these difficulties are specific to this concept, to all concepts similar to
it, or to any concept.
In a sense all great teachers from Socrates on have known this, but it is rarely practiced because:
1. It is difficult to do. Teaching for cognitive stimulation is far more demanding, and seems far more risky in the
classroom than is efficient instruction in content matter. Amongst others, Adey et al. (2004) have described the
extent of the conceptual-pedagogical change that teachers must make to move from one form of teaching to another.
2. In the United States, in Africa, South Asia, and South-east Asia, in Australia and the United Kingdom, and in many
other European countries, Education has become an important plank in politicians’ claims for the way in which
they will improve society. This means that schools are typically dictated to by non-professionals who see education
primarily as the transmission of information, seen to be culturally or vocationally important.
The height of the first hurdle should not be underestimated. Nevertheless in the Cognitive Acceleration work we
have shown that the professional development required for the radical shift in pedagogy which teaching for intelligence
demands can be designed and delivered on a large scale (Adey et al., 2004).
The second hurdle could in principle be cleared by a political system that was prepared to take the arguments of this
paper (and many others like it) seriously, and enter the lists against conservative forces armed with a combination of
evidence, belief, and a story about the economic efficiency of turning over some proportion of the school curriculum to
the specific promotion of general intellectual development. There have been examples of politicians taking this line,
most notably in Venezuela in the late 1970s where Minister of Education Machado made the teaching of intelligence the
major goal of the school curriculum. Adequate resources were allocated and international experts such as the Harvard
94
P. Adey et al. / Educational Research Review 2 (2007) 75–97
Project Zero team brought in to advise. Sadly, the next election in Venezuela swept Machado from power and the brave
experiment was aborted before any useful data could be collected.
Sadly, the norm is more mundane. In the United States, for example, education has become an evermore important
issue where international comparisons such as PISA or TIMMS are being interpreted as showing the need for more
reforms and innovations. But the law of diminishing returns operates here: when the systems have a long history with
reasonably good results, much more radical planning and innovations are required to achieve even for small effects and a
common political response is the emergence of more technical planning by non-professional educators. All educational
institutions and pedagogical solutions have to respond to the problem of the variation within age-cohorts, and because
the quest for better competence is a universal one in developed countries, all advanced educational institutions are
facing the same problem. In this paper we have addressed ourselves on this issue and the claim is that the combined
positive effects of management of the between-school variation, class size variation, and the use of evidence-informed
intervention programs are bigger than the positive effects of segregated and tracked systems, where high outcomes are
hoped for, but only partly achieved, by early selection.
It may be even more difficult now in an era when all politicians seem intent on conquering the safe middle ground,
following what they see as the popular will rather than leading it. One of us recently found himself sitting next to an
ex-Minister of Education at dinner. When asked why, in spite of knowing of the evidence of the positive effects of
teaching for cognitive acceleration, she had never promoted it, she answered “It’s simple. Newspaper editors, parents
and governors understand the need for better literacy and numeracy, so numeracy and literacy strategies are easy
to sell. Selling the idea of better thinking would strike many of them as being ‘barmy psychology’ and we weren’t
prepared to take the political risk”. And yet in an era when governments seek evidence based education policy and
educational reforms are to be based on scientifically established methods, one might hope that the question of building
the development of general cognitive abilities into the curriculum can be raised again, and the methods that have already
proved to be efficient might proliferate and penetrate educations systems throughout Europe and beyond.
If policy makers have the will, we have the way.
Do search for ‘and’.
References
Adey, P. (2007). The CASE for a general factor in intelligence. In M. Roberts (Ed.), Integrating the mind: Domain general versus domain specific
processes in higher cognition (pp. 369–386). Hove: Psychology Press.
Adey, P., Fairbrother, R., & Wiliam, D. (1999). A review of research on learning strategies and learning styles. London: King’s College London.
Adey, P., Hewitt, G., Hewitt, J., & Landau, N. (2004). The professional development of teachers: Practice and theory. Dordrecht: Kluwer Academic.
Adey, P., Robertson, A., & Venville, G. (2002). Effects of a cognitive stimulation programme on Year 1 pupils. British Journal of Educational
Psychology, 72, 1–25.
Adey, P., & Shayer, M. (1993). An exploration of long-term far-transfer effects following an extended intervention programme in the high school
science curriculum. Cognition and Instruction, 11(1), 1–29.
Adey, P., & Shayer, M. (1994). Really raising standards: Cognitive intervention and academic achievement. London: Routledge.
Adey, P., Shayer, M., & Yates, C. (2001). Thinking science: The curriculum materials of the CASE project (third ed.). London: Nelson Thornes.
Anderson, M. (1992). Intelligence and development: A cognitive theory. London: Blackwell.
Arbitman-Smith, R., Haywood, C., & Bransford, J. (1994). Assessing cognitive change. In P. H. Brooks, R. Sperber, & C. McCauley (Eds.), Learning
and cognition in the mentally retarded. Hillsdale, NJ: Lawrence Erlbaum Associates.
Ausubel, D. (1965). An evaluation of the conceptual schemes approach to science curriculum development. Journal of Research in Science Teaching,
3(3), 255–264.
Baddeley, A. (1990). Human memory: Theory and practice. London: Lawrence Erlbaum.
Blair, C. (2006). How similar are fluid cognition and general intelligence? A developmental neuroscience perspective on fluid cognition as an aspect
of human cognitive ability. Behavioral and Brain Sciences, 206(29), 1–17.
Boekarts, M. (2002). Bringing the change into the classroom: Strengths and weaknesses of the self-regulated learning approach. Learning and
Instruction, 12, 589–604.
Bransford, J. B., Brown, A. L., & Cocking, R. R. (Eds.). (1999). How people learn. Washington, DC: National Academy Press.
Bruner, J. (1968). Towards a theory of instruction. New York: W.W. Norton and Co..
Carroll, J. B. (1993). Human cognitive abilities. Cambridge, UK: Cambridge University Press.
Case, R. (1985). Intellectual development: Birth to adulthood. New York: Academic Press.
Case, R. (1992). The role of central conceptual structures in the development of children’s scientific and mathematical thought. In A. Demetriou,
M. Shayer, & A. Efklides (Eds.), Neo-piagetian theories of cognitive development (pp. 52–64). London: Routledge.
Cattell, R. B. (1971). Abilities: Their structure, growth, and action. Boston: Houghton Mifflin.
Chomsky, N. (1986). Knowledge of language, its nature origin and use. Westport Conn: Praeger.
P. Adey et al. / Educational Research Review 2 (2007) 75–97
95
Cianciolo, A. T., & Sternberg, R. J. (2004). Intelligence—A brief history. Oxford: Blackwell.
Cosmides, L., & Tooby, J. (1994). Origins of domain specificity: The evolution of functional organization. In L. Hirschfeld & S. Helman (Eds.),
Mapping the mind (pp. 85–116). Cambridge, UK: Cambridge University Press.
Cowan, N. (2006). Within fluid cognition: Fluid processing and fluid storage. Behavioral and Brain Sciences, 206(29), 21–22.
Csapó, B. (1999). Improving thinking through the content of teaching. In J. H. M. Hamers, J. E. H. van Luit, & B. Csapó (Eds.), Teaching and
learning thinking skills (pp. 37–62). Lisse: Swets and Zeitlinger.
Das, J. P., Naglieri, J. A., & Kirby, J. R. (1994). Assessment of cognitive processes: The PASS theory of intelligence. Boston: Allyn and Bacon.
Dasen, P. R. (1972). Cross cultural piagetian research: A summary. Journal of Cross Cultural Psychology, 3(1), 23–40.
de Bono, E. (1987). CoRT thinking program: Workcards and teachers’ notes. Chicago: Science Research Associates.
De Corte, E. (2000). Marrying theory building and the improvement of school practice: a permanent challenge for educational psychology. Learning
and Instruction, 10, 249–266.
Deary, I. J. (2000). Looking down on human intelligence; From psychometrics to the brain. Oxford: Oxford University Press.
Demetriou, A. (2000). Organization and development of self-understanding and selfregulation: Toward a general theory. In M. Boekaerts, P. R.
Pintrich, & M. Zeidner (Eds.), Handbook of self-regulation (pp. 209–251). New York: Academic Press.
Demetriou, A. (2004). Mind, intelligence, and development: A general cognitive, differential, and developmental theory of the mind. In A. Demetriou
& A. Raftopoulos (Eds.), Developmental change: Theories, models and measurement (pp. 21–73). Cambridge: Cambridge University Press.
Demetriou, A., & Andreou, M. (2006). Cognitive, hypercognitive, and emotional processes underlying school performance. In Paper Presented at
the 26th Congress of the International Association of Applied Psychology.
Demetriou, A., Christou, C., Spanoudis, G., & Platsidou, M. (2002). The development of mental processing: Efficiency, working memory, and
thinking. Monographs of the Society of Research in Child Development, 67. Serial Number 268.
Demetriou, A., Efklides, A., & Platsidou, M. (1993) The architecture and dynamics of developing mind: Experiential structuralism as a frame for
unifying cognitive developmental theories. Monographs of the Society for Research in Child Development, 58 (5–6, Serial No. 234).
Demetriou, A., & Kazi, S. (2001). Unity and modularity in the mind and the self: Studies on the relationships between self-awareness, personality,
and intellectual development from childhood to adolescence. London: Routledge.
Demetriou, A., & Kazi, S. (2006). Self-awareness in g (with processing efficiency and reasoning). Intelligence, 34, 297–317.
Demetriou, A., & Raftopoulos, A. (1999). Modelling the developing mind: From structure to change. Developmental Review, 19, 319–368.
Demetriou, A., Raftopoulos, A., & Kargopoulos, P. (1999). Interactions, computations, and experience: Interleaved springboards of cognitive
emergence. Developmental Review, 19, 389–414.
Driver, R., Asoko, H., Leach, J., & Scott, P. (1987). CLIS in the classroom. Leeds: Centre for Studies in Science and Mathematics Education.
Driver, R., Guesne, E., & Tiberghien, A. (Eds.). (1985). Children’s ideas in science. Milton Keynes: Open University Press.
Emanuelsson, I., Reuterberg, S.-E., & Svensson, A. (1993). Changing differences in intelligence? Comparisons between groups of thirteen-year-olds
tested from 1960 to 1990. Scandinavian Journal of Educational Research, 37, 259–277.
Feuerstein, R., Rand, Y., Hoffman, M., & Miller, M. (1980). Instrumental enrichment: An intervention program for cognitive modifiability. Baltimore:
University Park Press.
Flynn, J. R. (1987). Massive IQ gains in 14 nations: What IQ tests really measure. Psychological Bulletin, 101, 171–191.
Flynn, J. R. (1994). IQ gains over time. In R. J. Sternberg (Ed.), Encyclopaedia of human intelligence (pp. 617–623). New York: Macmillan.
Fodor, J. (1983). The modularity of the mind. Cambridge, MA: MIT Press.
Fodor, J. (2000). The mind doesn’t work that way: The scope and limits of computational psychology. Cambridge MA: MIT Press.
Gardner, H. (1993). Frames of mind. New York: Basic Books.
Garlick, D. (2002). Understanding the nature of the general factor of intelligence: the role of individual differences in neural plasticity as an
explanatory mechanism. Psychological Review, 109(1), 116–136.
Garlick, D., & Sejnowski. (2006). There is more to fluid intelligence than working memory capacity and executive function. Behavioral and Brain
Sciences, 206(29), 26–27.
Geary, D. C. (2005). The origin of the mind: Evolution of brain, cognition, and general intelligence. Washington, DC: American Psychological
Association.
Gottfredson, L. S. (2003). g, jobs, and life. In H. Nyborg (Ed.), The scientific study of general intelligence: Tribute to Arthur R. Jensen (pp. 293–342).
Amsterdam: Pergamon.
Gustafsson, J.-E. (1984). A unifying model for the structure of intellectual abilities. Intelligence, 8, 179–203.
Gustafsson, J.-E. (2002). Measurement from a hierarchical point of view. In H. I. Braun, D. N. Jackson, & D. E. Wiley (Eds.), The role of constructs
in psychological and educational measurement. Mahwah: Lawrence Erlbaum.
Gustafsson, J. E., & Undheim, J. O. (1996). Individual differences in cognitive functions. In D. C. Berliner & R. C. Calfee (Eds.), Handbook of
educational psychology (pp. 186–242). New York: Macmillan.
Halford, G. S. (1993). Children’s understanding: The development of mental models. Hillsdale, NJ: Erlbaum.
Halford, G. S., Wilson, W. H., & Phillips, S. (1998). Processing capacity defined by relational complexity: Implications for comparative, developmental, and cognitive psychology. Behavioral and Brain Sciences, 21, 803–864.
Hamers, J. H. M., & Overtoom, M. Th (Eds.). (1997). Teaching thinking in Europe. Inventory of European programmes. Utrecht: Sardes.
Hamers, J. H. M., van Luit, J. E. H., & Csapó, B. (Eds.). (1999). Teaching and learning thinking skills. Lisse: Swets & Zeitlinger.
Hautamäki, J., Arinen, P., Eronen, S., Hautamäki, A., Kupiainen, S., Lindblom, B., et al. (2002). Assessing learning-to-learn. A framework. Evaluation
3/2002. Helsinki: National Board of Education.
Herrnstein, R., & Murray, C. (1994). The bell curve: Intelligence and class structure in American life. New York: Free Press.
Horn, L. (2007). Understanding human intelligence: where have we come since Spearman. In R. Cudeck & R. MacCallum (Eds.), Factor analysis
at 100 (pp. 230–255). Mahweh, NJ: Lawrence Erlbaum.
96
P. Adey et al. / Educational Research Review 2 (2007) 75–97
Inhelder, B., Sinclair, H., & Bovet, M. (1974). Learning and the development of cognition. London: Routledge & Kegan Paul.
Jausovec, N. (2000). Differences in cognitive processes between gifted, creative, and average individuals while solving complex problems: An EEG
study. Intelligence, 28, 213–237.
Jensen, A. (1973). Educability and group differences. London: Methuen.
Jensen, A. R. (1998). The g factor: The science of mental ability. New York: Praeger.
Jung, R. E., & Haier, R. A. (in press). The parieto-frontal integration theory (P-FIT) of intelligence: Converging neuroimaging evidence. Behavioral
and Brain Sciences.
Kanazawa, S. (2004). General intelligence as a domain-specific adaptation. Psychological Review, 111, 512–523.
Kargopoulos, P., & Demetriou, A. (1998). What, why, and whence logic? A response to the commentators. New Ideas in Psychology, 16, 125–139.
Karmiloff-Smith, A. (1991). Beyond modularity: Innate constraints and developmental change. In S. Carey & R. Gelman (Eds.), The epigenesis of
mind (pp. 171–198). Hillsdale, NJ: Lawrence Erlbaum Associates.
Klauer, K. J. (1989a). Denktraining für Kinder I. Göttingen: Hogrefe.
Klauer, K. J. (1989b). Teaching for analogical transfer as a means of improving problem solving, thinking and learning. Instructional Science, 18(3),
179–192.
Klauer, K. J. (1990). A process theory of inductive reasoning tested by the teaching of domain-specific thinking strategies. European Journal of
Psychology of Education, 5(2), 191–206.
Klauer, K. J. (1991). Denktraining für Kinder II. Göttingen: Hogrefe.
Klauer, K. J. (1993). Denktraining für Jugendliche. Göttingen: Hogrefe.
Klauer, K. J. (1997). Training inductive reasoning: A developmental programme of higher-order cognitive skills. In J. H. M. Hamers & M. Th.
Overtoom (Eds.), Teaching thinking in Europe. Inventory of European programmes (pp. 77–81). Utrecht: Sardes.
Klauer, K. J. (2000). Das Huckepack-Theorem asymmetrischen Strategietransfers. Ein Beitrag zur Trainings- und Transfertheorie. [The piggyback
theorem of asymmetric strategy transfer. A contribution to the theory of training and transfer]. Zeitschrift fur Entwicklugspsychologie and
Pädagogische Psychologie, 32, 153–165.
Klauer, K. J. (2001). Training des induktiven Denkens. In K. J. Klauer (Ed.), Handbuch Kognitives training (pp. 165–209). Göttingen: Hogrefe.
Klauer, K. J., & Phye, G. (1994). Cognitive training for children. A developmental program of inductive reasoning and problem solving. Seattle:
Hogrefe and Huber.
Kuhn, D., & Pease, M. (2006). Do children and adults learn differently? Journal of Cognition and Development, 7, 279–293.
Kyllonen, P. C., & Christal, R. E. (1990). Reasoning ability is (little more than) working memory capacity? Intelligence, 14, 389–433.
Lipman, M., Sharp, M., & Oscanyan, F. (1980). Philosophy in the classroom. Philadelphia: Temple University Press.
Lubinski, D., & Humphreys, L. G. (1997). Incorporating general intelligence into epidemiology and the social sciences. Intelligence, 24(1), 159–201.
McGuinness, C., & Nisbet, J. (1991). Teaching thinking in Europe. British Journal of Educational Psychology, 61(2), 174–186.
Murray, C. (2007). Intelligence in the classroom. The Wall Street Journal on the web, January 16, Editorial. www.opinionjournal.
com/extra/?id=110009531.
Newman, D., Griffin, P., & Cole, M. (1989). The construction zone: Working for cognitive change in School. Cambridge: Cambridge University
Press.
Nunes, T., Bryant, P., Evans, D., Bell, D., Gardner, S., Gardner, A., et al. (2007). The contribution of logical reasoning to the learning of mathematics
in primary school. British Journal of Developmental Psychology, 25(1), 147–166.
OECD. (2000). Measuring student knowledge and skills. The PISA 2000 assessment of reading, mathematical and scientific literacy. Education and
skills. Paris: OECD.
OECD. (2003). The OECD 2003 assessment framework. Mathematics, reading, science and problem solving knowledge and skills. Education and
skills. Paris: OECD.
Pascual-Leone, J. (1970). A mathematical model for the transition rule in Piaget’s developmental stages. Acta Psychologica, 32, 301–345.
Perkins, D. N., & Saloman, G. (1989). Are cognitive skills context bound? Educational Researcher, 18(1), 16–25.
Piaget, J. (1974). Foreword. In B. Inhelder, H. Sinclair, & M. Bovet (Eds.), Learning and the development of cognition (pp. ix–xiv). London:
Routledge & Kegan Paul.
Pinker, S. (1997). How the mind works. London: Penguin Press.
Posner, G. J., Strike, K. A., & Hewson, P. W. (1982). Accommodation of scientific conceptions; toward a theory of conceptual change. Science
Education, 66(2), 211–227.
Raven, J. C. (1960). Guide to the standard progressive matrices set A, B, C, D and E. London: H.K. Lewis.
Rose, C. (1985). Accelerated learning. Aylesbury: Accelerated Learning Systems.
Rychen, D. S., & Salganik, L. H. (Eds.). (2001). Defining and selecting key competencies. Seattle: Hogrefe and Huber.
Sacks, O. (1985). The man who mistook his wife for a hat. London: Duckworth.
Schmidt, F. L., & Hunter, J. E. (1998). The validity and utility of selection methods in personnel psychology: practical and theoretical implications
of 85 years of research findings. Psychological Bulletin, 124(2), 262–274.
Seok, B. (2006). Diversity and unity of modularity. Cognitive Science, 30(March–April), 347–380.
Shayer, M. (1979). Has Piaget’s construct of formal operational thinking any utility? British Journal of Educational Psychology, 49, 265–267.
Shayer, M. (1999). Cognitive acceleration through science education II: Its effect and scope. International Journal of Science Education, 21(8),
883–902.
Shayer, M. (2002). Not just Piaget; not just Vygotsky, and certainly not Vygotsky as alternative to Piaget. In M. Shayer & P. Adey (Eds.), Learning
intelligence: Cognitive acceleration across the curriculum from 5 to 15 years. Milton Keynes: Open University Press.
Shayer, M., & Adey, P. (Eds.). (2002). Learning intelligence: Cognitive acceleration across the curriculum from 5 to 15 years. Milton Keynes: Open
University Press.
P. Adey et al. / Educational Research Review 2 (2007) 75–97
97
Shayer, M., & Adhami, M. (2007). Fostering cognitive development through the context of mathematics: Results of the CAME project. Educational
Studies in Mathematics, 64, 265–291.
Shayer, M., & Adhami, M. (submitted for publication). Realising the Cognitive Potential of Children 5 to 7 with a Mathematics focus: Effects of a
two-year intervention. British Educational Research Journal.
Shayer, M., & Beasley, F. (1987). Does Instrumental Enrichment work? British Educational Research Journal, 13, 101–119.
Shayer, M., Coe, R., & Ginsburg, D. (2007). 30 Years on—a large anti-‘Flynn effect’? the Piagetian test Volume & Heaviness norms 1975–2003.
British Journal of Educational Psychology, 77, 25–41.
Shayer, M., Demetriou, A., & Pervez, M. (1988). The structure and scaling of concrete operational thought: three studies in four countries. Genetic
Social and General Psychological Monographs, 309–375.
Shayer, M., Küchemann, D., & Wylam, H. (1976). The distribution of Piagetian stages of thinking in British middle and secondary school children.
British Journal of Educational Psychology, 46, 164–173.
Shayer, M., & Wylam, H. (1978). The distribution of Piagetian stages of thinking in British middle and secondary school children. II—14- to 16year olds and sex differentials. British Journal of Educational Psychology, 48, 62–70.
Spearman, C. (1927). ‘General Intelligence’, objectively determined and measured. American Journal of Psychology, 15, 201–293.
Sternberg, R. (1985). Beyond IQ: A triarchic theory of intelligence. Cambridge: Cambridge University Press.
Sternberg, R. J. (1996). Myths, countermyths, and truths about intelligence. Educational Researcher, 25(2), 11–16.
Sundet, J. M., Barlaug, D. G., & Torjussen, T. M. (in press). The end of the Flynn effect? A study of secular trends in mean intelligence test scores
of Norwegian conscripts during half a century. Intelligence.
Teasdale, T. W., & Owen, D. R. (2000). Forty-year secular trends in cognitive abilities. Intelligence, 28, 115–120.
ter Laak, J. J. F. (1995). Psychologische Diagnostiek. Lisse: Swets & Zeitlinger.
Thompson, G. H. (1916). A hierarchy without a general factor. British Journal of Psychology, VIII(3), 271–281.
Thurstone, L. L., & Thurstone, T. G. (1941). Primary mental abilities. Chicago: Science Research Associates.
Visser, B. A., Ashton, M. C., & Vernon, P. A. (2006). Beyond g: Putting multiple intelligences theory to test. Intelligence, 34, 487–502.
Vosniadou, S., Ioannides, C., Dimitrakopolou, A., & Papademetriou, E. (2001). Designing learning environments to promote conceptual change in
science. Learning and Instruction, 11, 38–419.
Westermann, G. W., Mareshall, D., Johnson, M. H., Sirois, S., Spratling, M. W., & Thomas, M. C. (2007). Neuroconstructivism. Developmental
Science, 10, 75–83.
Wohlwill, J. F. (1973). A study of behavioral development. New York: Academic Press.
Zelazo, P. R., & Frye, D. (1998). Cognitive complexity and control: II. The development of executive function in childhood. Current Directions in
Psychological Science, 7, 121–126.