Newman Error Analysis PDF
Newman Error Analysis PDF
Newman Error Analysis PDF
Allan of Science
Leslie White and Mathematics
Education in Southeast Asia
2010, Vol. 33 No. 2, 129 - 148
Background
While most mathematical questions involve the use of words, not all are
classed as word problems. A primary condition of word problems is the
inclusion of a word description of a context within which the problem resides
such as the word problem shown in Figure 1.
Paul went on a bike hike. He rode 402 km on his bicycle over 6 days. He rode
the same distance each day. How far did Paul ride each day?
129
Numeracy, Literacy and Newman’s Error Analysis
130
Allan Leslie White
131
Numeracy, Literacy and Newman’s Error Analysis
132
Allan Leslie White
The reasons for the inclusion of NEA, (Newman, 1977, 1983) in the 2007
and 2008 programs were to assist teachers when confronted with students
who experienced difficulties with mathematical word problems. It
challenged the prevailing practice of giving students ‘more of the same’
involving drill and practice in the hope that the students would rectify their
difficulties. NEA provided a framework for considering the reasons that
underlay the difficulties and a process for assisting teachers to determine
where misunderstandings occurred and where to target effective teaching
strategies to overcome them. Moreover, it provided excellent professional
learning program for teachers and made a nice link between literacy and
numeracy.
NEA was designed as a simple diagnostic procedure. Newman (1977,
1983) maintained that when a person attempted to answer a standard,
written, mathematics word problem then that person had to be able to pass
over a number of successive hurdles: Level 1 Reading (or Decoding), 2
Comprehension, 3 Transformation, 4 Process Skills, and 5 Encoding (see
Table 1 for the interview prompts). Along the way, it was always possible
to make a careless error and there were some who gave incorrect answers
because they were not motivated to answer to their level of ability. Newman’s
research generated a large amount of evidence highlighting that far more
children experienced difficulty with the semantic structures, the vocabulary,
and the symbolism of mathematics than with the standard algorithms. In
many Newman studies carried out in schools the proportion of errors first
occurring at the Comprehension and Transformation’ stages has been large
(Marinas & Clements,1990; Ellerton & Clements, 1996; Singhatat, 1991). Thus,
studies regularly reported that approximately 70 per cent of errors made by
Year 7 students on typical mathematics questions were at the Comprehension
or Transformation levels. These researchers also found that Reading
(Decoding) errors accounted for less than 5 per cent of initial errors and the
same was true for Process Skills errors, mostly associated with standard
numerical operations (Ellerton & Clarkson, 1996). Also, Newman’s research
consistently pointed to the inappropriateness of many remedial mathematics
programs in schools in which the revision of standard algorithms was
overemphasised, while hardly any attention was given to difficulties
associated with Comprehension and Transformation (Ellerton & Clarkson,
1996). There have been adaptations and two procedures that modified the
interview procedures used by Newman (1977) will now be briefly described.
133
Numeracy, Literacy and Newman’s Error Analysis
Table 1
The Newman’s Error Analysis Interview Prompts
1. Please read the question to me. If you don’t know a word, leave it out.
2. Tell me what the question is asking you to do.
3. Tell me how you are going to find the answer.
4. Show me what to do to get the answer. “Talk aloud” as you do it, so that I can
understand how you are thinking.
5. Now, write down your answer to the question.
134
Allan Leslie White
relationships. The results indicated a 35% mismatch with students who gave
correct answers with little or no understanding and others who gave incorrect
answers but possessed some understanding. The authors cast doubt on the
use of large scale testing programs as a means of making comparisons or
being used as basis for the allocation of resources.
While there are other theoretical approaches available to teachers, NEA
offers one of the easiest to use and adapt and has proven popular among
NSW teachers for the ease of the diagnostic features. What is also surprising
is how NEA has been used by teachers as a problem solving strategy for
students and as a classroom pedagogical strategy. In the next section, data
from the 2008 evaluation report (White, 2009) will examine the student
learning outcomes and the teacher use of NEA.
Student Learning
The 2008 Counting On program was implemented in 99 schools across the
state. An assessment instrument based on the learning framework was
administered by the class teacher as a whole class schedule covering place
value, addition, subtraction, multiplication, division, and word problem
tasks. The assessment schedule results were used by the teacher to identify
the student target group. The target group completed the program and were
tested at the start and finish of the program. The teachers were asked to
record the results of the target group assessment process involving a
minimum of 5 students per class on an excel spreadsheet supplied to them.
The spreadsheet recorded the initial level on the learning framework and
NEA for the students before the program was implemented and again
following 10 weeks of targeted activities.
The Counting On program is funded under an Australian federal
government program and there is a mandatory evaluation process that
includes instruments and reporting requirements. The Counting On program
also has to report to other NSW state bodies and other data is collected for
these purposes. The author of this paper was given all the data collected
from all the instruments used and asked to analyse and construct an
evaluation report. He had neither input into the design of these instruments
nor the collection of data although he was able to collect further data. Thus
there are methodological issues that arise such as a concern with the initial
and final student level diagnosis by teachers. While the facilitators are trained
in the use of NEA, there are concerns with the process involving the other
135
Numeracy, Literacy and Newman’s Error Analysis
teachers and this will be discussed later in this paper. However as a result
of these concerns with the integrity of some of the data, it was decided to
use only simple statistical tools in the analysis.
In 2008 data was collected from 74 schools with 55 primary schools, 16
secondary schools and three central schools. There were 1213 students with
954 primary students (78.6%) and 259 secondary students (21.4%). Only one
of the two questions involving Newman’s Error Analysis in the assessment
instrument was recorded for each student. The NEA scale from 1 to 5 was
used, and a category 6 was added to represent those who could complete
the word problem successfully. Table 2 displays the initial and final NEA
levels for the 2008 cohort and indicates an improvement in the overall levels
from the initial to the final student assessments.
Table 2
The Initial and Final Newman’s Error Analysis Levels
NEA Initial Level Percentage Final Level Percentage
Levels Frequency Frequency Frequency Frequency
1 196 16.2% 51 4.2%
2 452 37.3% 234 19.3%
3 399 32.9% 477 39.3%
4 101 8.3% 220 18.1%
5 37 3.1% 134 11.0%
6 28 2.3% 97 8.0%
Total 1213 100.0% 1213 100.0%
136
Allan Leslie White
Table 3
The Difference in Newman’s Error Analysis Levels
Difference Frequency Percentage Frequency
-4 3 0.2%
-3 6 0.5%
-2 14 1.2%
-1 52 4.3%
0 452 37.3%
1 385 31.7%
2 189 15.6%
3 79 6.5%
4 27 2.2%
5 6 0.5%
Total 1213 100.0%
While the 2008 data collected for the pre and post program student
learning outcomes indicated that a statistical significant increase existed in
the NEA levels, of concern was the group who did not show an increase. In
a short program as this it is unrealistic to expect that all students will make
great leaps on the NEA levels. These targeted students have been struggling
for some time with their mathematical and literacy difficulties and have
developed judgements of their own ability. To improve 1 level, especially
for the NEA scale which could involve the improvement of reading or
comprehension, is quite remarkable in such a small time frame.
However, there may be other possible explanations for the lack of
improvement in a small group of students or the apparent decline in others.
Vaiyatvutjamai and Clements (2004) analysed the errors made by 231 Form
3 (Year 9) Thai students in two Chiang Mai government secondary schools.
Students completed tasks before and immediately after a series of 13 lessons.
A number of misconceptions were revealed and although some were clarified
as a result of the lessons, there were others that remained and seemed to be
‘fossilised’. A ‘fossilised misconception’ was used to denote the situation
where a student maintains a faulty conception despite having been
specifically taught the ‘official’ defining characteristics of the relevant
concept. Associated with this then is the absence of cognitive change over
time or even resistance to change over time, so that cognitive inertia persists
despite the individual having been taught the ‘proper’ view of the concept.
137
Numeracy, Literacy and Newman’s Error Analysis
The implications for this current study are that the strategies and procedures
of the intervention program of Counting On should become integrated into
the everyday classroom and continue after the program has finished. These
‘fossilised misconceptions’ may require a greater time period for them to be
changed.
Also of interest is that while the study by Vaiyatvutjamai and Clements
(2004) involved students across the range of abilities, the results for low
performing students challenged the use of the term misconception for many
of the student errors. “A misconception can be regarded as a fairly stable,
but inappropriate, way of thinking ...analysing the errors made by low
performers in this study, was that the word ‘stable’ was not one that could
sensibly be used” (p. 181). Students with ‘unstable’ conceptions will give
different answers at different times and hence it is possible that their test
scores will decline. Students who have not developed confidence in their
ability to answer a question may revert to guessing. This may well explain
the 35% mismatch with students who gave correct answers with little or no
understanding and others who gave incorrect answers but possessed some
understanding reported in Ellerton and Olson’s (2005) study. While students
using a guessing strategy may cause some instability in the results for the
process level in NEA if not identified by the teacher, it would be easily
revealed by the second adaption of NEA (Ellerton & Clements, 1997) where
correct answers are interrogated.
This completes the section on NEA and student learning and the next
section will discuss the second learning focus which is upon the teacher.
Teachers were shown how to use the NEA prompts (see Table 1) to diagnose
the difficulties that their students were having with mathematical word
problems. The following section of this paper will explore how teachers
responded to NEA as an aspect of the Counting On program.
138
Allan Leslie White
139
Numeracy, Literacy and Newman’s Error Analysis
140
Allan Leslie White
141
Numeracy, Literacy and Newman’s Error Analysis
Conclusion
The Counting On program was a success in improving both teacher and
student learning outcomes through the inclusion of NEA. The data revealed
a statistical and educationally significant improvement existing in student
learning outcomes between the start and the completion of the Counting
On program involving mathematical problem solving using word problems.
As well NEA was being used by teachers as a remedial classroom strategy
and as a wider classroom pedagogical strategy. Thus this article concludes
that the inclusion of NEA was a powerful classroom diagnostic assessment
and teaching tool for assessing, analysing and catering for student
experiencing difficulties with mathematical word problems.
Acknowledgement
The author wishes to acknowledge the support of the New South Wales
Department of Education and Training, particularly Peter Gould, Chris
Francis, Ray MacArthur and Bernard Tola of the Curriculum Support
Directorate. The opinions expressed in this paper are those of the author
and do not necessarily reflect those of the New South Wales Department of
Education and Training or the above members of the Curriculum Support
Directorate.
142
Allan Leslie White
References
Askew, M. (2003), Word problems: Cinderellas or wicked witches? In I.
Thompson (Ed.), Enhancing primary mathematics teaching (pp. 78-85).
Berkshire, England: Open University Press.
Beishuizen, M. (1993). Mental strategies and materials or models for addition
and subtraction up to 100 in Dutch second grades. Journal for Research
in Mathematics Education, 24(4), 294-323.
Carpenter, T. P., & Lehrer, R. (1999). Teaching and Learning Mathematics
with Understanding. In E. Fennema & T. A. Romberg (Eds). Mathematics
Classrooms That Promote Understanding (pp. 19-32). Mahwah, N.J.:
Lawrence Erlbaum Associates.
Casey, D. P. (1978). Failing students: a strategy of error analysis. In P. Costello
(ed.). Aspects of Motivation (pp. 295-306). Melbourne: Mathematical
Association of Victoria.
Clarkson, P. C. (1980). The Newman Error Analysis- some extensions. In
B.A. Foster (Ed.), Research in Mathematics Education in Australia 1980
(Vol. I, pp. 11-22). Hobart Mathematics Education Research Group of
Australia.
Clarkson, P. C. (1983). Types of errors made by Papua New Guinean students,
Report No. 26. Lae: Papua New Guinea University of Technology
Mathematics Education Centre.
Clarkson, P. C. (1991). Language comprehension errors: A further
investigation. Mathematics Education Research Journal 3(2), 24-33.
Clements, M. A. (1980). Analysing children’s errors on written mathematical
tasks. Educational Studies in Mathematics, 11 (1), 1-21.
Clements, M. A. (1982). Careless errors made by sixth-grade children on
written mathematical tasks. Journal for Research in Mathematics
Education, 13(2), 136-144.
Clements, M. A. (1984). Language Factors in School Mathematics. In P.
Costello, S. Ferguson, K. Slinn, M. Stephens, D. Trembath, & D. Williams
(Eds.), Facets of Australian Mathematics Education (pp. 137-148) ,
Australian Association of Mathematics Teachers, Adelaide, South
Australia.
143
Numeracy, Literacy and Newman’s Error Analysis
144
Allan Leslie White
145
Numeracy, Literacy and Newman’s Error Analysis
146
Allan Leslie White
147
Numeracy, Literacy and Newman’s Error Analysis
Author:
Allan Leslie White; University of Western Sydney;
e-mail: [email protected]
148