Assessment For Learning Torrance
Assessment For Learning Torrance
Assessment For Learning Torrance
The paper reports on the findings of a Learning and Skills Research Centre (LSRC) funded study
investigating the impact of different modes and methods of assessment on achievement and progress
in post-secondary education and training. Data were collected across Advanced-level academic and
vocational preparation programmes in schools and colleges, work-based training, and adult educa-
tion settings. The paper reports that clarity in assessment procedures, processes and criteria has
underpinned widespread use of coaching, practice and provision of formative feedback to boost
achievement, but that such transparency encourages instrumentalism. It concludes that the practice
of assessment has moved from assessment of learning, through assessment for learning, to assess-
ment as learning, with assessment procedures and practices coming completely to dominate the
learning experience and ‘criteria compliance’ replacing ‘learning’.
Introduction
Claims for the educational value and effectiveness of formative assessment in the
mainstream compulsory school system have been made for a number of years, in the
UK and elsewhere. It is argued that assessment has to move from ‘assessment of
learning’ to ‘assessment for learning’, whereby assessment procedures and practices
are developed to support learning and underpin rather than undermine student
confidence, achievement and progress (Black & Wiliam, 1998; Torrance & Pryor,
1998; Gipps, 1999; Shepard, 2000). Indeed, so commonplace have these claims
become that research and development attention has shifted to the dissemination and
*Education and Social Research Institute (ESRI), Manchester Metropolitan University, 799
Wilmslow Road, Manchester M20 2RR, UK. Email: [email protected]
The research
The findings on which this paper is based derive from a Learning and Skills
Research Centre (LSRC) funded research project, ‘The impact of different modes
and methods of assessment on achievement and progress in the learning and skills
sector’ (additional support was provided by City & Guilds and the ‘University for
Industry’, UfI). The project was commissioned to investigate whether or not, and
if so, how, use of different assessment methods makes a difference to learner
achievement and progress in post-compulsory education and training. A review of
the literature on assessment in post-compulsory education noted the scarcity of
studies in the sector (Torrance & Coultas, 2004) while other recent research
reviews such as Stasz et al. (2004) note a similar paucity of evidence with respect
to what teaching and learning approaches might actually make a difference to
achievement and outcomes in the sector. Thus the project was commissioned to
address the need for a comprehensive overview of assessment procedures and prac-
tices in the post-compulsory Learning and Skills Sector (LSS) in England, espe-
cially at the level of impact on the learner. It sought to compare and contrast the
assessment experiences of learners in different settings and is the first comprehen-
sive study of assessment procedures and practices employed across the full range of
LSS contexts—school sixth-forms, further education colleges, workplaces and adult
learning environments.
Data have been gathered by conducting a series of parallel case studies of assess-
ment ‘in action’ across a wide variety of LSS settings and by a questionnaire distrib-
uted to a larger sample of learners derived from the case study settings. Case studies
were conducted of assessment across of the post-compulsory Learning and Skills
Sector:
The boundaries of each case were established with respect to particular qualifica-
tions and/or awards and the contextual and regional factors which influence the
assessment of awards in practice, including awarding body procedures and
processes. Thus the case studies were designed as ‘vertical’ investigations, exploring
a particular qualification such as AVCE or NVQ from awarding body through to
learner, though with the emphasis on learner experience. In total 237 learners/
candidates were interviewed, along with 95 ‘assessors’ (i.e. all those involved in
operating and conducting assessment within the case studies including the full range
of senior awarding body staff, chief and lead verifiers, internal assessors and external
284 H. Torrance
Downloaded By: [Swets Content Distribution] At: 14:29 11 December 2007
students could opt to do one unit of a business studies course as either a coursework
project or a modular test and one student chose both options so that she could pick
her higher mark to go forward for final grading.
ABs also came in for considerable criticism about lack of responsiveness to queries
and clearly this will influence choice of AB. In some colleges in the study, curriculum
managers have been returning to BTEC programmes (that is, older vocational prep-
aration programmes), which they feel are more genuinely vocationally oriented,
rather than AVCEs. Additionally BTEC students are not counted in national achieve-
ment data for ‘Level 3’ (i.e. A-level equivalent for government statistics), so moving
to BTEC and taking vocational students out of college level data could obscure a
college’s attainment of fewer higher grades in AVCE compared to general A-level.
Savory et al. (2003) similarly report a move away from AVCE towards offering BTEC
Nationals, and note widespread dissatisfaction with ABs’ responses to queries. They
quote one respondent complaining that:
The exam boards have become so big that they are drowning and we find it almost
impossible to find a real person to talk to about new specification problems. (Savory et al.,
2003, p. 16)
Once ABs and syllabuses are selected, detailed grade criteria are articulated for
learners at A-level and AVCE—the breaking down of awarding body guidelines and
instructions into detailed mark schemes, assignment plans, etc., for students to follow.
286 H. Torrance
Downloaded By: [Swets Content Distribution] At: 14:29 11 December 2007
At A-level [the examining boards] want to hear technical terms and expanded vocabulary
… [students] need to be a little more analytical at A-level … So we have drafted a crib sheet
with words, explanations. (A-level PE tutor)
We have spent a lot of time … coming up with a sort of a template to issue to our students
as a starting point to give them something to work on … writing frames, templates to fill
in, bullet points to follow … . (AVCE Sp&L tutor)
In turn students can draft and re draft assignments, receiving feedback on strengths
and weaknesses and what needs to be done to improve the grade. They can also re-
take unit and modular tests as necessary to improve grades. Sometimes tutors operate
with the ‘straight’ AVCE/BTEC nomenclature of Pass, Merit and Distinction; some-
times they operate with a range of grades which parallel AS and A2 (grades E–A) and
which in turn ‘map onto’ AVCE, thus ‘D/E’ refer to ‘describing and identifying’ and
would correspond with a Pass; ‘B/C’ involve ‘understanding’ and ‘bringing together’
and correspond with a Merit; ‘A’ must include ‘critical evaluation and analysis’
and corresponds with a Distinction. When asked what an A grade meant in AVCE
Business Studies, one student responded:
It’s analyse, evaluate, and stuff like that. You have to explain things but more in-depth.
Instead of just summarizing, you have to extend it and make it relevant and link it to what
you’ve already put, it’s clear and it flows and it’s fully described.
Other students were equally well tuned to the criteria and drafting process:
At the start of each module we get like a sheet of paper, it’s got ‘for an E you have to do
this, for a D you have to do that’ … they tell you what you have to do to get a good mark.
(AVCE student)
Thus ‘good teachers’ are those who can handle this workload and schedule this
formative feedback within realistic time-scales, offering clear guidance that students
feel they can follow. A-level and AVCE alike involve a great deal of criteria-focused
‘coaching’ of students. The potential downside of such activity is that achievement
can come to be seen as little more than criteria compliance in pursuit of grades.
Moreover, while the pressure of coursework assignments can become intense, the
responsibility for putting in the ‘hard work’ of assessment in pursuit of achievement
might now be said to fall as much on the shoulders of tutors as on the learners, and a
great deal of ‘hidden work’ is undertaken according to tutor disposition (cf. also
James & Diment, 2003).
Assessment as Learning? 287
Downloaded By: [Swets Content Distribution] At: 14:29 11 December 2007
In other workplace settings support can be observed in the way ‘leading questions’
are asked of candidates to help them through observations of workshop practice and
in compiling portfolio evidence. In the example below, where an MVE apprentice is
cleaning and adjusting brake drums and shoes, the interaction is more like a tradi-
tional ‘teacher–pupil’ pedagogic encounter than a workplace assessment.
[Fieldwork observation:]
Assessor: What were the drums like?
MVE trainee: They had a bit of rust on them so we cleaned the drums out and cleaned
the shoes out … [ ] …
A: I’m trying to think of something I haven’t asked you before. Yes, what
causes the rust on a brake pipe?
MVE tr: Corrosion.
A: So you get outside corrosion, yes, from the weather and then what about
the inside corrosion? How does a brake pipe get rusted on the inside which
you can’t see?
MVE tr: The brake fluid gets warm.
A: No.
MVE tr: It’s something in the brake fluid isn’t it?
A: Yes, what causes rust?
MVE tr: Water.
A: So if it’s rusty on the inside, what do we say the brake fluid does? Why do
we have to change the brake fluid? If I say hydroscopic to you, I’m not
swearing. Have you heard of it?
MVE tr: I’ve heard it now.
A: Do you know what it means? Can you remember what it means? It absorbs
moisture. So that’s why you have to change the fluid so that the brake pipes
don’t become rusty on the inside.
MVE tr: I knew it was something in the fluid.
A: Well now you know don’t you. Don’t forget next time will you?
288 H. Torrance
Downloaded By: [Swets Content Distribution] At: 14:29 11 December 2007
Similarly in social care we observed assessors asking leading questions to help candi-
dates articulate what they (supposedly) already know and can do, as in this assess-
ment of a foster parent:
[Fieldwork observation:]
Assessor: Why is it important to explain these limits to clients [i.e. foster children]?
Learner: I don’t know. My mind’s gone blank.
A: So what if a child came home from school and they’re going on a trip in two
days time and they want you to sign the form?
L: I’d say to him because I’m not your legal guardian, I’m not allowed to sign so
we’d have to try and get in touch with the parent or social worker for them to
sign. I wouldn’t be stopping them from going on the trip. It’s not me that’s
stopping them; it’s that I’m not allowed to sign.
A: So right, they know what their expectations are then. When you’ve got older
children Julie, it’s always important to explain everything to them … [ ] … So
let’s think, say for instance you get a child who’s not going to school at all …
but you managed to work with them and they start going to school and start
going regular, why is it important that you sit down and talk to that child about
what they’ve achieved by going?
L: Well it’s to help them further their education; get a better start in life.
A: How do you think that would make them feel about themselves?
L: I think they’d feel more secure and confident about themselves.
A: That’s what I’m looking for.
L: I finally got there!’
Adult learning
In Access courses and adult basic education settings, similar levels of support, coach-
ing and practice were observed. Exercises and assignments were drafted, read,
commented upon and then re submitted:
I tend to layout fairly clearly what I want … I’ve broken it down into four discrete sections
and lay out exactly what they have to do for each and then they get handouts which will
support them. (Access tutor)
They do give you a lot of feedback on your assignments … The first assignment in Psychology
I got a [level] 2 and she went through it and she said if you define that a bit better than
that and she gave me another week and I did it all and she gave me a [level] 3. (Access student)
In Basic Skills programmes there are continual efforts made to relate literacy and
numeracy tasks to relevant social and vocational activities, and render the Adult Basic
Skills Unit (ABSU) national curriculum into ‘small chunks’, but with the addition
that in college-based settings as much testing as possible was embedded in ordinary
classroom activities.
We all have small chunks for the students to cover and so they are doing activities in the
classroom and they can self-assess. They can peer-assess as well. They are doing group
work and they are doing listening as well, so they are assessing all these skills as we go
along. (Adult Basic Skills tutor)
Right now I have been working through units, I am just working through the units each
time … I tend to do them one by one, make sure they are OK, and get them out of the way
Assessment as Learning? 289
Downloaded By: [Swets Content Distribution] At: 14:29 11 December 2007
… you do one paper, and if you get it all right, then you can move on to the next one … It
is the set-up to work through each unit and you take your time going through it, make sure
it is correct, and you get to the end and do the final test. (Basic Skills learner)
In this respect it is a moot point if some candidates even realized they were taking a
test. A number of learners had progressed from one level to another without being
able to remember that they had taken tests:
… the English teacher is quite clever, she’s given me a few tests without me [realizing]. When
I go on the computer she says ‘well that’s level 1 or level 2’, so she says ‘you passed that’. I
don’t know … which is kind of good because of the psychology of it … . (Basic Skills learner)
Significant levels of tutor ‘hidden work’ were also observed in this sub-sector as in others:
If you’ve got problems you know who to talk to. You haven’t got to go hunting for some-
body … I think [tutor] R feels like the mother of the group. She’s the mother hen that goes
round and worries about everybody … If you need help or you need extra time, you go and
talk to R and it’s sorted. (Access student)
Last year [named tutor] had them from [village] and it’s quite a way out, I mean it’s right
up in the hills really. And they don’t have cars because it’s a poorer area … And she
brought them down in her car. She went up, collected them all, brought them all back.
They did their exams and she took them back again. Because you do worry about them.
Well, I worry about them. I know it’s stupid. (Basic Skills tutor)
appeared to be a very common practice for MVE apprentices simply to keep their
garage ‘job-sheets’ up to date and filled in with brief descriptions of the jobs under-
taken and completed. The assessor then ‘deconstructs’ this basic information into the
relevant ‘competences’ and maps and transfers the detail into the apprentice’s
evidence portfolio. Such practices also accord with the findings of other recent studies
of portfolio completion in the workplace which indicate that younger workers do not
usually take responsibility for portfolio completion (Kodz et al., 1998; Tolley et al.,
2003). Indeed, Fuller and Unwin (2003) in their study of apprenticeship in various
sectors of the steel industry, note, almost in passing, that:
Responsibility for recording the apprentices’ progress towards the achievements of the
qualifications was taken by the external training provider at the regular review sessions. An
important part of his job was to help apprentices identify how the day-to-day task in which
they were engaged could be used to generate evidence that they were meeting the compe-
tence standards codified in the NVQ… . (Fuller & Unwin, 2003, p. 422)
In turn, the instrumentalism of learners both drives and validates the level of tutor
support. Similarly institutions themselves are target-oriented and instrumentally
driven. Thus learners seek and expect details of assessment specifications, evidence
requirements and so forth. They want support and appreciate it when they get it; but
their instrumentalism reinforces tutor moves to focus on grade criteria, the elucidation
of evidence, etc. As a consequence assignments and portfolios from some institutions
can often look very similar—in structure, format, types of evidence included, etc.—
so it seems that some institutions are becoming very adept at ‘coaching’ cohorts
Assessment as Learning? 291
Downloaded By: [Swets Content Distribution] At: 14:29 11 December 2007
Conclusion
Previous studies of formative assessment in the compulsory school sector have noted
that broad recommendations to ‘share criteria’ and ‘provide criteria-focused feed-
back’ can be interpreted in different ways. Torrance and Pryor (1998) report that
formative assessment can be ‘convergent’, with teachers focusing on identifying and
reporting whether or not students achieve extant curriculum-derived objectives, or
‘divergent’ which is much more oriented towards identifying what students can do in
an open-ended and exploratory fashion. Similarly Marshall and Drummond (2006)
note that while some teachers in the ‘Learning How to Learn’ study attempted to
implement the ‘spirit’ of formative assessment with respect to developing student
autonomy, most simply followed the ‘letter’ and used formative assessment tech-
niques to facilitate short-term lesson planning and teaching, and promote short-term
grade accomplishment. It would appear that not much has changed in the compul-
sory sector in the eight years between the two studies. Certainly it would seem that
292 H. Torrance
Downloaded By: [Swets Content Distribution] At: 14:29 11 December 2007
judgements are made and thus should be the level of the system at which most efforts
at capacity building are directed. This may start to reinstate the challenge of learning
in the post-compulsory sector and attend to it as an act of social and intellectual
development rather than one of acquisition and accumulation.
Acknowledgements
I am grateful to the Sponsors, LSRC and City & Guilds, the research manager
Maggie Greenwood, and for the contributions of the other members of the research
team: Helen Colley, Kathryn Ecclestone, Dean Garratt, David James, Janis Jarvis,
Heather Piper, some of whose work is included elsewhere in this issue.
Notes
1. Paper originally presented to BERA symposium, ‘Assessment in Post-compulsory Education
and Training’, University of Warwick, September 2006.
2. Also known in the UK as post-compulsory education and training, since some elements of
provision remain located in secondary school ‘sixth forms’, which students attend after passing
the formal minimum school leaving age of 16 years.
3. Lorna Earl also uses the phrase ‘Assessment as Learning’ (Earl, 2003). However, she argues for
a congruence between learning and assessment which is concerned with student self-assessment
and is very much in the assessment for learning tradition. My use of the term is much more
concerned with the displacement of learning (i.e. understanding) by procedural compliance: i.e.
achievement without understanding.
4. ‘SureStart’ focuses on early learning among pre-school children and community-based support
for new parents.
5. Research project funded 2003–2005 by LSRC and City & Guilds, with support from Ufi. The
full report including a full description of the methodology, sample, etc. is available at: http://
www.lsda.org.uk/cims/order.aspx?code=052284&src=XOWEB.
6. I am grateful to Mary James for raising this query when she read an earlier version of this paper.
7. Two cautionary riders may be added at this point, however. First, the research engaged with learn-
ers who are in the system, rather than outside it, and hence are likely to have a propensity to
continue to comply and achieve, albeit while perhaps avoiding tasks or activities which are
perceived as too difficult. Second, we have encountered evidence of more intrinsic orientations
and rewards, particularly in the adult education sector, with respect to the development of self-
confidence. In training contexts the development of practical competence following school-based
academic failure and/or disinterest has also brought a sense of achievement. By their very nature,
however, such achievements are difficult for learners to identify and articulate in more than a
cursory manner. When asked in the questionnaire to state how they knew they were making
progress, learners’ answers varied from ‘because I’m passing the tests’ and ‘because I’m getting
more knowledge’ through ‘holding my job down at work’ to ‘because I just re fitted the brakes on
my car’. Clearly there is a sense of personal achievement linked to developing competence in these
latter responses, but such responses are unusual and hard to ‘call forth’ in the context of assessment
tasks and events where achievement and progress are normally interpreted much more narrowly.
Notes on contributor
Harry Torrance is professor of education and director of the Education and Social
Research Institute, Manchester Metropolitan University, UK.
294 H. Torrance
Downloaded By: [Swets Content Distribution] At: 14:29 11 December 2007
References
Black, P. & Wiliam, D. (1998) Assessment and classroom learning, Assessment in Education, 5(1),
7–74.
Black, P., McCormick, R., James, M. & Pedder, D. (2006) Learning How to Learn and Assessment
for Learning: a theoretical inquiry, Research Papers in Education, 21(2), 119–132.
Burke, J. (Ed.) (1989) Competency-based education and training (London, Falmer Press).
Earl, L. (2003) Assessment as learning (Thousand Oaks, Corwin Press).
Fuller, A. & Unwin, L. (2003) Learning as apprentices in the contemporary UK workplace: creat-
ing and managing expansive and restrictive participation, Journal of Education and Work,
16(4), 407–426.
Gipps, C. (1999) Socio-cultural aspects of assessment, Review of Research in Education, 24, 355–392.
Hodgson, A. & Spours, K. (2003) Curriculum 2000 (London, Routledge).
James, D. & Diment, K. (2003) Going underground? Learning and assessment in an ambiguous
space, Journal of Vocational Education and Training, 55(4), 407–422.
James, M. & Pedder, D. (2006) Beyond method: assessment and learning practices and values,
The Curriculum Journal, 17(2), 109–138.
James, M., Black, P., McCormick, R., Pedder, D. & Wiliam, D. (2006) Learning How to Learn,
in classrooms, schools and networks: aims, design and analysis, Research Papers in Education,
21(2), 101–118.
Jessup, G. (1991) Outcomes: NVQs and the emerging model of education and training (London,
Falmer Press).
Kodz, J., Dench, S., Pollard, E. & Evans, C. (1998) Developing the key skills of young people: an eval-
uation of initiatives in the former Avon area, Report 350 (Brighton, Institute for Employment
Studies).
Marshall, B. & Drummond, M. J. (2006) How teachers engage with Assessment for Learning:
lessons from the classroom, Research Papers in Education, 21(2), 133–149.
Raffe, D., Croxford, L. & Brannen, K. (2001) Participation in full time education beyond 16: a
‘home international’ comparison, Research Papers in Education, 16(1), 43–68.
Savory, C., Hodgson, A. & Spours, K. (2003) The Advanced Vocational Certificate of Education
(AVCE): a general or vocational qualification? (London, IoE/Nuffield Series Number 7).
Shepard, L. (2000) The role of assessment in a learning culture, Educational Researcher, 29(7), 4–14.
Stasz, C., Hayward, G., Oh, S. & Wright, S. (2004) Outcomes and processes in vocational learning: a
review of the literature (London, Learning and Skills Research Centre (LSRC)).
Tolley, H., Greatbatch, D., Bolton, J. & Warmington, P. (2003) Improving occupational learning:
the validity and transferability of NVQs in the workplace, DfES Research Report RR425
(London, DfES).
Torrance, H. & Coultas, J. (2004) Do summative assessment and testing have a positive or negative
effect on post-16 learners’ motivation for learning in the learning and skills sector? A review of the
research literature on assessment in post-compulsory education in the UK (London, LSRC).
Torrance, H. & Pryor, J. (1998) Investigating formative assessment: teaching, learning and assessment
in the classroom (Buckingham, Open University Press).
Torrance, H., Colley, H., Garratt, D., Jarvis, J., Piper, H., Ecclestone, K. & James, D. (2005) The
impact of different modes of assessment on achievement and progress in the learning and skills sector
(London, LSDA for the LSRC).
Wenger, E. (1998) Communities of practice: learning, meaning and identity (Cambridge, Cambridge
University Press).
Wolf A. (1995) Competence-based assessment (Buckingham, Open University Press).