Mind the Gap
Evaluation Report and Executive Summary
October 2014
Independent evaluator:
Richard Dorsett, Cinzia Rienzo and Heather Rolfe (NIESR)
Process evaluator:
Helen Burns, Barbara-Anne Robertson, Benjamin Thorpe and Kate
Wall (Durham University)
The Education Endowment Foundation (EEF)
The Education Endowment Foundation (EEF) is an independent grant-making charity dedicated to
breaking the link between family income and educational achievement, ensuring that children from all
backgrounds can fulfil their potential and make the most of their talents.
The EEF aims to raise the attainment of children facing disadvantage by:
Identifying promising educational innovations that address the needs of disadvantaged
children in primary and secondary schools in England;
Evaluating these innovations to extend and secure the evidence on what works and can be
made to work at scale;
Encouraging schools, government, charities, and others to apply evidence and adopt
innovations found to be effective.
The EEF was established in 2011 by the Sutton Trust, as lead charity in partnership with Impetus
Trust (now part of Impetus-The Private Equity Foundation) and received a founding £125m grant from
the Department for Education.
Together, the EEF and Sutton Trust are the government-designated What Works Centre for improving
education outcomes for school-aged children.
For more information about the EEF or this report please contact:
Robbie Coleman
Research and Communications Manager
Education Endowment Foundation
9th Floor, Millbank Tower
21-24 Millbank
SW1P 4QP
p:
e:
w:
020 7802 1679
[email protected]
www.educationendowmentfoundation.org.uk
About the evaluator
The trial was independently evaluated by a team from the National Institute of Economic and Social
Research (NIESR). The lead evaluator was Richard Dorsett.
The process study was carried out by a team from Durham University, led by Kate Wall.
Contact details:
Richard Dorsett
National Institute of Economic and Social Research
2 Dean Trench Street,
Smith Square,
London SW1P 3HE
p: 020 7654 1940
e:
[email protected]
Kate Wall
School of Education
Durham University
Leazes Road
Durham DH1 1TA
p: 0191 3348334
e:
[email protected]
Contents
Executive summary .................................................................................................. 2
Introduction............................................................................................................... 5
Methodology ............................................................................................................. 8
Impact evaluation ................................................................................................... 17
Process evaluation ................................................................................................. 35
Conclusion .............................................................................................................. 40
References .............................................................................................................. 44
Appendices ............................................................................................................. 46
Education Endowment Foundation
1
Executive summary
Executive summary
The project
Mind the Gap sought to improve the metacognition and academic attainment of pupils in Year 4. There
were two aspects to the intervention. The first involved training teachers in how to embed
metacognitive approaches in their work, and how to continue to effectively and strategically involve
parents. This training took place over a day and was provided by a consultant. The second component
focused on parental engagement and offered families the opportunity to participate in a series of
facilitated workshops where children and parents work together to create an animated film. Sessions
were coordinated by a practitioner who helped participants to think about how they are learning, create
learning goals and reflect on their progress; to be metacognitive about the learning process they were
engaged in together. The families were offered 2 hours of workshops per week for 5 weeks (10 hours
in total).
The project targeted schools in four areas of England: Birmingham, Devon, London and Manchester. It
was delivered by the Campaign for Learning, with assessments carried out by Durham University.
Delivery started in September 2012 and finished in October 2013.
The project was evaluated using a randomised controlled trial, which compared the interventions to a
‘business-as-usual’ control group. It is important to note that it was eligibility for the animation course,
not participation, that was randomised, so the results must be regarded as estimating the effect of
being offered the animation course (alone or in combination with teacher training, as appropriate)
rather than participating in it.
Key conclusions
1. The headline findings provide no evidence of a statistically significant impact of Mind the Gap on
attainment. There is not sufficient evidence to conclude that any observed effect was caused by
the programme rather than occurring by chance.
2. The estimate of the programme’s impact on pupils’ metacognition was positive and statistically
significant. This improvement in metacognition may in time lead to an impact on academic
attainment.
3. Participating families and staff felt the intervention enhanced home–school relationships and
strengthened the learning relationship between children and parents.
4. To increase the proportion of families who sign up to the animation course, it is important that
schools clearly communicate its aims and promote the potential benefits of participation.
5. The difficulty in recruiting schools suggests that introducing Mind the Gap more generally may be
difficult where schools are not committed to parental engagement or where they have difficulty
delivering activities out of hours.
Education Endowment Foundation
2
Executive summary
What impact did it have?
The impact analysis considered the effect of Mind the Gap as a whole, as well as the separate effects
of the teacher training component and eligibility for the parental engagement intervention. InCAS tests
of reading, general maths and mental arithmetic were administered six months after the intervention.
The intervention had no statistically significant effect on the primary outcome, a combined reading and
numeracy score, so we do not have sufficient evidence to conclude that any observed effect was
caused by the programme rather than occurring by chance. This was also true for the subgroup of
pupils eligible for free school meals.
A number of secondary outcomes were also considered. These included reading, general maths and
mental arithmetic scores as well as a measure of how children felt about the relationship with their
parents (constructed using their responses to questions taken from the Self-Description
Questionnaire). These estimates were consistent in showing no statistically significant effect.
However, measures of metacognition (elicited from data collected using Pupil Views Templates)
provided evidence that eligibility for the animation course increased pupils’ ‘productive thinking’ and
‘metacognitive skilfulness’.
Group
No. of
pupils
Effect
size
(95 confidence
intervals) *
Estimated
months’
progress
Is this finding
statistically
significant?
Mind the Gap overall
effect (all pupils)
492
-0.141
(-0.697, 0.414)
-2
No
Teacher training
alone
278
0.009
(-1.119, 1.137)
0
No
Parental
engagement offer
alone
186
-0.252
(-0.576, 0.071)
-3
No
Mind the Gap overall
effect (FSM pupils)
123
-4
No
-0.265
(-1.332, 0.801)
Evidence
strength
**
Cost of
approach
***
££
* Effect sizes with confidence intervals that pass through 0 are not ‘statistically significant’, so we do not have sufficient evidence to conclude
that the observed effect was caused by the programme rather than occurring by chance.
** For more information about evidence ratings, see Appendix XIV in the main evaluation report.
Evidence ratings are not provided for subgroup analyses, which will always be less secure than overall findings
*** For more information about cost ratings, see Appendix XV in the main evaluation report.
How secure is this finding?
Overall the evaluation findings are judged to be of low security, largely due to the high degree of
attrition.
Mind the gap was evaluated using a randomised controlled trial. Participating schools were
randomised to a ‘business-as-usual’ control group or an intervention group, which received the
teacher training element. Within the intervention group, Year 4 classes were then randomised into a
further two groups: one that were eligible to participate in the animation course and a group that were
not. This design allowed three effects to be estimated:
the overall effect of Mind the Gap
the effect of teacher training alone
the effect of eligibility for the animation course.
Education Endowment Foundation
3
Executive summary
It was eligibility for participation in the animation course that was randomised. Participation in the
course was very low: 72% of pupils eligible for the animation course did not attend any sessions. The
results must therefore be regarded as estimating the effect of eligibility for the animation course (alone
or in combination with teacher training, as appropriate) rather than participation in it.
In practice, implementation difficulties raised concerns about the security of the results. The main
problem was the high level of dropout and the fact that this was concentrated among control schools.
There was no means of including such schools in the analysis since they provided no outcome data. A
comparison of baseline characteristics suggests that there may be some degree of imbalance across
experimental arms in the sample used for analysis, but these comparisons themselves are made
difficult by the high incidence of missing baseline data. Furthermore, among schools that did not drop
out, there were missing outcome data, compounding the problem caused by dropout.
Another limitation is that, in practice, the treatment status of classes within schools did not agree in all
cases with the randomised status. Such non-compliance is a common feature of randomised
controlled trials and generally one can still adhere to the intent to treat (ITT) principle. However, in
several cases the randomised status became meaningless in practice, as the teachers who were
randomised (in most cases, classes were identified by their teacher) had left the school. The final
analysis used de facto treatment status. This is a deviation from ITT and further reduces the extent to
which the impact estimates for the components of Mind the Gap can be viewed as experimental.
Existing evidence suggests that metacognition and parental engagement interventions have a high
average impact on attainment. However, these interventions can take many forms and the intervention
considered in this project has no directly comparable precedent.
How much does it cost?
For schools that already have the required IT hardware, delivery of the animation course for up to 15
families costs in the region of £1,950. This translates into a per child cost of roughly £130. The cost of
teacher training was typically an additional £195 per delegate.
Cost Item
Cost per pupil
Salary costs
£60
Resources*
£47
Administration
£11
Other **
£12
TOTAL
£130
*maximum cost: includes software licence and webcams package
**facilitator travel – variable depending on location
Education Endowment Foundation
4
Introduction
1. Introduction
Mind the Gap seeks to create better learners by harnessing the power of effective metacognition
strategies. This is based on the idea that effective learning depends on skills, attitudes and
dispositions to learning, and that these can themselves be learned.
There is evidence to suggest that metacognitive approaches may be effective at raising pupils’
academic attainment. Furthermore, it is not just teachers who can help children to improve their
learning skills. Parental engagement is potentially important; giving parents the skills to help their
children to become confident, motivated learners could make a big difference to their performance at
school.
1.1 Intervention
There are two aspects to the intervention. The first involves one full day of training for school staff
(usually class teachers) in how to embed metacognitive approaches in their work, and how to continue
to effectively and strategically involve parents. The second is the parental engagement component
which offers families the opportunity to participate in a series of facilitated workshops. These
workshops run for 10 hours in total (2 hours per week over 5 weeks) and involve children and their
parents working together to create an animated film. Sessions are coordinated by a practitioner who
helps participants to think about how they are learning, creating learning goals and reflecting on their
1
progress. As far as possible, the format of the course was standardised across schools.
1.2 Background evidence
Mind the Gap is a family learning project aiming to facilitate intergenerational engagement with
learning and to build learning skills in families. The project builds on a small-scale pilot project
completed by the Campaign for Learning (CfL) in Harrow that showed encouraging results
(unpublished). Among the 40 participating primary school pupils, the percentage achieving the
government’s expected level of progress in English rose from 15% to 73%.
Implicit in the project is reflective and strategic thinking (Moseley et al. 2005) that helps to make the
process of learning explicit, facilitating the development of participants’ metacognitive awareness
(Flavell 1979). The animation project targets children and their fathers or male guardians. It is
accompanied by staff development to promote the same metacognitive strategies (Wall et al. 2010)
across the curriculum and home–school boundaries.
Pedagogies that prioritise the development of metacognitive knowledge and skilfulness have become
increasingly common in English schools (Wall et al. 2010). Focusing on the key ideas of ‘thinking
about your thinking’ and drawing on theoretical and pedagogic traditions (such as learning to learn,
thinking skills, self-regulation, habits of mind, dispositions, self-efficacy and self-esteem in relation to
learning), concepts are fluid, reacting to the pedagogic and policy environment (Wall 2010). In
previous projects, run under the heading of ‘Learning to Learn’, also coordinated by the CfL (Wall et al.
2010), it was found to be very important to find the ‘space’ (time, environment, inspiration and suitable
challenges stimulus) for all participants (teachers and students) to voice perspectives on learning
(Wall 2012). In addition, many teachers saw the relevance of extending these pedagogies and
enhancing metacognitive development across home–school boundaries (Hall et al. 2005).
Research on family learning shows that, once socio-economic status is accounted for, the biggest
influence on children and young people’s motivation and attainment is parental support (for example,
Desforges and Abouchaar 2003; Harris and Goodall 2007). But what this parental engagement looks
1
The name of the animation project is Animate Learning .
Education Endowment Foundation
5
Introduction
like is up for debate; there is much variation in the type of intervention and therefore the impact and
process that is studied. Most research in the field is concerned with literacy development (for example,
Wade and Moore 2000; Wagner et al. 2002). However, many of these family literacy programmes
have been criticised for having a dominant ‘deficit discourse’ (Anderson et al. 2010: 41) in which
parents and guardians are seen as inadequate in or hindering learning. In some cases, the process
can even be described as one of surveillance (Crozier 1998). Wolfendale (1996) suggested that
projects should instead support parents to become integrated with school processes, curriculum and
teaching and learning approaches, while others talk about partnership working, which suggests
schools and parents working on an equal footing, pooling their expertise and applying this within a
project.
Involvement in family learning projects can also have a positive effect on parent learning identity
(Swain et al. 2014) and an explicit focus on the parent’s needs can increase impact (van Steensel et
al. 2011). Previous projects have found an increase in parents’ metacognitive awareness (Wall et al.
2009). As a result, in Mind the Gap there has been a strategic refocusing of the initiative away from an
explicit family–school relationship towards a partnership approach which focuses on learning and
involves the child more explicitly. The focus is on the learning experience rather than anything more
school-led or formal, with teachers and school ‘stepping back’ but providing a supportive role which
allows learning to happen and be recognised between parent and child.
Another key feature of Mind the Gap is the focus on fathers and male guardians, who tend to be
reluctant, absent or ignored in the majority of family learning programmes (Macleod 2008; Freeman et
al. 2008). Yet the research shows that when fathers do have high levels of involvement or interest in
their child’s education (Nord et al. 1998; Flouri and Buchanan 2004; Roopnarine et al. 2006) there is
greater progress, more positive attitudes and higher expectations (for example, Hill and Taylor 2004;
Goldman 2005).
The intervention is relevant to the government policy of raising the achievement of disadvantaged
pupils. It also promotes parental engagement and involvement in children’s education, as advocated
by Ofsted. It provides early intervention for pupils who are not achieving their full potential. It
contributes to the literacy agenda in its aims to develop literacy skills of pupils who are required to
complete written work and contribute to verbal discussions. The programme also supports wider digital
literacy and digital inclusion agendas.
As described above, there has been a large amount of research into metacognition and the
importance of parental engagement. However, robust evidence on the effectiveness of discrete
interventions is limited. This evaluation was intended to provide robust evidence on the effectiveness
of one such intervention. As such, it builds on the pilot study in Harrow (mentioned above) and
complements the understanding achieved to date of Mind the Gap (Wall et al., in press). The
evaluation was set up as an effectiveness trial, aiming to test whether the intervention can work at
scale in a large number of schools.
1.3 Evaluation objectives
The evaluation was designed to assess the impact on children’s attainment, their feelings about their
relationship with their parents and their metacognition of:
1. Mind the Gap as a whole (that is, both the Parental Engagement offer and the Teacher
Training components – PETT)
2. Eligibility for the parental engagement element of Mind the Gap (PE)
3. The teacher training element of Mind the Gap (TT).
The process study was intended to assess implementation of the programme in practice and also to
identify conditions required for successful delivery. Qualitative analysis was included to assess
stakeholders’ opinions of the intervention.
Education Endowment Foundation
6
Introduction
1.4 Project team
The intervention was delivered by the CfL, a UK-based education charity which developed the initial
idea for the project. CfL were also responsible for recruiting schools to the study. Durham University
carried out the initial sample size calculations and was responsible for administering all tests. Due to
the pre-existing relationship between Durham University and CfL, it was necessary to appoint an
independent evaluation team. NIESR led the independent evaluation, agreeing the research design,
refining the randomisation approach, randomising the schools and classes and carrying out the impact
analysis. Durham University led the process study and qualitative analysis, with oversight and advice
on research instruments from NIESR.
1.5 Ethical review
The project was considered and approved by the ethics committee at Durham University School of
Education.
CfL were responsible for school recruitment with support from Durham University. They made
individual contact with each school and provided a full explanation of the evaluation during scheduled
set-up meetings. Head teachers were asked to give signed consent for their school to take part
(Appendix 3). In signing up for the project, schools were fully aware that they were giving consent for
the evaluation to take place and what this would involve. Training sessions reinforced and elaborated
on the content of the evaluation, and provided opportunities for further questions, written information
and contact details for the research team. Members of the research team from Durham University
attended these training sessions and ensured they were available to answer ethical queries at any
time via email or telephone.
A frequently asked questions sheet and a letter for parents and carers concerning the evaluation
which schools could choose to use (or not) were produced (Appendix 4). This outlined the nature of
the evaluation, the measures to be used and the data protection standards that would be adhered to.
An initial parent/carer registration form sought photographic and video consent and was accompanied
by safeguarding information, including a disclaimer that data would be stored in accordance with the
Data Protection Act and used only by Durham University and its project partners (Appendix 5). This
form was revised as the project developed and the longitudinal data and links to the NPD become
more prominent within the project design. This new form was sent out to all participant parents
accompanied by an explanatory letter from the University for parents (Appendix 6).
Where families were interviewed they gave specific signed consent for this, having been provided with
a letter explaining the research as well as interview questions in advance of the interview.
Education Endowment Foundation
7
Methodology
2. Methodology
2.1 Trial design
A cluster randomisation approach was adopted since individual-level randomisation was felt to be
impractical, likely to cause contamination and to raise ethical concerns. As shown in Figure 1,
randomisation was at both the school level and the class level. Doing this allowed the three impacts
listed as evaluation objectives to be estimated.
Figure 1. Trial design
Eligible schools, N=50
Randomise schools
Intervention schools, N=25
Control schools, N=25
Randomise intervention
schools
Randomise control schools
PETT classes, N=25
TT classes, N=25
Control classes,
N=25
Not used
Baseline data collection. See Figure 3 for details
Follow-up data collection. See Figure 3 for details
Schools participating in the trial were randomly assigned to either the intervention group or a control
group. Teachers in intervention schools received training to embed techniques for developing
Education Endowment Foundation
8
Methodology
metacognition in their work. Furthermore, eligibility for the animation course was randomly assigned
among Year 4 classes in intervention schools. Year 4 was chosen as it is a primary age year group
that is not involved in national testing processes, but which normally achieves a good level of literacy
development, thereby allowing appropriate responses to the outcome and process measures used in
this study.
This assignment approach results in three types of Year 4 classes being included in the trial:
1. classes in intervention schools that are eligible for the animation course – PETT classes
2. classes in intervention schools that are not eligible for the animation course – TT classes
3. classes in control schools.
2.2 Eligibility
Eligible schools were from Devon, London Borough of Haringey, Birmingham and Manchester
considered to be located in areas of substantial socio-economic deprivation with a high proportion of
children receiving free school meals. Schools with at least two form entry were targeted but this was
not always possible, particularly in rural areas. It was expected that the majority of schools taking part
would be achieving expected pupil attainment (Key Stage 2 floor standards) in English and maths for
less than 70% of their pupils. However, as some schools were recruited through clusters and
networks, a minority of schools were achieving attainment above these targets. Year 4 pupils were
eligible to take part in the project, with the addition of some Year 3 and Year 5 pupils taking part if the
school had mixed-age classes. Siblings were also eligible to attend family animation sessions.
School-level consent was sought before randomisation on a verbal basis from the head teacher. This
was completed as part of the initial set-up meeting. The project design, the evaluation and its key
structures were made clear at this point as well as the nature of the intervention and data collection.
Once verbal agreement was reached, a written consent form was sent out (with non-return presumed
to be acceptance).
Consent was sought from families via the parent registration form. The key purpose of this document
was to capture socio-economic data relating to the broader family but the form also required the
parent to give consent for participation in the project data collection.
2.3 Intervention
The Mind the Gap intervention included two key elements: eligibility to participate in the animation
project and the staff development input.
Animation project: This consisted of five sessions (10 hours in total, 2 hours per week for 5 weeks)
where children and their parents worked together to create an animated film. The sessions usually
took place in school after the end of the school day. They were coordinated by a practitioner,
employed and trained by the CfL. These practitioners helped participants to consider and reflect upon
how they were learning. Where possible, a class teacher was also present during the sessions. This
requirement was made clear by the CfL in the school’s introduction to the project but was not always
met, due to varying levels of commitment and capacity. The child invited their parent or guardian to
take part in the project through drawing a ‘wanted poster’; thus the contract was immediately
suggested to be between the child and the adult without the school’s explicit involvement. The target
was fathers or male guardians, but where this was not possible the child invited an adult of their
choice.
The animation project itself was designed around three core tools to ensure consistency:
Education Endowment Foundation
9
Methodology
A presentation leading each facilitator through the five sessions. This included activities, films
and prompts to ensure the same focus on metacognition and so that the same animationbased activities were covered.
A family handbook given to all participant family groups, including information and activities for
each session linked to the presentation.
A teachers’ handbook given to the teachers of the participant classes including, as above,
information and guidance on each session linked to the presentation.
The five sessions were carefully designed to encompass elements such as story planning, modelling,
trialling the equipment, exploring different animation techniques, filming, and editing. Each element
2
was matched to one of the 5Rs (a learning dispositions framework) developed by CfL : Readiness,
Resourcefulness, Resilience, Responsibility and Reflectiveness (Wall et al. 2010). The progression
through the programme was carefully mapped to ensure that the metacognitive elements and the
animation process were closely associated and therefore maximised opportunities for transfer. This
process was standardised using a number of resources including goal and reflection sheets that
allowed the families to record their learning experiences.
Session
Metacognition
focus
1
Readiness
2
Resourcefulness
3
Responsibility
4
5
Resilience
Reflection
What happens
Introduction from facilitator
Show of previously made films
Introduction to metacognition
Use modelling clay to consider ‘what kind of learner am I?’
Experimenting with animation technology
Complete goal and reflection sheets
Frequently described as ‘creative’ activity by facilitators
Story planning using templates
Use craft materials to make sets and characters
Complete goal and reflection sheets
The most technical session in which families need to get to
grips with the software
Complete model making
Start filming
Complete goal and reflection sheets
Continue to use the software to make the film
Adjust models and storyboard to fit practical constraints of
filming and project timings
Complete goal and reflection sheets
Finish films
Add sound and credits
Share films with the wider group
Reflection on what has been learned/achieved
Complete goal and course reflection sheets
Staff development: All intervention schools were invited to send two teachers to a regional training
day led by Jackie Beere, an independent consultant with established expertise in the field of Learning
to Learn, working in partnership with CfL. This day’s training focused on how to embed approaches for
developing metacognition in their work, and how to develop a strategic approach to effective parental
engagement. The training covered why metacognitive development is important, including some of the
2
http://www.campaign-for-learning.org.uk/cfl/learninginschools/l2l/5rs.asp
Education Endowment Foundation
10
Methodology
theoretical background in this area, as well as a practical consideration of requirements in individual
school contexts, enabling teachers to apply their learning to their own work. Teachers were also
provided with access to resources and references to use in school.
Control schools formed a waiting list and became eligible for both elements of the intervention in the
following academic year. Eligibility for the animation project again comprised Year 4 classes, a cohort
of pupils one year younger than those involved in the trial. This goes some way to ensuring a longterm control group. However, it may still be the case that the longer-term outcomes of trial participants
in control schools are affected by the teacher training provided later.
Fidelity was a prime consideration throughout the project. CfL has a negotiated, partnership approach
to working. The standardisation desired for this evaluation was a challenge both in regard to how the
project was implemented in schools and how the schools interacted with the intervention.
Standardisation was negotiated prior to the intervention going live. All facilitators were involved in the
production of the presentation and handbooks for the animation project. They also attended at least
one of the staff training days so that parallels could be drawn wherever possible. This strengthened
fidelity of the intervention as it was rolled out, ensuring all members of the team were up to date on the
different elements of the project.
The Durham team undertook fidelity checks. These included:
Random spot checks of implementation (staff training days, animation project sessions and
data collection elements)
Facilitator diaries of implementation
Detailed examination of three schools’ experiences of the intervention, including video
recording of all sessions.
The main challenges to fidelity arose from:
Changes in school staff – difficulties associated with this were largely overcome via consistent
use of course materials: handbook, films and course structure
Specific aspects of different school environments, including
o Scheduling of the animation project within or outside the normal school day
o The role and commitment of the staff member allocated to the project
o The classroom environment allocated to the project
o Communication about the project by the lead contact to other staff members, students
and families.
2.4 Outcomes
The primary outcome measure was pupil attainment in literacy and numeracy, as captured by InCAS
assessments. This was constructed as a combination of three InCAS measures: reading, general
3
maths and mental arithmetic. These component measures were treated as secondary outcomes,
alongside a measure of pupils’ feelings about their relationship with their parents (recorded using a
subset of the Self-Description Questionnaire (SDQ; Marsh1992) and their metacognition (captured
4
using a Pupil Views Template (PVT). Further details about each of these are provided below.
3
The primary outcome was calculated as:
(reading score + (general maths score + mental arithmetic score)/2)/2.
in order to give equal weight to reading and numeracy.
4
The data collected in the course of the trial contain the required identifiers to link to the National Pupil
Database (NPD). Doing so would allow the impact on longer-term outcomes (Key Stage results) to be
estimated.
Education Endowment Foundation
11
Methodology
5
InCAS : a diagnostic, computer-adaptive assessment tool used to measure reading (word
recognition, word decoding, comprehension and spelling) and mathematics (general
mathematics and mental arithmetic) attainment. These tests were chosen due to their
suitability for primary aged children, their administration via computer program (which
increases fidelity) and their established credibility within practice and research communities.
The measure used was pre-test and 6 months delayed post-test age-equivalent score. The
scores are measured in months. Pupils performing as expected for their age, will, on average,
have an age-equivalent score equal to their chronological age. Pupils performing below
(above) expectations will have an age-equivalent score below (above) their chronological age.
SDQ: one factor was extracted from the SDQ to measure changes in pupils’ beliefs about their
relationship with their parents. This is an established instrument for use with primary aged
children. The single factor was extracted from the wider questionnaire for ease of
administration – 9 questions rather than 62. This measure was administered three times: pretest, immediately post-test and 6 months post-test. The analysis combined component SDQ
responses into a Likert score. Valid responses were recoded to run from 0 to 3 (instead of 1 to
6
4) and then summed. Given the negative and different wording, the second SDQ question
was excluded from the calculation of the Likert score. Low scores suggest that the child views
the relationship with his/her parents as being ‘more problematic/worse’, while higher scores
suggest the child views it as ‘less problematic/better’.
PVTs: pupil views templates were used to measure pupils’ metacognitive awareness. This
method – developed by one of the co-authors of this report (Wall and Higgins 2006) – is a
visual tool using a cartoon image of a learning scenario the students will recognise, in this
case an adult and child working together with a laptop, and uses speech and thought bubbles
to prompt the child’s reflections on the learning. The tool is well-liked by teachers and is
established in the research community as eliciting young children’s thinking about learning.
PVTs were used twice: pre-test and immediately post-test. The main outcomes for the PVT
are represented by five measures: information gathering, building understanding and
productive thought (representing cognitive skills; Moseley et al. 2005) and metacognitive
knowledge and metacognitive skilfulness (representing the metacognitive domain; Veenman
et al. 2005). For more information on this process see Appendix 11.
All measures were administered by the schools under guidance from CfL and Durham University
teams. This guidance included training days run by CfL, Durham University and the CEM Centre (the
latter being the owners and administrators of the InCAS tests). The computer-based and computeradaptive nature of the InCAS tests helped to ensure that the process for this measure was
standardised and effectively blinded (since no pupils received the same test and the marking is
automated rather than carried out by the schools). The SDQ and the PVT were sent as email
attachments to the schools with clear instructions on how to use them. Members of all the project team
were available by email and telephone to handle any questions around administration.
The InCAS data were uploaded by the schools to the CEM Centre, where they were processed. The
resulting scores were then passed onto the Durham team to be matched with the pupil record. The
SDQ data were sent in paper form to the Durham team. The scores were entered blind by two
researchers separately, moderated by a team leader. To check the fidelity of the coding process, the
two sets of entered scores were compared against each other and any anomalies checked against the
original paper versions. The PVT data were transcribed and the statements coded blind using a
deductive coding scheme identifying children’s declarative cognitive and metacognitive awareness.
This coding process was checked for inter-rater reliability using Cohen’s Kappa with an agreement of
84%. In addition, 20% (n=75) of the total sample were double coded by another researcher, and an
intra-rater agreement of 98% achieved. Further information can be found in Appendix 11.
5
6
http://www.cem.org/incas/introduction
See Appendix 1 for the full questionnaire.
Education Endowment Foundation
12
Methodology
Education Endowment Foundation
13
Methodology
2.5 Sample size
Sample size calculations were carried out by Durham University. The aim of the project was to recruit
7
50 schools to the study. A minimum detectable effect size (MDES) of 0.42 was estimated, based on
25 schools randomised into intervention and control, 30 children per cluster, 0.05 significance level,
0.8 power and a 0.25 intra-cluster correlation. This is a conservative ICC and the calculation does not
factor in stratification, which increases the power. The effect size was predicted to be in the range of
0.35 to 0.45 standard deviations.
The actual MDES of Mind the Gap as whole pertaining to the analysis sample for the primary outcome
was affected by school dropout and missing data. As reported later, the analysis was based on 22
classes rather than 50, which increased the MDES. On the other hand, the intra-cluster correlation
was smaller (0.165) than had been assumed. Furthermore, taking account of the proportion of
variation accounted for by regressors and blocking also reduces the MDES. At the pupil level, the
adjusted R-squared was 0.70 when including pupil-level regressors, while at the school level the
inclusion of blocking variables resulted in an adjusted R-squared of 0.34. The combined effect of these
factors was a MDES of 0.45, slightly higher than at the design stage.
2.6 Randomisation
NIESR randomised schools within blocks defined by area and by the proportion of pupils in each
school shown in the 2011 school performance tables to achieve Level 4 or higher at Key Stage 2 in
both English and maths (low, medium, high – where these thresholds were chosen to achieve equal
sized groups in each area). This resulted in 12 blocks (or strata).
Randomisation of schools (to achieve a 50:50 allocation) was performed as follows:
Each school was assigned a randomly generated number
Schools were sorted by block and, within each block, by the random number
The first school was randomised
Each subsequent school was assigned to have the opposite outcome of the previous school.
Randomisation of classes (also by NIESR) was performed as follows:
In intervention schools that had a single Year 4 class, that class was assigned to be eligible
for the parental engagement course
In intervention schools that had two Year 4 classes, one class was randomly assigned as the
PETT class and the other as the TT class
In intervention schools that had three Year 4 classes, a randomly selected class was excluded
from the trial and randomisation of eligibility then proceeded as in the two-class case
In control schools that had a single Year 4 class, that class was selected for the trial
In control schools, where there were multiple Year 4 classes, the class to be included in the
trial was selected at random. The other classes played no further role in the trial and were not
requested to provide data.
In practice, there were two factors that disrupted this experimental design. First, some schools
dropped out after randomisation. The extent of this is shown in Figure 3. Four schools dropped out
before delivery started and the remainder dropped out at some point after the programme started. It is
7
Note that the EEF Teaching and Learning Toolkit reports a high average effect size for metacognitive
approaches (http://educationendowmentfoundation.org.uk/toolkit/meta-cognitive-and-self-regulationstrategies/).
Education Endowment Foundation
14
Methodology
possible, although not inevitable, that this might cause experimental estimates to be biased. Although
we cannot say anything about the extent of any such bias, it is equally true that we can no longer rely
on the statistical properties of the RCT to argue that the resulting estimates are unbiased.
The second factor concerns assignment of classes. Classes were identified according to who their
teacher was intended to be. Randomisation therefore effectively assigned teachers to one of the three
arms of the trial (PETT, TT, Control). This has the benefit that teacher effects are randomised along
with classes, which prevents schools then allocating teachers according to which class is allocated to
which treatment. However, due to teacher turnover, the teacher information that existed at the time of
randomisation had, in some cases, changed by the time of implementation. Where this had occurred,
8
it was not always possible to implement the class randomisation as intended. In view of this, we use
class status as delivered in practice. This does not adhere perfectly to the randomised treatment
status but there is agreement in the majority (78.4%) of cases.
2.7 Analysis
The impact analysis was carried out using multilevel regression models to reflect the clustered nature
of randomisation. Apart from the regressors indicating treatment status, models controlled for pupils’
age and sex, their EAL, SEN and FSM status, their baseline (pre-test) outcome and the length of time
between pre-test and post-test. In addition, block dummies were included in all estimations.
The overall effect of Mind the Gap (comparing PETT to control classes) was estimated following the
intent-to-treat principle. However, this was only possible up to a point since schools that dropped out
did not provide any outcome information. Unfortunately, as noted above, the practical circumstances
surrounding class randomisation meant that it was not possible to follow the intent-to-treat principle
when estimating the impact of components of Mind the Gap (comparing PETT classes to TT classes
to isolate the impact of PE eligibility; or comparing TT to control classes to estimate the impact of TT).
Instead, we distinguish PETT classes from TT classes according to whether in practice they were
offered the opportunity to participate in the animation course.
Subgroup analysis on the primary outcome for the overall effect of Mind the Gap was carried out for
FSM pupils and for pupils with low pre-test scores. This latter definition was operationalised by
regarding everyone with a pre-test score in the lowest third of the distribution as having a low pre-test
score for the outcome in question. The protocol for the trial identifies some additional intended
subgroup analyses. These include: ethnic minority children; children with a low estimated probability of
participation in the animation course; subfloor standard schools; and faith schools. However, lack of
required information (in the case of the ethnic minority subgroup) and the small size of the achieved
sample made such analyses problematic.
2.8 Process evaluation methodology
The process evaluation was completed by the Durham University team and used data from three
tools:
Participant teacher interviews (n=18): interviews were completed with teachers whose class
had been offered the animation project. These teachers had also received the metacognition
training. Interviews were completed at least 3 weeks after the intervention. Interviews focused
on teachers’ experiences of the intervention and explored issues around home–school
relationships, the sustainability of the intervention elements and perceived impacts. The
interview schedule is included in Appendices 8 and 9.
8
In some of these cases it was still straightforward to adhere to the randomisation outcomes. For instance, no
real issue arose with one-form entry schools; in two-form entry schools where only one teacher had changed it
was reasonable to stick with the randomisation outcome for the known teacher and to apply the alternative
status to the unknown teacher.
Education Endowment Foundation
15
Methodology
Family interviews (n=16): interviews with the families who participated in the animation
course were carried out 6 months after the intervention. These were semi-structured (see
Appendix 10 for interview schedule), but also used as aides memoires the family’s handbook
(which recorded the learning journey through the animation project) and animated film.
Interviews focused on each family’s experience of the intervention, their relationship with
learning, the perceived impacts on family members and the sustainability of these impacts.
Video observations (n=3): three schools – one each from Birmingham, Devon and London –
were followed throughout their experience of the animation project. Video data allowed fidelity
checks as well as case study exploration of the process experienced by the schools and
families. These observations targeted the intervention itself and participants’ behaviours within
the project experience.
A staged model of analysis was used. In the first phase, data collection was observed and team
members discussed initial impressions. Once the data collection was complete, it was viewed and/or
read in its entirety for further impressions and interpretations. Multiple researchers completed these
stages, allowing the team to discuss, confirm and triangulate emerging ideas. From this, themes with
which to structure the analysis stage were identified. In that stage, the data were interrogated in depth
to explore key elements within each of the identified themes. This aimed to capture the bigger picture
as well as the details typifying each theme.
The design of the project was mixed method, in that both qualitative and quantitative data were
collected in a complementary design to answer the research questions. Process data were collected
from a subset of schools. These were analysed in their own right but also, along with data such as
facilitator diaries and evaluations, could triangulate the outcome measures. The research team
undertook an analysis process whereby emerging findings were checked against the outcome
measures and the outcome database was interrogated with findings of the process analysis in mind.
Education Endowment Foundation
16
Impact evaluation
3. Impact evaluation
3.1 Timeline
Schools were recruited to the trial between April 2012 and April 2013. School randomisation was
staggered; Devon and Haringey schools were randomised in July 2012, Birmingham schools were
randomised in November 2012 and Manchester schools were randomised in April 2013.
The teacher training for intervention schools started in September 2012 and finished in October 2013.
This consisted of training at the start of the programme in September/October depending on
geographical area, then ‘on the job’ training shadowing the project in their school when it took place. In
September/October 2013, a post-course training session took place.
Due to the distribution of the schools across different areas, the animation sessions and the tests took
place at different times. The engagement process for the first set of intervention schools to recruit
families onto the programme began in September 2012. The implementation timeline is depicted in
9
Figure 2. Schools were divided into five ‘sets’. Within each set, there were intervention schools and
control schools. The animation course was delivered to participating families at Intervention schools.
Pre-tests were carried out at about the same time, although there was some variation across the sets
in the extent to which this was achieved. Testing of pupils in intervention and control schools within
each set followed a similar schedule.
Figure 2. Implementation timeline
Set 5
Set 4
Set 3
Set 2
Set 1
InCAS
SDQ
Animation course
9
In Figure 2, the dates of the animation course were taken from records kept by the delivery team. The
assessment dates (InCAS and SDQ) were recorded in the evaluation database. As such, they are subject to
missing values.
Education Endowment Foundation
17
Impact evaluation
Depending on the geographical area, the pre-test was administered between October 2012 and
November 2013. Specifically, the InCAS pre-test was administrated between October 2012 and July
2013, while the pre-test for the SDQ was administered between October 2012 and November 2013.
The post-test was administrated between June 2013 and June 2014. Again, this varied across areas.
From Figure 2, one might be concerned about pre-tests taking place after pupils had started their
animation course. The trial was designed so that pre-tests would be carried out prior to the start of the
animation sessions but this was not always possible. Reasons for this were varied and included
school staff changes, technological challenges, teacher illness, teacher pregnancy and child illness. In
fact, the majority of pre-tests for pupils in PETT classes who also provided 6-month post-test InCAS
outcomes (and so featured in the impact analysis for the primary outcome) did take place prior to the
start of the animation sessions; the reading, maths and mental arithmetic pre-tests took place after the
start of the course in only 16, 11 and 16 cases, respectively. Furthermore, the maximum ‘lateness’ in
any case was only 7 days. This means that they would have only experienced the introductory
animation session before being assessed. It seems unlikely that this could have any effect on pre-test
scores. For the SDQ outcome, all pupils who were in a PETT class and who also provided 6-month
outcomes (and so featured in the impact analysis) had their pre-test before the animation course
started, apart from one individual whose pre-test took place two days later.
3.2 Participants
Recruitment
The initial recruitment strategy used to engage schools onto the programme was to target and
approach the schools through intermediaries. This included networks of heads and clusters and local
authority support services.
In Haringey all schools were selected and invited through the School Standards Service. This proved
extremely successful, as all but one of the schools that were approached were recruited to the project.
Sixteen schools were approached through this mechanism.
In Devon the project was promoted through the Devon Association of Headteachers. An invitation was
issued through their weekly newsletter and schools were asked to contact the project team to express
an interest in the project. Twelve of the Devon schools were recruited directly through this route and
the additional schools were engaged by the project team, using recommendations and contact
information provided by the initial ‘self-referral’ school cohort.
In Birmingham, the lead of a large school cluster approached the project team, with a ‘ready-made’
group of schools to join the project. However, take-up was low and, despite direct visits and numerous
emails and telephone calls, fewer than eight schools were recruited through this cohort. The project
team did additional work in the local area, including promoting the project through an introductory
breakfast session for schools and speaking at a local cluster event. Despite this, the team was
unsuccessful in recruiting the required number of Birmingham schools. A decision was therefore
taken, in conjunction with EEF, to extend the geographical reach of the project to include schools in
Manchester. Emails were sent directly or via contacts to 34 Manchester-based schools. Six expressed
interest in receiving the programme.
Once schools expressed an interest in the project, the process was similar in each geographical area.
A telephone call was made to each school to discuss the programme in more detail. This was followed
up by email and, where practical, face-to-face meetings were held with school staff, including senior
leaders wherever possible. At this point schools signed a brief document confirming their decision to
participate in the project and identifying their two nominated key contacts. The individual project officer
assigned to the school then followed this up to introduce themselves and formally welcome the school
to the project. There were no events explicitly designed for recruitment purposes, but there were
subsequent training events, also attended by the Durham research team who described the evaluation
Education Endowment Foundation
18
Impact evaluation
process at length, answered questions and provided contact details for further queries, building on the
information on the evaluation process provided by CfL in their recruitment discussions and meetings.
Numbers in the trial
Figure 3 summarises the number of schools, classes and pupils involved in the trial. Of the 51 schools
that agreed to participate, 25 were allocated to the Intervention group and 26 to the Control group. Of
10
the 25 Intervention schools, 11 were one-form entry, while 14 were of two- or three-form entry. All
classes in one-form entry schools were designated PETT classes. In the two- and three-form entry
schools, one class was randomly selected as the PETT class and one as the TT class. This means
that the overall effect of Mind the Gap can be estimated using data from all schools (there will be a
PETT class in all intervention schools) while the effect of TT alone and the effect of PE alone can only
be estimated in larger schools (i,e. not one-form entry schools). This is dealt with in the analysis by
restricting the sample to two- and three-form entry schools when estimating the latter two impacts.
However, as noted already, there was non-compliance with the randomised treatment at the class
level (there was no non-compliance at the school level) and, in cases where teachers had changed, it
was not even possible to impose the randomised outcomes. Figure 3 does not therefore identify the
class allocation as random. Instead, it reports the de facto treatment status: 25 PETT classes and 14
TT classes.
School dropout was a problem. Of the Intervention schools, two dropped out, one before and one after
the start of the programme. The reasons given to the CfL team for dropping out were changes in
staffing (resulting in a feeling that the school could no longer commit to the project) or a prior lack of
awareness of the amount of work the tests would involve. Numerous attempts were made to retain
schools – offering additional support, reminding them of their commitment, offering to discuss ways of
helping – but these were not successful. Of the control schools, four dropped out before the
programme had started and five dropped out later. This left 23 and 17 schools involved in the
intervention and control groups, respectively.
There were a number of additional deviations from the design. Two control schools provided data for
two classes rather than one. This resulted in there being 19 classes for 17 schools. In one intervention
school both classes received the animation session, resulting in there being one additional PETT class
and one less TT class. Additionally, four intervention schools provided outcomes for the PETT class
but none for the TT class. This contributed to the unbalanced number of observations between the
PETT and the TT classes – 635 and 377 pupils, respectively – compared to 543 in control classes.
The impact estimates were based on fewer observations still due to missing post-tests. Figure 3
provides details (in the bottom boxes) of the number of observations available for each outcome.
Some missing data occurred arose for pupil-specific reasons (such as absence due to illness) and
may be felt to be unrelated to treatment and outcomes. More problematic was the case of whole
classes missing outcome data. Such cases have a more negative effect on the power of the trial and
may also introduce bias if there is a systematic relationship with treatment status and outcomes. The
bottom boxes in Figure 3 show, for each outcome, two numbers. The first is the number of classes for
which the (6-month) outcome in question is observed for at least one pupil. The second is the number
of pupils, across all classes, for which the outcome is observed.
10
While the intention had been not to include two-form entry schools in the trial, recruitment difficulties
meant that, to achieve the required number of schools, some one-form entry schools were admitted.
Education Endowment Foundation
19
Impact evaluation
Figure 3. Flowchart of sampling, allocation and attrition
Schools invited to participate (N≈200)
Note:
ns = number of schools
nc = number of classes
np = number of pupils
Schools agreeing to participate
Randomised (ns=51)
Control (ns=26)
Intervention (ns =25)
1 form entry
schools (ns=11)
PE & TT
(nc=11)
2+ form entry
schools (ns=14)
PE & TT
(nc=14)
Selected classes
C
TT
(nc=14)
(nc=26)
Allocation
PE & TT nc=25
Dropout (2 schools), of which:
before programme started (1)
after programme started (1)
Dropout
Remaining schools (ns=23)
PE & TT
Dropout (9 schools), of which:
before programme started (4)
after programme started (5)
Remaining schools (ns=17)
TT only
Control
non-missing 6-months
outcomes (nc/np):
non-missing 6-months
outcomes (nc/np):
non-missing 6-months
outcomes (nc/np):
Reading
Arithmetic
General maths
SDQs
PVT
Reading
Arithmetic
General maths
SDQs
PVT
Reading
Arithmetic
General maths
SDQs
PVT
= 12/227
= 12/238
= 12/233
= 9/170
= 17/261
= 4/83
= 4/85
= 4/83
= 4/64
= 5/76
= 14/311
= 14/326
= 14/338
= 13/287
= 13/329
Observed
outcomes
Education Endowment Foundation
20
Impact evaluation
Parental/pupil engagement
In addition to working with schools, the intervention relied on buy-in from parents. An engagement
process lasting approximately 12 weeks was used to recruit families onto the course (there was some
slight variation in timescales due to term dates). The first stage was a letter sent home to all families
within the target class (average of 29 per class, i.e. approximately 1305 families in total). This was
followed approximately six weeks prior to the start of the course by a class taster session working with
the entire target class. This involved explaining more about the project, a short animation group-work
task and the completion of ‘Wanted Posters’ – personal invitations from the children to their parents to
attend the course. These were accompanied by a second letter home. Four weeks prior to the start of
the course, parents were invited to come to a parent and child taster session, to find out more about
the project and complete a short animation task. Three weeks prior, class teachers were encouraged
to speak to parents on the playground to increase engagement. Reminder letters were sent in each of
the two weeks prior to the course start date, reminding families about the project. In some schools this
was supplemented by text messages home to families.
Table 1a summarises the number of animation sessions attended by pupils in the PETT classes.
About 72% of pupils (458) did not attend any animation session. Of the 177 pupils who attended at
least one session, 58% of pupils (102) attended all five. This is lower attendance than had been hoped
for and expected. Reasons for this low take-up included issues around the timetabling of the animation
sessions to fit with parents’ work patterns and children’s after-school activity and a lack of
prioritisation, commitment and support from some schools at the point of recruitment. The animation
sessions targeted fathers and male carers because they are known to be more difficult to engage in
school activities (for example, see Desforges and Abouchaar 2003) which are frequently identified with
11
the role of the female carer or mother. However, where male carers and parents were not available,
there was an option for a female parent or carer to take part, so as to be inclusive. While targeting
male adults may have played a part in lower attendance than hoped for, it was at times overcome
through flexibility towards gender inclusion, depending on how the project was communicated to and
understood by the audience. Mostly, however, the nature of the targeted schools serving communities
with a variety of challenges means that parental engagement activities in schools will often be viewed
with some suspicion. In the process outcomes we discuss more extensively how recruitment could be
improved.
Table 1a. Number of animation sessions attended in PETT classes
Number of sessions attended
Frequency
Percent
0
458
72.13
1
13
2.05
2
15
2.36
3
14
2.2
4
33
5.2
5
102
16.06
Total
635
100
11
In 2010-11, 87.7% of those attending family learning were female (see page 39 of
http://shop.niace.org.uk/media/catalog/product/n/i/niace_family_learning_report_reprint_final.pdf).
Education Endowment Foundation
21
Impact evaluation
Additional insight is possible by examining the factors associated with participation among those
eligible for the animation sessions. The results of estimating a probit model of participation are
presented in the first column of Table 1b. The variables included in the model were selected by
stepwise reduction, beginning with the range of school and pupil background characteristics
considered in the impact estimates discussed later. To make the coefficients more interpretable, the
th
‘Prob. low’ column gives, for continuous variables, the probability of participation at the 25 percentile
th
of the variable in question, the ‘Prob. high’ column does similarly for the 75 percentile and the final
column reports the change in the probability of participation associated with a move across the interquartile range. The exception to this format is the third variable (‘Missing reading age at baseline’)
since it takes values zero and one; accordingly the ‘Prob. low’ column shows the estimated probability
when zero, the ‘Prob. high’ column shows the estimated probability when one and the final column
shows the change in probability when moving from zero to one.
Three points are apparent from Table 1b. The first is that the single personal characteristic (mental
12
arithmetic at baseline), while statistically significant, does not have a dramatic effect on participation.
Not having data on reading level at baseline has a bigger effect. This is obviously hard to interpret but
it is relevant to point out that missing data could arise for reasons associated with the pupil or with the
class/school. It is conceivable, for instance, that schools less committed to the research may be less
likely to provide baseline assessments and also less likely to promote the intervention. In this vein, the
second point is that school characteristics appear to be more important than pupil characteristics, in
the sense that the differences shown in the final column are much larger for the last two variables than
for the first two. Third, participation is higher when the SEN percentage in the school is higher, and
lower when the FSM percentage is higher. This last effect appears particularly strong and suggests
that promoting parental engagement is especially challenging in schools serving poorer areas. It is
interesting to note that pupils’ individual FSM status was not found to be statistically significant.
12
The ter effect is used loosely; clearly, these are only associations rather than necessarily causal
relationships.
Education Endowment Foundation
22
Impact evaluation
Table 1b. Factors associated with participation in the animation course: results of estimating a
probit model
Coeff.
Prob low
Prob High
Prob diff
0.003*
0.301
0.324
0.023
[0.001]
[0.021]
[0.028]
0.432**
0.237
0.368
[0.152]
[0.020]
[0.037]
0.097**
0.228
0.370
[0.020]
[0.017]
[0.025]
-0.030**
0.422
0.171
[0.004]
[0.028]
[0.017]
Mental arithmetic age at baseline
Missing reading age at baseline
School SEN percentage
School FSM percentage
Constant
0.131
0.142
-0.251
-0.940**
[0.247]
N
635
* p<0.05, ** p<0.01.
3.3 School and pupil characteristics
Tables 2a to 2c present the characteristics of the 51 schools included in the trial. Table 2a shows that
the geographical distribution of the schools was balanced across the intervention and control groups.
The Fisher’s test shows no statistically significant differences across intervention and control schools.
Table 2a. Geographical distribution of schools
Intervention
Control
Geographical distribution
Frequency
Percentage
Frequency
Percentage
Devon
8
32.0
8
30.8
London
8
32.0
8
30.8
Birmingham
7
28.0
8
30.8
Manchester
2
8.0
2
7.7
Total
25
100
26
100
Fisher’s test P-value = 1.000
Education Endowment Foundation
23
Impact evaluation
Table 2b shows the mean characteristics of the schools recruited for the trial. Again, the intervention
and control schools look similar. The average number of pupils in the intervention schools is 286,
slightly higher in the control schools (320) but not significantly so. In both intervention and control
schools, about 10% of pupils are under the Special Education Need statement or School Action Plus
(SEN) and for about one third of pupils English is not the first language (EAL). About 35% of pupils are
eligible for free school meals (FSM) and about 70% achieved Level 4 or higher in reading, writing and
maths.
Table 2b. Characteristics of pupils in recruited schools
Number of pupils
Percentage of pupils with SEN statement or on School
Action Plus
Percentage of pupils with English not as first language
Percentage of pupils eligible for free school meals
Percentage achieving Level 4 or above in reading,
writing and maths
Ofsted rating
Intervention
Control
Difference
N=25
N=26
N=51
286
320.0
-33.96
(138.8)
(108.3)
[34.787]
10.18
9.735
0.44
(3.863)
(4.563)
[1.186]
31.35
36.18
-4.84
(28.53)
(30.77)
[8.319]
34.84
34.76
0.07
(18.37)
(15.92)
[4.807]
69.24
70.96
-1.72
(17.38)
(11.41)
[4.101]
2.16
2.31
-0.148
(0.624)
(0.736)
[0.191]
Notes: Based on School Performance Table 2012, Department of Education. Standard deviation in (.).
Standard error in [.]. Significance of the differences is on the basis of t-tests; * p<0.05, ** p<0.01.
Education Endowment Foundation
24
Impact evaluation
Table 2c presents the number of PETT, TT and control classes. The smaller number of TT classes
arises from the fact, already discussed, that classes in one-form entry schools were always assigned
to the PETT group.
Table 2c. Distribution of classes by schools
Intervention
Parental Engagement and Teacher Training Classes (PETT)
25
Teacher Training Classes (TT)
14
Control Classes, of which:
Control
26
1-entry form
7
2- or 3-form entry form
19
Total
40
26
Baseline pupil characteristics are presented in Tables 3a to 3c. The first three columns of Table 3a
(under the overarching heading ‘Randomised sample’) are based on all pupils. These show that, in
PETT and control classes, female pupils represent about 48% of all pupils. The mean age of pupils is
eight and a half years. The percentage of EAL, SEN and FSM pupils in control schools are 29%, 25%
and 29%, respectively. In PETT classes, the corresponding proportions are slightly lower but the
differences are never statistically significant. The next three columns (under ‘Analysis sample’)
produce analogous results for the subsample of pupils who are included in the eventual analysis for
the primary outcome. This is a smaller sample and there is evidence of some mismatch between the
PETT and control classes; those in PETT classes are slightly older (in the analysis sample, at least)
and considerably less likely to have English as an additional language. The primary outcome itself –
literacy and numeracy – does not differ significantly across the two groups at baseline.
Table 3b follows a similar format. However, since classes within one-form entry intervention schools
are all PETT, TT classes can only exist in schools that are two- or three-form entry. Consequently,
Table 3b excludes one-form entry control schools in order to avoid reflecting systematic differences
between larger and smaller schools. Considering the ‘Randomised sample’ first, there is a significantly
higher FSM proportion among the TT classes but the other characteristics appear balanced. For the
‘Analysis sample’, those in TT classes are slightly older than those in control classes but, otherwise,
there are no significant differences.
Education Endowment Foundation
25
Impact evaluation
Table 3a. Baseline pupil-level characteristics, PETT and control classes
Randomised sample
PETT
C
N schools
23
17
Female
0.479
0.476
(0.500)
N
Age
Analysis sample
PETT-C
PETT
C
PETT-C
10
12
0.003
0. 493
0.481
0.012
(0.500)
[0.029]
(0.500)
(0.501)
[0. 046]
635
542
1,177
209
283
492
8.437
8.472
-0.035
8.505
8.456
0.049
(0.399)
(0.327)
[0.022]
(0. 297)
(0.367)
[0.031]*
N
528
527
1,055
209
283
492
EAL
0.243
0.292
-0.049
0. 122
0.224
-0.102
(0.429)
(0.455)
[0.028]
(0. 328)
(0.418)
[0.034]**
N
556
473
1,029
180
250
430
SEN
0.214
0.252
-0.038
0.206
0.268
-0. 062
(0.411)
(0.434)
[0.026]
(0.405)
(0.444)
[0.042]
N
556
473
1,029
180
250
430
FSM
0.268
0.288
-0.02
0.272
0.296
-0. 024
(0.443)
(0.453)
[0.028]
(0.446)
(0.457)
[0.044]
N
556
473
1,029
180
250
430
Literacy &
numeracy
105.3
106.9
-1.584
108.32
107.17
1.141
(17.362)
(17.70)
[1.257]
(18.575)
(18.111)
[1.964]
360
425
785
230
369
N
139
Notes: Standard deviation in (.). Standard error in [.]. Significance of the differences is on the basis of
t-tests; * p<0.05, ** p<0.01.
Education Endowment Foundation
26
Impact evaluation
Table 3b. Baseline pupil-level characteristics, TT and control classes in two- and three-form
entry schools
Randomised sample
TT
C
N schools
10
11
Female
0.457
0.469
(0.499)
(0.500)
369
Analysis sample
TT-C
PETT
C
PETT-C
4
8
-0.012
0.443
0.471
-0.028
[0.040]
(0.500)
(0.500)
[0.069]
636
70
208
278
-0.012
8.535
8.432
0.103
(0.293)
(0.384)
[0.050]*
N
267
Age
8.433
8.445
(0.391)
(0.344)
N
184
354
538
70
208
278
EAL
0.245
0.255
-0.01
0.205
0.229
-0.025
(0.431)
(0.437)
[0.043]
(0.421)
[0.070]
N
147
337
484
44
179
223
SEN
0.211
0.273
-0.062
0.341
0.296
0.045
(0.409)
(0.446)
[0.043]
(0.479)
(0.458)
[0.078]
N
147
337
484
44
179
223
FSM
0.354
0.252
0.102
0.318
0.257
0.061
(0.480)
(0.435)
[0.044]*
(0.471)
(0.438)
[0.075]
N
147
337
484
179
223
Literacy &
numeracy
106.44
105.72
0.726
113.346
108.286
5.059
(19.412)
(18.691)
[2.043]
(18.004)
(18.421)
[2.903]
126
269
395
[0.033]
(0.408)
44
53
160
213
Notes: Standard deviation in (.). Standard error in [.]. Significance of the differences is on the basis of
t-tests; * p<0.05, ** p<0.01.
Education Endowment Foundation
27
Impact evaluation
Table 3c compares the characteristics of pupils in the PETT and the TT classes in two- and three-form
entry intervention schools. Here, the only statistically significant difference is that those in PETT
classes have slightly lower attainment at baseline than those in TT classes (in the analysis sample).
Table 3c. Baseline pupil-level characteristics, PETT and TT classes in two- and three-form entry
Intervention schools
Randomised sample
PETT
TT
N schools
14
10
Female
0.470
0.457
(0.500)
N
Age
Analysis sample
PETT-TT
PETT
TT
PETT-TT
6
4
0.014
0.431
0.443
(0.499)
[0.039]
(0.497)
(0.500)
406
267
673
116
70
186
8.454
8.433
0.021
8.537
8.535
0.002
(0.386)
(0.391)
[0.036]
(0.286)
(0.293)
[0.044]
N
313
184
497
116
70
186
EAL
0.300
0.245
0.055
0.152
0.205
-0.052
(0.459)
(0.431)
[0.044]
(0.361)
(0.408)
N
367
147
514
92
44
136
SEN
0.223
0.211
0.013
0.283
0.341
-0.058
(0.417)
(0.409)
[0.040]
(0.453)
(0.479)
N
367
147
514
92
44
136
FSM
0.278
0.354
-0.076
0.348
0.318
0.03
(0.449)
(0.480)
[0.045]
(0.479)
(0.471)
[0.087]
N
367
147
514
Literacy &
numeracy
104.39
106.44
-2.053
105.67
113.35
(17.538)
(19.412)
[2.048]
(18.919)
(18.004)
215
126
341
N
92
91
44
53
-0.012
[0.075]
[0.069]
[0.085]
136
-7.679
[3.212]*
144
Notes: Standard deviation in (.). Standard error in [.]. Significance of the differences is on the basis of
t-tests; * p<0.05; ** p<0.01.
Education Endowment Foundation
28
Impact evaluation
Tables 4a and 4b report baseline levels of the secondary outcome variables for those individuals
where the corresponding post-test outcome is non-missing. Table 4a considers the InCAS scores for
reading, maths and mental arithmetic and the Likert score for the SDQ. Apart from mental arithmetic,
all other outcomes show some significant differences in at least one of the comparisons. Similarly,
Table 4b shows significant differences for three of the five PVT outcomes.
Table 4a. Pre-tests for InCAS and SDQ outcomes
1-, 2- or 3-form entry schools
2- or 3-form entry schools
PETT
C
PETT-C
PETT
TT
C
TT-C
PETT-TT
108.1
107.4
0.6
104.5
113.7
107.4
6.3*
-9.1**
(22.7)
(21.6)
[2.2]
(23.0)
(21.5)
(22.1)
[3.0]
[3.5]
9
12
5
4
8
157
285
102
69
207
109.3
106.2
3.2*
107.1
110.3
106.0
4.3
-3.2
(15.8)
(15.7)
[1.5]
(16.4)
(17.0)
(15.8)
[2.2]
[2.5]
9
12
5
4
8
184
326
114
68
230
106.4
105.2
1.2
103.1
106.5
106.6
-0.1
-3.5
(20.1)
(21.4)
[2.0]
(21.8)
(19.9)
(21.2)
[2.9]
[3.2]
9
11
5
4
7
N pupils
173
277
108
72
188
SDQ
19.1
20.2
-1.1
19.1
21.0
20.3
0.7
-1.9
(4.3)
(3.8)
[0.4]
(4.6)
(5.0)
(3.6)
[0.8]
[1.0]
7
9
5
2
7
147
237
105
25
192
Reading
N schools
N pupils
Maths
N schools
N pupils
Mental
arithmetic
N schools
N schools
N pupils
Notes: Standard deviation in (.). Standard error in [.]. Significance of the differences is on the basis of
t-tests; * p<0.05, ** p<0.01.
Education Endowment Foundation
29
Impact evaluation
Table 4b. Pre-tests for PVT outcomes
1-, 2- or 3-form entry schools
2- or 3-form entry schools
PETT
C
PETT-C
PETT
TT
C
TT-C
PETT-TT
8.0
12.9
-4.9**
8.8
12.0
13.4
-1.4
-3.2*
(9.1)
(13.2)
[1.0]
(9.3)
(12.5)
(13.3)
[1.9]
[1.6]
N schools
14
11
11
5
9
N pupils
215
287
137
62
243
Information
gathering
13.9
14.3
-0.4
12.9
11.8
13.2
-1.3
1.1
(8.4)
(11.4)
[0.9]
(8.2)
(9.8)
(10.7)
[1.5]
[1.3]
N schools
14
11
11
5
9
N pupils
215
287
137
62
243
Productive
thinking
4.1
6.5
-2.4*
4.7
7.5
7.3
0.2
-2.8
(8.3)
(12.3)
[1.0]
(9.0)
(10.4)
(13.1)
[1.8]
[1.4]
N schools
14
11
11
5
9
N pupils
215
287
137
62
243
Metacognitive
knowledge
1.1
3.2
-2.1**
1.5
2.6
3.7
-1.1
-1.1
(3.4)
(9.5)
[0.7]
(4.0)
(5.5)
(10.2)
[1.3]
[0.7]
N schools
14
11
11
5
9
N pupils
215
287
137
62
243
Metacognitive
skilfulness
1.3
2.3
-1.0
1.6
1.4
2.6
-1.3
0.3
(4.5)
(6.8)
[0.5]
(5.2)
(5.4)
(7.2)
[1.0]
[0.8]
N schools
14
11
11
5
9
n pupils
215
287
137
62
243
Building
understanding
Notes: Standard deviation in (.). Standard error in [.]. Significance of the differences is on the basis of
t-tests; * p<0.05, ** p<0.01.
The baseline imbalances seen in Tables 3a to 4b raise concerns about the extent to which it is
reasonable to view the trial as able to deliver unbiased impact estimates. However, it is important to
note that the baseline data are themselves imperfectly recorded. In particular, there is a high incidence
of missing values. Table 5 shows the proportion of missing baseline outcomes (or pre-tests) when the
post-test outcome is not missing. The fact that there is considerable variation across the arms of the
trial means that the imbalances discussed above may reflect both true imbalance and imbalance in
Education Endowment Foundation
30
Impact evaluation
availability of data. Overall, the imbalances raise a concern about the reliablility of the impact
estimates but are not in themselves sufficient to conclude that the resulting impacts will be biased.
Table 5. Missing pre-tests
1-, 2- or 3-form entry schools
2- or 3-form entry schools
PETT
C
PETT
TT
C
Reading & maths
0.33
0.19
0.22
0.24
0.23
Reading
0.31
0.08
0.23
0.17
0.09
Maths
0.21
0.04
0.16
0.18
0.04
Mental arithmetic
0.27
0.15
0.23
0.15
0.20
SDQ
0.23
0.27
0.30
0.63
0.23
PVT
0.18
0.13
0.06
0.18
0.13
Notes: Standard deviation in (.). Standard error in [.]. Significance of the differences is on the basis of
t-tests; * p<0.05, ** p<0.01.
3.4 Outcomes and analysis
The main results are presented in Tables 6a to 6c. For each outcome, there are three estimated
effects:
The overall effect of Mind the Gap (denoted PETT)
The effect of teacher training alone (denoted TT)
The effect of eligibility for the Parental Engagement animation course (denoted PE) for those
in schools where teachers received the training.
Table 6a presents impacts on the primary outcome as effect sizes. These are calculated by dividing
the estimated impact coefficients by the level 1 standard deviation from the respective multilevel
regression, and so control for covariates and the school-level random effect. The estimation results
are shown in detail in Appendices 12 and 13. These detailed results reveal that one of the maths
regressions and two of the SDQ regressions suggest no random effect. As a robustness test, linear
regressions were also estimated, clustering standard errors. These gave results that were essentially
identical to the multilevel results.
Table 6a suggests no significant effect on literacy and numeracy of Mind the Gap as a whole, nor any
significant effect of its individual elements. This is also true for the FSM and low pre-test subgroups.
Similarly for attainment (reading, maths, mental arithmetic) and SDQ secondary outcomes, Table 6b
finds no statistically significant impacts. Table 6c presents the PVT outcomes. For three of the
outcomes – building understanding, information gathering and metacognitive knowledge – there is no
significant effect. However, significant impacts on productive thinking and metacognitive skilfulness
are evident. In both cases, Mind the Gap as a whole is not found to have a statistically significant
impact, nor is there a separate statistically significant effect of the teacher training component. But
eligibility for the animation course acts to increase metacognition as measured by these outcomes.
Education Endowment Foundation
31
Impact evaluation
Table 6a. Estimation results: literacy and numeracy
N Schools
N Pupils
0.618 [-0.697, 0.414]
22
492
0.575
0.988 [-1.119, 1.137]
12
278
-0.252
0.165
0.126 [-0.576, 0.071]
6
186
PETT
-0.265
0.544
0.626 [-1.332, 0.801]
17
123
PETT
0.129
0.449
0.775 [-0.752, 1.010]
18
110
N Schools
N Pupils
Effect
type
Effect
size
Standard
error
PETT
-0.141
0.283
TT
0.009
PE
FSM subgroup
Low pre-test
Outcome
Full sample
Pvalue
95%
confidence
interval
Subgroups:
Table 6b. Estimation results: reading, maths, mental arithmetic, SDQ
Outcome
Reading
Maths
Mental arithmetic
SDQ
Effect
size
Standard
error
PETT
-0.069
0.249
0.782 [-0.558, 0.420]
23
538
TT
0.167
0.527
0.752 [-0.866, 1.200]
12
310
PE
-0.281
0.160
0.078 [-0.593, 0.032]
6
215
PETT
-0.167
0.219
0.445 [-0.596, 0.262]
23
571
TT
-0.253
0.422
0.549 [-1.079, 0.574]
12
322
PE
-0.043
0.152
0.777 [-0.341, 0.255]
6
219
PETT
-0.207
0.201
0.305 [-0.601, 0.188]
23
564
TT
-0.161
0.409
0.694 [-0.963, 0.642]
12
321
PE
-0.053
0.155
0.734 [-0.358, 0.252]
6
225
PETT
-0.005
0.194
0.979 [-0.386, 0.376]
19
516
TT
0.206
0.212
0.332 [-0.210, 0.622]
12
318
PE
0.067
0.175
0.700 [-0.276, 0.411]
6
217
Education Endowment Foundation
Pvalue
95%
confidence
interval
Effect
type
32
Impact evaluation
Table 6c. Estimation results: PVT outcomes
Outcome
Building understanding
Information gathering
Productive thinking
Standard
Perror
value
95%
confidence
interval
N
N Pupils
Schools
Effect
type
Effect
size
PETT
-0.155
0.249
0.533 [-0.642, 0.332]
28
590
TT
0.180
0.329
0.584 [-0.464, 0.824]
15
354
PE
-0.129
0.190
0.495 [-0.501, 0.242]
12
222
PETT
-0.109
0.223
0.625 [-0.546, 0.328]
28
590
TT
-0.300
0.252
0.233 [-0.793, 0.193]
15
354
PE
-0.115
0.187
0.538 [-0.482, 0.252]
12
222
PETT
0.264
0.216
0.220 [-0.158, 0.687]
28
590
TT
0.233
0.214
0.276 [-0.186, 0.652]
15
354
PE
0.503
0.193
0.009
[0.126, 0.881]
12
222
0.113
0.171
0.510 [-0.223, 0.448]
28
590
TT
0.216
0.218
0.323 [-0.212, 0.644]
15
354
PE
-0.046
0.187
0.806 [-0.413, 0.321]
12
222
PETT
0.146
0.184
0.427 [-0.214, 0.505]
28
590
TT
-0.013
0.166
0.939 [-0.338, 0.313]
15
354
PE
0.583
0.192
0.002
12
222
Metacognitive knowledge PETT
Metacognitive skilfulness
13
[0.207, 0.960]
Cost
Animation course
The pupil-level intervention would cost in the region of £1,950 (depending on whether software was
purchased or free software accessed and on the location). This includes recruitment and delivery for
13
These outcomes are based on counts of specified words. As such, they include a high proportion of pupils
scoring zero and, at the other extreme, a small number of pupils with very high scores. To assess the
robustness of the findings to the highly non-normal distribution of outcomes, a mixed Poisson model was
estimated. This too found significant effects on productive thinking and metacognitive skilfulness. To assess
robustness to the high-scori g outliers, a separate a alysis Wi sorised the pre-test and post-tests, topcoding
th
at the 95 percentile of the respective distribution. Again, the significant results held.
Education Endowment Foundation
33
Impact evaluation
up to 15 pupils and families. An alternative option is to buy in a Train the Trainer course to receive
training and the resources to deliver the intervention themselves at a cost of £800 plus equipment,
courier fees and facilitator transport costs. Additional resources can be accessed electronically free of
charge or in hard copy at low cost. For example, the printed Family Handbooks cost £4-£5 per family.
Prices quoted are exclusive of VAT.
Computers and webcams or tablets with in-built cameras are needed with animation software or apps
installed. There are free and paid-for versions available. Schools also need materials to create models
to animate, for example, plasticine, lego, craft items. Junk modelling materials can be used to promote
creativity and ensure value for money.
Schools wanting to deliver the intervention themselves will need to familiarise themselves with the
programme through a one-day training course. Session plans are available but there will be some
planning time for adapting these for any specific needs of learners, arranging logistics of running the
course and engaging families, approximately 12 hours per course. If ‘buying in’ CfL to deliver the
programme, schools should still allow approximately 5 hours of staff time to make arrangements with
the facilitators, distribute letters and evaluate the programme.
Schools contributed varying amounts of staff time to the programme, and incurred some limited supply
costs to release teaching staff to attend training. There was no capital expenditure required, although
some schools incurred small costs as they provided refreshments and additional craft resources.
Teacher training
The teacher training component of the course was typically £195 per delegate. On a per child basis,
this translated to a small additional cost, in the region of £7.
Education Endowment Foundation
34
Process evaluation
4. Process evaluation
4.1 Implementation
What are the necessary conditions for success of the intervention?
Many of the conditions for success apply to the set-up of the project, in particular communication of
the commitments involved and the expectations that participation requires. Set-up meetings with
schools require a clear introduction to the rationale and objectives for the project. It is important that
schools do not see the animation project in isolation but rather see it as complementary to other
school agendas, especially parental engagement and learning-focused pedagogy, where the
association between the staff development training and the animation project is crucial.
The partnership between stakeholders, the school, families and facilitator/CfL, should be grounded in
clear communication and shared goals. Identifying appropriate fit between the different agendas,
particularly the schools, will be supportive of this.
A named representative, preferably the teacher of the class participating in the animation project,
should lead school involvement and they should be inducted into and understand fully the interwoven
nature of the animation and metacognition elements. They should commit to attend and help out at all
sessions of the animation project. In this way they are a visible presence to the families taking part,
they can support the facilitator where necessary, and they can take the opportunity to observe the
families and the way the child they know is embedded in the family unit.
Family recruitment is an important stage. School commitment and therefore communication of the
project to families is essential at this point. The animation project works better if enough families
attend. We suggest that at least six families are needed to ensure a productive learning environment.
Low attendance can be explained by a number of reasons as opposed to a single major cause:
Engaging parents in school is notoriously difficult in many schools. It is especially difficult
when the schools targeted are in ‘hard to reach’ areas of socio-economic deprivation, where
‘school’ does not have positive connotations for many adults who did not themselves thrive
there as children. This challenge is compounded by an initial emphasis on targeting male
parents and carers as the preferred audience.
It proved to be impossible to time the sessions perfectly to suit all parents, who had different
working patterns and commitments.
To some degree, engagement was more likely in schools where there was an existing, solid
partnership with parents. However, part of the value of the project was in working with schools
which had identified the need to develop this – as such, these schools may have done a little
less to get parents on board and the barriers which needed to be broken down to permit
engagement were more substantial.
Further factors which impact on success include the following:
There is a tension between the ‘animation’ and ‘metacognition’ aspects of the project, which
occasionally can cause a conflict of expectations.
There is a ‘catch 22’ situation where, until families experience the metacognition aspects of
the project, it is difficult to understand its place in an animation project. This means that the
metacognition elements can sometimes cause puzzlement; however, this tension is balanced
by the positive impacts overall and families come to understand the value of the pedagogy.
Sensitive and confident facilitators are able to support families through this process.
Education Endowment Foundation
35
Process evaluation
These findings have arisen from family and teacher interviews and researcher observations of project
sessions across schools.
Are there any barriers to delivery?
Barriers to delivery tended to be related to logistics, some of which concerned ICT; for project
delivery, these included a lack of ICT equipment (for example, laptops and webcams not being
delivered to venues by couriers on time, or equipment not working).
Problems were also experienced with ICT support for the InCAS tests (for example, issues
with uploading the software to the school server).
Inevitable staff turnover in schools meant that on occasion named staff who were responsible
for project elements left and, if handover was not strategically managed, then the project could
slip off the school’s radar.
The nature of the project involved complex partnership working between the school, family
and facilitator (from CfL). Good, clear communication was essential at all stages of the project,
and particularly around the project set-up and recruitment of families. Miscommunications at
this stage or a perceived lack of commitment by one partner could jeopardise the project.
Is the intervention attractive to stakeholders?
The majority of participants, families and school staff, were positive about the intervention, indicating
that it helped them to:
Enhance relationships between home and school
Enrich perspectives on learning lifelong and life-wide
Engage children and adults in learning and nurturing that relationship
Combine agendas the school was already pursuing, e.g. skills-based curriculum,
metacognition, enquiry
Develop creative projects – does some of this job for the school and is attractive to
participants
Develop IT projects – does some of this job for the school and is attractive to participants
Enhance the school’s ‘out of school’ offer
Support story-telling with direct translation to literacy
4.2 Outcomes
The discussion below encompasses both elements of the intervention, the teacher training and the
animation project. The families were participants in the animation project and so this was their main
focus. The teachers were from schools that had received both the teacher training and the animation
project and so were able to talk about both elements. However, in that they were inter-related and
designed to complement one another, it is not possible to separate out responses relating to the
different elements.
Positive impact on family relationships
All the schools engaged in the project were keen to develop home–school connections. Teachers
were keen to comment on the insight provided into the working relationships of families: the
cooperative approaches and the way in which the child or parent took a lead (with the latter being the
most negatively viewed). There were many examples from teachers observing development in the
relationships.
Families, on the other hand, often said that their relationships had not changed but frequently talked
about the positive impacts in a less direct way, discussing the development of teamwork and better
understanding of how their children learn, both of which suggest increased closeness between parent
and child. They could identify that the practical challenges inherent in the animation project made it
necessary for them as a partnership to find ways to work together, to negotiate ideas and to perform
Education Endowment Foundation
36
Process evaluation
physical tasks effectively. Adults often talked about seeing the project as a rare opportunity to work
intensively with one child and the way this had benefits for their relationship. Many families talked
about increased engagement with children afterwards, in helping with homework, in creative activities
and occasionally with animation.
The teachers saw the development of relationships between parents and school as very important
outcomes of the project. They noted increased engagement or communication with the less prominent
parent (often male), giving examples of parents supporting more at school, or communicating with
school better. From the families’ perspective, the parents often valued the opportunity to gain an
insight into their child’s school and classroom context, valuing this added closeness.
An additional area of development the teachers observed was in their relationship with the child. For
example, one teacher discussed how watching the children work helped her in the classroom. She
used examples from practice and talked about how it helped her refine her teaching in response to
seeing the children learning with their parents.
The animation project as a vehicle for family learning
Across the interviews, the animation was seen as a good motivator and some teachers specifically
commented on how it was especially so for fathers. The teachers were very positive about how the
project was structured and the way in which it was challenging enough to allow participants to see that
learning did not always go smoothly. They could see real value in this for the children and the parents.
A few schools talked about how they have continued to use the animation project in school to engage
parents in learning with their child.
The families also reflected on how the technical and creative challenges inherent in the process
presented good learning opportunities. Families usually came to understand the value of the animation
process for supporting the development of metacognition. However, families sometimes saw a
disconnect between the two elements.
A focus on learning
Many families talked positively about coming ‘back to school’, the altered learning environment that
the project enabled and how the intensity of the project provided a window into the detail of the child’s
learning. ‘Creativity’ and ‘teamwork’ were frequently referred to in terms of learning skills which the
project helped nurture. Some families noticed increases in children’s confidence as a result of the
project. The small size and relatively informal nature of the social setting were discussed as factors
that facilitated this, along with the pride children had in their finished work.
The majority of schools in which the project worked well had used similar techniques before; it had a
good fit with skills-based and inquiry-based learning in schools. As a group, the teachers were keen to
mention metacognitive development and resilience as two key concepts embedded in the project.
They also commonly talked about impacts on pupils’ self-esteem and confidence.
Sustainability: impact home, impact school
Many of the schools and teachers were transferring the ideas from the teacher training element into
the classroom. They were also trying out ideas from the animation element of the intervention, for
example the learning language and use of exemplification in prompts such as ‘remember when...’.
Teachers believed that the project had resulted in the involvement of participant families in school
activities, especially family members who had previously been less visible. This was echoed by the
parents who felt that they had had the opportunity to open a window on school activities and the
learning activities in which their child was involved. They felt better able to talk about learning and
teaching with the school and their child’s teacher.
Education Endowment Foundation
37
Process evaluation
Negative aspects
The teachers were negative about the standardised approach of the research project design. They felt
that lack of flexibility meant that the project was insufficiently inclusive and that fewer families were
recruited than would be the case with a more flexible model. The inflexible aspects of the project were
requirements of the RCT design and were unavoidable in order to evaluate the project.
In addition, the RCT arrangements, the waiting list control, the attainment tests and other outcome
measures were all seen as an added burden on precious learning and teaching time. Some teachers
even documented that they felt ‘hounded’ by the RCT aspect of the project.
Families had a limited recall of the metacognition aspects of the project, in comparison to the project
content. This does not mean that learning did not happen: some elements of it were recognised but
many appear to have been forgotten. For example, there was little mention of the ‘5Rs’ (Readiness,
Resourcefulness, Resilience, Responsibility and Reflectiveness), whether directly or indirectly, to
describe what took place. ‘Creativity’, ‘teamwork’ and ‘confidence’ were the main areas of learning
discussed, along with ideas around challenge and not giving up. This may also reflect families’ limited
ability to verbalise this thinking rather than the absence of this process. In fact, discussions by the
families of the creativity and teamwork aspects of the project included implicit consideration of the ‘5R’
skills. Overall, there is evidence that the project was successful in facilitating thinking about learning.
4.3 Fidelity
Was the intervention delivered as intended to all the treatment groups?
Fidelity was ensured through establishing a consistent pedagogy and designing tools that supported
that consistency. These tools included specific outcomes for set-up meetings, a clear outline of the
commitment the school was making to the project, set processes for family recruitment and common
structures for the intervention inputs. These inputs were the staff development INSETs and the
animation project elements, which included presentation with embedded discussion points and tasks
complemented by family and teacher handbooks. In fidelity checks these were seen to work well in the
majority of cases.
If there were any issues with fidelity what were the reasons?
Fidelity was impacted upon by individual schools’ circumstances, for example the size of the classes,
the interest and uptake for the animation project, staff changes and logistics. including technical issues
with the hardware and software, the impact of Ofsted inspections and appropriateness of school
learning environments. It was very challenging to ensure that the project was tailored to schools’
contexts, accommodating current practices and approaches, while also keeping to the core
standardised principles around recruitment, delivery and evaluation (as reported in the section
describing the intervention).
What elements of the intervention are perceived to be adaptable?
The development of metacognitive knowledge and skilfulness can include a variety of techniques and
ideas that aim to make the process of learning explicit to learners. As explained in the Introduction,
pedagogies can draw on theoretical and pedagogic traditions such as learning to learn, thinking skills,
self-regulation, habits of mind, dispositions, self-efficacy and self-esteem in relation to learning. As
such, individual schools and teachers are able to draw on a repertoire that suits them and their
contexts.
The timing of the animation project can be varied depending on the characteristics of the community
within which the school is located. There are affordances and constraints to timetabling the course
(during school, immediately after school or in the evenings) that impact on parents’ ability to attend,
the children’s engagement and the schools’ commitment.
Education Endowment Foundation
38
Process evaluation
Schools could coordinate the focus of the animations towards a theme. These could be introduced to
fit with school agendas or to support transfer of learning from the course into the school curriculum. A
narrower focus, or more structure to the animation brief, would have been supportive for some
families, while others would have found it constraining. However, this connection could be seen by
some schools as legitimising the animation course into more recognisable school learning.
Education Endowment Foundation
39
Conclusion
5. Conclusion
5.1 Key Conclusions
Key conclusions
1. The headline findings provide no evidence of a statistically significant impact of Mind the Gap
on attainment. There is not sufficient evidence to conclude that any observed effect was
caused by the programme rather than occurring by chance.
2. The estimate of the programme’s impact on pupils’ metacognition was positive and statistically
significant. This improvement in metacognition may in time lead to an impact on academic
attainment.
3. Participating families and staff felt the intervention enhanced home–school relationships and
strengthened the learning relationship between children and parents.
4. To increase the proportion of families who sign up to the animation course, it is important that
schools clearly communicate its aims and promote the potential benefits of participation.
5. The difficulty in recruiting schools suggests that introducing Mind the Gap more generally may be
difficult where schools are not committed to parental engagement or where they have difficulty
delivering activities out of hours.
5.2 Limitations
There are two main limitations to the analysis. First, there was substantial dropout and this was
concentrated among control schools. Dropout schools provided no outcome data and so could not
feature in the impact estimates. While it is not inevitably the case that this will introduce a bias into the
resulting estimates, it does mean that the estimates can no longer be regarded as truly experimental.
The problem is exacerbated by a non-negligible degree of missing outcome data among those schools
that did not drop out. This may be a consequence of the burden placed on participating schools. The
number of data collection tools and the time they took to complete was challenging for many teachers
and schools.
The second limitation is that, in practice, the treatment status of classes within schools did not agree in
all cases with the randomised status. Such non-compliance is a common feature of randomised
controlled trials and generally one can still adhere to the intent-to-treat (ITT) principle. However, in
several cases the randomised status became meaningless in practice, most obviously when the
teachers who were randomised (in most cases, classes were identified by their teacher) had left the
school. This made it impossible to adhere to ITT. Instead, de facto treatment status was used for
classes. This further reduces the extent to which the impact estimates for the components of Mind the
Gap can be viewed as experimental.
Education Endowment Foundation
40
Conclusion
A practical drawback of the project design is its reliance on technology and technical support to
administer the InCAS tests at a level which some primary schools may not be able to provide. It is
possible that logistical difficulties in delivery reduced the intensity of the input. These were identified in
the process evaluation and included a lack of ICT equipment or support as well as turnover of staff
responsible for project elements in schools.
The schools involved in the trial were drawn from a range of geographical areas within England –
Devon, Birmingham, London and Manchester. This is helpful as a means of capturing some of the
diversity of the pupil population. However, the challenge involved in recruiting a sufficient number of
schools suggests there may be many schools that are not able to participate or, perhaps, are not
interested in doing so. This implies that generalisability can only be viewed as extending to the type of
school that can be reached by this type of approach. It is possible that, without the need to participate
in a trial, there might have been more interest from schools. The randomised controlled design and the
resulting data collection was a major barrier for some schools. This was particularly because of the
lack of flexibility surrounding standardisation that for many schools, particularly small ones, was not
necessarily conducive to school practice. Schools allocated to the control were challenged by the
waiting list design and idea of quite intrusive data collection without, as they perceived, any of the
benefits. In addition, the multiple data collection tools to be completed were seen as overly onerous by
some schools and this resulted in non-completion.
5.3 Interpretation
The impact analysis has found no evidence of a statistically significant impact of Mind the Gap on the
primary outcome. Taking the concerns discussed above as read, there is a question of how to
interpret this finding. One consequence of the school dropout and prevalence of missing outcome data
is that the number of schools (and pupils) observed is much smaller than was assumed at the design
stage. Furthermore, the estimated effect sizes are all considerably below those assumed when
designing the trial (0.35 to 0.45 standard deviations). Both these points suggest that the evaluation
may not have had sufficient statistical power to be able to detect the size of effect that prevailed in
practice. Consequently, we cannot rule out the possibility that Mind the Gap has an effect on
attainment but, if it does, it is likely to be smaller than anticipated and to require a larger trial in order to
detect it.
With the secondary outcomes, there was similarly no significant impact on attainment or pupils’ views
of their relationship with their parents. However, significant increases in some aspects of
metacognition were found. Specifically, pupils in intervention schools who were eligible for the
animation classes scored higher on their productive thinking and metacognitive skilfulness than pupils
who were not eligible. In considering this, it is relevant that only slightly more than a quarter of eligible
families participated in the animation course. We have not attempted here to assess the impact of the
course on participants since the trial was not designed to deliver this estimate. However, if we believe
the estimated positive metacognition impacts of eligibility are driven by course participation, we would
expect the effects on participants to be greater than the effects on eligible pupils as a whole. It is also
relevant to remember that these effects in turn reflect varying degrees of participation since, among
those who did participate, only 58% attended all five sessions. It appears that school characteristics
are more important than pupil characteristics in the participation decision. In particular, participation
was low among schools serving poorer areas. This points to the challenge of enabling such an
intervention to reach poorer children.
The qualitative analysis also provided some positive indications. The animation course was valued by
participating families and school staff for a number of reasons. These include its perceived role in
enhancing home–school relationships, nurturing the learning relationship between children and adults
and developing creative ICT projects which enhance the school’s out of hours provision. The focus on
ICT, and animation in particular, was attractive to parents. There were indications that some schools
have continued to run the animation project to engage parents with their child’s learning. Findings from
the process evaluation suggest that projects of this type work well in small group settings where
participants can be given individual support and, if necessary, additional motivation.
Education Endowment Foundation
41
Conclusion
However, there are indications that the metacognition component of the project was not understood by
parents as well as it might have been. The process evaluation found that, while families recognised
that they had undergone a learning process, they were not able to fully articulate the ways in which
they had benefited. This may have been because the input was not sufficiently intensive, or
alternatively due to content or delivery. This aspect of the project appeared to work well in schools
which had used similar techniques before, suggesting that it may take some time to have a significant
impact.
5.4 Future research and publications
It is hard to avoid the conclusion that the robustness of the impact analysis has, to some extent, been
compromised by dropout and the practical complications surrounding delivery. There are lessons to be
had from this experience. Partly, it showcases the difficulties that can be encountered when trying to
achieve strong evidence of programme effectiveness in a school setting. There were indications that
features of the programme that were desirable from the research point of view did not sit comfortably
with the ethos and working practice of some schools. Since the success of studies like this relies on
the full cooperation of participating schools, future evaluations may benefit from trying to factor in
some degree of flexibility and, in any event, to minimise the burden on all stakeholders.
Short of repeating the exercise, there are opportunities for future research using the database that has
been assembled in the course of this evaluation. One aspect that has been left unexplored in this
report is the effect of participation in the animation course – could this affect attainment or children’s
perception of the relationship with their parents? To explore such questions requires non-experimental
techniques since the trial randomised eligibility rather than participation. However, as an informal
exploration, Table 7 shows mean outcomes among PETT classes according to the number of
sessions attended. Here we distinguish between zero classes (non-participation), one to four classes
(partial participation) and five classes (full participation). The final column in the table gives a p-value
of the null hypothesis that there is no variation across these three groups. From this, it appears that
variation across the three groups in the primary outcome is significant for attainment outcomes, with
those attending more sessions achieving higher scores. For the other secondary outcomes, there is
no evidence of statistically significant variation.
While perhaps intriguing, the possibility that the primary outcome varies with the number of sessions
can be interpreted in more than one way. One possibility is that attending more sessions improves
outcomes. Alternatively, the number of sessions may not be important but the less able pupils may
drop out before the end. Unfortunately, we have no way of distinguishing between these competing
explanations.
Education Endowment Foundation
42
Conclusion
Table 7. Mean outcomes by number of animation sessions attended
Number of sessions attended
Outcome
Literacy and numeracy
0
1-4
5
111.98
111.39
120.19
20.06
20.69
15.62
139
17
53
112.08
111.45
121.43
23.88
22.46
19.15
153
20
54
112.92
113.35
121.03
17.67
20.01
14.85
157
22
54
110.04
111.28
117.93
21.15
24.10
15.60
160
23
55
SDQ
19.66
18.22
20.16
s.d.
4.35
5.78
4.27
N
136
18
38
Building understanding
7.83
11.44
7.76
s.d.
8.30
14.37
10.00
N
172
27
62
14.23
13.81
13.26
s.d.
8.51
6.69
7.45
N
172
27
62
Productive thinking
3.96
1.52
3.56
s.d.
8.05
3.25
7.06
N
172
27
62
Metacognitive knowledge
0.55
0.00
1.34
s.d.
2.89
0.00
4.32
N
172
27
62
Metacognitive skilfulness
1.92
2.11
1.65
s.d.
4.82
4.26
4.42
N
172
27
62
s.d.
N
Reading
s.d.
N
Maths
s.d.
N
Mental arithmetic
s.d.
N
Information gathering
Education Endowment Foundation
p-value
0.03
0.03
0.01
0.05
0.32
0.17
0.72
0.29
0.12
0.89
43
References
References
Anderson, J., Anderson, A., Friedrich, N. and Kim, J.I. (2010) Taking Stock of Family Literacy: some
contemporary perspectives, Journal of Early Childhood Literacy, 10(1): 33-53
Crozier, G. (1998) Parents and Schools: partnership or surveillance? The Journal of Education Policy,
13(1): 125-136
Desforges, C. and Abouchaar, A. (2003) The Impact of Parental Involvement, Parental Support and
Family Education on Pupil Achievements and Adjustments: A literature review, London: DfES.
Available at:
http://bgfl.org/bgfl/custom/files_uploaded/uploaded_resources/18617/desforges.pdf (accessed
18 February 2014)
Flavell, J. H. (1979) Metacognition and cognitive monitoring: a new area of cognitive developmental
inquiry, Cognitive Development, 34: 906–911.
Flouri, E. and Buchanan, A. (2004) Early Father’s and Mother’s Involvement and Child’s Later
Educational Outcomes, British Journal of Educational Psychology, 74: 141-153
Freeman, H., Newland, L.A. and Coyl, D.D. (2008) Father beliefs as a mediator between contextual
barriers and father involvement, Early Child Development and Care, 178(7&8): 803-819
Goldman, R. (2005) Fathers’ involvement in their children’s education, London: National Family and
Parenting Institute
Hall, E., Wall, K., Higgins, S., Stephens, L., Pooley, I. and Welham, J. (2005) Learning to Learn with
Parents: lessons from two research projects. Improving Schools, 8(2): 179-191
Harris, A. and Goodall, J. (2007) Engaging Parents in Raising Achievement: Do parents know they
matter? London: DCSF. Available at: http://dera.ioe.ac.uk/6639/1/DCSF-RW004.pdf
(accessed 18 February 2014)
Hill, N.E. and Taylor, L.C. (2004) Parental School Involvement and Academic Achievement:
pragmatics and issues, Current Directions in Psychological Science, 13(4): 161-164
Macleod F. (2008) Why fathers are not attracted to family learning groups? Early Child Development
and Care, 178(7–8), 773–783
Marsh, H.W. (1992) Self-Description Questionnaire II: Manual. Sydney: University of Western Sydney,
Macarthur, Faculty of Education, Publication Unit.
Moseley, D., Baumfield, V., Elliott, J., Higgins, S., Miller, J. and Newton D. P. (2005) Frameworks for
Thinking: a handbook for teaching and learning Cambridge: Cambridge University Press
Nord, C.W., Brimhall, D. & West, J. (1998) Fathers’ involvement in their children’s schools (NCES 98091). Washington, DC: U.S. Department of Education, National Center for Education
Statistics. ED 409 125. Available at: http://files.eric.ed.gov/fulltext/ED409125.pdf (accessed 19
February 2014)
Rocha-Schmid, E. (2010) Participatory Pedagogy for Empowerment: a critical discourse analysis of
teacher-parents’ interactions in a family literacy course in London, International Journal of
Lifelong Education, 29(3): 343-358
Education Endowment Foundation
44
References
Roopnarine, J.L., Krishnakumar, A., Metindogan, A. and Evans, M. (2006) Links between parenting
styles, parent-child academic interaction, parent-school interaction, and early academic skills
and social behaviors in young children of English-speaking Caribbean immigrants, Early
Childhood Research Quarterly, 21: 238-252
Swain, J., Brooks, G. and Bosley, S. (2014) The Benefits of family learning provision for parents in
England, Journal of Early Childhood Research, 12(1): 77-91
Van Steensel, R., McElvany, N., Kurvers, J. and Herppich, S. (2011) How Effective are Family Literacy
Programs? Results of a meta-analysis, Review of Educational Research, 81(1): 69-96
Veenman, M.V.J., Kok, R. and Blöte, A.W. (2005) The relation between intellectual and metacognitive
skills in early adolescence. Instructional Science, 33, 193–211
Wade, B. and Moore, M. (2000) A Sure Start with Books. Early Years, 20: 39-46
Wagner, M., Spiker, D., and Linn, M. (2002) The Effectiveness of the Parents as Teachers Program
with Low-income Parents and Children, Topics in Early Childhood Special Education, 22(2):
67-81
Wall, K. (2010) Learning to Learn: Ten Years on, Learning and Teaching Update, Issue 33: April 2010
Wall, K. (2012) ‘It wasn’t too easy, which is good if you want to learn’: An exploration of pupil
participation and Learning to Learn. The Curriculum Journal, 23(3): 283-305
Wall, K., Burns, H. and Llewellyn, A. (2014) Mind the Gap: an exploratory investigation of a family
learning initiative to develop metacognitive awareness. Paper presented at the European
Conference on Educational Research, September 2014, Porto, Portugal
Wall, K., Hall, E., Baumfield, V., Higgins, S., Rafferty, V., Remedios, R., Thomas, U., Tiplady, L.,
Towler, C. & Woolner, P. (2010) Learning to Learn in Schools Phase 4 and Learning to Learn
in Further Education Projects, London: Campaign for Learning
Wall, K., Hall, E., Higgins, S., Leat, D., Thomas, U., Tiplady, L., Towle, C. and Woolner, P. (2009)
Learning to Learn in Schools Phase 4 Year One Report. London: Campaign for Learning
Wall, K. and Higgins, S. (2006) Facilitating and supporting talk with pupils about metacognition: a
research and learning tool, International Journal of Research and Methods in Education,
29(1), 39-53
Wolfendale, S. (1996) The Effectiveness of Family Literacy. In: Wolfendale, S. and Topping, K. (Eds.)
Family Involvement in Literacy: Effective Partnerships in Education, London: Cassell: 131-146
Education Endowment Foundation
45
Appendices
Appendices
Appendix I
Education Endowment Foundation
Self-description questionnaire
46
Appendices
Appendix II
Education Endowment Foundation
Pupil Views Template
47
Appendices
Appendix III
Education Endowment Foundation
School Memorandum of Understanding
48
Appendices
Appendix IV
Frequently asked questions sheet
͚Mind the Gap͛ animation project
Evaluation by Durham University
Frequently Asked Questions for schools
Is the information collected made anonymous?
The data from each child is given a unique code which links their data anonymously. The key which
links codes to individual children is kept securely at Durham University in full compliance with data
protection legislation. Privacy and confidentiality are of primary importance to us. All original copies of
response forms will be held until 2020 (seven years) when they will be destroyed. Any data used in
presentations or publications will be averaged: results will never include any details about any
individual child, in any circumstance. No child will ever be personally identified by or through the
project.
Who else has access to this information? Will the school get to see it, who are the projects
partners who are looking at it?
The Mind the Gap team at Durham University will be storing and analysing these data. Your child’s
school will have access to the results of InCAS testing as a way to assess each child’s academic
progress and development, identifying any areas a child finds challenging as well as areas of strength
and to complement their own assessment data. The tests are undertaken using a computer and the
children usually enjoy the experience. Our project partners are the Campaign for Learning and the
Institute for Economic and Social Research. The project is funded by the Education Endowment
Foundation who will also receive a copy of the results and who will track the impact of projects
anonymously in national tests with the permission of the Department for Education
Should you wish to know more about your project partners, here are their website addresses:
www.campaign-for-learning.org.uk/
www.niesr.ac.uk
www.dur.ac.uk/education/
http://educationendowmentfoundation.org.uk
Will schools get to see the results of the research?
Once this project reaches completion in 2014 we will start the process of collating and analysing these
data. We will be happy to provide schools with details of our overall results, and a summary in a form
which they could circulate to parents. We work with anonymised data, so we would not hold records
on individual children.
Education Endowment Foundation
49
Appendices
How many schools are involved in this project?
There are 50 primary schools involved in the project, from Birmingham, Devon, Manchester and
London
Who can I contact if I have any questions regarding this project?
Please contact Helen Burns, Project Manager for Mind the Gap, Durham University, School of
Education, DH1 1TA at
[email protected] or on 0191 3348130.
Why have parents and carers not been asked for consent for children to take part in the study
as a whole?
We ask the school to agree to take part in the project as we are not comparing individual children’s
personal data but comparing classes between schools. We therefore ask for consent at school level
so that we know that schools are happy to share their class level attainment data. We do ask parents
for consent for any video or photography, just in case this takes place (although this is very rare due to
the scale of the project). Additionally, researchers will ask for further consent where families are being
filmed for case studies or are involved in a recorded interview. We also offer parents and carers the
opportunity to opt out at any stage so that any information about them their child or children is not
included in the project.
This commitment is outlined by the Campaign for Learning in their induction for the school and also at
the training events for schools involved in the programme, delivered by the Campaign and by Durham
University. The project has ethical clearance from Durham University’s School of Education Ethics
Committee and the Education Endowment Foundation have consulted the Department for Education
about both the legal and ethical issues involved in analysing school attainment data, as our project is
one of over 50 which they have funded.
Have schools had a chance to communicate with parents and to ask the Durham University
evaluation team questions about the study?
Almost all schools have had the opportunity to attend a local training session in relation to the project.
Additionally, they have had an induction to the programme from the Campaign for Learning as well as
regular communication from the Durham University evaluation team. All schools have been made
aware of the evaluation as a condition for taking part in the animation project. The Durham University
team are happy to answer any questions you have now, or throughout the course of the study
[email protected]
How long will the project last, when will it be completed and when will we be able to see the
results?
Data collection will be completed in 2014. We expect to have the results available by 2015 and will
provide schools who have been involved with a summary sheet.
Education Endowment Foundation
50
Appendices
Appendix V
Education Endowment Foundation
Parental registration form (initial)
51
Appendices
Appendix VI
Education Endowment Foundation
Parental registration form (revised)
52
Appendices
Appendix VII Teacher questionnaire
Education Endowment Foundation
53
Appendices
Appendix VIII Teacher interview schedule
Mind the Gap
Participant Teacher Interview
This interview contributes to Durham University’s evaluation of ‘Mind the Gap’.
The research is funded by the Education Endowment Foundation and seeks to
investigate links between the Mind the Gap animation project and attainment,
as well as looking more broadly at the learning which takes place. The data
supplied through the interview will be used to complement data collected via
InCAS assessments, questionnaires and family interviews, in gaining a full
picture of the impact of the project.
Data will be used only by Durham University and will be treat confidentially and
anonymously. All data will be held securely and researchers will comply with
the Data Protection Act (1998). You are free to withdraw consent at any time.
Please feel free to ask for further information by contacting Helen Burns at:
[email protected]
This interview should take around 30-45 minutes. You will be asked to confirm
consent to be recorded at the start of the interview.
1. Could you tell us about the Mind the Gap project as you experienced it?
You might like to comment on the following:
o Introduction and recruitment
o How difficult was it to get parents involved?
o Did you see parents that you didn’t see previously?
o Duration and timing
o Delivery and facilitators
o Structure and content
o Course materials (worksheets etc.)
o End result
o Major impacts
o How well did it fit with school practice: home–school partnership
work, learning to learn approaches already in use?
o Do you feel that the intervention was successful?
Education Endowment Foundation
54
Appendices
2. Were there any particular families that you felt the Mind the Gap process
was particularly useful for? Please explain why.
You might consider different perspectives:
o the child
o the parent
o the school
o parent-child relationships
o home–school relationships
The following types of learning might be useful to think about:
o Curriculum attainment
o Skills
o Soft skills
o Motivation and disposition
o Creative imagination
o Metacognition (thinking about learning)
3. Were there any particular families that you felt the Mind the Gap process
was not particularly useful for? Please explain why.
The prompts from the previous question might be useful
4. Please use the table below to consider the families that took part in the
Mind the Gap project. For each one could you rate how successful you
think their:
Learning experience was?
Animation experience was?
Please give each family a mark out of 10 (with 10 being VERY
SUCCESSFUL and 1 being not successful)
The interviewer will ask you to justify your ratings and talk through your
thinking.
Family name
Education Endowment Foundation
Learning experience
Animation experience
55
Appendices
5. Would you do the project again?
6. On a scale of 1-5, how glad are you that you participated?
Thank you for your time and thought
Education Endowment Foundation
56
Appendices
Appendix IX
post project
Teacher interview schedule at 6 months
Mind the Gap
Participant Teacher Interview at 6 months post project
This interview contributes to Durham University’s evaluation of ‘Mind the Gap’. The
research is funded by the Education Endowment Foundation and seeks to
investigate links between the Mind the Gap animation project and attainment, as well
as looking more broadly at the learning which takes place. The data supplied through
the interview will be used to complement data collected via InCAS assessments,
questionnaires and family interviews, in gaining a full picture of the impact of the
project.
Data will be used only by Durham University and will be treat confidentially and
anonymously. All data will be held securely and researchers will comply with the Data
Protection Act (1998). You are free to withdraw consent at any time. Please feel free
to ask for further information by contacting Helen Burns at:
[email protected]
The interview takes place by telephone. You will be asked for consent for this
conversation to be recorded at the start of the interview.
This interview should take around 30 minutes
1. Given that it has been 6 months since you took part in the project, could you
reflect on the subsequent impact of the Mind the Gap?
You might like to comment on the following:
o Major, sustained impacts
o Instances where the project has been seen to influence the
behaviour/dialogue/actions of a child
o Evidence of increased/different support from parents and carers for
children’s learning
o Relationships between parents/carers and school
o Attainment
o Metacognition
o Imagination/creativity
o Influence on school practice
o Influence on your personal practice
o Do you feel that the intervention was successful?
Education Endowment Foundation
57
Appendices
2. Were there any particular families that you felt the Mind the Gap process has
had a sustained or delayed impact on, revealed in the 6 months post project?
Please explain why.
You might consider different perspectives:
o the child
o the parent
o the school
o parent-child relationships
o home–school relationships
The following types of learning might be useful to think about:
o Curriculum attainment
o Skills
o Soft skills
o Motivation and disposition
o Creative imagination
o Metacognition (thinking about learning)
3. Were there any particular families that you felt had not benefitted from the
Mind the Gap process at the time but have shown evidence of positive impact
in the last 6 months? Please explain why.
The prompts from the previous question might be useful
4. In our previous interview, you were asked to grade families that took part in
the Mind the Gap project in terms of how successful you thought their learning
and animation experiences were. Please see the table below as a reference to
the grades you allocated for ‘learning experience’:
Family
Education Endowment Foundation
Grade
58
Appendices
Focussing on their learning experience, please consider the families for which
you said the experience had been successful:
How sustained has this success been for these families?
Focussing on the learning experience of families for whom you thought the
experience had been less successful:
Do you feel that they may have gained longer term benefits from being involved in the Mind the
Gap programme?
5. Would you do the project again?
6. On a scale of 1-5, how glad are you that you participated?
THANK YOU FOR YOUR TIME AND THOUGHT
Education Endowment Foundation
59
Appendices
Appendix X
Family interview schedule
Durham University
‘Mind the Gap’
Participant Family Interview
Introduction
The interview will take place with you, the child you made the animation with and one researcher
from Durham University. The interview will be recorded so that the interviewer can look at it again
later. It will be very informal and more like a conversation but with a bit of structure to help things
along. We will treat what you tell us as confidential and neither you nor your child will be identified
in any report. The interview will last about 45 minutes. We would like to use your handbook and the
film you made as a starting point for our conversation. Please could you bring your handbook with
you to the interview? If you don’t have this any more, don’t worry but please come anyway. We will
try to make sure the film which you made is on hand.
Why we are interested in your experiences
Initial findings relating to an animation project similar to Mind the Gap, indicated that it improved
children and young people’s motivation and performance at school. Funded by the Education
Endowment Foundation (EEF); Durham University, The Campaign for Learning and the National
Institute for Economic and Social Research (NIESR) are undertaking a 2 year, national evaluation of
Mind the Gap to look closer at the impact on motivation and performance. We would like to get your
views on the project as part of this.
If you would like to know anything else about the research, please feel free to ask us questions either
before or at the interview. We are more than happy to answer any questions you may have. Contact:
Helen Burns at
[email protected] or on 0191 3348310.
Below are the questions you will be asked during the interview. We give you these in advance
so that you have time to think about your responses and so that there are no surprises!
Education Endowment Foundation
60
Appendices
Introduction to the questions
I have 3 main questions to ask you:
1.
2.
3.
What did you do on the project?
What were the effects/impacts of the project?
What did you learn?
Please use your family handbook to help you to talk about your experiences. We can have a look at
the film that you made too.
Question 1 (asked to both child and adult to give joint answers)
What did you do?
Why did you get involved in the project?
What did you hope to get from the project?
Tell us your experiences of the project?
What happened each week?
What was at the start, middle and end of the project?
What skills were you using? ICT/creative/ learning?
What did you do with the handbook?
What did you make?
Is there anything you would do differently if you were to go on a similar project together in the
future?
Question 2
Did the project have an effect/impact on you and if so, how and why?
Adult
Your learning?
Did you see any changes in each other (behaviour, feelings)?
Did you notice any changes in yourself (behaviour, feelings)?
Have you used any of the knowledge/ skills (relationship/ ICT/ creative/ learning) from the
project since?
Your relationships with school?
Education Endowment Foundation
61
Appendices
Child
Your learning?
Did you see any changes in each other (behaviour, feelings)?
Did you notice any changes in yourself (behaviour, feelings)?
Have you used any of the knowledge/ skills (relationship/ ICT/ creative/ learning) from the
project since
Family
Your family relationships?
Your learning?
Have you used any of the knowledge/ skills (relationship/ ICT/ creative/ learning) from the
project since?
Would you recommend this project to other families? Why?
Question 3
What did you learn?
Adult
How do you feel about yourself as a learner? Has the project impacted on this at all?
Do you have any examples of how you’ve applied what you learned during/after the project?
How would you rate what you learned? Give a mark out of 10 and why?
How would you rate your film? Give a mark out of 10 and why?
Child
How do you feel about how well you do at learning? Has the project impacted on this at all?
Do you have any examples of how you’ve applied what you learned during/after the project?
How would you rate your learning? Give a mark out of 10 and why?
How would you rate your film? Give a mark out of 10 and why?
Family
Did you learn well together?
Education Endowment Foundation
62
Appendices
Thank you for answering the questions. Is there anything else you would like to add which we’ve not
discussed?
Education Endowment Foundation
63
Appendices
Appendix
Pupil Views Template coding scheme
Pupil Views Template data are a qualitative tool that at analysis stage was coded using a
deductive coding scheme to measure the declarative cognitive and metacognitive awareness
of the children. The written content of each pupil views template was transcribed and imported
into NVivo11 for analysis using a deductive coding procedure (described below). A code was
applied based on the sense and meaning of a pupil’s response with a judgement made by the
researchers as to the intended meaning. A category could therefore be applied to a single
word, to a sentence fragment, a full sentence or a paragraph. Results are presented in terms of
total words coded as the most sensitive output of NVivo (both proportionally and in relation to
the research aims).
The statements were categorised using Moseley et al.’s (2005a) model of thinking (figure
A11.1). This model was chosen as it represents a comprehensive synthesis of learning and
thinking theory. The statements were categorised as to whether they were predominantly
evidence of cognitive skills (information gathering, building understanding, or productive
thinking; and/or whether they were evidence of metacognitive thought (strategic and reflective
thinking in Moseley et al.’s model).
Figure A11.1: Frameworks for Thinking model used as a basis for the coding scheme (adapted from
Moseley et al., 2005)
The following definitions based on this analysis were used:
Information gathering: Characterised by recall of ideas and processes and recognition or
basic comprehension of information they have been told or have read.
Education Endowment Foundation
64
Appendices
Building understanding: This required some organisation of ideas and recollections,
some idea of relationships or connections, with some development of meaning about
implications and/or patterns which could be applied or interpreted.
Productive thinking: These comments tended to show more complex thinking such as
reasoning, problem solving and some movement of understanding beyond the
concrete and towards the abstract. Ideas that were generalisable or creative were
placed also in this category.
Strategic and reflective thinking: Comments represented an awareness of the process of
learning, including a reflective or strategic element to the statement or explicit thinking
about learning (metacognitive awareness of learning).
The statements coded as strategic and reflective, and therefore indicative of metacognition,
were then further analysed for evidence of metacognitive knowledge and metacognitive
skilfulness (Veenman et al. 2005). These categories were defined in the following ways:
Metacognitive knowledge: Comments in this category showed an understanding that the
learner could think about learning, and could talk about some of the processes which
supported their own learning (declarative knowledge).
Metacognitive skilfulness: Comments within this category involved the procedural
application and translation of thinking and learning skills across different contexts or
for different purposes (for definitions see also Veenman & Spaans (2005), p 160).
This coding system was checked for inter-rater reliability using Cohen’s Kappa with an
agreement of 84%, 20% (n=75) of the total sample were double coded by another researcher,
and an intra-rater agreement of 98%, 20% of total sample double coded at the end of the coding
process compared to the beginning. Exemplification of the coding can be seen in Table 1
where examples of each coding category are given.
Table A11.1: Table exemplifying the different coding groups
Code
Example quote
Information gathering
Shall we use the camera?
(representative of recall of events)
Building understanding
But Daddy how do we play games?
(making connections)
Productive thinking
Dad I don’t do it like that. I do it like Mr Bowles taught me.
(using ideas from beyond the current context)
Strategic &
Reflective
Thinking
Metacognitive
Knowledge
Can you give me a hand, I don’t get this, can you help?
(awareness of a learning process)
Metacognitive
Skilfulness
You need to do a little more practice for yourself
(ability to apply knowledge of the learning process)
Education Endowment Foundation
65
Appendices
Appendix XII Detailed estimation results
The tables in this appendix present the full estimation results for all of the outcomes considered. In
each case, there are three columns of results. The first column (PETT vs Control) shows the overall
impact of Mind the Gap. The second column (TT vs Control) shows the impact of the teacher training
element of Mind the Gap. The third column (PETT vs TT) shows the impact of the parental
engagement element of Mind the Gap over and above the impact of the teacher training element.
Education Endowment Foundation
66
Appendices
Table A12a. Literacy and numeracy, full estimation results
Impact
Age (months)
Age squared
Female
EAL
SEN
FSM
EALSENFSM missing
Pre-test
Pre-test missing
Blocking variables
Constant
Log likelihood
Chi-squared test statistic of RE
Chi-squared p-value
N
Random effects
- school-level variance
- pupil-level variance
N schools
N observations per school
min per school
max per school
mean per school
ICC
Education Endowment Foundation
PETT vs Control
-1.429
[2.866]
-7.272
[4.140]
0.036
[0.020]
1.815
[0.947]
-0.533
[1.507]
-5.908
[1.370]**
-2.566
[1.228]*
-8.193
[2.970]**
0.835
[0.035]**
83.164
[4.017]**
y
391.405
[210.088]
TT vs Control
0.086
[5.497]
-8.316
[4.311]*
0.042
[0.021]**
3.302
[1.205]***
0.821
[1.944]
-3.857
[1.700]**
-4.055
[1.589]**
-5.99
[4.374]
0.839
[0.044]***
85.431
[4.984]***
y
429.674
[217.461]*
PETT vs TT
-2.662
[1.742]
5.702
[17.161]
-0.028
[0.084]
-0.917
[1.653]
-2.021
[2.664]
-8.286
[2.363]***
-3.941
[2.082]*
-22.033
[7.679]***
0.814
[0.059]***
83.894
[6.752]***
y
-253.705
[879.982]
-1,819.42
19.28
0.00
492
-1,003.12
9.95
0.00
278
-678.27
0.00
1.00
186
20.227
[11.516]
102.254
[6.733]
22
29.888
[27.313]
91.297
[8.055]
12
0
[0]
111.214
[12.028]
6
4
39
22
0.165
14
39
23
0.247
13
48
31
0
67
Appendices
Table A12b. Reading score, full estimation results
Impact
Age (months)
Age squared
Female
EAL
SEN
FSM
EALSENFSM missing
Pre-test
Pre-test missing
Time between tests
Blocking variables
Constant
Log likelihood
Chi-squared test statistic of RE
Chi-squared p-value
N
Random effects
- school-level variance
- pupil-level variance
N schools
N observations per school
min per school
max per school
mean per school
ICC
Education Endowment Foundation
PETT vs
Control
-0.885
[3.203]
-6.549
[5.108]
0.032
[0.025]
1.191
[1.151]
-0.514
[1.802]
-6.633
[1.636]**
-1.545
[1.495]
-4.763
[3.449]
0.794
[0.033]**
75.412
[11.682]**
-0.028
[0.048]
y
367.016
[259.541]
TT vs Control
1.944
[6.140]
-6.117
[5.153]
0.031
[0.026]
2.342
[1.395]*
0.176
[2.121]
-5.752
[1.958]***
-4.524
[1.766]**
1.725
[4.922]
0.812
[0.039]***
94.782
[17.410]***
0.044
[0.074]
y
315.458
[260.062]
PETT vs TT
-3.646
[2.072]*
-6.827
[19.331]
0.032
[0.094]
-0.416
[1.891]
-4.11
[3.066]
-7.968
[2.677]***
-3.325
[2.436]
-25.77
[9.446]***
0.826
[0.055]***
133.232
[22.621]***
0.204
[0.091]**
y
348.938
[991.300]
-2,116.93
15.31
0.00
538
-1,180.72
7.87
0.00
310
-832.66
0.73
0.20
215
23.83
[14.085]
164.868
[10.356]
23
29.109
[27.253]
135.677
[11.298]
12
38.767
[86.190]
168.718
[16.957]
6
1
44
23
0.126
17
44
26
0.177
20
49
36
0.187
68
Appendices
Table A12c. Maths score, full estimation results
Impact
Age (months)
Age squared
Female
EAL
SEN
FSM
EALSENFSM missing
Pre-test
Pre-test missing
Time between tests
Blocking variables
Constant
Log likelihood
Chi-squared test statistic of RE
Chi-squared p-value
N
Random effects
- school-level variance
- pupil-level variance
N schools
N observations per school
min per school
max per school
mean per school
ICC
Education Endowment Foundation
PETT vs
Control
-1.503
[1.970]
-1.853
[3.462]
0.009
[0.017]
0.686
[0.780]
-1.95
[1.228]
-3.022
[1.128]**
-1.127
[1.013]
-4.495
[2.329]
0.862
[0.029]**
84.549
[8.560]**
-0.019
[0.034]
y
118.986
[175.838]
TT vs Control
-2.126
[3.551]
-4.3
[3.678]
0.022
[0.018]
-0.264
[0.980]
0.366
[1.524]
-2.553
[1.401]*
-2.801
[1.251]**
-1.337
[3.413]
0.878
[0.037]***
102.816
[15.032]***
0.051
[0.065]
y
214.416
[186.170]
PETT vs TT
-0.396
[1.400]
-3.406
[13.241]
0.018
[0.065]
-0.55
[1.365]
-1.567
[2.045]
-6.67
[1.835]***
1.495
[1.684]
-12.571
[5.577]**
0.895
[0.050]***
112.132
[11.322]***
0.091
[0.042]**
y
166.245
[678.110]
-2,052.22
9.99
0.00
571
-1,127.51
4.01
0.02
322
-778.24
0
1.00
219
8.601
[5.506]
81.121
[4.937]
23
9.683
[11.303]
70.883
[5.791]
12
0
[0.000]
84.795
[8.417]
6
3
48
25
0.096
18
48
27
0.12
16
50
37
0
69
Appendices
Table A12d. Mental arithmetic score, full estimation results
Impact
Age (months)
Age squared
Female
EAL
SEN
FSM
EALSENFSM missing
Pre-test
Pre-test missing
Time between tests
Blocking variables
Constant
Log likelihood
Chi-squared test statistic of RE
Chi-squared p-value
N
Random effects
- school-level variance
- pupil-level variance
N schools
N observations per school
min per school
max per school
mean per school
ICC
Education Endowment Foundation
PETT vs Control
-2.975
[2.899]
-2.663
[5.084]
0.013
[0.025]
1.058
[1.253]
2.458
[1.946]
-8.865
[1.827]**
-5.309
[1.614]**
-5.735
[3.606]
0.648
[0.039]**
60.682
[13.509]**
-0.023
[0.054]
y
181.38
[257.842]
TT vs Control
-2.276
[5.789]
-5.943
[6.140]
0.03
[0.030]
2.173
[1.655]
1.568
[2.523]
-3.618
[2.324]
-3.613
[2.100]*
-4.405
[5.558]
0.709
[0.049]***
62.04
[22.979]***
-0.056
[0.098]
y
335.75
[310.390]
PETT vs TT
-0.8
[2.351]
-0.556
[21.664]
0.004
[0.106]
-3.531
[2.188]
2.522
[3.274]
-11.214
[3.035]***
-4.764
[2.820]*
-13.584
[9.150]
0.602
[0.063]***
91.015
[22.568]***
0.134
[0.090]
y
45.914
[1,108.623]
-2,280.86
7.68
0.00
564
-1,281.15
2.29
0.07
321
-904.42
0.07
0.40
225
17.626
[12.614]
207.185
[12.697]
23
22.929
[30.249]
199.908
[16.355]
12
10.507
[51.133]
228.721
[22.428]
6
3
50
25
0.078
19
50
27
0.103
25
48
38
0.044
70
Appendices
Table A12e. SDQ Likert score, full estimation results
Impact
Age (months)
Age squared
Female
EAL
SEN
FSM
EALSENFSM missing
Pre-test
Pre-test missing
Time between tests
Time between tests missing
Blocking variables
Constant
Log likelihood
Chi-squared test statistic of RE
Chi-squared p-value
N
Random effects
- school-level variance
- pupil-level variance
N schools
N observations per school
min per school
max per school
mean per school
ICC
Education Endowment Foundation
PETT vs Control
-0.06
[0.651]
0.026
[0.046]
0
[0.000]
0.947
[0.325]**
0.014
[0.459]
1.23
[0.414]**
-0.66
[0.375]
-0.966
[1.514]
0.523
[0.047]**
9.034
[1.333]**
-0.007
[0.012]
-0.264
[3.033]
y
9.105
[3.057]**
TT vs Control
0.749
[0.767]
0.002
[0.066]
0
[0.001]
0.202
[0.417]
0.719
[0.525]
0.516
[0.512]
-1.41
[0.480]**
-0.89
[1.443]
0.485
[0.068]**
8.715
[1.937]**
-0.024
[0.021]
-4.272
[5.152]
y
14.384
[5.034]**
PETT vs TT
0.244
[0.633]
0.132
[0.079]
-0.001
[0.001]
0.655
[0.519]
-1.844
[0.758]*
0.869
[0.627]
0.53
[0.599]
-0.794
[1.697]
0.682
[0.073]**
11.635
[1.755]**
0.006
[0.020]
3.163
[4.966]
y
2.24
[5.315]
-1,393.80
0.000
1
516
-853.39
0.15
0.35
318
-587.28
0.000
1
217
0.000
[0.000]
12.997
[0.827]
19
0.182
[.564]
12.661
[1.042]
12
0.000
[0.000]
13.079
[1.305]
6
18
51
27
0.000
7
51
27
0.014
27
49
36
0.000
71
Appendices
Table A12f. PVT measure 1: Building understanding, full estimation results
Impact
Age (months)
Age squared
Female
EAL
SEN
FSM
EALSENFSM missing
Pre-test
Pre-test missing
Blocking variables
Constant
Log likelihood
Chi-squared test statistic of RE
Chi-squared p-value
N
Random effects
- school-level variance
- pupil-level variance
N schools
N observations per school
min per school
max per school
mean per school
ICC
Education Endowment Foundation
PETT vs Control
-1.497
[2.403]
-0.161
[0.121]
0.001
[0.001]
3.724
[0.839]**
-0.329
[1.326]
-1.646
[1.114]
-0.228
[1.110]
-2.083
[3.775]
0.159
[0.041]**
0.225
[1.673]
y
8.976
[3.150]**
TT vs Control
1.822
[3.329]
-0.067
[0.163]
0
[0.002]
4.55
[1.144]**
-0.363
[1.606]
-1.647
[1.495]
-2.182
[1.429]
-7.871
[4.602]
0.125
[0.049]*
1.21
[2.282]
y
14.05
[3.641]**
PETT vs TT
-1.177
[1.726]
-0.171
[0.175]
0.002
[0.002]
4.28
[1.294]**
0.874
[2.108]
1.323
[1.884]
-1.707
[1.939]
3.413
[3.379]
0.235
[0.068]**
-1.757
[2.426]
y
1.284
[3.911]
-2,167.52
27.39
0
590
-1,312.26
10.9
0
354
-779.67
1.84
0.09
222
19.202
[9.390]
93.461
[5.605]
28
17.821
[12.83]
102.593
[7.973]
15
8.654
[11.962]
82.927
[8.262]
12
1
52
21
0.17
1
52
24
0.148
1
47
19
0.094
72
Appendices
Table A12g. PVT measure 2: Information gathering, full estimation results
Impact
Age (months)
Age squared
Female
EAL
SEN
FSM
EALSENFSM missing
Pre-test
Pre-test missing
Blocking variables
Constant
Log likelihood
Chi-squared test statistic of RE
Chi-squared p-value
N
Random effects
- school-level variance
- pupil-level variance
N schools
N observations per school
min per school
max per school
mean per school
ICC
Education Endowment Foundation
PETT vs Control
-1.016
[2.076]
0.071
[0.116]
-0.001
[0.001]
0.3
[0.794]
1.04
[1.272]
-0.222
[1.063]
0.099
[1.070]
7.616
[3.526]*
0.152
[0.046]**
2.951
[1.635]
y
13.409
[2.793]**
TT vs Control
-3.197
[2.682]
0.217
[0.168]
-0.002
[0.002]
0.117
[1.168]
0.815
[1.671]
0.079
[1.540]
0.93
[1.505]
5.308
[4.298]
0.03
[0.063]
-1.518
[2.251]
y
11.374
[3.374]**
PETT vs TT
-1.036
[1.684]
0.149
[0.171]
-0.001
[0.002]
-0.562
[1.249]
3.086
[2.081]
0.621
[1.850]
1.168
[1.902]
-0.4
[3.321]
0.22
[0.085]**
2.801
[2.513]
y
11.033
[3.550]**
-2,144.67
22.75
0
590
-1,327.25
2.7
0.05
354
-776.45
0.85
0.18
222
13.11
[6.683]
86.784
[5.198]
28
8.261
[8.761]
113.54
[8.839]
15
4.071
[6.999]
80.931
[8.052]
12
1
52
21
0.131
1
52
24
0.068
1
47
19
0.048
73
Appendices
Table A12h. PVT measure 3: Productive thinking, full estimation results
Impact
Age (months)
Age squared
Female
EAL
SEN
FSM
EALSENFSM missing
Pre-test
Pre-test missing
Blocking variables
Constant
Log likelihood
Chi-squared test statistic of RE
Chi-squared p-value
N
Random effects
- school-level variance
- pupil-level variance
N schools
N observations per school
min per school
max per school
mean per school
ICC
Education Endowment Foundation
PETT vs Control
1.711
[1.396]
-0.106
[0.081]
0.001
[0.001]
1.814
[0.556]**
-1.254
[0.883]
-0.902
[0.738]
-0.661
[0.743]
-1.787
[2.398]
0.134
[0.029]**
-2.199
[1.062]*
y
3.266
[1.913]
TT vs Control
1.555
[1.428]
-0.188
[0.104]
0.002
[0.001]
1.265
[0.744]
-0.687
[1.037]
-1.307
[0.954]
-0.377
[0.938]
-1.099
[2.505]
0.143
[0.033]**
-0.28
[1.285]
y
3.609
[2.033]
PETT vs TT
3.714
[1.421]**
-0.029
[0.143]
0
[0.001]
1.893
[1.048]
-3.197
[1.709]
-2.182
[1.544]
-0.136
[1.583]
1.822
[2.759]
0.33
[0.064]**
1.693
[2.001]
y
0.986
[3.504]
-1,937.13
18.16
0
590
-1,168.13
0.88
0.17
354
-737.06
4.34
0.02
222
5.7
[3.159]
41.96
[2.516]
28
1.607
[2.378]
44.533
[3.465]
15
9.78
[10.973]
54.424
[5.422]
12
1
52
21
0.12
1
52
24
0.035
1
47
19
0.152
74
Appendices
Table A12i. PVT measure 4: Metacognitive knowledge, full estimation results
Impact
Age (months)
Age squared
Female
EAL
SEN
FSM
EALSENFSM missing
Pre-test
Pre-test missing
Blocking variables
Constant
Log likelihood
Chi-squared test statistic of RE
Chi-squared p-value
N
Random effects
- school-level variance
- pupil-level variance
N schools
N observations per school
min per school
max per school
mean per school
ICC
Education Endowment Foundation
PETT vs Control
0.376
[0.571]
-0.052
[0.040]
0.001
[0.000]
0.27
[0.285]
-0.663
[0.449]
0.502
[0.376]
-0.121
[0.379]
-0.587
[1.103]
0.085
[0.021]**
-0.522
[0.515]
y
-0.056
[0.848]
TT vs Control
0.89
[0.901]
-0.14
[0.064]*
0.001
[0.001]*
0.545
[0.454]
-0.092
[0.641]
0.711
[0.591]
0.709
[0.581]
0.746
[1.558]
0.091
[0.026]**
-0.435
[0.785]
y
-0.159
[1.269]
PETT vs TT
-0.184
[0.748]
-0.021
[0.076]
0
[0.001]
0.2
[0.558]
-1.603
[0.925]
-0.51
[0.822]
1.106
[0.843]
0.36
[1.482]
0.132
[0.066]*
-0.302
[1.024]
y
1.202
[1.500]
-1,555.91
5.85
0.01
590
-1,005.44
1.11
0.15
354
-610.46
0.64
0.21
222
0.695
[.504]
11.152
[.668]
28
0.678
[.933]
17.019
[1.324]
15
0.773
[1.453]
15.989
[1.592]
12
1
52
21
0.059
1
52
24
0.038
1
47
19
0.046
75
Appendices
Table A12j. PVT measure 5: Metacognitive skilfulness, full estimation results
Impact
Age (months)
Age squared
Female
EAL
SEN
FSM
EALSENFSM missing
Pre-test
Pre-test missing
Blocking variables
Constant
Log likelihood
Chi-squared test statistic of RE
Chi-squared p-value
N
Random effects
- school-level variance
- pupil-level variance
N schools
N observations per school
min per school
max per school
mean per school
ICC
Education Endowment Foundation
PETT vs Control
0.661
[0.833]
-0.09
[0.056]
0.001
[0.001]
1.146
[0.389]**
-0.921
[0.613]
-0.027
[0.513]
-0.481
[0.518]
-1.298
[1.561]
0.168
[0.037]**
-1.808
[0.715]*
y
2.444
[1.203]*
TT vs Control
-0.062
[0.805]
-0.209
[0.073]**
0.002
[0.001]**
0.756
[0.534]
-0.271
[0.735]
-0.265
[0.678]
-0.85
[0.677]
-2.301
[1.599]
0.133
[0.042]**
-0.328
[0.815]
y
3.795
[1.363]**
PETT vs TT
2.589
[0.852]**
0.055
[0.086]
-0.001
[0.001]
1.381
[0.634]*
-1.572
[1.030]
-0.108
[0.920]
-0.98
[0.956]
0.616
[1.656]
0.22
[0.065]**
-0.447
[1.152]
y
0.268
[2.173]
-1,731.49
6.87
0
590
-1,057.92
0
0.49
354
-632.99
5.66
0.01
222
1.677
[1.164]
20.595
[1.236]
28
0.013
[.483]
23.513
[1.82]
15
4.179
[4.408]
19.688
[1.961]
12
1
52
21
0.075
1
52
24
0.001
1
47
19
0.175
76
Appendices
Appendix XIII Subgroup results
Table A13a. Full estimation results – FSM subsample, attainment outcomes
Impact (PETT vs Control)
Age (months)
Age squared
Female
EAL
SEN
Pre-test
Pre-test missing
Literacy &
numeracy
-2.821
[5.788]
7.867
[13.707]
-0.038
[0.067]
3.172
[2.218]
-0.85
[2.649]
-11.899
[2.875]**
0.783
[0.076]**
66.472
[7.893]**
y
-374.595
[697.502]
Reading
-1.632
[8.919]
12.27
[17.700]
-0.06
[0.087]
3.493
[2.799]
1.267
[3.444]
-14.498
[3.619]**
0.687
[0.075]**
104.03
[28.139]**
0.192
[0.120]
y
-631.097
[903.620]
General
maths
6.682
[3.242]*
17.143
[9.617]
-0.084
[0.047]
-1.913
[1.581]
-1.407
[1.906]
-6.087
[2.021]**
0.778
[0.061]**
57.84
[19.894]**
-0.058
[0.082]
y
-836.161
[489.515]
Mental
arithmetic
-0.841
[8.597]
17.944
[20.826]
-0.089
[0.102]
0.53
[3.276]
-2.445
[3.815]
-14.803
[4.329]**
0.532
[0.091]**
16.233
[37.349]
-0.131
[0.153]
y
-812.186
[1,060.461]
-434.77
2.39
0.06
123
-524.13
3.17
0.04
136
-455.37
0
1
136
-538.73
3.49
0.03
136
26.887
[29.628]
113.153
[16.055]
17
65.599
[67.536]
212.979
[28.68]
17
0
[0]
70.846
[9.223]
17
53.402
[51.843]
279.387
[36.995]
17
1
17
7
0.192
1
17
8
0.235
1
17
8
0
1
17
8
0.16
Time between tests
Blocking variables
Constant
Log likelihood
Chi-squared test statistic of RE
Chi-squared p-value
N
Random effects
- school-level variance
- pupil-level variance
N schools
N observations per school
min per school
max per school
mean per school
ICC
Education Endowment Foundation
77
Appendices
Table A13b. Full estimation results – low pre-test subsample, attainment outcomes
Impact (PETT vs control)
Age (months)
Age squared
Female
EAL
SEN
FSM
EAL SEN FSM missing
Pre-test
Literacy &
numeracy
1.099
[3.837]
-6.562
[6.279]
0.031
[0.031]
0.784
[1.823]
1.441
[2.525]
-3.909
[2.278]
-3.475
[2.047]
1.349
[6.849]
0.673
[0.093]**
y
371.108
[316.622]
Reading
-0.837
[4.522]
-2.533
[8.224]
0.011
[0.041]
3.862
[2.357]
2.754
[3.274]
-7.048
[2.750]*
-3.187
[2.610]
5.862
[8.893]
0.523
[0.085]**
0.179
[0.099]
y
140.68
[417.167]
General
maths
1.999
[2.446]
-9.886
[5.286]
0.049
[0.026]
-2.572
[1.602]
-3.236
[2.181]
-2.287
[1.872]
-0.537
[1.749]
2.069
[4.677]
0.69
[0.081]**
0.072
[0.057]
y
517.454
[267.090]
Mental
arithmetic
5.076
[4.966]
0.041
[8.454]
-0.004
[0.043]
-0.827
[2.986]
7.912
[4.130]
-7.771
[3.586]*
-9.432
[3.248]**
10.307
[10.247]
0.429
[0.085]**
0.103
[0.110]
y
63.658
[423.148]
-358.1
2.51
0.06
110
-429.74
1
0.16
121
-387.65
0.05
0.41
120
-450.57
0.24
0.31
120
14.15
[14.889]
72.892
[11.072]
18
14.974
[21.409]
133.121
[19.127]
19
1.514
[7.366]
62.909
[9.139]
20
11.044
[27.297]
219.61
[31.726]
19
1
14
6
0.163
1
16
6
0.101
1
14
6
0.023
1
16
6
0.048
Time between tests
Blocking variables
Constant
Log likelihood
Chi-squared test statistic of RE
Chi-squared p-value
N
Random effects
- school-level variance
- pupil-level variance
N schools
N observations per school
min per school
max per school
mean per school
ICC
Education Endowment Foundation
78
Appendices
Appendix XIV Security rating
Security rating summary: Mind the Gap
Rating
1. Design
2. Power
(MDES)
3. Attrition
4. Balance
5
Fair and clear experimental
design (RCT)
< 0.2
< 10%
Well-balanced on
observables
No threats to validity
4
Fair and clear experimental
design (RCT, RDD)
< 0.3
< 20%
3
Well-matched comparison
(quasi-experiment)
< 0.4
< 30%
2
Matched comparison
(quasi-experiment)
< 0.5
< 40%
1
Comparison group with
poor or no matching
< 0.6
< 50%
No comparator
> 0.6
> 50%
Imbalanced on
observables
Significant threats
0
The final security rating for this trial is 1
5. Threats to
validity
. This means that findings are of low security.
The trial was designed as a cluster randomised efficacy trial with 51 schools recruited and entered into
the randomisation. This resulted in a minimum detectable effect size of just under 0.3 at randomisation
for attainment meaning the trial could still have achieved a maximum of 4 . However, there was very
high attrition of 57% with only 22 schools entered into the primary analysis. There was some limited
evidence of imbalance at the baseline but this was not statistically significant and observable
characteristics were controlled for in the analysis. There are some threats to the validity of the results
due to there being some evidence of non-compliance to treatment allocation (with control classes
implementing the intervention) and tests were delivered by schools.
Education Endowment Foundation
79
Appendices
Appendix XV
Cost Rating
Cost ratings are based on the approximate cost per pupil of implementing the intervention over one
year. Cost ratings are awarded using the following criteria.
Cost
Description
£
Very low: less than £80 per pupil per year.
££
Low: up to about £170 per pupil per year.
£££
Moderate: up to about £700 per pupil per year.
££££
High: up to £1,200 per pupil per year.
£££££
Very high: over £1,200 per pupil per year.
Education Endowment Foundation
80
You may re-use this document/publication (not including logos) free of charge in any format or
medium, under the terms of the Open Government Licence v2.0.
To view this licence, visit www.nationalarchives.gov.uk/doc/open-government-licence/version/2
or email:
[email protected]
Where we have identified any third party copyright information you will need to obtain permission from
the copyright holders concerned. The views expressed in this report are the authors’ and do not
necessarily reflect those of the Department for Education.
This document is available for download at www.educationendowmentfoundation.org.uk.
The Education Endowment Foundation
9th Floor, Millbank Tower
21–24 Millbank
London
SW1P 4QP
www.educationendowmentfoundation.org.uk