CASE STUDY I: Predictive Analytics at Nottingham Trent University
CASE STUDY I: Predictive Analytics at Nottingham Trent University
CASE STUDY I: Predictive Analytics at Nottingham Trent University
The NTU Student Dashboard measures students engagement with their course; the University has
found engagement to be a stronger predictor of success than background characteristics
Engagement scores are calculated from VLE access, library usage, card swipes and assignment
submissions
Tutors are prompted to contact students when their engagement drops off; staff find this a valuable
resource
The provision of the Dashboard has helped to build better relations between students and personal
tutors
The Dashboard is having positive impacts on behaviour: tutors discuss engagement with their
students and some learners find that seeing their own engagement is a positive spur to stay
engaged
The Dashboard has been developed to achieve key institutional academic objectives; the project is
led by the PVC Academic and delivered by a project team comprising academics, students,
educational developers, IT staff and the developers
Transparency and a close partnership approach has been critical to the success of the initiative, and
has reduced ethical concerns about the use of student data
The provision of the Dashboard is now expected by staff and students, and the project has helped to
extend the culture of data-driven decision making across the University
Rationale
For NTU, the key strategic driver is enhancing the academic experience for its 28,000 students, particularly
their engagement with their course. Three more detailed goals were developed from this initial aim:
to enhance retention1
to increase a sense of belonging within the course community, particularly with tutors
to improve attainment
Earlier institutional research by Ed Foster, Student Engagement Manager at NTU, had identified that, in
line with other studies, up to a third of students had considered withdrawing at some point during their
first year. These doubters were less confident about coping with their studies, less engaged with their
course, formed weaker relationships with peers and tutors, and were ultimately more likely to withdraw
early. This apparent detachment from teaching staff meant that tutors were at risk of putting effort into
assisting those who requested support, rather than those who most needed it.
Focussing on overall student engagement rather than the risk of withdrawal has shaped the Dashboards
design and development in four ways:
1.
2.
background information
ii.
most recent engagement with the data sources (for example VLE use)
iii.
This means that while staff can use the data to initiate conversations with students, the students can
be their own agents of change. Seeing that their engagement is lower than their peers may prompt
them to improve it.
Because a key goal is to build relationships between tutors and students, the Dashboard sends an
email alert to tutors when student engagement stops for two weeks. These alerts encourage staff to
make early contact with students who are disengaging from their studies rather than waiting for more
traditional warning signs such as assessment deadlines not being met.
3.
The Dashboard was designed to be used by tutors as they work directly with students; they can make
notes about agreed actions in the Dashboard.
4.
NTU specifically chose not to make use of demographic data in the analytics model, choosing only to
use data relating to student engagement with their course.
1 However, retention is not a particular problem at NTU, with around 93% of students progressing from the first to the
second year.
2 Importantly, students only see information about themselves, not their fellow students.
3 Professor Chris Pole led the pilot phase, his successor, Professor Eunice Simmons, the full institutional implementation.
Solutionpath was selected as it was the most successful at sorting out and gaining insight from some very
complex data. This proved to be an effective partnership, the technical work was carried out mainly by
Solutionpath; the University coordinated that work, and various IT, academic and research staff were
deployed as required.
The Dashboard was initially tested with four courses, forty tutors and over 500 first year students. There
were some concerns from within the institution about student privacy, potential duplication with existing
systems and value for money, however by the first pilot evaluation point (Christmas 2013), positive
feedback from both students and staff largely allayed these anxieties. The pilot provided useful
information about integrating systems, student and staff expectations and ways to incorporate the
Dashboard into normal working practices.
The pilot successfully demonstrated the value of learning analytics, and the Dashboard was implemented
throughout the whole University in September 2014. It was launched using a range of communication
activities including roadshow presentations in all academic schools. For the most part these briefings were
used to introduce learning analytics, discuss the ethical and process implications and explore good
practice. The tool itself is straightforward to use, requiring only limited training. The impact on academic
time required to make use of it was minimal; this was important for its adoption.
4 This information is not used in the analytics algorithm, but is used to provide staff with contextual information and to
conduct cohort-level analysis.
Figure 3: A students cumulative engagement rating compared with the course mean. This view provides students with a
good overall view of their engagement for the year, however is less immediately responsive.
Automated email alerts are sent to tutors when students have had no engagement for a fortnight. The
University made the decision to send these to the students tutors, not directly to the students themselves.
This was done specifically to increase opportunities for dialogue between tutors and students, and to help
build students sense of belonging to the course community. Carefully worded, these emails inform the
tutors that they may already know the learners circumstances but ask them to consider contacting them.
Ed says Were trying to initiate conversations rather than replace them. Mike Day, Director of IS adds
However, we are interested in finding ways that we can use the technology to improve the dialogue. For
example, we might develop the system to generate more personalised emails to tutors, for example,
where engagement drops from good to partial, where students dont use the library by a particular cut off
date, or for students with a history of late submission. Although it would be necessary to avoid
spamming tutors with too many alerts
Figure 4: Week by week engagement score (for the same student as in Figure 3) compared with the course mean. This gives a
more responsive view and students can see dramatic changes to their engagement within a week
5 This policy can be found in the NTU Quality Handbook Section 14, Learning and Teaching
http://ntu.ac.uk/adq/document_uploads/quality_handbook/150956.pdf
Figure 5: Student dashboard showing their engagement over the previous five days along with an explanation of what the
engagement ratings mean.
Using data from the pilot year, a positive relationship between student engagement and both progression
and attainment has been identified6. Although only four courses took part in the pilot year, it was possible
to extract anonymised daily engagement data for all students from the start to the end of the 2013-14
academic year. Analysis of this data showed that less than a quarter of students with a low average
engagement progressed from the first to the second year, whereas over 90% of students with good or high
average engagement did so. Interestingly, students with good average engagement had a marginally
higher rate of progression than those with high average engagement; this will require further analysis.
Overall, a students engagement was a far more important predictor of progression than data such as
background characteristics or entry qualifications.
70%
60%
50%
40%
30%
20%
10%
0%
Low engagement
Satisfactory engagement
Good engagement
High engagement
Other (n=76)
3.1%
1.2%
0.7%
0.2%
Withdrawn (n=310)
18.5%
4.0%
2.8%
4.0%
Transfer (n=143)
5.3%
2.2%
1.1%
1.3%
Repeating (n=364)
12.4%
6.2%
2.0%
2.5%
36.5%
3.1%
0.8%
0.4%
Progressed (n=5,836)
24.2%
83.4%
92.5%
91.7%
Table 1: The association between average engagement7 and student progression in 2013-14 (Foster, Kerrigan & Lawther,
2016, forthcoming)
The same team analysed the relationship between average engagement and academic attainment for
final year students. Once again there was a strong association between engagement and a final degree
classification: 81% of students with a high average engagement graduated with a 2:1 or first class honours
degree, whereas only 42% of students with a low average engagement achieved the same.
In a survey of first year students (460 respondents), 27% said that they had changed their behaviour in
response to data on the dashboard, for example increasing their presence on campus, taking out more
books and accessing the VLE more8. Interestingly, some students also became better engaged with
academic activities such as doing more independent learning, even though these factors are not measured
in the Dashboard. The researchers also found a number of examples of students using the system to
challenge themselves and their peers. For example, one student described how they used the Dashboard
to benchmark their own engagement: When you can track and see how you are doing it gives you more of
a push if you are dropping down and you dont want to drop out.
The project team also saw examples of students competing with one another to have the highest
engagement score9. Ed acknowledged that there is a chance that the University might discover that the
data helps those students who are already articulate and engaged. NTU will be exploring these issues in
much more detail in the coming years.
Engagement classification
good
satisfactory
low
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
low
13.2%
satisfactory
15.2%
good
20.9%
high
28.0%
28.6%
47.0%
55.2%
52.5%
28.6%
30.5%
21.5%
17.0%
11.0%
4.9%
1.4%
1.8%
18.7%
2.4%
1.1%
0.6%
100%
Table 2: The association between average engagement and final degree classification.
Another impact is that tutors are changing how they work with students because of the Dashboard. They
find that the improved information on individuals allows them to target their interventions more
appropriately. Most tutors in the initial pilot used the Dashboard once per week with a minimal impact on
their workload; one third of tutors contacted students as a result of viewing their engagement data in the
Dashboard. Typical tutor comments were:
It saves so much time by having the information to hand in one place for a student.
[It is] Useful to get a sense of the students history. Useful to be prompted to send out an email to
a disengaged student and useful to have a template for guidance on this.
As with all projects aimed at rolling out technology across an institution, the use of learning analytics has
required accompanying changes to culture and practice. The staff and the students who have used the
student dashboard would not give it up. Professor Eunice Simmons, the PVC Academic with overall
responsibility for the initiative, feels: its becoming business as usual.
Future plans
The approach that is being taken now is closely aligned with the new institutional strategy. Improving the
student-staff academic experience was key in the earlier strategy. However, the emphasis now is on
creating opportunity a broader set of experience than just the academic one, which aims to help
students achieve their overall ambitions. Personalisation and tailoring the education to individual students
are stronger objectives in the new strategy so learning analytics will be crucial to help facilitate this over
the next five years.
During the pilot and initial year of implementation, students and staff have made dozens of suggestions
about how the Dashboard could be improved. The project team has prioritised the most immediate goals
and is working with the vendor on the next phases of work: the first is integrating additional data sources
to the Dashboard. The most important of these is an attempt to capture attendance data accurately. In
those buildings where students already use their cards to swipe in, further attendance data may not make
much difference to the model. However, as some staff feel that the engagement ratings cannot be
accurate unless they include some aspect of attendance monitoring, this is important.
There is also currently a project which is looking at e-books. There are around one million physical books in
the library and approximately six million e-books and e-journals; student engagement with these
electronic resources is not currently measured by the system and should prove a useful additional data
source. NTU is integrating electronic feedback from the tutor into the Dashboard so that both tutor and
student can see all the relevant information in one place.
The second phase is focussed much more on operation in the real world. NTU believe that the Dashboard
provides an accurate picture of student engagement, but more work is needed to explore how they turn
this data into operational practices. The University is leading an Erasmus+ research project with KU
Leuven and the University of Leiden; the Achieving Benefits from Learning Analytics (ABLE) Project will
explore how to use learning analytics data to improve academic and student support.
While Mike considers the tools available to be more than adequate in their capabilities for the time being,
there is a desire to experiment with the way that students and staff use the resource. Presenting the data
differently might help change students behaviour. For example, including options for setting personal
engagement targets or seeing the ratings for the most engaged students on the course might work with
particular groups of students. Meanwhile Solutionpath, the developers are currently working on a mobile
app version of the Dashboard.
Further work is needed in areas such as management information about student engagement and the
potential to test different types of engagement alerts or reports to tutors, or even experiment with direct
messages to students. There are strong desires from many staff to use the Dashboard to provide a wide
range of information. While the project team is keen to try and meet these needs, they are also keen to
keep the design of the Dashboard clean and uncluttered. The visualisations need to remain simple to
understand for staff and students, and not add to tutor workload.
The model is more accurate at the moment with full-time students because the patterns of engagement
are easier to model. Part-time students participate in blocks so might work quite intensively for a month
and then not so much the next month. Solutionpath are currently recalibrating the model to represent
more accurately the spread of engagement typical for part-time students. However, the University will
continue to explore whether or not the algorithms need further customisation to suit different types of
academic engagement.
Although overall student retention is high at NTU, there is a recognition that, as is the case across the
sector, there are attainment gaps for some student groups (for example widening participation students).
Whilst demographic data is not included in the learning analytics model, it has been used to understand
better the engagement of different student groups. The Dashboard is starting to provide valuable data
that will be used to develop future interventions for the Universitys Success for All strategy.
The cross-institutional perspective that has been developed at NTU as a result of this work is considered
very valuable. Analytics is giving the University insight into ways to continually improve processes in many
aspects of institutional business, and colleagues are now thinking about how they can apply those
techniques to other things they do and potentially other stages of the student lifecycle.
NTU is not claiming that what they have done is fully comprehensive but they do feel that they are far
ahead of where they were two years ago when they commenced. There is now general acceptance by staff
and students that use of the Dashboard is helping to enhance the student experience, evidence that
interventions as a result of the analytics are having an impact on student engagement, and a
transformation in the way that data is being used across the institution to inform decision making.
The development of learning analytics at NTU has been effective because of an integrated partnership
approach. The work was framed by research into student needs, was placed within an overall academic
governance structure and developed by a project team of academics, students, IT staff and educational
developers working with the external supplier. Ed Foster and Mike Day suggest a number of fundamental
lessons for any institution undertaking analytics projects:
Learning analytics resources are only as useful as the action they initiate.
Its important to ask the question: Is what were doing in the best interests of each individual
student? When NTU did that it helped them take the right path.
The end users need to be involved in the development of the resource. The project goals need to be
transparent and comprehensible to them, and effective communication with them is time consuming.
There is still a lot to do. This is not primarily about the technology or the data but how it shapes
institutional practice in the best interests of all stakeholders.
Acknowledgements
Jisc is grateful to Mike Day and Ed Foster from Nottingham Trent University for their input to this case
study which has been developed from the research publications of Ed and his colleagues (see References),
and from interviews with and presentations given by Mike and Ed over the past year.
Mike Day is an experienced CIO, having fulfilled senior Leadership, Education
Management and IT roles with Nottingham Trent University, University of Lincoln,
National College for School Leadership and the Royal Air Force. He has a strong
interest in digital technology as it applies to higher education and has a reputation
for innovation, leading the team that won the Times Higher Award for Outstanding
Student Support in 2014 for its work on learner analytics. He has also collaborated
on award-winning flipped learning and online digital initiatives. His experience
includes developing business-focused digital and IT strategies, managing change
programmes, building high performing teams, project and programme management, business
intelligence, and system design and development. He is a graduate of the University of Nottingham.
References
Day, Mike (2015, Feb 20). NTU Student Dashboard. Learning Analytics Network, University of East London.
http://analytics.jiscinvolve.org/wp/files/2015/02/Jisc-LA-Network-Mike-Day.pdf
Foster, Ed (2015, June 24). What have we learnt from implementing learning analytics at NTU? Jisc Learning
Analytics Network, Nottingham Trent University.
http://analytics.jiscinvolve.org/wp/files/2015/07/jisc-la-network-ed-foster-ntu.pdf
Foster, E., Borg, M., DR., Lawther, S., McNeil, J. and Kennedy, E., DR., 2014. Using Student Engagement
Research to Improve the First Year Experience at a UK University. In: C. Bryson, ed, Student
Engagement in Challenging Times. UK: Routledge.
Foster, E., Kerrigan, M. and Lawther, S., 2016 (forthcoming). What Matters is Performance, not Predicted
Performance: the impact of student engagement on retention and attainment. Journal of Learning
Development in Higher Education.
Foster, E., Lawther, S., Lefever, R., Bates, N., Keenan, C. and Colley, B., 2012. HERE to stay? An exploration
of student doubting, engagement and retention in three UK universities. In: I. Solomonides, A.
Reid and P. Petocz, eds, Engaging with Learning in Higher Education. UK: Libri Publishing, pp. 95114.
Lawther, S., Foster, E., Mutton, J. and Kerrigan, M., 2015 (forthcoming). Can the use of learning analytics
encourage positive student behaviours?. In: Jane, G, Nutt, D, and Taylor, P., eds, Student behaviour
and positive learning cultures, UK: SEDA.
Sclater, Niall (2014, November). Learning analytics: the current state of play in UK higher and further
education. Jisc. http://repository.jisc.ac.uk/5657/1/Learning_analytics_report.pdf
Sclater, Niall (2015, July 1). Notes from UK Learning Analytics Network event in Nottingham. Effective
Learning Analytics. Jisc. http://analytics.jiscinvolve.org/wp/2015/07/01/notes-from-uk-learninganalytics-network-event-in-nottingham/