CASE STUDY I: Predictive Analytics at Nottingham Trent University

Download as pdf or txt
Download as pdf or txt
You are on page 1of 14

Learning Analytics in Higher Education

A review of UK and international practice

CASE STUDY I: Predictive analytics at Nottingham


Trent University
One of the most prominent learning analytics initiatives in the UK is at Nottingham Trent University
(NTU). The institution-wide rollout of the NTU Student Dashboard to facilitate dialogue between students,
their personal tutors and support staff has seen widespread uptake, positive impacts on student
engagement, and a change in organisational culture towards a more data driven approach across the
Universitys business. The initiative was awarded the 2014 Times Higher Education award for Outstanding
Student Support.

Key takeaway points

The NTU Student Dashboard measures students engagement with their course; the University has
found engagement to be a stronger predictor of success than background characteristics

Engagement scores are calculated from VLE access, library usage, card swipes and assignment
submissions

Tutors are prompted to contact students when their engagement drops off; staff find this a valuable
resource

The provision of the Dashboard has helped to build better relations between students and personal
tutors

The Dashboard is having positive impacts on behaviour: tutors discuss engagement with their
students and some learners find that seeing their own engagement is a positive spur to stay
engaged

The Dashboard has been developed to achieve key institutional academic objectives; the project is
led by the PVC Academic and delivered by a project team comprising academics, students,
educational developers, IT staff and the developers

Transparency and a close partnership approach has been critical to the success of the initiative, and
has reduced ethical concerns about the use of student data

The provision of the Dashboard is now expected by staff and students, and the project has helped to
extend the culture of data-driven decision making across the University

CASE STUDY I: Predictive analytics at Nottingham Trent University

Learning Analytics in Higher Education


A review of UK and international practice

Rationale
For NTU, the key strategic driver is enhancing the academic experience for its 28,000 students, particularly
their engagement with their course. Three more detailed goals were developed from this initial aim:

to enhance retention1

to increase a sense of belonging within the course community, particularly with tutors

to improve attainment

Earlier institutional research by Ed Foster, Student Engagement Manager at NTU, had identified that, in
line with other studies, up to a third of students had considered withdrawing at some point during their
first year. These doubters were less confident about coping with their studies, less engaged with their
course, formed weaker relationships with peers and tutors, and were ultimately more likely to withdraw
early. This apparent detachment from teaching staff meant that tutors were at risk of putting effort into
assisting those who requested support, rather than those who most needed it.
Focussing on overall student engagement rather than the risk of withdrawal has shaped the Dashboards
design and development in four ways:
1.

2.

Students2 and staff see precisely the same information


i.

background information

ii.

most recent engagement with the data sources (for example VLE use)

iii.

engagement compared to their peers on the same course

This means that while staff can use the data to initiate conversations with students, the students can
be their own agents of change. Seeing that their engagement is lower than their peers may prompt
them to improve it.
Because a key goal is to build relationships between tutors and students, the Dashboard sends an
email alert to tutors when student engagement stops for two weeks. These alerts encourage staff to
make early contact with students who are disengaging from their studies rather than waiting for more
traditional warning signs such as assessment deadlines not being met.

3.

The Dashboard was designed to be used by tutors as they work directly with students; they can make
notes about agreed actions in the Dashboard.

4.

NTU specifically chose not to make use of demographic data in the analytics model, choosing only to
use data relating to student engagement with their course.

1 However, retention is not a particular problem at NTU, with around 93% of students progressing from the first to the
second year.
2 Importantly, students only see information about themselves, not their fellow students.

CASE STUDY I: Predictive analytics at Nottingham Trent University

Learning Analytics in Higher Education


A review of UK and international practice

Figure 1: Student Dashboard, staff log in view

The initial project


NTU already had a strong track record of evidence-based quality management, good strategic information
for senior managers and, importantly, held a significant amount of relevant data in a mature Cognos data
warehouse implementation. However, data was primarily used for operational and quality management
purposes, not student/ staff interactions. In 2013 the University initiated a pilot to test the feasibility of
using learning analytics. A project group was sponsored by the PVC Academic3 with representatives from
academic departments, Information Systems, the Centre for Academic Development & Quality, and the
Students Union. Members included experts in data analysis, diversity and student support.
A procurement process was initiated, and each interested vendor was offered the opportunity to crunch
five years of anonymised data from seven data sources to see what insight could be obtained.

3 Professor Chris Pole led the pilot phase, his successor, Professor Eunice Simmons, the full institutional implementation.

CASE STUDY I: Predictive analytics at Nottingham Trent University

Learning Analytics in Higher Education


A review of UK and international practice

Solutionpath was selected as it was the most successful at sorting out and gaining insight from some very
complex data. This proved to be an effective partnership, the technical work was carried out mainly by
Solutionpath; the University coordinated that work, and various IT, academic and research staff were
deployed as required.
The Dashboard was initially tested with four courses, forty tutors and over 500 first year students. There
were some concerns from within the institution about student privacy, potential duplication with existing
systems and value for money, however by the first pilot evaluation point (Christmas 2013), positive
feedback from both students and staff largely allayed these anxieties. The pilot provided useful
information about integrating systems, student and staff expectations and ways to incorporate the
Dashboard into normal working practices.
The pilot successfully demonstrated the value of learning analytics, and the Dashboard was implemented
throughout the whole University in September 2014. It was launched using a range of communication
activities including roadshow presentations in all academic schools. For the most part these briefings were
used to introduce learning analytics, discuss the ethical and process implications and explore good
practice. The tool itself is straightforward to use, requiring only limited training. The impact on academic
time required to make use of it was minimal; this was important for its adoption.

Figure 2: A student profile as viewed by their tutor

CASE STUDY I: Predictive analytics at Nottingham Trent University

Learning Analytics in Higher Education


A review of UK and international practice

Data sources and indicators of engagement


Data comes from a number of sources and is collected in an NTU data warehouse. Biographical data such
as the students enrolment status is drawn from the student information system4. Actual engagement data
comes from four separate systems: the virtual learning environment (VLE), the card access database, the
assessment submission system and the library system. The indicators used in the analytics are calculated
from door swipes into academic buildings, visits to the VLE, the submission of assignments and a count of
when library resources were taken from the library. These were selected from a range of variables using
the Solutionpath algorithm that identified them as the most significant indicators of student engagement.
Each student receives one of five engagement ratings: High, Good, Partial, Low and Not Fully Enrolled.
The analytics model at the core of the system is the intellectual property of Solutionpath. NTU does not
know the core algorithms driving the resource. However, as part of the procurement process, the
developers demonstrated the accuracy of the tool to the University, by correctly predicting students most
at risk of early departure at two census points. Furthermore, in the 2013-14 pilot year the project team
analysed the relationship between engagement as measured in the Dashboard and academic attainment
(see Findings and Outcomes below).
The team chose not to include student demographics in the predictive algorithms. In early discussions the
developers found some evidence to show an impact based on who students were. However, as Ed says,
The Dashboard is about student engagement. We didnt want to be in a situation where two students
might be engaging in precisely the same way with their studies but we gave one a lower score because
they were male, or came from a poorer socio-economic background. Students cant do anything about
their background characteristics, but they can do something about the way that they engage with their
course. Remember students can see their own Dashboard and we want this to be a tool that motivates
them.

Dashboards and interventions


The Dashboard was developed both for students to sense check their engagement against their peers and
for tutors to discuss the students performance with them. Two visualisation charts show a progress line
indicating engagement, comparing individual students with the rest of the cohort, and showing the
students engagement rating. The tutor can drill down to the level below the rating to gain a better
understanding of the students recent engagement with the course.

4 This information is not used in the analytics algorithm, but is used to provide staff with contextual information and to
conduct cohort-level analysis.

CASE STUDY I: Predictive analytics at Nottingham Trent University

Learning Analytics in Higher Education


A review of UK and international practice

Figure 3: A students cumulative engagement rating compared with the course mean. This view provides students with a
good overall view of their engagement for the year, however is less immediately responsive.

Automated email alerts are sent to tutors when students have had no engagement for a fortnight. The
University made the decision to send these to the students tutors, not directly to the students themselves.
This was done specifically to increase opportunities for dialogue between tutors and students, and to help
build students sense of belonging to the course community. Carefully worded, these emails inform the
tutors that they may already know the learners circumstances but ask them to consider contacting them.
Ed says Were trying to initiate conversations rather than replace them. Mike Day, Director of IS adds
However, we are interested in finding ways that we can use the technology to improve the dialogue. For
example, we might develop the system to generate more personalised emails to tutors, for example,
where engagement drops from good to partial, where students dont use the library by a particular cut off
date, or for students with a history of late submission. Although it would be necessary to avoid
spamming tutors with too many alerts

CASE STUDY I: Predictive analytics at Nottingham Trent University

Learning Analytics in Higher Education


A review of UK and international practice

Figure 4: Week by week engagement score (for the same student as in Figure 3) compared with the course mean. This gives a
more responsive view and students can see dramatic changes to their engagement within a week

Ethics, legal issues and presentation


Overall staff and students can see that there are benefits from the use of learning analytics. There are, of
course, some concerns that there is the potential to misuse the data; these appear to be expressed more
frequently by staff than students. At each stage of the Dashboards development the project team made
great efforts to communicate with both students and staff. The ideas were tested with students including
the Students Union elected officers and the student representatives of the Information Services Student
Committee. On the legal side NTU is using the same core systems as it was previously, so existing
computing use policies and student enrolment conditions already provide a sufficiently robust legal
framework to cover the Dashboards operation. However in September 2015, the University implemented
a policy for the Use of learning analytics to support student success5. The Director of Academic
Development, Jane McNeil, stressed that It was important that the University articulate the academic
context for learning analytics and acknowledge the limitations of such data.
How the Dashboard is portrayed to students and staff was thought through carefully by the project team.
Staff involved in the project have deliberately used positive and inclusive language, and have talked about
attainment rather than retention and about success for all instead of narrowing the gap.

5 This policy can be found in the NTU Quality Handbook Section 14, Learning and Teaching
http://ntu.ac.uk/adq/document_uploads/quality_handbook/150956.pdf

CASE STUDY I: Predictive analytics at Nottingham Trent University

Learning Analytics in Higher Education


A review of UK and international practice

Figure 5: Student dashboard showing their engagement over the previous five days along with an explanation of what the
engagement ratings mean.

Findings and outcomes


The NTU project team has found two key operational outcomes from this work. Firstly, there is the
knowledge and understanding that comes from the analytics. The second is that just having the
Dashboard itself is a valuable resource for staff seeking to build relationships with students. Previously
tutors could access the information available in the Dashboard, but doing so was time-consuming and, in
some cases, extremely difficult.

CASE STUDY I: Predictive analytics at Nottingham Trent University

Learning Analytics in Higher Education


A review of UK and international practice

Using data from the pilot year, a positive relationship between student engagement and both progression
and attainment has been identified6. Although only four courses took part in the pilot year, it was possible
to extract anonymised daily engagement data for all students from the start to the end of the 2013-14
academic year. Analysis of this data showed that less than a quarter of students with a low average
engagement progressed from the first to the second year, whereas over 90% of students with good or high
average engagement did so. Interestingly, students with good average engagement had a marginally
higher rate of progression than those with high average engagement; this will require further analysis.
Overall, a students engagement was a far more important predictor of progression than data such as
background characteristics or entry qualifications.

Progression status by engagement rating (year 1 full time UG students)


100%
90%
80%

70%
60%
50%
40%

30%
20%
10%

0%

Low engagement

Satisfactory engagement

Good engagement

High engagement

Other (n=76)

3.1%

1.2%

0.7%

0.2%

Withdrawn (n=310)

18.5%

4.0%

2.8%

4.0%

Transfer (n=143)

5.3%

2.2%

1.1%

1.3%

Repeating (n=364)

12.4%

6.2%

2.0%

2.5%

Academic failure (n=281)

36.5%

3.1%

0.8%

0.4%

Progressed (n=5,836)

24.2%

83.4%

92.5%

91.7%

Table 1: The association between average engagement7 and student progression in 2013-14 (Foster, Kerrigan & Lawther,
2016, forthcoming)

The same team analysed the relationship between average engagement and academic attainment for
final year students. Once again there was a strong association between engagement and a final degree
classification: 81% of students with a high average engagement graduated with a 2:1 or first class honours
degree, whereas only 42% of students with a low average engagement achieved the same.
In a survey of first year students (460 respondents), 27% said that they had changed their behaviour in
response to data on the dashboard, for example increasing their presence on campus, taking out more

6 See Foster, Kerrigan & Lawther, 2016, forthcoming


7 The four engagement ratings listing in the two tables are those used in 2014-15. In 2015-16, satisfactory was replaced
by partial and a new heading, not fully enrolled was added.

CASE STUDY I: Predictive analytics at Nottingham Trent University

Learning Analytics in Higher Education


A review of UK and international practice

books and accessing the VLE more8. Interestingly, some students also became better engaged with
academic activities such as doing more independent learning, even though these factors are not measured
in the Dashboard. The researchers also found a number of examples of students using the system to
challenge themselves and their peers. For example, one student described how they used the Dashboard
to benchmark their own engagement: When you can track and see how you are doing it gives you more of
a push if you are dropping down and you dont want to drop out.
The project team also saw examples of students competing with one another to have the highest
engagement score9. Ed acknowledged that there is a chance that the University might discover that the
data helps those students who are already articulate and engaged. NTU will be exploring these issues in
much more detail in the coming years.

Engagement classification

Final degree awards by engagement classification


high

good

satisfactory

low
0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

1st Class Honours

low
13.2%

satisfactory
15.2%

good
20.9%

high
28.0%

2nd Class Honours-1st Division

28.6%

47.0%

55.2%

52.5%

2nd Class Honours-2nd Division

28.6%

30.5%

21.5%

17.0%

3rd Class Honours

11.0%

4.9%

1.4%

1.8%

Other - Ordinary Degree

18.7%

2.4%

1.1%

0.6%

100%

Table 2: The association between average engagement and final degree classification.

Another impact is that tutors are changing how they work with students because of the Dashboard. They
find that the improved information on individuals allows them to target their interventions more
appropriately. Most tutors in the initial pilot used the Dashboard once per week with a minimal impact on
their workload; one third of tutors contacted students as a result of viewing their engagement data in the
Dashboard. Typical tutor comments were:

It saves so much time by having the information to hand in one place for a student.

[It is] Useful to get a sense of the students history. Useful to be prompted to send out an email to
a disengaged student and useful to have a template for guidance on this.

8 See Lawther, Foster, Mutton & Kerrigan 2015, forthcoming


9 Students could not see one anothers dashboards, so groups of friends and peers would sit next to one another in
computer labs, or compare screens on their phones to compare ratings.

CASE STUDY I: Predictive analytics at Nottingham Trent University

Learning Analytics in Higher Education


A review of UK and international practice

Tutors can immediately have a conversation and build on it.

As with all projects aimed at rolling out technology across an institution, the use of learning analytics has
required accompanying changes to culture and practice. The staff and the students who have used the
student dashboard would not give it up. Professor Eunice Simmons, the PVC Academic with overall
responsibility for the initiative, feels: its becoming business as usual.

Future plans
The approach that is being taken now is closely aligned with the new institutional strategy. Improving the
student-staff academic experience was key in the earlier strategy. However, the emphasis now is on
creating opportunity a broader set of experience than just the academic one, which aims to help
students achieve their overall ambitions. Personalisation and tailoring the education to individual students
are stronger objectives in the new strategy so learning analytics will be crucial to help facilitate this over
the next five years.
During the pilot and initial year of implementation, students and staff have made dozens of suggestions
about how the Dashboard could be improved. The project team has prioritised the most immediate goals
and is working with the vendor on the next phases of work: the first is integrating additional data sources
to the Dashboard. The most important of these is an attempt to capture attendance data accurately. In
those buildings where students already use their cards to swipe in, further attendance data may not make
much difference to the model. However, as some staff feel that the engagement ratings cannot be
accurate unless they include some aspect of attendance monitoring, this is important.
There is also currently a project which is looking at e-books. There are around one million physical books in
the library and approximately six million e-books and e-journals; student engagement with these
electronic resources is not currently measured by the system and should prove a useful additional data
source. NTU is integrating electronic feedback from the tutor into the Dashboard so that both tutor and
student can see all the relevant information in one place.
The second phase is focussed much more on operation in the real world. NTU believe that the Dashboard
provides an accurate picture of student engagement, but more work is needed to explore how they turn
this data into operational practices. The University is leading an Erasmus+ research project with KU
Leuven and the University of Leiden; the Achieving Benefits from Learning Analytics (ABLE) Project will
explore how to use learning analytics data to improve academic and student support.
While Mike considers the tools available to be more than adequate in their capabilities for the time being,
there is a desire to experiment with the way that students and staff use the resource. Presenting the data
differently might help change students behaviour. For example, including options for setting personal
engagement targets or seeing the ratings for the most engaged students on the course might work with
particular groups of students. Meanwhile Solutionpath, the developers are currently working on a mobile
app version of the Dashboard.

CASE STUDY I: Predictive analytics at Nottingham Trent University

Learning Analytics in Higher Education


A review of UK and international practice

Further work is needed in areas such as management information about student engagement and the
potential to test different types of engagement alerts or reports to tutors, or even experiment with direct
messages to students. There are strong desires from many staff to use the Dashboard to provide a wide
range of information. While the project team is keen to try and meet these needs, they are also keen to
keep the design of the Dashboard clean and uncluttered. The visualisations need to remain simple to
understand for staff and students, and not add to tutor workload.
The model is more accurate at the moment with full-time students because the patterns of engagement
are easier to model. Part-time students participate in blocks so might work quite intensively for a month
and then not so much the next month. Solutionpath are currently recalibrating the model to represent
more accurately the spread of engagement typical for part-time students. However, the University will
continue to explore whether or not the algorithms need further customisation to suit different types of
academic engagement.
Although overall student retention is high at NTU, there is a recognition that, as is the case across the
sector, there are attainment gaps for some student groups (for example widening participation students).
Whilst demographic data is not included in the learning analytics model, it has been used to understand
better the engagement of different student groups. The Dashboard is starting to provide valuable data
that will be used to develop future interventions for the Universitys Success for All strategy.
The cross-institutional perspective that has been developed at NTU as a result of this work is considered
very valuable. Analytics is giving the University insight into ways to continually improve processes in many
aspects of institutional business, and colleagues are now thinking about how they can apply those
techniques to other things they do and potentially other stages of the student lifecycle.
NTU is not claiming that what they have done is fully comprehensive but they do feel that they are far
ahead of where they were two years ago when they commenced. There is now general acceptance by staff
and students that use of the Dashboard is helping to enhance the student experience, evidence that
interventions as a result of the analytics are having an impact on student engagement, and a
transformation in the way that data is being used across the institution to inform decision making.
The development of learning analytics at NTU has been effective because of an integrated partnership
approach. The work was framed by research into student needs, was placed within an overall academic
governance structure and developed by a project team of academics, students, IT staff and educational
developers working with the external supplier. Ed Foster and Mike Day suggest a number of fundamental
lessons for any institution undertaking analytics projects:

Learning analytics resources are only as useful as the action they initiate.

Its important to ask the question: Is what were doing in the best interests of each individual
student? When NTU did that it helped them take the right path.

The end users need to be involved in the development of the resource. The project goals need to be
transparent and comprehensible to them, and effective communication with them is time consuming.

CASE STUDY I: Predictive analytics at Nottingham Trent University

Learning Analytics in Higher Education


A review of UK and international practice

There is still a lot to do. This is not primarily about the technology or the data but how it shapes
institutional practice in the best interests of all stakeholders.

Acknowledgements
Jisc is grateful to Mike Day and Ed Foster from Nottingham Trent University for their input to this case
study which has been developed from the research publications of Ed and his colleagues (see References),
and from interviews with and presentations given by Mike and Ed over the past year.
Mike Day is an experienced CIO, having fulfilled senior Leadership, Education
Management and IT roles with Nottingham Trent University, University of Lincoln,
National College for School Leadership and the Royal Air Force. He has a strong
interest in digital technology as it applies to higher education and has a reputation
for innovation, leading the team that won the Times Higher Award for Outstanding
Student Support in 2014 for its work on learner analytics. He has also collaborated
on award-winning flipped learning and online digital initiatives. His experience
includes developing business-focused digital and IT strategies, managing change
programmes, building high performing teams, project and programme management, business
intelligence, and system design and development. He is a graduate of the University of Nottingham.

Ed Foster is the Student Engagement Manager at Nottingham Trent University. His


remit is to find ways to improve students engagement with their studies, primarily
through research-informed institutional change. Ed is the institutional lead for the
NTU Student Dashboard. He also heads the Erasmus+ funded ABLE project
(Achieving Benefits from LEarning analytics) in partnership with colleagues from the
Katholieke Universiteit Leuven and Universiteit Leiden. The ABLE Project explores
how students can be better supported once learning analytics have identified that
they are potentially at risk. Ed is expert in the fields of student transition and
retention.

References
Day, Mike (2015, Feb 20). NTU Student Dashboard. Learning Analytics Network, University of East London.
http://analytics.jiscinvolve.org/wp/files/2015/02/Jisc-LA-Network-Mike-Day.pdf
Foster, Ed (2015, June 24). What have we learnt from implementing learning analytics at NTU? Jisc Learning
Analytics Network, Nottingham Trent University.
http://analytics.jiscinvolve.org/wp/files/2015/07/jisc-la-network-ed-foster-ntu.pdf
Foster, E., Borg, M., DR., Lawther, S., McNeil, J. and Kennedy, E., DR., 2014. Using Student Engagement
Research to Improve the First Year Experience at a UK University. In: C. Bryson, ed, Student
Engagement in Challenging Times. UK: Routledge.

CASE STUDY I: Predictive analytics at Nottingham Trent University

Learning Analytics in Higher Education


A review of UK and international practice

Foster, E., Kerrigan, M. and Lawther, S., 2016 (forthcoming). What Matters is Performance, not Predicted
Performance: the impact of student engagement on retention and attainment. Journal of Learning
Development in Higher Education.
Foster, E., Lawther, S., Lefever, R., Bates, N., Keenan, C. and Colley, B., 2012. HERE to stay? An exploration
of student doubting, engagement and retention in three UK universities. In: I. Solomonides, A.
Reid and P. Petocz, eds, Engaging with Learning in Higher Education. UK: Libri Publishing, pp. 95114.
Lawther, S., Foster, E., Mutton, J. and Kerrigan, M., 2015 (forthcoming). Can the use of learning analytics
encourage positive student behaviours?. In: Jane, G, Nutt, D, and Taylor, P., eds, Student behaviour
and positive learning cultures, UK: SEDA.
Sclater, Niall (2014, November). Learning analytics: the current state of play in UK higher and further
education. Jisc. http://repository.jisc.ac.uk/5657/1/Learning_analytics_report.pdf
Sclater, Niall (2015, July 1). Notes from UK Learning Analytics Network event in Nottingham. Effective
Learning Analytics. Jisc. http://analytics.jiscinvolve.org/wp/2015/07/01/notes-from-uk-learninganalytics-network-event-in-nottingham/

CASE STUDY I: Predictive analytics at Nottingham Trent University

You might also like