Engaging Children With Educational Content Via Gamification: Research Open Access

Download as pdf or txt
Download as pdf or txt
You are on page 1of 15

Nand et al.

Smart Learning Environments


https://doi.org/10.1186/s40561-019-0085-2
(2019) 6:6 Smart Learning Environments

RESEARCH Open Access

Engaging children with educational


content via Gamification
Kalpana Nand1,4, Nilufar Baghaei1,2* , John Casey1, Bashar Barmada1, Farhad Mehdipour2 and Hai-Ning Liang3

* Correspondence: nilufar.baghaei@
op.ac.nz Abstract
1
Department of Computer Science,
Unitec Institute of Technology, Gamification is the application of game mechanisms in non-gaming environments
Auckland, New Zealand with the objective of enhancing user experience. In this paper, we investigate the
2
Department of Information effectiveness of gamification in educational context, i.e. teaching numeracy at a
Technology, Otago Polytechnic
Auckland International Campus primary school level. We study the appealing characteristics of engaging computer
(OPAIC), Auckland, New Zealand games from children’s point of view, and investigate whether embedding the
Full list of author information is proposed characteristics into an educational tool enhances children’s learning. The
available at the end of the article
main characteristics we identify are levels of difficulties, feedback from the current
level, and graphical presentation. They were then embedded into a Java-based open
source programme based on “Who wants to be a millionaire” TV show, with the aim
of teaching children numeracy (level 5 New Zealand curriculum). Two versions were
created: feature enriched game (FEG) with all the features enabled and feature
devoid game (FDG) with no extra features. We present the results of an evaluation
study done with primary school children (n = 120) over a period of two weeks. The
effectiveness of the educational tool was measured using a pre-test and a post-test,
as well as other indicators such as the frequency and duration of interaction. Results
show that the FEG version was more effective in enhancing children’s learning and
they found it more engaging.
Keywords: Designing educational tool, Computer game, Children, Learning,
Engagement, Gamification

Introduction
Concerns over computer-based games having negative impacts on children and their
interaction with the society are increasing, especially the effect of violent themes con-
tained in a large proportion of games and the effect of extended periods of game play-
ing and over stimulation of children ( Walsh & Gentile, 2008; Shokouhi-Moqhaddam et
al. 2013). Other studies have looked at the positive outcomes of games when they are
used as a source of information and for enhancing children’s learning (Mitchell and
Saville-Smith, 2004; Chen et al. 2011; Baghaei et al. 2016; Yusoff et al. 2018). Computer
games have intrigued a lot of researchers because of their potential to entice and engage
the player’s attention for extended periods of time (de Freitas 2018).
There are certain attributes of computer games which contribute to how well they
are received by the players. Designers of educational tools can aim to integrate these
attributes to maximise the tool’s effectiveness in increasing learning outcomes, level of
engagement and motivation. Gamification is the application of game-design elements
© The Author(s). 2019 Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International
License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium,
provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and
indicate if changes were made.
Nand et al. Smart Learning Environments (2019) 6:6 Page 2 of 15

principles in non-game contexts (De-Marcos, et al. 2014; Robson, et al. 2015; Nehring
et al. 2018). It employs game design elements to achieve different goals including im-
proving user engagement (Hamari, 2015; Nehring et al. 2018), user behaviour (Reddy,
2018), organizational productivity (Zichermann & Cunningham, 2011) and learning
(Denny, 2015; Gooch et al., 2016; Cózar-Gutiérrez, et al. 2016). Gamification has been
shown to increase learners’ engagement with course materials and improve their motiv-
ation, learning participation and collaboration (Denny, 2015; Dicheva et al. 2015; Nehr-
ing et al. 2018).
According to Prensky (2001) a prerequisite of successful learning is motivation. He
argues that a lot of what is in the curriculum is not motivating for students these days.
Yet the same children are motivated and excited to play video games for long duration.
What is notable according to Prensky (2001) is that some children’s attitude toward
video games is the opposite of the attitude they have toward learning in schools. One
way of getting children motivated is to design educational tools which are as engaging
and motivating as popular commercial games. These tools can be integrated with the
curriculum to enhance children’s motivation and learning.
The aim of this research project is to extract the characteristics of popular commer-
cial games which are able to engage and motivate players, and embed those characteris-
tics in an engaging educational tool. Our main research question is “what are the main
characteristics of effective computer games that engage a player for such long periods?”
This paper seeks to explore and examine those characteristics and to design engaging
educational tools, based on those characteristics. The proposed educational tools aim
to be used in primary school curriculum.
We begin by examining the relevant literature on the characteristics of engaging
computer games and collecting opinions of 120 children, aged between 9 and 10 years
old, who are enrolled in an Auckland primary school. We then apply our findings to
design an educational tool that incorporates those characteristics. We believe our re-
search paves the way for the systematic design and development of full-fledged en-
gaging educational tools.
The remainder of this paper is organized as follows. Section 2 reports on the current
literature. Section 3 outlines the research questions followed by methodology in Section
4. The modified game is presented in Section 5. We then describe the evaluation study
and the results in Sections 6 & 7 respectively, followed by conclusions and future work.

Related work
Learning is conventionally defined as the “process of acquiring competence and under-
standing” (Zhu et al., 2016), with competence being described as possessing specific
skills and, understanding as possessing specific knowledge. The use of technology, such
as computer games, to enhance students’ learning in the classroom is a timely topic
that permeates a lot of educational literature today. This is particularly important in to-
day’s society where students can access new games easily via platforms beyond desktop
computers and gaming consoles. Video and computer game design have been studied
by various researchers interested in finding out how different aspects of the game de-
sign could be utilised in developing educational tools (e.g., Malone, 1981; Dickey, 2003;
Dondlinger, 2007; Pinelle et al. 2008). Researchers have been interested in figuring out
not just about game functions but what features in games make them engaging.
Nand et al. Smart Learning Environments (2019) 6:6 Page 3 of 15

A lot of studies over the years have shown that children’s learning increased as a re-
sult of playing computer games. Research (e.g., Csikszentmihalyi, 1990; Provost, 1990;
Rogoff, 1990; Clark et al. 2018) has shown that game playing makes up a vital element
of a child’s cognitive and social development. These studies assert that children learn
more from playing and carrying out “hands-on” activities than by being simply asked to
“recite” information from books.
According to Vygotsky (1976), children learn by playing with others, creating and im-
proving their zone of proximal development; as they play, they are more involved in
carrying out complex activities. Fisch (2005) has noted that children have learnt about
diverse subjects such as prehistory and asthma education by playing computer games.
The learning aspect of computer games has been further endorsed by Baghaei et al.
(2016, 2017). In these studies, a set of design guidelines has been proposed that can be
ideally applied to any game to teach children how to manage their diabetes. The results
showed that children enjoyed playing the game and their knowledge of diabetes and
healthy lifestyle increased as a result of playing the game. Other examples of educa-
tional and health-related games include (Halim et al. 2018; Hinds et al. 2017; Consolvo
et al., 2006; Fujiki et al. 2008; Alankus et al. 2010; Berkovsky et al. 2010; Whitten et al.
2017; Baranowski et al. 2016). A lot of games stimulate thinking and curiosity and the
outcome, i.e., the desire to win is what attracts players to playing any game. For any
game to be successful, it must be able to engage the player and attract their attention.
Based on our experience (first author works in a primary school), most of the educa-
tional games available in New Zealand schools are not motivating enough for students
and lack the fun factor. Children are not as motivated to play these games as they are
to play commercial computer and video games at home. There is a need to design and
develop more useful and engaging educational games, which are relevant to the current
New Zealand curriculum and can be integrated in the day to day learning.
Parents are progressively accepting of the notion that using computer games as edu-
cational tool (Brand, 2012). A study commissioned by the Interactive Games & Enter-
tainment Association (IGEA) found that “79 per cent of parents with children under
the age of 18 play video games, and a further 90 per cent of this group do so together
with their children” (Brand, 2012, p. 13). Furthermore, the report also found “92 per
cent of parents believe video games are educational, with three-in-four actively using
games as an educational tool with their children” (Brand, 2012, p. 13). The report
showed that video games are increasingly embraced as teaching tools not only by par-
ents but also by teachers at schools and tertiary environments.
There has been some studies on systematic mapping of gamification applied to edu-
cation (de Sousa Borges et al. 2014; Caponetto et al. 2014). Gamification, in an educa-
tional context, can be applied at elementary education, lifelong education, and higher
education levels. In a practitioner’s guide to gamification of education (Huang and
Soman, 2013), the authors outline a five step process 1) understanding the target audi-
ence and the context, 2) defining the learning objectives, 3) studying the experience, 4)
identifying the resources, and 5) applying gamification elements. When considering
gamification, some key criteria to be considered are the duration of the learning pro-
gram, the location of the learning (for example: classroom, home, or office), the nature
of the learning programme (for example one-on-one or group), and size of class (or size
of groups). Olsson et al. (2015) pointed that in virtual learning environment users
Nand et al. Smart Learning Environments (2019) 6:6 Page 4 of 15

usually feel lonely and puzzled in their learning journey, therefore visualization and
gamification may be applied as solutions, but the former worked better than the latter.
It is suggested that the effects of gamifications are worth studying more deeply and
widely on various learning styles. Urh et al. (2015) analysed the use of gamification in
e-Learning process, including its advantages and disadvantages, and argued that there
were possibilities of practice gamification in higher education. They stated that the
application of gamification was designed to meet project objectives, thus different types
of education would affect the system development as well as different learning styles
and personalities of learners (Nehring et al., 2018). Gamification facilities smart learn-
ing environments, which is defined by Merrill (2013), as an environment that is effect-
ive, efficient and engaging.
Gamification has a lot of potential, but some effort is still required in the design and
implementation of the user experience in order to further enhance participants’ motiv-
ation and engagement with the platform.

Research questions
Our research questions, in this study, are as follows:

 What are the main characteristics of engaging and popular computer games for
children?
 Can adding those characteristics to an educational tool enhance their learning?

As the first step, we decided to explore those characteristics by collecting feedback


from primary school children. We then designed an educational tool based on the feed-
back we collected.

Collecting users’ input


We selected a group of 120 children aged between 9 and 10 at Glen Eden Primary
School in Auckland, New Zealand. They were given a questionnaire and were asked to
choose 3 features (from a given list) of computer games that they found most
appealing.
As shown in Fig. 1, the following game attributes were most appealing:

 Challenges (CH): having different levels in the game


 Feedback (FB): knowing how many points were scored
 Graphics (GH): having realistic graphics

In order to dig deeper into realistic graphics, a further questionnaire was designed
and given to children. In this questionnaire children were asked to select three features
which stood out for them when describing what realistic graphics were.
As shown in Fig. 2, the children identified the following attributes as the three as-
pects of graphics they liked the most in a game:

 Colorful images
 Real life characters
 High definition
Nand et al. Smart Learning Environments (2019) 6:6 Page 5 of 15

Fig. 1 Number of responses corresponding to each of the game features surveyed

Furthermore, children were asked to select the curriculum area in which they
preferred a game to be designed in. The Topics Related part included Science,
Social Studies, Technology and Te Reo (Mario language). Results are shown in
Fig. 3.
As shown above, a vast number of children were interested in playing numeracy
games. Some of the reasons given as to why they wanted a numeracy game devel-
oped included: “I want to get better at maths”, “I want to learn my multiplication
facts”, “Learning maths in a game will be a fun way to learn” and “I don’t like
maths so playing a game and learning will be better”.

Game design
Driven by the three main characteristics identified by the target group and de-
scribed in the previous section (i.e CH, FB and GH), a variety of open source

Fig. 2 Number of responses showing the detailed attributes corresponding to realistic. Graphics
Nand et al. Smart Learning Environments (2019) 6:6 Page 6 of 15

Fig. 3 Children’s preferred curriculum areas

games were examined. We felt that JQuizShow, a Java-based open source game
(http://quizshow.sourceforge.net/download.html) is a suitable option to choose for
the preliminary evaluation. The game is based on a television show (“Who wants
to be a Millionaire”) in which the participants are offered cash prizes for correctly
answering a series of multiple-choice questions in the order of increasing difficulty
levels. This game can be configured easily to include any content. New content
can be added by including the questions at various levels as a text file. Choosing
an incorrect answer at any point in the game ends the session, with a feedback
message saying the game can be played again from the beginning. Depending on
when the incorrect answer is given, the player can leave with either no money or a
certain amount. The amount a player can leave with depends on the level reached.
The game designed for this study has three levels indicated by an amount writ-
ten in white font compared to the rest of the amounts which are written in yel-
low font (see Fig. 4). Once a player passes a level indicated by the amounts
$1000, $32,000 and $1 million, the player can leave anytime with the money asso-
ciated with the highest previous level reached. This applies in both cases: when a
player voluntarily chooses to leave the game and/or when the player gets an in-
correct answer.
There are five chances for the player to leave with nothing. The first being if he
or she were to give a wrong answer before obtaining the first guaranteed amount
and the other four being if he/she gets an answer incorrect even before reaching
the first level, which is $1000. After reaching $1000, this amount is guaranteed and
subsequent questions are played for increasingly large sums (roughly doubling at
each turn). The complete sequence of prizes is as follows: $100, $200, $300, $500,
$1000, $2000, $4000, $8000, $16,000, $32,000, $64,000, $125,000, $250,000, $500,
000 and $1000,000. Note that incorrectly answering intermediate level questions,
e.g., $4000, does not enable the player to leave with $4000, but the last level
reached, that is $1000.
For this game, the New Zealand Numeracy Curriculum was used in order to de-
termine the level of question suitable for the children selected for the study. In
Nand et al. Smart Learning Environments (2019) 6:6 Page 7 of 15

Fig. 4 a and b Screenshots from the game “who wants to be a millionaire?”

order for the game to be enjoyable and engaging, it was necessary that the players
were given the type of questions of which they had prior knowledge and which
were not extremely difficult or “boringly” easy (e.g., a good solution was to provide
a progressive level of skills). Their teachers were consulted and the numeracy
levels of the children were taken into consideration. It was revealed that the chil-
dren in the target group were on level 5 according to the New Zealand Numeracy
Curriculum. Three levels of questions were therefore developed at level 5 on the
following topics:

 Level 1: Addition/Subtraction
 Level 2: Multiplication/division
 Level 3: Combining all the above operations
Nand et al. Smart Learning Environments (2019) 6:6 Page 8 of 15

We developed two versions. The first version was a feature enriched game (FEG)
which had extensive use of the three identified features (i.e., CH, FB & GH) and the
second version, a feature devoid game (FDG) had overt absence of these features.

Feedback feature implementation (FB)


In order to study the impact of feedback in the game, the feature was used multiple
times, almost after every stage in the game. Feedback was implemented using floating
dialogue boxes as well as part of the permanent fixture of the game. Apart from the
transient feedback, permanent feedback based on the level of question being answered
and the amount of money in the bag is provided on the score screen on the top right
hand corner of the screen.
The “help” options were in the form of fifty-fifty, phone a friend and ask the audience.
These were slightly different to the actual TV game the computer game is based on. The
help options were made available using the three icons in Fig. 4 and their implementation
is described as follows:

Fifty-fifty
The player can choose to have the computer randomly eliminate two of the incorrect
answer choices, leaving the player with a choice between the correct answer and an in-
correct choice. Based on these two choices, he or she then makes the answer selection.

Phone-A-friend
Players can ask one out of three pre-arranged friends for an answer. These three
friends can be arranged before commencing the game. In the television game, the
player can phone one of three pre-arranged friends. Since this not possible in a
classroom setting, the player could ask one of three pre-arranged classmates for an
answer. The conversation between the friend and the player is timed in the game,
with a configurable time, and a value of 60 s was used. If the time expires then the
game is ended.

Ask the audience


The player can ask any of their classmates. In the television game, the players get to
ask the audience for help. In a classroom setting, we chose to let the players ask any of
their classmates. This can involve shouting the question over to a friend in another cor-
ner of the room, building up even more excitement in the game.
In the FDG version, feedback was minimal. When a player selects an option, the
answer is highlighted with a white box around the answer. If the option selected is
correct, then the answer is highlighted again in a basic white colour. A prompt
with the dollar amount won is shown next. The player is not given any feedback
about what to do next. If a player selects an incorrect answer, then a prompt ap-
pears with $0 displayed on it. The player is not informed about what to do next.
On the score screen displayed in the top right hand corner of the main screen, the
dollar amounts are displayed. There is no indication during the game as to how
much the player has won. Also there is no indication of what the guaranteed
Nand et al. Smart Learning Environments (2019) 6:6 Page 9 of 15

amounts in the game are. Additionally, the FDG did not have any additional “help”
options as in the FEG version, i.e., fifty-fifty, phone a friend or ask the audience.

Challenge feature implementation (CH)


In the FEG version, challenge exists in the form of the difficulty level of the questions.
The number of levels players can have is configurable and for the purpose of this study
three levels were used. Each level contained a set of 5 questions, with level 1 being the
easiest set. The game starts with level 1 questions asked 5 times after which $1000 level
is reached in terms of the money earned. At this point $1000 becomes a guaranteed
take-home amount. The next set of 5 questions is then asked from level 2 after which
$32,000 becomes a guaranteed take-home amount. Finally, the most difficult set of
questions were asked from level 3 after which the player takes home 1 million dollars.
The increasing level of difficulty challenges the students to come back and play the
game again if they get an answer incorrect in order to achieve a higher level. There is a
catalogue of questions stored in the game so that different questions are asked each
time a player interacts with the game. New questions can easily be added to the
catalogue.
It was difficult to design a version of the game without a challenge feature as the core
part of the game is to win increasing amounts of money, which in itself is a challenge
feature. However, in the FDG version, the challenge in terms of the difficulty level of
the questions was minimised. This was done by randomising the difficulty level of
questions instead of a gradual increase. Hence a player could encounter a level 3 (most
difficult) question to start with and get a relatively easy question towards the crucial
part closer to the end of the game dealing with winning a large sum of money. The
randomization of the level of questions was based on the premise that players encoun-
tering difficult questions at the start would feel discouraged and abandon the game in
the early stages and after a while stop playing altogether. Conversely, players answering
a relatively easy question at the point of winning a major prize would not feel the same
sense of achievement as they would if they won the same money by answering a diffi-
cult question.

Graphics feature implementation (GH)


Graphics includes both colour and sound. In terms of colour, the FEG version had a lot
of attractive colours in all the parts of the persistent screen as well as the transient dia-
logues. The main screen has a black background and the questions appear in a blue
framed box. The questions are displayed in white font while the optional answers are
displayed in yellow font against a black background. When a player selects an answer,
this selection gets highlighted in orange. If the option selected is correct, then the cor-
rect answer gets highlighted in green. If an option is selected and it is the incorrect op-
tion, then the correct answer gets highlighted in green while the incorrect answer
remains highlighted in orange. The dollar amounts that appear on the score screen on
the top right hand corner of the main screen appear in a yellow font. The guaranteed
amounts of $1000, $32,000 and $1000,000 appear in a white font. When a player wins
an amount of money, this amount is highlighted in bright red. In contrast to the use of
Nand et al. Smart Learning Environments (2019) 6:6 Page 10 of 15

the bright colours, the FDG version was done in the two basic colours of black and
white.
In terms of sound, there is a soft, continuous background tune played while the FEG
version of the game is being played. When a correct answer is selected, a short, high
musical note is played to indicate that this is the correct answer. If an answer selected
is incorrect, then a short, low musical note is played to indicate that this is the incor-
rect answer. At the completion of the game, a clapping sound is played to congratulate
the player. In the FDG version all music was muted.

Evaluation study
The study was conducted with 120 children aged between 9 and 10 at Glen Eden
Primary School in Auckland. The study was approved by the Unitec Institute of Tech-
nology Ethics Committee. The participants were divided into a Control group and a
Test group of 60 students each. Both groups were pre-tested firstly on the numeracy
learning outcomes. The Test group was given the FEG version to play over a period of
two weeks and the Control group was given the FDG version to play over two weeks.
Both groups were given post tests on the numeracy learning outcome.
Both FEG and FDG versions of the game were installed on the 12 available com-
puters in the school library and as time permitted, pupils in groups of 12 were given
the games to play in a separate room with the computers. Both Control and Test
groups played at different times and were not able to see what version of game each
group was playing. There was a deliberate attempt to keep the two group’s playing
times separate. The students were allowed to play the game for about 20 min without
any interference from the researcher or any of the other teachers. At the end of a max-
imum of 30 min the students were stopped and allowed to go back to their classrooms.

Analysis of the results


Measuring children’s learning was our main dependent variable. In order to do that, we
used a pre-test, post-test and interaction logs. The pre-test was conducted to measure
student knowledge before using the educational tool and the post-test was used to
measure the learning outcome after using the educational tool. The questions in the
tests were similar to the ones used by teachers in assessing their students in numeracy.
The pre -test and the post-test for each of the curriculum areas were done using the
same questions. This gave us a direct measurement of the change in the learning out-
come. The results are reported in Table 1.
As we can see, the average scores have increased after playing both versions of the
game. The average for the control group has increased from 12.12 to 12.97 and for the
test group, has gone up from 12.87 to 14.77. In addition, the absolute score for these

Table 1 Statistics for the Pre and Post Scores


Control Group Test Group
Statistic Pre Test Post Test Pre Test Post Test
Count 60 60 60 60
Average 12.12 12.97 12.87 14.77
Std. Dev. 4.30 4.21 4.55 3.51
Relative Std. Dev. (%) 35.5 32.5 35.4 23.8
Nand et al. Smart Learning Environments (2019) 6:6 Page 11 of 15

equate to an increase of 0.85 or 7% for the control group and an increase of 1.9 or
14.8% for the test group. Thus, the percentage increase in the mean score is twice as
much for the FEG compared to the FDG game. Comparison of the post-test scores for
the control and the test groups (12.97 compared to 14.77) also shows that the FEG was
more effective in raising the performance level of the students. The T-Test values are
3.63 × 10–10 for the Control group and 1.31 × 10–31 for the Test group. Both values
are orders of magnitude smaller than 0.05, showing that the change in the learning out-
come (post-test vs. pre-test) was statistically significant for both groups. Additionally,
the T-Test value for the Test group is orders of magnitude smaller than the Control
group T-Test value, implying a significant effect of the FEG.
Also, as seen in Table 1, the standard deviation figures show a consistent decrease
from pre-test to post-test in both Control and Test groups. The standard deviation for
the Control group decreased from 4.30 to 4.21 and for the Test group it decreased from
4.55 to 3.51. This shows that the scores are more closely clustered near the mean; how-
ever, the mean has also increased in value. Hence the decrease in the standard devi-
ation value in combination with the increase in the mean value shows that playing the
game in between the pre-test and the post-test had the effect of increasing the scores
of the participants. The relative changes in the standard deviation values of the control
and test groups show that the effect was comparatively more pronounced for the Test
group, indicating the effectiveness of the FEG version of the game.
Table 2 shows the average values of some of the other attributes of the experiments
that were extracted from the log files. The participants in the Test group attempted
more questions in average, provided more correct answers, spent more time playing
the game and reached more levels compared with the Control group—this indicates
that the FEG version was better utilised compared with the FDG.
The bar graph in Fig. 5 shows the maximum level reached by the participants instead
of the average as shown in the last row of Table 2. These results further illustrate that
the game features integrated in the learning tool were effective in achieving better
learning outcomes in terms of higher levels of questions attempted between the Con-
trol and Test groups. The higher levels attained indicate that the students effectively
learned more by being at the learning task for longer. Conversely, the participants in
the Control group were not able to progress as much, probably because of lack of
motivation.
The results show that the FEG version significantly improved learning outcomes for
numeracy—however, it can be even further improved by adapting the game for more
fact manipulation or cognitive based curricula.

Table 2 Interaction logs for Test and Control Groups


Numeracy
AV. per participant Control Test
No. of participants 60 60
No. of questions attempted 9 11.5
No. of correct responses 8 10.5
Time spent playing game (mins) 9.19 10.44
Level reached 1.8 2.3
Nand et al. Smart Learning Environments (2019) 6:6 Page 12 of 15

Fig. 5 Maximum Game Levels Attained by Test and Control Group

Cognitive learning is defined as “learning that is concerned with acquisition of


problem-solving abilities and with intelligence and conscious thought” (Cognitive
Learning, 2018). Numeracy learning is based on problem solving and the game used in
this research did not give children an opportunity to practice problem solving skills. To
learn mathematics, students must be engaged in exploring, estimating, and thinking
rather than recall based learning. Numeracy learning involves understanding the con-
cepts and meanings underlying the operations, as opposed to merely applying rules. So
the most important premise of numeracy learning is that when students understand
the concepts and reasoning underlying a process, they are more likely to be able to cor-
rectly apply that process. This game reinforces previously introduced skills and con-
cepts, but it does not teach players new concepts. Constructivist theorists like Piaget
(1970), Vygotsky (1978) and Bruner (1960) assert that when students construct per-
sonal knowledge derived from meaningful experiences, they are much more likely to
retain and use what they have learned. Hence any learning tool, such as the game de-
signed for this study, should be able to suitably support this.
As part of defining a research framework for smart education, Zhu et al. (2016)
propose a four tier architecture of smart pedagogies: 1) Class-based differentiated
instruction (covering basic knowledge & core skills), 2) Group-based collaborative
learning (covering comprehensive ability), 3) Individual-based personalised learning
(covering personalised expertise) and 4) Mass-based generative learning (covering col-
lective intelligence). Our contribution in this paper covers the first layer, i.e. class-based
differentiated instruction, with the potential to expand to layer 2 (group-based collab-
orative learning) and layer 3 (individual-based personalised learning).

Conclusion & Future Work


In this paper we identified the most appealing characteristics of computer games by
studying the related literature as well as surveying 120 primary school children. The
key features included graphics, feedback and challenge. We then embedded those three
characteristics into an educational tool to find out if the modified version could
Nand et al. Smart Learning Environments (2019) 6:6 Page 13 of 15

enhance children’ learning. A second version with minimal features was used as by the
control group.
The results showed that FB, CH and GH features embedded into the learning tool
were effective in improving learning outcomes. The main dependent variable used was
the amount of learning that took place, measured with the use of pre-test and post-test
and user interaction data. The T-Test results on the learning outcome scores also
showed that the learning outcome was not achieved by random chance, confirming the
effectiveness of the learning tool. The T-Test values for the FEG version were orders of
magnitude smaller than the FDG version, although both values were less than the ac-
cepted critical value of 0.05, implying that while learning outcomes were influenced by
both game versions, the FEG version was more influential.
An immediate future work identified from this study is to adapt the game for more
cognitive based learning tasks. Numeracy learning task, for example, involves more fact
manipulation operations which involves various intermediate steps in order to arrive at
the final answer. The support for such intermediate steps was not fully implemented in
the current version used for this study. In addition, a more comprehensive set of ques-
tions with intermediate questions can be developed in the game to guide the user to a
final answer. It would be interesting to see if the effectiveness of the feature enriched
educational tool would also be valid in other scenarios such as for secondary school
children and in other curriculum areas. We also plan to personalise the questions based
on learners’ understanding of the concepts and conduct a long-term (6 months) study
to find out if there will be significant increase in learning outcomes and amount of
enjoyment.

Abbreviations
CH: Challenges; FB: Feedback; FDG: Feature Devoid Game; FEG: Feature Enriched Game; GH: Graphics; IGEA: Interactive
Games & Entertainment Association

Acknowledgements
The first author would like to acknowledge Dr. Parma Nand for his help in implementing the tool and for
proofreading the work.

Author’s contributions
The corresponding author can confirm that all authors have had scientific contribution in this manuscript, with the
first and second authors contributing the most to the design, implementation, analysis and write up of the
manuscript. All authors read and approved the final manuscript.

Funding
No funding was received for this project.

Availability of data and materials


The authors do not have ethics approval to make the raw student data or the tool available to anyone outside the
organisation, in which the experiment was conducted. Going back to ethics committee to request this is not a feasible
option at this stage, as the experiment is already completed. The authors will keep this in mind for future submissions
and will explicitly request this in the ethics application.

Competing interests
The corresponding author can confirm that none of the authors have any competing interests in the manuscript.

Author details
1
Department of Computer Science, Unitec Institute of Technology, Auckland, New Zealand. 2Department of
Information Technology, Otago Polytechnic Auckland International Campus (OPAIC), Auckland, New Zealand.
3
Department of Computer Science & Software Engineering, Xi’an Jiaotong-Liverpool University, Xi’an, China. 4Glen
Eden Intermediate School, Auckland, New Zealand.
Nand et al. Smart Learning Environments (2019) 6:6 Page 14 of 15

Received: 15 April 2019 Accepted: 21 June 2019

References
G. Alankus, A. Lazar, M. May, C. Kelleher, in Proc. Int. Conference on Human Factors in Computing Systems (CHI 2010). Towards
customizable games for stroke rehabilitation (ACM Press, Atlanta, GA, USA, 2010)
N. Baghaei, J. Casey, D. Nandigam, A. Sarrafzadeh, R. Maddison, Engaging Children in Diabetes Education through Mobile
Games. P. Isaías (Ed.), Proceedings of the 13th International Conference on Mobile Learning (ML) (Budapest, Hungary, 2017)
N. Baghaei, D. Nandigam, J. Casey, A. Direito, R. Maddison, Diabetic Mario: Designing and evaluating Mobile games for
diabetes education. Games for Health Journal: Research, Development, and Clinical Applications 5(4), 270–278 (2016)
T. Baranowski, F. Fran Blumberg, R. Buday, A. DeSmet, L.F. Fiellin, C. Green, P.M. Kato, A.S. Lu, A.E. Maloney, R. Mellecker, B.A.
Morrill, W. Peng, R. Shegog, M. Simons, A.E. Staiano, D. Thompson, K. Young, Games for health for children—Current
status and needed research. Games for Health Journal 5(1), 1–12 (2016)
Berkovsky, S., Freyne, J., Coombe, M., Bhandari, D., Baghaei, N., & Kimani, S. (2010). Exercise and play: Earn in the physical,
spend in the virtual. Int. Journal of Cognitive Technology. 14(2), 22–31
Brand, J. E. (2012). Digital New Zealand 2012. http://igea.net/wp-content/uploads/2011/10/DNZ12FinalLinkVideo.pdf. Accessed
Dec 2018
J. Bruner, The Process of Education (Harvard University Press, Cambridge, MA, 1960)
I. Caponetto, J. Earp, M. Ott, in European Conference on Games Based Learning. Gamification and education: A literature review
(Academic Conferences International Limited, 2014), p. 50
G. Chen, N. Baghaei, A. Sarrafzadeh, C. Manford, S. Marshall, G. Court, in Proc. Australian Computer-Human Interaction
Conference (OZCHI 2011). Designing games to educate diabetic children (ACM, New York, 2011)
D.B. Clark, E.T. Smith, A. Hostetler, A. Fradkin, V. Polikov, Substantial integration of typical educational games into extended
curricula. Journal of Learning Sciences 27(2), 265–318 (2018)
S. Consolvo, K. Everitt, I. Smith, J.A. Landay, in Proc. Int. Conference on Human Factors in Computing Systems (CHI 2006). Design
requirements for technologies that encourage physical activity (ACM Press, Montreal, Quebec, Canada, 2006)
R. Cózar-Gutiérrez, J.M. Sáez-López, Game-based learning and gamification in initial teacher training in the social sciences: An
experiment with Minecraft. Int. J. Educ. Technol. High. Educ. 13(1), 2 (2016)
M. Csikszentmihalyi, Flow: The Psychology of Optimal Experience (Harper Perennial, London, 1990)
S. De Freitas, Are games effective Learning tools? A review of educational games. J. Educ. Technol. Soc. 21(2), 74–84 (2018)
S. de Sousa Borges, V.H. Durelli, H.M. Reis, S. Isotani, in Proceedings of the 29th Annual ACM Symposium on Applied Computing.
A systematic mapping on gamification applied to education (2014), pp. 216–222
L. De-Marcos, A. Domínguez, J. Saenz-de-Navarrete, C. Pagés, An empirical study comparing gamification and social
networking on e-learning. Comput. Educ. 75, 82–91 (2014)
P. Denny, in Proceedings of the 46th ACM Technical Symposium on Computer Science Education. Generating practice questions
as a preparation strategy for introductory programming exams (ACM, Kansas City, Missouri, USA, 2015)
D. Dicheva, C. Dichev, G. Agre, G. Angelova, Gamification in education: A systematic mapping study. J. Educ. Technol. Soc. 18,
75 (2015)
M.D. Dickey, in Proc. An investigation of computer gaming strategies for engaged Learning (American Educational Research
Association, Chicago, IL, 2003)
M.J. Dondlinger, Educational video game design: A review of the literature. Journal of Applied Educational Technology 4(1),
21–31 (2007)
S.M. Fisch, Making Educational Computer Games "Educational" (Conference on Interaction design and children, Boulder, CO,
2005), p. 2005
Y. Fujiki, K. Kazakos, C. Puri, P. Buddharaju, I. Pavlidis, J. Levine, NEAT-o-games: Blending physical activity and fun in the daily
routine. ACM Comput. Entertain. 6(2), Article 21 (2008)
D. Gooch, V. Asimina, L. Benton, R. Khaled, in Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems.
CHI '16. Using Gamification to motivate students with dyslexia (New York, NY, 2016)
I. Halim, J. Casey, N. Baghaei, Designing a Virtual Reality Flight Simulator. 26th International Conference on Computers in
Education (ICCE 2018) (Manila, Philippines, 2018)
J. Hamari, Do badges increase user activity? A field experiment on the effects of gamification. Comput. Hum. Behav. 71, 469–
478 (2015). https://doi.org/10.1016/j.chb.2015.03.036
M. Hinds, N. Baghaei, P. Ragon, J. Lambert, T. Dajakaruna, T. Houghton, S. Dacey, J. Casey, in 9th International Conference on
Computer Supported Education. Designing a novel educational game for teaching C# programming (Bruce M. McLaren,
Porto, Portugal, 2017)
W.H.-Y. Huang, D. Soman, in Gamification of Education. Research Report Series: Behavioural Economics in Action, ed. by Rotman
School of Management, University of Toronto. (2013)
Cognitive Learning. (2018). In the Free Dictionary online edition. http://medical-dictionary.thefreedictionary.com/cognitive+
learning. Accessed Oct 2018
T.W. Malone, Toward a theory of intrinsically motivating instruction. Cogn. Sci. (4), 333–369 (1981)
M.D. Merrill, First principles of instruction: Identifying and designing effective, efficient and engaging instruction (Wiley, San
Francisco, 2013) (2013)
A. Mitchell, C. Saville-Smith, The Use of Computer and Video Games for Learning. A Review of the Literature (The Learning and
Skills Development Agency, London, 2004)
N. Nehring, N. Baghaei, S. Dacey, in International Conference on Computer Supported Education (CSEDU), ed. by B. McLaren.
Improving students’ performance through Gamification: A user study (2018), pp. 213–218
M. Olsson, P. Mozelius, J. Collin, Visualisation and Gamification of e-Learning and programming education. Electronic Journal
of e-Learning 13(6), 441–454 (2015)
J. Piaget, The Science of Education and the Psychology of the Child (Grossman, New York, NY, 1970)
D. Pinelle, N. Wong, T. Stach, Heuristic Evaluation for Games: Usability Principles for Video Game Design. In Proc. Conference on
Human Factors in Computing Systems (CHI 2008), ACM (Florence, Italy, 2008)
Nand et al. Smart Learning Environments (2019) 6:6 Page 15 of 15

M. Prensky, Digital Game-Based Learning (McGraw Hill, New York, 2001)


J.A. Provost, Work, Play and Type: Achieving Balance in your Life (Consulting Psychologists Press, Palo Alto, CA, 1990)
L. Reddy, Persuasion Via Gamification: Mobile Applications for Supporting Positive Behaviour for Learning (PB4L) Pedagogy. An
Unpublished Thesis Submitted in Partial Fulfilment of the Requirements for the Degree of Master of Applied Practice (Unitec
Institute of Technology, Auckland, New Zealand, 2018) https://unitec.researchbank.ac.nz/handle/10652/4497
K. Robson, K. Plangger, J. Kietzmann, I. McCarthy, L. Pitt, Is it all a game? Understanding the principles of gamification.
Business Horizons. 58(4), 411–420 (2015). https://doi.org/10.1016/j.bushor.2015.03.006
Rogoff, B. (1990). Apprenticeship in Thinking: Cognitive Development in Social Context. New York, NY: Oxford University Press
S. Shokouhi-Moqhaddam, N. Khezri-Moghadam, Z. Javanmard, H. Sarmadi-Ansar, M. Aminaee, M. Shokouhi-Moqhaddam, M.
Zivari-Rahman, A study of the correlation between computer games and adolescent behavioral problems. Addiction &
health 5(1–2), 43–50 (2013)
M. Urh, G. Vukovic, E. Jereb, The model for introduction of gamification into e-learning in higher education. Procedia Soc.
Behav. Sci. 197, 388–397 (2015)
L. Vygotsky, Play: Its Role in Development and Evolution, Chapter Play and its Role in the Mental Development of the Child
(Penguin Books, New York, NY, 1976), pp. 537–554
L. Vygotsky, Mind and Society: The Development of Higher Mental Processes (Harvard University Press, Cambridge, MA, 1978)
D. Walsh, D. Gentile, 13th Annual Mediawise Video Game Report Card (National Institute on Media and the Family,
Minneapolis, Minnesota, 2008).
Whitten, P., Aulakh, B., Verma, K., Cross, B., Hsu, I.-H., Stoner, J., Copelen, K., Hall, K., Kluesner, J., Greenawalt, K., Barrows, S.,
Farmer, J., Davis, R. (2017). Gamifying Tobacco-Free Kids: Employing Medical Visualization and Games in Youth Anti-
Tobacco Outreach. American Thoracic Society International Conference, May 18–23, 2018 - San Diego, CA
Z. Yusoff, A. Kamsin, S. Shamshirband, A survey of educational games as interaction design tools for affective learning:
Thematic analysis taxonomy. Educ. Inf. Technol. 23(1), 393–418 (2018)
Z. Zhu, M. Yu, P. Riezebos, A research framework of smart education. Smart Learning Environments 3(4) (2016)
Zichermann, G., Cunningham, C. (2011). Gamification by Design: Implementing Game Mechanics in Web and Mobile Apps
(1st ed.). Sebastopol, California: O’Reilly Media. p. xiv. ISBN 978-1-4493-1539-9

Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

You might also like