International Journal of Medical Informatics: Sciencedirect

Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

International Journal of Medical Informatics 141 (2020) 104226

Contents lists available at ScienceDirect

International Journal of Medical Informatics


journal homepage: www.elsevier.com/locate/ijmedinf

A comparative case study of 2D, 3D and immersive-virtual-reality T


applications for healthcare education
Omar López Cháveza, Luis-Felipe Rodrígueza, J. Octavio Gutierrez-Garciab,*
a
Instituto Tecnológico de Sonora, Cd. Obregón, Sonora 85000, Mexico
b
ITAM, Río Hondo 1, Ciudad de México 01080, Mexico

ARTICLE INFO ABSTRACT

Keywords: Background and objective: The workings of medical educational tools are implemented using a myriad of ap-
Healthcare education proaches ranging from presenting static content to immersing students in gamified virtual-reality environments.
Virtual reality The objective of this paper is to explore whether and how different approaches for designing medical educational
Comparative case study tools affect students’ learning performance.
Materials and methods: Four versions of an educational tool for the study of clinical cases were implemented: a
2D version, a gamified 2D version, a gamified 3D version, and a gamified immersive-virtual-reality version. All
complying with the same functional requirements. Each version was used and evaluated by an independent
group of students. The participants (n = 78) evaluated the applications regarding usefulness, usability, and ga-
mification. Afterward, the students took an exam to assess the retention of information on the clinical cases
presented.
Results: One-sample Wilcoxon signed-rank tests confirmed that the participants perceived that it was at least
quite likely that gamification helped improved their learning. In addition, based on the participants’ perception,
the gamification of the immersive-virtual-reality version helped the most to improve their learning performance
in comparison with the gamified 2D and 3D versions.
Conclusions: Regardless of whether different versions of a medical educational tool (complying with the same
functional requirements) are perceived as equally useful and usable, the design approach (either 2D, 3D, or
immersive-virtual-reality with or without gamification) affects students’ retention of information on clinical
cases.

1. Introduction innovative methods based on immersive virtual reality to provide si-


mulated realistic environments, see [14–16].
Effective and efficient training of medical students is critical to Medical educational tools, regardless of their underlying design
ensure a reliable future healthcare workforce capable of providing high approach, are developed to engage medical students and improve their
quality healthcare services, see [1–3]. In this context, a variety of learning performance. In this regard, most medical educational tools
medical educational tools exist to complement traditional learning reported in the literature have been positively evaluated as tools that
methods [4–6], which are designed to help students enhance acquired promote and improve learning, see [4,10,14]. As an illustration, in [17]
theoretical knowledge and support practical training by providing a game-based prototype was positively evaluated by nursing students as
technology-based learning resources [7,8]. Designers of computer-as- a useful, usable, and satisfying educational tool. In [18–20] medical
sisted medical educational tools take advantage of diverse technological educational tools based on simulated virtual environments and 3D
approaches, ranging from presenting 2D static content (e.g., [9]), of- models were evaluated by medical students, resulting as a realistic,
fering fully immersive-virtual-reality environments (e.g., [10]), using easy-to-navigate, and useful approach that enhances learning, which is
social networks (e.g., [11]), to incorporating gamification elements more effective than traditional methods. Nevertheless, although most
such as badges, rankings, challenges, and rewards in serious games medical educational tools reported in the literature present evaluations,
(e.g., [12,13]). More recently, medical educational tools have adopted there is still a need for comparative studies analyzing and identifying

Corresponding author.

E-mail addresses: [email protected] (O. López Chávez), [email protected] (L.-F. Rodríguez),


[email protected] (J.O. Gutierrez-Garcia).

https://doi.org/10.1016/j.ijmedinf.2020.104226
Received 7 April 2020; Received in revised form 25 May 2020; Accepted 16 June 2020
Available online 17 June 2020
1386-5056/ © 2020 Elsevier B.V. All rights reserved.

Downloaded for Anonymous User (n/a) at University of Sharjah from ClinicalKey.com by Elsevier on September 22,
2023. For personal use only. No other uses without permission. Copyright ©2023. Elsevier Inc. All rights reserved.
O. López Chávez, et al. International Journal of Medical Informatics 141 (2020) 104226

design approaches and technological elements (e.g., gamification and healthcare students in the use of medical instruments. The application
virtual reality) that are perceived by medical students as more usable, presents 3D virtual models of medical devices for manipulation as well
useful, and effective toward a given set of learning goals. Moreover, the as self-assessment activities for promoting learning. The perceived ef-
benefits of identifying such design approaches and technological ele- fectiveness of the application was evaluated by 50 participants using a
ments go beyond the medical domain, since computer-based educa- 10-item questionnaire. Participants considered that the application was
tional tools are commonly implemented and used across multiple dis- simple to use, interactive, and more effective than traditional methods.
ciplines to improve learning performance, see [29–31]. In addition, the results of [20] showed that the effectiveness of the
This paper presents a comparative case study implementing, eval- designed learning tool held regardless of the participant age and level of
uating, and analyzing four versions of a medical educational tool for computer skills.
training medical students (hereafter referred to as MediTool). The pri- As seen in this section, most medical educational tools reported in
mary objective is to explore whether and how different approaches for the literature present evaluations of the proposed tools in terms of us-
designing medical educational tools affect students’ learning perfor- ability, effectiveness, usefulness, and satisfaction. However, there is still
mance. To attain this objective, a group of 78 participants evaluated the a need for comparative studies of medical educational tools reported in
usefulness, usability, and gamification elements of different versions of the literature as well as for analyses of the extent to which a particular
MediTool. Afterward, the participants took a paper-based exam to assess technological element such as gamification and/or virtual reality con-
their learning performance. It is important to note that learning perfor- tributes to end users’ perceived usability and perceived effectiveness.
mance in the present comparative case study was measured as the
ability of the individual to retain information, which is linked to 3. Materials and method
learning [21,22]. In this sense, the paper-based exam assessed the re-
tention of information by the participants on clinical cases presented in 3.1. Sample of participants
MediTool. The four versions of MediTool (a non-commercial software)
implemented by the authors were (i) a 2D desktop version presenting A sample of 78 undergraduate medical students were recruited from
static content; (ii) a gamified 2D version including badges, challenges, a local university: 8 first-year students, 6 third-year students, 31 fourth-
and other gamification elements; (iii) a gamified 3D version using 3D year students, and 33 fifth-year students. The only restriction for se-
avatars and non-immersive virtual reality; and (iv) a gamified im- lection was that students were familiar with the use of computers. The
mersive-virtual-reality version. median age was 22 years (range 18–26) with a standard deviation of
1.65 years. With respect to gender, there were 36 female participants
2. Related work and 42 male participants. See Table 1 for a detailed description of the
participants for each evaluation group of MediTool.
In [17] a serious game software prototype was designed to help
nursing students develop clinical reasoning and decision-making skills 3.2. Instruments
associated with the care of patients with chronic obstructive pulmonary
disease in home healthcare settings. The prototype applied gamification Three types of instruments were designed to conduct the present
components such as challenges, interaction, and objectives. The pro- study: (i) four versions of MediTool for the study of clinical cases; (ii) a
totype was a single-player online game that provides nursing students traditional paper-based, multiple-choice exam to evaluate participants’
with a realistic video-based simulated scenario of a home healthcare learning performance after studying clinical cases using one version of
clinical practice, information about the patient, and quiz-based tasks. MediTool; and (iii) a formal questionnaire to evaluate perceived useful-
The perceived usability of the tool was evaluated by six participants ness, perceived ease of use, and the impact of gamification elements on
using questionnaires, interviews, and cognitive walkthroughs. In gen- learning performance.
eral, the results showed that participants perceived the serious game as
useful, usable, and satisfying. 3.2.1. A medical educational tool for the study of clinical cases: MediTool
In [18] it was designed and evaluated a virtual coach in a mixed Four versions of a medical educational tool for the study of clinical
reality simulator to guide users during a functional endoscopic sinus cases were designed and implemented by the authors (see Fig. 1): (i) a
surgery. The virtual coach provides assistance to users by using visual 2D version; (ii) a gamified 2D version; (iii) a gamified 3D version; and
and auditory cues in an immersive-virtual-reality operating room. This (iv) a gamified immersive-virtual-reality version. All the versions
study involved 17 participants grouped into novice and expert sur- comply with the same functional requirements (see Table 2) and pre-
geons. Participants evaluated the virtual coach using a Likert-type sented the same five clinical cases to participants organized into three
questionnaire. Both groups of participants regarded the virtual coach as difficulty levels: three clinical cases for beginners, one case for inter-
a useful and beneficial educational tool that enhances learning in a mediates, and one for advanced students.
surgical mixed-reality environment. The five clinical cases are based on the Mexican Clinical Practice
In [19] an anatomy-augmented virtual simulation training module Guides [23], which make use of the best national and international
was evaluated to determine whether it contributes to the development clinical practices adapted for the health profile of Mexicans. The in-
of students’ abilities to place a nasogastric tube. This tool is supported formation presented for each clinical case was organized into three
by video and 3D computer graphics to simulate organ systems and vi- parts: (i) patient's antecedents; (ii) patient's vital signs (e.g., body
sualize anatomical structures. The study involved 69 nursing students temperature and blood pressure) and appearance; and (iii) patient's
divided into control and experimental groups. The former group was personal information (e.g., gender and age). There was a series of
trained on how to place a nasogastric tube using traditional methods, multiple-choice questions associated with each clinical case, which
whereas the latter was trained using the anatomy-augmented virtual were the same for the four versions of MediTool. It is important to note
simulation module. After training, participants of both groups practiced that some of the questions associated with the intermediate and ad-
the placement of a nasogastric tube on a manikin and were evaluated by vanced clinical cases included lab examinations to present additional
a group of experts using a 17-item competency checklist. Results information about patient's findings. Once a participant answered a
showed that, in general, competency scores were better for participants question, MediTool provided feedback according to the selected answer
trained with the anatomy-augmented virtual simulation module. In and a link to the clinical practice guide so that the participant could
addition, the participants regarded the simulation training module as access additional information.
realistic, easy to use, and useful. The graphical user interfaces, interactive elements, and navigation
In [20] a web-based interactive application was developed to train options were different for each version of MediTool (see Table 3). The

Downloaded for Anonymous User (n/a) at University of Sharjah from ClinicalKey.com by Elsevier on September 22,
2023. For personal use only. No other uses without permission. Copyright ©2023. Elsevier Inc. All rights reserved.
O. López Chávez, et al. International Journal of Medical Informatics 141 (2020) 104226

Table 1
Description of the participants for each evaluation group of MediTool.
2D version Gamified 2D version Gamified 3D version Immersive-virtual-reality version

Id Gender Age Year Id Gender Age Year Id Gender Age Year Id Gender Age Year

1 Male 22 4 18 Female 22 4 38 Female 22 5 58 Female 22 4


2 Male 22 4 19 Male 22 5 39 Female 24 5 59 Female 22 4
3 Female 22 4 20 Male 24 5 40 Male 23 4 60 Male 23 4
4 Female 21 4 21 Female 22 5 41 Male 22 5 61 Male 21 4
5 Male 21 4 22 Female 23 5 42 Female 24 5 62 Male 22 4
6 Female 22 4 23 Male 21 4 43 Male 21 4 63 Male 21 4
7 Male 25 4 24 Male 24 5 44 Male 23 5 64 Male 22 4
8 Female 22 4 25 Male 21 4 45 Female 22 5 65 Male 21 3
9 Male 24 5 26 Female 26 4 46 Male 21 4 66 Male 21 3
10 Female 22 5 27 Female 22 4 47 Male 22 5 67 Female 21 3
11 Male 23 5 28 Male 24 5 48 Female 22 5 68 Male 21 3
12 Male 22 5 29 Male 23 5 49 Male 23 5 69 Male 20 3
13 Female 23 5 30 Female 21 4 50 Female 24 5 70 Male 22 3
14 Male 22 5 31 Male 22 5 51 Female 21 4 71 Male 18 1
15 Male 22 5 32 Male 22 4 52 Male 23 4 72 Female 19 1
16 Female 22 5 33 Male 26 5 53 Male 21 4 73 Female 19 1
17 Male 23 5 34 Female 23 5 54 Female 21 4 74 Female 18 1
35 Female 23 4 55 Female 22 5 75 Female 18 1
36 Female 21 4 56 Female 23 5 76 Female 18 1
37 Male 22 5 57 Female 22 5 77 Female 18 1
78 Male 18 1

Median 22 5 Median 22 5 Median 22 5 Median 21 3

3D and immersive-virtual-reality versions offered virtual scenarios that evaluated by 17, 20, 20, and 21 participants, respectively. The group
simulate a consulting room and interactive 3D objects that were used by size varied because each group corresponded to either one or two
participants to (de)activate the description of clinical cases and the groups of a medical school, which usually have different sizes.
associated examinations. These versions also included 3D avatars The evaluation sessions consisted of four phases. In the first phase,
playing the role of a patient and allowed participants to navigate the the objective of the experiment was explained and participants were
virtual scenario. In particular, in the immersive-virtual-reality version, asked to read and sign a consent form. Afterward, the functionality of
participants navigated the virtual scenario by moving in a physical real- the corresponding version of MediTool was explained by directly ex-
world space using a virtual reality headset and hand controllers to in- ploring how it works and all participants’ questions were answered. The
teract with virtual objects. To promote gamification, the gamified instruments used to evaluate MediTool were provided to participants
versions of MediTool included avatars, rewards, comments, rankings, and explained. In particular, all the questions of the evaluation in-
badges, and challenges (e.g., resolving a given number of clinical struments (see Table 4) and related concepts (such as learning perfor-
cases). mance) were explained. This phase lasted approximately 20 min. In the
second phase, participants were requested to register on MediTool and
3.2.2. Exam to evaluate learning performance were allowed to freely utilize the corresponding version of MediTool for
A traditional paper-based exam was designed to evaluate the par- 20 min. Assistance was provided to participants as requested. In the
ticipant's learning performance after studying the clinical cases using a third phase, the participants evaluated the corresponding version of
particular version of MediTool. The exam consisted of the same five MediTool in regard to usefulness, ease of use, and gamification by using
clinical cases incorporated into MediTool and their associated multiple- the questionnaire reported in Table 4. In the fourth phase, the partici-
choice questions. pants were handed over a (traditional) multiple-choice exam to de-
termine whether and to what extent they retained information about
the clinical cases. There was no time limit for this last phase. On
3.2.3. Questionnaire to evaluate usefulness, ease of use, and gamification
average, the participants took 15 min to complete the exam.
The four versions of MediTool were evaluated in terms of perceived
usefulness and perceived ease of use by participants using a 12-item
Likert-type questionnaire based on the Technology Acceptance Model 4. Results
(TAM) [25], see Table 4, questions Q1−12. The TAM is a universal,
standardized questionnaire [26] that can be adapted to the domain of For this analysis, statistical tests were performed using R and a p-
the application and is widely accepted [24,27] as a robust tool for value less than 0.05 was considered significant. Raw results are re-
measuring usefulness and ease of use. This questionnaire was com- ported in Tables 5 and 6, and shown in Figs. 2–4 .
plemented with six questions to evaluate the perception of participants
on whether incorporating gamification elements helped improve their 4.1. Results on the participants’ scores on the exams
overall learning performance, see Table 4, questions Q13−18. These
questions were asked only to participants that utilized the gamified Shapiro-Wilk's tests confirmed the normality of the scores obtained
versions of MediTool. by the participants on the traditional exam (Table 5) after using the 2D
version (W = 0.91564, p-value = 0.1246), the gamified 2D version
3.3. Procedure (W = 0.93928, p-value = 0.2324), the gamified 3D version
(W = 0.96008, p-value = 0.5454), and the immersive-virtual-reality
Each version of MediTool was evaluated by an independent group of version (W = 0.9243, p-value = 0.1058). In addition, two-sample F-
participants. Particularly, the 2D version, the gamified 2D version, the tests for equal variances confirmed the equality of the variances of the
gamified 3D version, and the immersive-virtual-reality version were scores obtained by the participants on the traditional exam after using

Downloaded for Anonymous User (n/a) at University of Sharjah from ClinicalKey.com by Elsevier on September 22,
2023. For personal use only. No other uses without permission. Copyright ©2023. Elsevier Inc. All rights reserved.
O. López Chávez, et al. International Journal of Medical Informatics 141 (2020) 104226

Fig. 1. Different versions of MediTool.

Table 2
A representative selection of functional requirements of the four versions of MediTool.
Functional requirements

Req 1 MediTool should show a user registration screen (requires user name, password, participant's name, and email).
Req 2 MediTool should show a login screen to enter the system (requires user name and password).
Req 3 MediTool should show a user tutorial.
Req 4 MediTool should include five clinical cases organized into three difficulty levels: beginner, intermediate, advanced.
Req 5 Level 1 (beginner) should include three clinical cases at random.
Req 6 Level 2 (intermediate) should include one clinical case.
Req 7 Level 3 (advanced) should include one clinical case.
Req 8 The description of clinical cases should be composed of patient's antecedents, vital signs, and personal information.
Req 9 MediTool should include quizzes with multiple-choice questions corresponding to each clinical case.
Req 10 MediTool should present feedback to the user once a response is provided.
Req 11 MediTool should present the user's score.

Downloaded for Anonymous User (n/a) at University of Sharjah from ClinicalKey.com by Elsevier on September 22,
2023. For personal use only. No other uses without permission. Copyright ©2023. Elsevier Inc. All rights reserved.
O. López Chávez, et al. International Journal of Medical Informatics 141 (2020) 104226

Table 4

3D interactive objects, 3D virtual agents, and avatar


Questionnaire based on the TAM [24] used to evaluate the four versions of

User input and output based on virtual reality hand


MediTool.
Perceived usefulness
Gamified immersive-virtual-reality version

Badges, challenges, rankings, score table


Q1 Using MediTool in my studies would enable me to accomplish learning
goals more quickly.
Q2 Using MediTool in my studies would improve my learning performance.
Q3 Using MediTool in my studies could increase my learning efficiency.
Q4 Using MediTool in my studies would enhance my learning effectiveness.
Q5 Using MediTool would make it easier to study.
Q6 I would find MediTool useful in my studies.
controller and headset
3D consulting room

Perceived ease of use


Q7 Learning to operate MediTool was easy for me.
Q8 I found it easy to get MediTool to do what I want it to do.
Q9 My interaction with MediTool was clear and understandable.
Q10 I found MediTool to be flexible to interact with.
Q11 It would be easy for me to become skillful at using MediTool.
Q12 I found MediTool easy to use.
User input and output based on keyboard, mouse,

Questions about gamification elements


3D interactive objects and 3D virtual agents

Q13 Including badges helped improve my learning performance.


Badges, challenges, rankings, score table

Q14 Including ranking helped improve my learning performance.


Q15 Including challenges helped improve my learning performance.
Q16 Including comments helped improve my learning performance.
Q17 Including 2D/3D avatars helped improve my learning performance.
Q18 Including rewards helped improve my learning performance.

Values and interpretation for possible answers are as follows: 7: extremely


Gamified 3D version

3D consulting room

and screen devices

likely, 6: quite likely, 5: slightly likely, 4: neither, 3: slightly unlikely, 2: quite


unlikely, and 1: extremely unlikely

Table 5
Descriptive statistics of the participants’ scores on a traditional multiple-choice
exam.
User input and output based on keyboard, mouse,

Median Std dev Min Max 25th 50th 75th


percentile percentile percentile
Badges, challenges, rankings, score table

Scores obtained by the participants using the 2D version of MediTool


395 43.13 330 470 350 395 430

Scores obtained by the participants using the gamified 2D version of MediTool


440 45.16 320 500 413.75 440 463.75
2D icons and 2D images

Scores obtained by the participants using the gamified 3D version of MediTool


Gamified 2D version

and screen devices

415 40.03 340 500 390 415 442.50


A representative selection of particular characteristics of each version of MediTool.

2D interfaces

Scores obtained by the participants using the immersive-virtual-reality version of MediTool


400 51.79 265 480 390 400 425

Possible scores for the traditional exam ranged from 0 to 500.


User input and output based on keyboard, mouse,

the 2D version and (i) the gamified 2D version (p-value = 0.876), (ii)
the gamified 3D version (p-value = 0.7339), and (iii) the immersive-
virtual-reality version (p-value = 0.4767). Based on these tests’ results,
t tests were used to assess the statistical significance of the results on
learning performance.
From the data reported in Table 5, a two-tailed two-sample t test
(with equal variances) showed that the scores on the traditional exams
2D icons and 2D images

for the 2D version and the scores for the gamified 2D version are sig-
nificantly different (t = −3.0518, p-value = 0.004322). In addition, a
and screen devices

two-tailed two-sample t test (with equal variances) showed that the


2D interfaces

scores on the traditional exams for the 2D version and the scores for the
2D version

gamified 3D version are significantly different (t = −2.153, p-


value = 0.03829). However, a two-tailed two-sample t test also showed
N/A

that the scores for the 2D version and the immersive-virtual-reality


version are not significantly different (t = −0.84511, p-
Gamification elements
Interactive elements

value = 0.4036). Also, as shown in Table 5, the highest average scores


on the traditional exams were obtained for the gamified 2D version and
the gamified 3D version, whereas the lowest average scores were ob-
Navigation

tained for the 2D version and the immersive-virtual-reality version.


Table 3

GUI

Downloaded for Anonymous User (n/a) at University of Sharjah from ClinicalKey.com by Elsevier on September 22,
2023. For personal use only. No other uses without permission. Copyright ©2023. Elsevier Inc. All rights reserved.
O. López Chávez, et al. International Journal of Medical Informatics 141 (2020) 104226

Table 6
Descriptive statistics of the evaluation results of MediTool.
Construct Median Mode Min Max 25th percentile 50th percentile 75th percentile

Evaluation of the 2D version


Perceived usefulness 6 5 2 7 5 6 6
Perceived ease of use 7 7 3 7 6 7 7

Evaluation of the gamified 2D version


Perceived usefulness 6 6 1 7 6 6 7
Perceived ease of use 7 7 1 7 6 7 7
Gamification 6 7 2 7 5 6 7

Evaluation of the gamified 3D version


Perceived usefulness 6 6 5 7 6 6 7
Perceived ease of use 6 6 3 7 6 6 7
Gamification 6 7 1 7 5 6 7

Evaluation of the immersive-virtual-reality version


Perceived usefulness 6 6 5 7 6 6 7
Perceived ease of use 7 7 3 7 6 7 7
Gamification 7 7 3 7 6 7 7

Values and interpretation for possible answers are as follows: 7: extremely likely, 6: quite likely, 5: slightly likely, 4: neither, 3: slightly unlikely, 2: quite unlikely, and
1: extremely unlikely.

4.2. Results on usefulness and ease of use gamified 3D version was 6 (i.e., quite likely that gamification helped
improved learning), whereas the median for the immersive-virtual-
For all the results on the evaluation constructs (namely, usefulness, reality version was 7 (i.e., extremely likely that gamification helped
ease of use, and gamification) reported in Table 6, one-sample Wilcoxon improved learning).
signed-rank tests showed that the constructs’ medians µ1/2 i
are sig-
nificantly greater than specified medians µ1/2 1 (p-values <0.00001),
i
5. Discussion
which confirm the significance of the medians. The one-sample Wil-
coxon signed-rank test was used because Likert scales are ordinal and
5.1. Discussion about the participants’ scores on the exams
evaluation results do not follow a normal distribution according to re-
sults from Shapiro–Wilk's tests (p-values <0.00000001).
The highest average scores on the traditional exams were obtained
The median of the perceived ease of use construct for the 2D version,
by the participants using either the gamified 2D version or the gamified
the gamified 2D version, and the immersive-virtual-reality version was
3D version (Fig. 2), whereas the lowest average scores were obtained by
7 (i.e., extremely likely to be usable), whereas, for the same construct,
the participants using the immersive-virtual-reality version and the 2D
the median for the gamified 3D version was 6 (i.e., quite likely to be
version. In fact, the lowest average score was obtained by the partici-
usable). With respect to the perceived usefulness construct, the median
pants using the 2D version. This may suggest that gamification elements
for all the versions was 6 (i.e., quite likely to be useful). As for the
play a positive role in student's learning performance as discussed in
gamification construct, the median for the gamified 2D version and the
[28] and as indicated by the participants of this study (see Fig. 3).

Fig. 2. Participants’ scores on a traditional multiple-choice exam after using MediTool.

Downloaded for Anonymous User (n/a) at University of Sharjah from ClinicalKey.com by Elsevier on September 22,
2023. For personal use only. No other uses without permission. Copyright ©2023. Elsevier Inc. All rights reserved.
O. López Chávez, et al. International Journal of Medical Informatics 141 (2020) 104226

Fig. 3. Evaluation results of MediTool.

In general, the participants (that used all the gamified versions) It is acknowledged that, in this study, the assessment of learning
perceived that it was (at least) quite likely that the gamification helped performance was limited to administering an exam on the clinical cases
improve their learning performance. However, the average score on the taught in MediTool and grading the participants’ responses. This as-
traditional exams of the participants using the immersive-virtual-reality sessment method is mainly focused on evaluating the participants’
version (that also included gamification elements) was not as high as short-term retention (or working memory), which plays a role in
the average score of the participants using the other gamified versions. learning [22] and according to [21] even underpin learning. None-
This contrasts even more with the fact that the immersive-virtual-rea- theless, to completely determine participants’ learning performance,
lity version was perceived as quite likely to be useful (the same as the additional metrics should be involved and more elaborate assessment
other versions) and extremely likely to be usable (among the most methods are required.
usable versions). Moreover, in terms of gamification, the immersive-
virtual-reality version received the highest rating, namely the partici-
pants indicated that it was extremely likely that the gamification of the 5.2. Discussion about usefulness and ease of use
immersive-virtual-reality version helped improve their learning per-
formance. A potential explanation for this result is that the participants In general, all the versions of MediTool were perceived as extremely
that evaluated the immersive virtual reality version had fewer years of usable, except the gamified 3D version, which was perceived as quite
medical school preparation (with a median of 3) compared to the other usable (Fig. 3). In this regard, according to a couple of comments from
evaluation groups (with a median of 5), see Table 1. Another factor that two participants, the gamified 3D version may have not been perceived
may have affected the average exam score of the participants that used as extremely likely to be usable because it was difficult to interact with
the immersive-virtual-reality version is that, while using MediTool, the environment using avatars. In spite of this difference in usability, all
some of them moved hesitantly because they were not seeing the the versions of MediTool were perceived as quite useful (Fig. 3).
physical real-world environment and may have not been aware of its Nonetheless, it should be noted that for the gamified 3D version and the
boundaries. It is hypothesized that this uncertainty may have distracted immersive-virtual-reality version, the most rigorous participants con-
students from learning. Another potential explanation for this result sidered them as slightly likely to be useful. However, in the case of the
(based on the authors’ observations) is that the participants that used 2D version and the gamified 2D version, there were two participants
the immersive-virtual-reality version spent some time exploring the (one in each evaluation) that indicated that the corresponding version
environment instead of spending time studying the clinical cases. This of MediTool was quite unlikely to be useful and extremely unlikely to be
may have been because for the majority of these participants, it was the useful, respectively. This may suggest that including 3D components
first time that they used an immersive-virtual-reality application. either using non-immersive virtual reality or immersive virtual reality
may have helped improve the perception of potential usefulness of

Downloaded for Anonymous User (n/a) at University of Sharjah from ClinicalKey.com by Elsevier on September 22,
2023. For personal use only. No other uses without permission. Copyright ©2023. Elsevier Inc. All rights reserved.
O. López Chávez, et al. International Journal of Medical Informatics 141 (2020) 104226

18
16
14
12
10
8
6
4
2
0
(1) (2) (3) (4) (5) (6) (7) (1) (2) (3) (4) (5) (6) (7) (1) (2) (3) (4) (5) (6) (7) (1) (2) (3) (4) (5) (6) (7) (1) (2) (3) (4) (5) (6) (7) (1) (2) (3) (4) (5) (6) (7)
Question 1 Question 2 Question 3 Question 4 Question 5 Question 6
2D version Gamified 2D version Gamified 3D version Immersive-virtual-reality version

Responses regarding ease of use


18
16
14
12
10
8
6
4
2
0
(1) (2) (3) (4) (5) (6) (7) (1) (2) (3) (4) (5) (6) (7) (1) (2) (3) (4) (5) (6) (7) (1) (2) (3) (4) (5) (6) (7) (1) (2) (3) (4) (5) (6) (7) (1) (2) (3) (4) (5) (6) (7)
Question 7 Question 8 Question 9 Question 10 Question 11 Question 12
2D version Gamified 2D version Gamified 3D version Immersive-virtual-reality version

Responses regarding gamification


18
16
14
12
10
8
6
4
2
0
(1) (2) (3) (4) (5) (6) (7) (1) (2) (3) (4) (5) (6) (7) (1) (2) (3) (4) (5) (6) (7) (1) (2) (3) (4) (5) (6) (7) (1) (2) (3) (4) (5) (6) (7) (1) (2) (3) (4) (5) (6) (7)
Question 13 Question 14 Question 15 Question 16 Question 17 Question 18
2D version Gamified 2D version Gamified 3D version Immersive-virtual-reality version

Values and interpretation for possible answers are as follows: (7) extremely likely, (6) quite likely,
(5) slightly likely, (4) neither, (3) slightly unlikely, (2) quite unlikely, and (1) extremely unlikely.
Fig. 4. Detailed evaluation results of Meditool.

MediTool regardless of whether it may be difficult to interact with 3D retention of information on clinical cases. In addition, regardless of
environments using avatars (as indicated by a couple of participants in whether the medical educational tool is perceived as highly useful and
relation to the gamified 3D version). highly usable and the tool complies with the expected functional re-
quirements, there were significant differences on the average scores on
the exams obtained by each group of participants. Finally, it should be
6. Conclusions and future work noted that, in terms of gamification and based on the perception of the
participants, the immersive-virtual-reality version received the highest
The novelty of this paper resides in the comparative nature of the rating, i.e., it was extremely likely that its gamification helped improve
analysis enabled by implementing and evaluating four versions of an their learning performance.
educational tool for the study of clinical cases using different ap- An immediate future research direction is to explore how learning
proaches (combining gamification with 2D and 3D virtual environ- efficiency and effectiveness evolve with respect to usage time when
ments and fully immersive virtual reality). medical students make use of either 2D, 3D or immersive-virtual-reality
The results of this study show that using different design approaches medical educational tools.
(either 2D, 3D, or immersive-virtual-reality with or without gamifica-
tion) to implement a medical educational tool may affect students’

Downloaded for Anonymous User (n/a) at University of Sharjah from ClinicalKey.com by Elsevier on September 22,
2023. For personal use only. No other uses without permission. Copyright ©2023. Elsevier Inc. All rights reserved.
O. López Chávez, et al. International Journal of Medical Informatics 141 (2020) 104226

Summary table (2019) 188–194.


[6] D. Qi, A. Ryason, N. Milef, S. Alfred, M.R. Abu-Nuwar, M. Kappus, S. De, D.B. Jones,
Virtual reality operating room with ai guidance: design and validation of a fire
What was already known on the topic: scenario, Surg. Endosc. (2020) 1–8.
[7] J. Herron, Augmented reality in medical education and training, J. Electron.
• The workings of medical educational tools are implemented using a Resour. Med. Libr. 13 (2) (2016) 51–55.
[8] F. Lateef, Simulation-based learning: just like the real thing, J. Emerg. Trauma
myriad of approaches ranging from presenting static content to Shock 3 (4) (2010) 348–352.
immersing students in gamified virtual-reality environments. [9] A. Katz, R. Tepper, A. Shtub, Simulation training: evaluating the instructor's con-
• Current medical educational tools have been positively evaluated as tribution to a wizard of oz simulator in obstetrics and gynecology ultrasound
training, JMIR Med. Educ. 3 (1) (2017) e8.
tools that engage students and improve learning, however, there is a [10] R. Lohre, A.J. Bois, G.S. Athwal, D.P. Goel, Improved complex skill acquisition by
lack of studies comparing and discussing their design approaches immersive virtual reality training: a randomized controlled trial, J. Bone Joint Surg.
and technological elements in relation to learning performance. 102 (6) (2020) e26.
[11] A. Dehghani, J. Kojuri, M.R. Dehghani, A. Keshavarzi, S. Najafipour, Experiences of
students and faculty members about using virtual social networks in education: a
What this study added to our knowledge: qualitative content analysis, J. Adv. Med. Educ. Prof. 7 (2) (2019) 86–94.
[12] D.P. de Sena, D.D. Fabrício, V.D. da Silva, L.C. Bodanese, A.R. Franco, Comparative

• Regardless of whether different versions of a medical educational evaluation of video-based on-line course versus serious game for training medical
students in cardiopulmonary resuscitation: a randomised trial, PLOS ONE 14 (4)
tool (complying with the same functional requirements) are per- (2019) e0214722.
ceived as equally useful and usable, the design approach (either 2D, [13] C.R. Nevin, A.O. Westfall, J.M. Rodriguez, D.M. Dempsey, A. Cherrington, B. Roy,
3D, or immersive-virtual-reality with or without gamification) af- M. Patel, J.H. Willig, Gamification as a tool for enhancing graduate medical edu-
cation, Postgrad. Med. J. 90 (1070) (2014) 685–693.
fects students’ retention of information on clinical cases.

[14] E. Parkhomenko, M. O’Leary, S. Safiullah, S. Walia, M. Owyong, C. Lin, R. James,
Based on the perception of the participants, the gamification of an Z. Okhunov, R. Patel, K. Kaler, J. Landman, R. Clayman, Pilot assessment of im-
immersive-virtual-reality version of a medical education tool helped mersive virtual reality renal models as an educational and preoperative planning
tool for percutaneous nephrolithotomy, J. Eendourol. 33 (4) (2019) 283–288.
the most to improve learning performance in comparison with ga- [15] K. Stepan, J. Zeiger, S. Hanchuk, A. Del Signore, R. Shrivastava, S. Govindaraj,
mified 2D and 3D versions. A. Iloreta, Immersive virtual reality as a teaching tool for neuroanatomy,
International Forum of Allergy & Rhinology vol. 7, Wiley Online Library, 2017, pp.
1006–1013.
Authors’ contributions [16] D. Takagi, M. Hayashi, T. Iida, Y. Tanaka, S. Sugiyama, H. Nishizaki, Y. Morimoto,
Effects of dental students’ training using immersive virtual reality technology for
OL developed the four versions of the medical educational tool. OL, home dental practice, Educ. Gerontol. 45 (11) (2019) 670–680.
[17] H.M. Johnsen, M. Fossum, P. Vivekananda-Schmidt, A. Fruhling, Å. Slettebø,
LR and JG designed the experiments. OL performed the experiments.
Teaching clinical reasoning and decision-making skills to nursing students: design,
OL, LR, and JG conducted the data analysis and interpretation. LR and development, and usability evaluation of a serious game, Int. J. Med. Inform. 94
JG conceived and designed the research project. All authors have (2016) 39–48.
contributed to the manuscript and have read and approved the final [18] J.P. Richards, A.J. Done, S.R. Barber, S. Jain, Y.-J. Son, E.H. Chang, Virtual coach:
the next tool in functional endoscopic sinus surgery education, Int. Forum Allergy
version. Rhinol. 10 (1) (2020) 97–102, https://doi.org/10.1002/alr.22452.
[19] M. Aebersold, T. Voepel-Lewis, L. Cherara, M. Weber, C. Khouri, R. Levine,
Conflicts of interest A.R. Tait, Interactive anatomy-augmented virtual simulation training, Clin. Simul.
Nurs. 15 (2018) 34–41.
[20] M.G. Violante, E. Vezzetti, Design and implementation of 3d web-based interactive
None declared. medical devices for educational purposes, Int. J. Interact. Des. Manuf. (IJIDeM) 11
(1) (2017) 31–44.
[21] N. Cowan, Working memory underpins cognitive development, learning, and edu-
Acknowledgments cation, Educ. Psychol. Rev. 26 (2) (2014) 197–223.
[22] S. Gathercole, T.P. Alloway, Working Memory and Learning: A Practical Guide for
This work was supported by PROFEXCE 2020 and PROFAPI 2020. J. Teachers, Sage, 2008.
[23] Guías de práctica clínica, (2020) (accessed 25.03.20), http://www.imss.gob.mx/
O. Gutierrez-Garcia gratefully acknowledges the financial support from
guias_practicaclinica?field_categoria_gs_value=All.
the Asociación Mexicana de Cultura, A.C. [24] Y. Lee, K.A. Kozar, K.R. Larsen, The technology acceptance model: past, present,
and future, Commun. Assoc. Inf. Syst. 12 (1) (2003) 50.
[25] F.D. Davis, Perceived usefulness, perceived ease of use, and user acceptance of
References
information technology, MIS Q. 13 (3) (1989) 319–340.
[26] A. Assila, K. Oliveira, H. Ezzedine, Standardized usability questionnaires: features
[1] A. Herrmann-Werner, H. Weber, T. Loda, K. Keifenheim, R. Erschens, S. Mölbert, and quality focus, Electron. J. Comput. Sci. Inf. Technol.: eJCIST 6 (1) (2016)
C. Nikendei, S. Zipfel, K. Masters, “but dr google said..“-training medical students 15–31.
how to communicate with e-patients, Med. Teacher 41 (12) (2019) 1434–1440. [27] L.G. Wallace, S.D. Sheetz, The adoption of software measures: a technology ac-
[2] A. Rajaram, Z. Hickey, N. Patel, J. Newbigging, B. Wolfrom, Training medical ceptance model (tam) perspective, Inf. Manag. 51 (2) (2014) 249–259.
students and residents in the use of electronic health records: a systematic review of [28] M. Sailer, L. Homner, The gamification of learning: a meta-analysis, Educ. Psychol.
the literature, J. Am. Med. Inform. Assoc. 27 (1) (2020) 175–180. Rev. 32 (2020) 77–112.
[3] R. Tsopra, M. Courtine, K. Sedki, D. Eap, M. Cabal, S. Cohen, O. Bouchaud, [29] J. Marty, J. Heraud, T. Carron, L. France, Matching the performed activity on an
F. Mechaï, J.-B. Lamy, Antibiogame®: a serious game for teaching medical students educational platform with a recommended pedagogical scenario: a multi-source
about antibiotic use, Int. J. Med. Inform. 136 (2020) 104074. approach, J. Interact. Learn. Res. 18 (2) (2007) 267–283.
[4] S. Ammanuel, I. Brown, J. Uribe, B. Rehani, Creating 3d models from radiologic [30] L. De-Marcos, E. Garcia-Lopez, A. Garcia-Cabot, On the effectiveness of game-like
images for virtual reality medical education modules, J. Med. Syst. 43 (6) (2019) and social approaches in learning: comparing educational gaming, gamification &
166. social networking, Comput. Educ. 95 (2016) 99–113.
[5] M.-S. Bracq, E. Michinov, P. Jannin, Virtual reality simulation in nontechnical skills [31] N. Zhao, M. Wu, J. Chen, Android-based mobile educational platform for speech
training for healthcare professionals: a systematic review, Simul. Healthc. 14 (3) signal processing, Int. J. Electr. Eng. 54 (1) (2017) 3–16.

Downloaded for Anonymous User (n/a) at University of Sharjah from ClinicalKey.com by Elsevier on September 22,
2023. For personal use only. No other uses without permission. Copyright ©2023. Elsevier Inc. All rights reserved.

You might also like