International Journal of Industrial Ergonomics: Sciencedirect

Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

International Journal of Industrial Ergonomics 68 (2018) 110–117

Contents lists available at ScienceDirect

International Journal of Industrial Ergonomics


journal homepage: www.elsevier.com/locate/ergon

Comparing learning outcomes in physical and simulated learning T


environments
Myrtede Alfred, David M. Neyens∗, Anand K. Gramopadhye
Department of Industrial Engineering, Clemson University, Clemson, SC, 29634, USA

A R T I C LE I N FO A B S T R A C T

Keywords: The use of 2D and 3D simulated learning environments in education and training has increased significantly in
Simulated learning environments the past decade. Simulated learning environments provide several advantages over physical learning environ-
2D environments ments including increased safety and accessibility. Simulated learning environments can also be utilized in an
3D environments online setting, increasing the efficiency of delivery, access, and supporting greater personalization of the
learning process. Despite a long history of use in workforce education, researchers have questioned whether
simulations provide learners with the same quality of education as physical learning environments. This research
investigated how learning to construct electrical circuits using a 2D simulation, a 3D simulation or a physical
breadboard impacted learning outcomes. Additionally, this study examined the influence of learner character-
istics, cognitive ability and goal orientation, on the relationship between the simulated learning environments
and learning outcomes. The study utilized a pretest-posttest between subjects design and included 48 partici-
pants. Results suggest that learning to construct a circuit with physical components results in higher self-efficacy,
faster construction times, and higher odds of correct construction than learning in a 2D or 3D simulation.
Participants in the three conditions achieved comparable results in terms of cognitive outcomes; the differences
identified were based on cognitive ability and goal orientation. There were no significant differences in out-
comes achieved between participants in the 2D and 3D simulations. Implications for the design of simulated
learning environments and potential impact for online technical curriculum are discussed.
Relevance to industry: This study supports the evaluation of using online educational technology to learn tech-
nical skills. This is relevant to workforce education, especially with a diverse and distributed workforce that
requires technical training.

1. Introduction their own pace, on their own schedule, and until the point of profi-
ciency (Krueger, 1991; Zacharia, 2007). Simulations can also be de-
Technical education has been slower than other disciplines in livered in an online setting, allowing increased accessed and efficiency
adopting online delivery for course and laboratory instruction (Bernard of delivery, and greater personalization of the learning process
et al., 2004). This is, in part, due to the belief that laboratory education (Henderson et al., 2015; Kim et al., 2013). Developing effective simu-
for technical skills requires hands-on, classroom-based instruction that lations for technical courses, including simulated learning environ-
simulated environments cannot provide (Bourne et al., 2005; Zacharia ments to support laboratory-based instruction, is instrumental for in-
and Olympiou, 2011). This perspective is supported by concerns that creasing educational access and opportunities for students and fully
the adoption of simulations has occurred more rapidly than empirical exploiting the benefits of online education. This research sought to
evidence supporting its effectiveness (Goode et al., 2013) and re- evaluate the influence of simulated learning environments, both 2D and
cognition that offering technical courses, specifically those requiring a 3D, on learning outcomes for a technical course with a corresponding
lab component, in an online setting requires the development of ped- laboratory-based activity.
agogies that support course adaptation and effective evaluation
(Bernard et al., 2004). However, simulated learning environments
provide several advantages over physical learning environments in-
cluding a safer learner environment that allows learners to practice at


Corresponding author. Clemson University, 100 Freeman Hall, Clemson, SC, 29634, USA.
E-mail address: [email protected] (D.M. Neyens).

https://doi.org/10.1016/j.ergon.2018.07.002
Received 15 December 2016; Received in revised form 16 June 2018; Accepted 2 July 2018
0169-8141/ © 2018 Elsevier B.V. All rights reserved.
M. Alfred et al. International Journal of Industrial Ergonomics 68 (2018) 110–117

2. Background 2015). The use of 3D representations provides more flexibility and


realism, however, the increased complexity can make it difficult for
2.1. Physical and simulated laboratory instruction inexperienced users to navigate and attend to all to all of the in-
formation being conveyed resulting in degraded performance (Gillet
Laboratory instruction is a key educational feature for technical et al., 2013; Sampaio et al., 2010; Stuerzlinger and Wingrave, 2011).
disciplines as well as science, engineering, and math. These learning Technical issues like poor resolution and lag in a 3D virtual environ-
environments were developed with the belief that understanding how ment can also lead to performance deficiencies (Kenyon and Afenya,
to apply science to solve real world problems required both theory and 1995). Prior research has suggested that higher levels of fidelity are not
practice (Auer et al., 2003). In laboratories, students may study proper necessary, and sometimes even detrimental, to learning and transfer
laboratory technique, develop analytical thinking, connect theory to (Alexander et al., 2005). Additional research is needed to understand
practice, and gain hands-on experience (Woodfield et al., 2005). Stu- what aspects of 2D and 3D virtual representations of tasks are beneficial
dents also engage in active learning, conduct experiments, and employ for learning, as well as tasks, contexts, and domains may be best suited
problem-solving skills that facilitate the application of theory in prac- for these types of technologies (Richards and Taylor, 2015).
tical situations (Feisel and Rosa, 2005). Laboratory instruction has ty-
pically occurred in a classroom environment where students work in- 2.2. Learner characteristics in simulated learning environments
dividually or in team and are guided by an instructor or teaching
assistant. Using physical equipment and materials during laboratory Learner characteristics influence instructional effectiveness and
instruction represents the highest level of fidelity. Physical learning learning outcomes (Anderson, 1982; De Raad and Schouwenburg,
environments allow students to experience the sensory characteristics 1996; Noe, 1986; Snow, 1989; Shute and Towle, 2003). Personality
of the equipment and experiments and gain familiarity with the en- features are believed to impact affect; overlay features influence do-
vironment within which they will be used (Zacharia, 2007; Zacharia main knowledge; and cognitive features which influence students' in-
and Olympiou, 2011). formation processing (Kim et al., 2013). This study focused on two
Simulations are designed to model the core principles of a particular learner characteristics, goal orientation and cognitive ability, pre-
system (Jaakkola et al., 2011). Simulated learning environments in- viously found to influence learning outcomes. Goal orientation, com-
clude 2D, desktop 3D, and immersive virtual environments. In addition monly conceptualized as performance goal orientation (PGO) and
to providing increased accessibility these environments can foster the learning (or mastery) goal orientation (LGO), describes the way an in-
attention and engagement of students more readily than traditional dividual approaches an achievement task (Button et al., 1996; Elliott
methods (Adams et al., 2008; Stone, 2001). Simulations have the ability and Dweck, 1988). A PGO leads learners to focus on a narrow set of
to “make the invisible visible” (e.g., showing the current flow of an concepts impeding the learning of more involved task relationships that
electric circuit), which can help students learn complex relationships results in good initial performance but poor ability to apply the material
(Finkelstein et al., 2005; Jaakkola et al., 2011). Simulations also have in other contexts (Kozlowski et al., 2001). A LGO fosters a desire to
the added benefit of helping students to learn in an ideal environment explore relationships in greater depth and acquire the knowledge and
where they can focus on exploring concepts without the complications skills required for competency while building task-specific self-efficacy
associated with equipment and device reliability (Finkelstein et al., (Kozlowski et al., 2001). Cognitive ability describes an individual's
2005). However, simulations limit students from experiencing hands-on capacity to perform higher-order mental processes such as critical
manipulation of real materials, may lack the necessary detail and rea- thinking and problem-solving (Clark and Voogel, 1985). Higher cog-
lism to effective teach scientific techniques, and can distort reality nitive ability is associated with learning, retention, and application of
(Scheckler, 2003; Woodfield et al., 2005). Simulations also lack phy- skills and knowledge (Busato et al., 2000; Clark and Voogel, 1985).
sicality, which is “the actual and active touch of concrete material,” Lower cognitive ability individuals may need a more structured
which is believed to be important for learning (Zacharia and Olympiou, learning environment (Snow, 1989) suggesting that the less structured
2011, p. 318). Other researchers have suggested that it is the active and more autonomous nature of 2D or 3D environments may be det-
manipulation, rather than the physicality, that is the most important rimental to low cognitive ability learners particularly in online settings.
element of laboratory instruction (Resnick, 1998) and physicality may Currently, the authors are unaware of research investigating modera-
only be necessary for perceptual-motor skills (Triona and Klahr, 2003). tion effects of goal orientation and cognitive ability on fidelity (e.g., 2D
Several studies have evaluated using simulated environments in simulation, 3D simulation, or physical labs) for learning outcomes.
laboratory instruction as a supplement, a substitute, or in some com-
bination with physical instruction. Research has found both positive 2.3. Purpose of this study
and negative effects of simulation based instruction on learning out-
comes (Lee, 1999; Sitzmann, 2011). Simulations have been found Although previous research has identified value in using simulations
beneficial for helping students prepare for lab (Dalgarno et al., 2009; as a supplement or in combination with laboratory education, little
Martinez-Jimenez et al., 2003) and students learning in simulated en- research outside of the workforce training has specifically investigated
vironments can outperform students learning in physical environments the differences in outcomes between 2D and 3D simulations as well as
(Campbell et al., 2002; Finkelstein et al., 2005). A meta-analysis by Lee the influence of learner characteristics (Kim et al., 2013; Richards and
(1999) also found that simulations had a positive effect on learning but Taylor, 2015). The current study aimed to explore the role of the fi-
reported negative effect on students' affect for technology-based delity of the learning environment by comparing learning outcomes
learning. Combined simulation and physical instruction was found to associated with learning in a 2D, 3D, or physical environment. This
result in superior learning outcomes than students learning solely in a research also aimed to investigate the roles of goal orientation and
physical environment (Campbell et al., 2002; Jaakkola et al., 2011; cognitive ability on learning outcomes for participants learning in those
Zacharia, 2007) and simulations were effective for learning both pre- different environments.
sentation and practice when used with other instructional methods
(Lee, 1999). 3. Methods
Simulations, however, can also vary greatly in their level of fidelity.
Instruction using 2D simulation might be less effective as 2D re- 3.1. Participants
presentations may be inherently deficient for 3D representations and
translating the representation from 2D to 3D may result in additional Participants were recruited using word of mouth, flyers, and email
cognitive load for learners (Regian et al., 1992; Richards and Taylor, blasts. To be eligible, participants could not have been currently

111
M. Alfred et al. International Journal of Industrial Ergonomics 68 (2018) 110–117

enrolled in or taken a circuits-based class during the previous academic


year. Additionally, each participant must have been able to self-report
an ACT or SAT score. This study was approved by the Clemson
University IRB (# IRB 2015-001).

3.2. Experimental design and measures

This study utilized a pretest-posttest between subjects design. The


fidelity of the learning environment (with three levels, physical, 2D Fig. 1. 800 point solderless breadboard.
simulation, and 3D) was the between subjects variable and primary
independent variable (IV). The covariates included pretest scores,
cognitive ability, and goal orientation. Pretest scores were used to
control for individual differences in baseline knowledge and any ex-
posure to electrical circuits that was not restricted due to the study
design. Participants' SAT or ACT scores were used as a proxy for cog-
nitive abilities. Past research has demonstrated that the SAT (r = 0.82)
and the ACT (r = 0.77) both have a strong correlation with cognitive
ability (Koenig et al., 2008; Noftle and Robins, 2007) and a strong
correlation (r = 0.87) to one another (Dorans, 1999). For consistency,
Fig. 2. Arduino 2D breadboard (123d.circuits.io).
ACT composite scores were converted to total SAT scores for the ana-
lysis using the conversion chart developed by Dorans (1999). Learning
and performance goal orientations were each assessed using an eight watched two videos demonstrating how to construct a series and a
question instrument developed by Button et al. (1996). The reliability parallel circuit. Because students in the 2D and 3D conditions also had
for these questionnaires, indexed by Cronbach's alpha, was 0.72 for to learn to use the software, the instructional videos for each of the
PGO and 0.78 for LGO. These instruments also used a five-point Likert conditions varied in length. In total, each video ranged from 7 min to
scales anchored by strongly disagree and strongly agree. 17 min. These videos were designed specifically to teach participants
In order to facilitate a holistic evaluation of learning, the dependent how to construct circuits in their assigned condition. The practice cir-
measures included affective, cognitive, and skill-based outcomes cuits constructed were identical. Participants in the physical condition
(Kraiger et al., 1993). The affective measure was self-efficacy (SE), practiced constructing circuits using an 800-point solderless bread-
which describes a participant's belief in their ability to perform a task board (Fig. 1). Participants in the 2D condition practiced using a 2D
(Guthrie and Schwoerer, 1994). It was measured post instruction, to breadboard simulation (Fig. 2) and participants in the 3D condition
assess participants' confidence that they learned and can perform the practiced using a 3D breadboard) (Fig. 3). Participants in these two
instruction-related task, on a five point Likert Scale using a six question conditions navigated the 2D and 3D environments using a mouse and
instrument with a reliability of α = .82 (Guthrie and Schwoerer, 1994). keyboard. All participants used comparable circuit components, in-
The cognitive outcomes were gain score and circuit design score. Gain cluding light emitting diodes (LEDs), switches, batteries, and resistors.
score describes the improvement from the pretest score to the posttest During these videos, participants were shown how to use Ohm's law
score. It was calculated by subtracting the pretest score from the to calculate the resistance values needed for their circuit, how to design
posttest score. Circuit design score was graded on a three level scale – their circuit diagram and how to construct their circuit. Participants
no errors, minor errors, and major errors. Major errors included mis- were given three practice activities to complete. One of these practice
takes such as designing a series circuit instead of a parallel circuit, and activities instructed participants to complete a series circuit using a
minor errors included using incorrect symbols. The skill-based out- three prong switch, the second activity had participants construct a
comes were construction time, measured in minutes, and circuit con- parallel circuit, and the last activity demonstrated how to construct a
struction score, which followed the same scale used for circuit design circuit with parallel and series connections. During these practice ses-
score. With respect to circuit construction grades, major errors included sions, they were provided with feedback concerning the accuracy of
mistakes such as an inability to close the circuit properly, while minor their calculations and the construction of the circuit and were referred
errors included incorrect placement of the switch. to the appropriate video for review if they made errors. Participants
were not allowed to continue the experiment until they had successfully
3.3. Procedures completed the practice activities. Although this requirement led to
varying practice times, it was essential that participants demonstrate a
After completing the consent form, participants completed a five minimum level of proficiency before continuing. The study set-up in-
question multiple choice, paper-based pretest examining their knowl- cluded a computer workstation with two monitors so that participants
edge of basic electrical concepts. The pretest included questions about could watch the video on one screen while constructing their practice
defining electrical concepts (e.g., voltage, resistance, and current), circuits on the second screen. Students in all conditions had access to
identifying circuit diagram symbols (e.g., switches, resistors, battery, the instructional videos during their practice session.
and light emitting diodes), designing a circuit diagram, demonstrating Following these practice sessions, the participants completed a
understanding of breadboard functionality, and applying Ohm's law. survey assessing their post training SE and a 5-question multiple choice,
Each question had four answer options. Next, participants were set-up
at the study workstation which included a dual-monitor desktop,
mouse, and keyboard. Participants completed the demographic survey,
where they reported their SAT/ACT scores, and the goal orientation
instruments. They then watched a 28 min video lecture on circuit
analysis and basic circuit construction. The video lecture contained
three sections and each section included learning objectives and prac-
tice exercises.
Following this instruction, students were randomly assigned to one
of the experimental conditions (2D, 3D, or physical environment) and Fig. 3. National instruments multisim 3D breadboard.

112
M. Alfred et al. International Journal of Industrial Ergonomics 68 (2018) 110–117

paper-based post-test. The posttest was of the same structure, length, Table 2
and used the same types of questions as the pretest. Finally, the parti- ANOVA for participants' self-efficacy following instruction and practice.
cipants from all conditions constructed a circuit on a physical bread- Sum of Squares DF Mean Square F P-value
board without access to the video lectures. Participants had to first
design the circuit and use Ohm's law to determine the correct amount of Fidelity 3.16 2 1.58 3.81 0.031
PGO 1.35 1 1.35 3.26 0.079
resistance needed based on the voltage source they selected (a 9 V
LGO 0.666 1 0.666 1.61 0.212
battery or 1.5 V AA batteries). The circuit that needed to be designed Error 16.16 39 0.414
was a simple circuit that included a switch and 3 LEDs. The circuit Total 21.2 43
needed to be constructed such that the two LEDs were connected in
series and powered by a switch and the third LED was connected in R Squared = .237 (Adjusted R Squared = .159)

parallel. Construction time started once participants received the di-


rections and ended when participants submitted their final circuit.
While completing this construction task, they were video recorded
using a GoPro Hero4 Black camera. The camera was positioned above
the participant to record an aerial view of their work surface without
being intrusive.

3.4. Analysis

The data analysis was conducted using SPSS 22. ANOVAs were used
to analyze the effects of the predictor variables on self-efficacy, gain
scores, and construction time. An ordered logistic regression was used
to analyze the effects of the predictor variables on circuit design grade
and circuit construction grade. All of the models were evaluated at the
alpha = 0.05 level. Prior to the analysis, the data was evaluated to
ensure it met the assumptions for an ANOVA as well as the requisite Fig. 4. Means with 95% confidence interval of self-efficacy by condition.
assumptions for an ordered logistic regression. For the ordered logistic
regressions, the continuous variables were dichotomized into high and 4.1. Main effects of fidelity and co-variates on learning outcomes
low values based on a median split.
Four participants did not complete the SE survey, resulting in a total
4. Results of 43 observations. The predictor variables included in the ANOVA
model for SE were fidelity, LGO, and PGO. Only fidelity, F
Participants for this study included 48 undergraduate and graduate (2,39) = 3.809 (p = .031), was a significant predictor of SE (Table 2).
students from a public mid-sized Southeastern University. Engineering The mean SE was 4.36 (SD = .58) for participants in the physical
students represented approximately 33% of the participants, under- condition, 3.76 (SD = .67) for participants in the 2D condition, and
graduates accounted for 50%, and females comprised 62.5%. Most 3.93 (SD = .75) for participants in the 3D condition (Fig. 4). Post hoc
participants (79%) reported that they were in the 18–27 year old ca- analysis completed using the least significant difference (LSD) test re-
tegory and the remaining 21% were 28 years-old or older. The majority vealed significant differences in SE between participants in the physical
of the participants (92%) reported having little to no prior experience condition and participants in the 2D condition (p = .014) and also
working with circuits (Table 1). between participants in physical condition and the 3D condition
The data for one participant, who was in the physical condition, was (p = .038). Fidelity had a unique effect size of sr2 = .378, accounting
removed due to failure to report SAT or ACT scores as required by the for 37.8% of the variation in participants' self-rated SE.
study. Furthermore, three additional participants – all in the 3D en- The average gain score for all conditions was 0.24 (SD = .21), based
vironment - withdrew from the study, resulting in a different sample on a maximum score of one. The pretest scores ranged from 0.10 to 0.80
size for the circuit design and circuit construction activities. The total and the posttest scores ranged from 0.45 to 1.00. The ANOVA model for
number of participants in each condition for all of the dependent gain score included the predictor variables of LGO, PGO, cognitive
measures was 15 in the physical condition, 16 in the 2D condition, and ability, and pretest scores. LGO, F (1,40) = 5.02 (p = .031), cognitive
13 in the 3D condition. A one-way ANOVA found no significant dif- ability, F (1,40) = 6.49 (p = .015), and pretest scores, F (1,40) = 31.09
ferences, F (2,44) = .123 (p = .884), in the scores from the pretest as- (p < .001), were significant predictors of gain score. Pretest scores
sessing knowledge of circuit theory and construction for the partici- accounted for the largest percentage in the variation in gain scores,
pants in the three conditions. This suggests that there were no sr2 = .378, with cognitive ability and LGO also contributing small un-
detectable differences in the pre-existing knowledge of participants in ique effects, sr2 = .06 and sr2 = .057, respectively. Fidelity and PGO
the three conditions. were not significant predictors of gain score (Table 3).
Circuit design was graded on a scale ranging from no errors to major
Table 1 errors (Table 4). As one participant completed the circuit diagram prior
Participant demographics. to withdrawing from the study, there were a total of 45 observations for
Age 18–27 28 + this model. The majority of participants (51%) were able to correctly
38 (79%) 10 (21%) design the circuit (Table 4). An ordered logistic regression was used to
Gender F M analyze the effects of the IVs – cognitive ability, LGO and PGO – on
30 (62.5) 18 (37.5)
circuit design grades. The test of parallel lines for the ordered logistic
Major/Program Engineering Non-engineering
16 (33.3%) 42 (66.7%) model was found to be insignificant, suggesting the proportional odds
Circuit experience Little to none More than a little assumption was met (p = .161). For circuit design, only cognitive
44 (92%) 4 (8%) ability was found to be a significant predictor, Wald χ2 (1,
Classification Undergraduate Graduate N = 45) = 5.51 (p = .019). The odds of designing the circuit correctly
24 (50%) 24 (50%)
were 4.57 times higher [95% CI: 1.32, 17.15] for participants with high

113
M. Alfred et al. International Journal of Industrial Ergonomics 68 (2018) 110–117

Table 3 Table 6
ANOVA for participants' gain score from the pre-test to the post-test. Frequency of errors in participants' circuit construction grades.
Sum of Squares DF Mean Square F P-value Condition No errors Minor errors Major Errors Total

Fidelity 0.07 2 0.034 1.52 0.232 Physical 11 2 2 15


LGO 0.11 1 0.113 4.86 0.031 2D 8 2 6 16
PGO 0.00 1 0.005 0.23 0.886 3D 4 2 7 13
Cognitive ability 0.15 1 0.118 5.10 0.015 Total 23 6 15 44
Pretest score 0.70 1 0.741 32.0 < 0.001
Error 0.90 40 0.023
Total 1.98 46
insignificant (p = .77). For circuit construction, circuit design score,
R Squared = 0.544 (Adjusted R Squared = 0.476) Wald χ2 (1, N = 44) = 5.32, p = .024, and fidelity, Wald χ2 (2,
N = 44) = 2.93, p = .021, were found to be significant predictors. The
odds of constructing the circuit correctly were .04 times lower [95% CI:
Table 4 .003, .617] for participants who made major errors in their circuit de-
Frequency of errors in participants' circuit design task. sign compared to participants who made no errors. Additionally, the
odds for participants in the 3D condition were .064 times [95% CI:
Condition No errors Minor Errors Major Errors Total
.003, .617] lower than the odds for participants in the physical condi-
Physical 9 5 1 15 tion.
2D 7 6 3 16
3D 7 5 2 14
4.2. Moderating effects of learner characteristics on outcomes
Total 23 16 6 45
LGO was found to moderate the relationship between fidelity and
construction time, F (2, 33) = 3.41 (p = .045) (Table 5). The modera-
cognitive ability compared to participants with low cognitive ability.
tion effect uniquely accounting for 13.4% of the variation in con-
Fidelity, PGO, and LGO were not significant predictors.
struction time, sr2 = .134. In the 3D condition, participants who had a
Only 44 participants attempted construction as three participants
higher than average LGO constructed their circuit faster (24.11 min,
withdrew and the data for one participant was removed from the
SD = 9.61min) than those with a lower than average LGO (41.8 min,
analysis. The predictor variables included in the ANOVA model for
SD = 22.21 min). In the 2D condition, participants with a higher than
construction time were fidelity, goal orientation, and cognitive ability.
average LGO took longer to construct their circuit (33.86 min,
Fidelity was a significant predictor of construction time, F
SD = 17.32 min) than participants with a lower than average LGO
(2,33) = 4.87 (p = .014) (see Table 5). The mean construction time
(26.78 min, SD = 12.61 min) (Fig. 5). In the physical condition, parti-
differed among the three conditions, with participants in the physical
cipants with a higher than average LGO also constructed their circuit
condition taking 15.47 min (SD = 12.39 min), participants in the 2D
slower (18.22 min, SD = 15.24) min than those with a lower than
simulation condition taking 29.88 min (SD = 14.76 min), and partici-
average LGO (11.43 min, SD = 4.89 min). Further analysis found that
pants in the 3D condition taking 30.43 min (SD = 16.91 min). Post hoc
this pattern is consistent even after removing participants who gave up
analysis using LSD found significant differences between the physical
or were ultimately unsuccessful in their construction attempt.
condition and the 2D condition (p = .018) as well as between the
physical condition and the 3D condition (p = .019). There were no
significant differences in mean construction times between the 2D and 5. Discussion
3D conditions (p = .620). Fidelity accounted for 19% of the variation in
participants' construction time, sr2 = .19. LGO, PGO, and cognitive Self-efficacy (SE) is an important learning outcome as it influences
ability did not have any main effects on construction time. effort, persistence, and emotional response (Zimmerman, 2000). Par-
Circuit construction, like circuit design, was also scored on a scale ticipants in the physical condition had a higher mean SE than the
ranging from no errors to major errors (Table 6). Of the 44 participants participants in both the 2D and 3D conditions. This result was contra-
who attempted construction, 52% were able to correctly construct the dictory to the findings of the meta-analysis conducted by Sitzmann
circuit. An ordered logistic regression was used to analyze the effects of (2011) which found participants in simulated environments SE was
all the IVs –fidelity, cognitive ability, LGO and PGO – as well as circuit 20% higher than those in traditional instructional environments.
design score on circuit construction. The proportional odds assumption However, the studies in the meta-analysis replaced traditional class-
for this model was also met as the test of parallel lines was found to be room methods, lectures and discussion, with simulated learning and
this study focused on the technical task, circuit construction, and only
varied the practice portion. In this regard, participants in the physical
Table 5
condition had the advantage of fidelity for the circuit construction task
ANOVA for participants' construction time on the physical breadboard.
as it was completed on a physical breadboard although they were
Sum of DF Mean Square F P-value unaware of this fact beforehand as the SE instrument was completed
Squares
before the construction task. Potentially, participants in the simulated
PGO 12.35 1 12.35 0.056 0.815 environments recognized that constructing a circuit is a hands-on and
LGO 1.06 1 1.06 0.005 0.945 that what they learned would inherently be different from how the task
Cognitive ability 254.46 1 254.46 1.150 0.292 would be performed in the real world and, as a result, had a lower self-
Fidelity 2161.65 2 1080.83 4.870 0.014
efficacy. Prior research has found that participants learning in simu-
Fidelity*Cognitive 103.67 2 51.84 0.233 0.793
ability
lated environment may doubt the real-world application of the phe-
Fidelity*LGO 1513.34 2 756.67 3.410 0.045 nomena experienced in the simulation (Couture, 2004).
Fidelity*PGO 322.13 2 161.07 0.725 0.492 Fidelity did not significantly predict the cognitive outcomes, gain
Error 7328.61 33 222.08 scores or circuit design grades. This finding is in line with prior research
Total 11288.31 44
that has found that are no differences in cognitive outcomes between
R Squared = .351 (Adjusted R Squared = .134) physical and simulated environments when the instructional method is
controlled, as it was in the study (Clark, 1994; Jaakkola and Nurmi,

114
M. Alfred et al. International Journal of Industrial Ergonomics 68 (2018) 110–117

Fig. 5. LGO and physical fidelity interaction on construction time.

2008; Triona and Klahr, 2003; Zacharia and Olympiou, 2011). Parti- their performance when they transitioned to the physical environment
cipants in all of the conditions watched the same video lecture and (Goodman and Wood, 2004). Interestingly, this moderation effect was
completed the same practice exercises and activities, therefore it was the only significant difference detected in the outcomes achieved be-
not anticipated that fidelity would impact the cognitive outcomes. LGO tween participants in the 2D and the 3D simulations, although parti-
was a significant predictor of gain score and cognitive ability was a cipants in the 3D demonstrated lower SE, higher construction time, and
significant predictor of both gain score and circuit design. Both of these lower odds of correctly constructing their circuit. Existing literature has
characteristics are associated with better educational performance suggested that increasing the level of fidelity does not necessarily im-
(Button et al., 1996; Clark and Voogel, 1985). Specifically for circuit prove learning outcomes as higher levels of fidelity are more difficult to
design, while most participants knew how to construct a diagram, those navigate and may increase the cognitive load of participants and
with a higher cognitive ability were likely better able to design a circuit (Alexander et al., 2005; Gillet et al., 2013; Paas and Sweller, 2014;
that was different than what had been designed during practice. Stuerzlinger and Wingrave, 2011). The poor results for 3D participants
Fidelity was a significant predictor of the skill-based outcomes, may have been exacerbated by the fact that majority of the participants
construction time and circuit construction. Participants in the physical (n = 30) were female and prior research has identified gender differ-
condition were able to construct the circuit twice as fast as participants ences in the spatial ability (Feng et al., 2007). The value of 3D en-
in either the 2D or 3D condition and were more likely to construct the vironments over 2D is likely dependent on the task being studied
circuit correctly. The identical elements theory may explain this dif- (Richards and Taylor, 2015).
ference in construction time between participants in the three condi-
tions as it posits that there will be a higher positive transfer when the
instruction environment is identical to the performance environment 5.1. Limitations
(Goldstein and Ford, 2002). Participants who practiced in the physical
condition had the benefit of a higher level of fidelity, a situation which Although both undergraduate and graduate students were used to
likely contributed to their ability to construct the circuit much faster create a more diverse group of participants, a more representative
than participants in the other two conditions. Participants in the 2D and sample would have included non-traditional students as they are more
3D conditions likely exerted additional effort (and thus time) to accli- likely to enroll in online courses and technical curriculum than tradi-
mate to working with physical components. Furthermore, there were tional students (Allen and Seaman, 2007). Participants without any
characteristics of the software design, rather than an innate char- prior experience working on circuits would have been preferable;
acteristic of 2D simulations and 3D learning environments, that may however, the majority of the participants in the study reported having
have been detrimental to participants' performance when they transi- little to no prior experience. The exclusion criteria for participation,
tioned from the simulated environment to the physical environment. could not have taken a circuits course in the past year, is supported by
Various abstractions such as keying resistance value versus reading it prior research which has found that the unused skills and knowledge
from a resistance color code sheet and feedback mechanisms, such as decay quickly following instruction and then more slowly until it
displaying blown LEDs or incorrect connections, simplified circuit reaches the pre-instruction levels in several months (Arthur et al., 1998;
construction and provided a level of support unavailable to participants O'Hara, 1990). The engineering students in the study may have only
when they transitioned to working with the physical breadboard. These completed a survey circuits course for non-electrical engineers or may
findings provide insights regarding how the design of simulated en- not have taken a circuits course at all, depending on their year.
vironments can be improved to support learning and transfer. Certain The sample size for the study was relatively low (16 per condition)
abstractions may need to be removed to facilitate transfer or may need and was reduced further due to the withdrawal of several participants.
to be introduced only after learners have established proficiency. Si- As a result, the power of the analysis was not ideal. Despite the low
milarly, feedback should be reduced as learners gain proficiency so they power, significant differences were identified between the fidelity of
do not become too dependent on it, which can hinder performance the environment or learner characteristics on all of the measured DVs.
(Goodman and Wood, 2004). Dichotomizing continuous variables for the logistic regression fa-
These differences in the 2D and 3D environments potentially in- cilitated interpretation of the results but also reduced the sensitivity of
fluences interaction found between LGO and fidelity on construction the analysis. Since both the 2D and 3D software were purchased off the
time. For participants in the 3D condition, having a high LGO resulted shelf, there were differences in these environments, besides fidelity,
in a lower construction time but in the 2D and physical condition, such as level of feedback provided, that potentially contributed to the
having a higher LGO resulted in a higher mean construction time. The results. Lastly, while circuit construction may be compared to other
3D software provided immediate feedback about circuit connectivity cognitive procedural tasks, additional research should evaluate the
and thus participants with a low LGO may have depended more heavily extent to which the results from this study are generalizable across tasks
on this feedback than those with a high LGO and it was detrimental to and other types of laboratory-based instruction.

115
M. Alfred et al. International Journal of Industrial Ergonomics 68 (2018) 110–117

6. Conclusions 33 (2), 113–123.


Couture, M., 2004. Realism in the design process and credibility of a simulation-based
virtual laboratory. J. Comput. Assist. Learn. 20 (1), 40–49.
Identifying how technical and hands-on tasks can be effectively Dalgarno, B., Bishop, A.G., Adlong, W., Bedgood, D.R., 2009. Effectiveness of a virtual
learned in simulated environments is an important question for ex- laboratory as a preparatory resource for distance education chemistry students.
panding course offerings in online education and subsequently in- Comput. Educ. 53 (3), 853–865.
De Raad, B., Schouwenburg, H.C., 1996. Personality in learning and education: a review.
creasing access and educational equity. While these findings suggest Eur. J. Pers. 10 (5), 303–336.
that instruction using physical components was superior, there was Dorans, N.J., 1999. Correspondences between ACT™ and SAT® I scores. ETS Res. Rep. Ser.
evidence of transfer for participants who learned to construct a circuit 1999 (1) i-18.
Elliott, E.S., Dweck, C.S., 1988. Goals: an approach to motivation and achievement. J.
in a simulated environment. Improvements in the design of both the 2D Pers. Soc. Psychol. 54 (1), 5.
and 3D software and better incorporation of instructional design prin- Feisel, L.D., Rosa, A.J., 2005. The role of the laboratory in undergraduate engineering
ciples could help address issues participants faced when they transi- education. J. Eng. Educ. 94 (1), 121–130.
Feng, J., Spence, I., Pratt, J., 2007. Playing an action video game reduces gender dif-
tioned and reduce the time needed to acclimate to the physical en-
ferences in spatial cognition. Psychol. Sci. 18 (10), 850–855.
vironment. The 3D condition appeared to offer no significant Finkelstein, N.D., Adams, W.K., Keller, C.J., Kohl, P.B., Perkins, K.K., Podolefsky, N.S.,
advantages over a 2D simulation for helping participants learn to Reid, S., LeMaster, R., 2005. When learning about the real world is better done vir-
construct a circuit on a breadboard. Therefore, it may not be necessary tually: a study of substituting computer simulations for laboratory equipment. Phys.
Rev. ST Phys. Educ. Res. 1 (1), 010103.
to devote the time and resources to develop and maintain 3D en- Gillet, D., De Jong, T., Sotirou, S., Salzmann, C., 2013, March. Personalised Learning
vironments and for learners using 2D environments likely reduces the Spaces and Federated Online Labs for Stem Education at School. In: Global
technological requirements to run the simulation. This finding may be Engineering Education Conference (EDUCON), 2013 IEEE. IEEE, pp. 769–773.
Goldstein, I.L., Ford, J.K., 2002. Training in Organizations Belmont. Wadsworth, CA.
specific to the task as prior research has identified value in using 3D Goodman, J.S., Wood, R.E., 2004. Feedback specificity, learning opportunities, and
representation for other tasks, such as navigation (Regian et al., 1992). learning. J. Appl. Psychol. 89 (5), 809.
This research also highlighted the importance of identifying and eval- Goode, N., Salmon, P.M., Lenné, M.G., 2013. Simulation-based driver and vehicle crew
training: applications, efficacy and future directions. Appl. Ergon. 44 (3), 435–444.
uating learner characteristics when selecting the learning environment Guthrie, J.P., Schwoerer, C.E., 1994. Individual and contextual influences on self-assessed
as characteristics, such as cognitive ability and goal orientation, can training needs. J. Organ. Behav. 15 (5), 405–422.
influence performance. Future research should continue to identify Henderson, M., Selwyn, N., Aston, R., 2015. What works and why? Student perceptions of
‘useful’digital technology in university teaching and learning. Stud. High Educ. 1–13.
which learner characteristics are most important and how they impact Jaakkola, T., Nurmi, S., 2008. Fostering elementary school students' understanding of
various learning outcomes in simulated environments. simple electricity by combining simulation and laboratory activities. J. Comput.
Assist. Learn. 24 (4), 271–283.
Jaakkola, T., Nurmi, S., Veermans, K., 2011. A comparison of students' conceptual un-
Acknowledgements
derstanding of electric circuits in simulation only and simulation-laboratory contexts.
J. Res. Sci. Teach. 48 (1), 71–93.
This work was supported by the National Science Foundation under Kenyon, R.V., Afenya, M.B., 1995. Training in virtual and real environments. Ann.
Grant No. DUE-1104181. Any opinions, findings, and conclusions or Biomed. Eng. 23 (4), 445–455.
Kim, J., Lee, A., Ryu, H., 2013. Personality and its effects on learning performance: design
recommendations expressed in this material are those of the author(s) guidelines for an adaptive e-learning system based on a user model. Int. J. Ind. Ergon.
and do not necessarily reflect the views of the National Science 43 (5), 450–461.
Foundation. This work was also supported by through a Clemson Koenig, K.A., Frey, M.C., Detterman, D.K., 2008. ACT and general cognitive ability.
Intelligence 36 (2), 153–160.
University Diversity Fellowship, the Bowen Graduate Fellowship, and Kozlowski, S.W., Gully, S.M., Brown, K.G., Salas, E., Smith, E.M., Nason, E.R., 2001.
the Southern Regional Education Board Fellowship. The authors would Effects of training goals and goal orientation traits on multidimensional training
also like to thank the Ergonomics and Applied Statistics Lab at Clemson outcomes and performance adaptability. Organ. Behav. Hum. Decis. Process. 85 (1),
1–31 IbeIbelieve1/10.
University for their review of this manuscript. Kraiger, K., Ford, J.K., Salas, E., 1993. Application of cognitive, skill-based, and affective
theories of learning outcomes to new methods of training evaluation. J. Appl.
References Psychol. 78 (2), 311.
Krueger, M.W., 1991. Artificial Reality II. Reading. Addison-Wesley.
Lee, J., 1999. Effectiveness of computer-based instructional simulation: a meta analysis.
Adams, W.K., Reid, S., LeMaster, R., McKagan, S.B., Perkins, K.K., Dubson, M., Wieman, Int. J. Instr. Media 26 (1), 71.
C.E., 2008. A study of educational simulations part 1-engagement and learning. J. Martínez-Jiménez, P., Pontes-Pedrajas, A., Climent-Bellido, M.S., Polo, J., 2003. Learning
Interact. Learn. Res. 19 (3), 397. in chemistry with virtual learning environments. J. Chem. Educ. 80, 346.
Alexander, A.L., Brunyé, T., Sidman, J., Weil, S.A., 2005. From Gaming to Training: a Noe, R.A., 1986. Trainees' attributes and attitudes: neglected influences on training ef-
review of Studies on Fidelity, Immersion, Presence, and Buy-in and their Effects on fectiveness. Acad. Manag. Rev. 11 (4), 736–749.
Transfer in PC-based Simulations and Games, vol. 5. DARWARS Training Impact Noftle, E.E., Robins, R.W., 2007. Personality predictors of academic outcomes: big five
Group, pp. 1–14. correlates of GPA and SAT scores. J. Pers. Soc. Psychol. 93 (1), 116.
Allen, I.E., Seaman, J., 2007. Making the Grade: Online Education in the United States, O'Hara, J.M., 1990. The retention of skills acquired through simulator-based training.
2006. Sloan Consortium, Newburyport, MA PO Box 1238. Ergonomics 33 (9), 1143–1153.
Anderson, J.R., 1982. Acquisition of cognitive skill. Psychol. Rev. 89 (4), 369. Paas, F., Sweller, J., 2014. Implications of Cognitive Load theory for Multimedia
Arthur Jr., W., Bennett Jr., W., Stanush, P.L., McNelly, T.L., 1998. Factors that influence Learning, vol. 27. The Cambridge handbook of multimedia learning, pp. 42.
skill decay and retention: a quantitative review and analysis. Hum. Perform. 11 (1), Regian, J.W., Shebilske, W.L., Monk, J.M., 1992. Virtual reality: an instructional medium
57–101. for visual spatial tasks. J. Commun. 42 (4), 136–149.
Auer, M., Pester, A., Ursutiu, D., Samoila, C., 2003, December. Distributed virtual and Resnick, M., 1998. Technologies for lifelong kindergarten. Educ. Technol. Res. Dev. 46
remote labs in engineering. In Industrial technology. In: 2003 IEEE international (4), 43–55.
conference on Vol. 2. IEEE, pp. 1208–1213. Richards, D., Taylor, M., 2015. A Comparison of learning gains when using a 2D simu-
Bernard, R.M., Brauer, A., Abrami, P.C., Surkes, M., 2004. The development of a ques- lation tool versus a 3D virtual world: an experiment to find the right representation
tionnaire for predicting online learning achievement. Dist. Educ. 25 (1), 31–47. involving the Marginal Value Theorem. Comput. Educ. 86, 157–171.
Bourne, J., Harris, D., Mayadas, F., 2005. Online engineering education: learning any- Sampaio, A.Z., Ferreira, M.M., Rosário, D.P., Martins, O.P., 2010. 3D and VR models in
where, anytime. J. Eng. Educ. 94 (1), 131. Civil Engineering education: construction, rehabilitation and maintenance. Autom.
Busato, V.V., Prins, F.J., Elshout, J.J., Hamaker, C., 2000. Intellectual ability, learning ConStruct. 19 (7), 819–828.
style, personality, achievement motivation and academic success of psychology stu- Scheckler, R.K., 2003. Virtual labs: a substitute for traditional labs? Int. J. Dev. Biol. 47
dents in higher education. Pers. Indiv. Differ. 29 (6), 1057–1068. (2–3), 231–236.
Button, S.B., Mathieu, J.E., Zajac, D.M., 1996. Goal orientation in organizational research: Shute, V., Towle, B., 2003. Adaptive e-learning. Educ. Psychol. 38 (2), 105–114.
a conceptual and empirical foundation. Organ. Behav. Hum. Decis. Process. 67 (1), Sitzmann, T., 2011. A meta-analytic examination of the instructional effectiveness of
26–48. computer-based simulation games. Person. Psychol. 64 (2), 489–528.
Campbell, J.O., Bourne, J.R., Mosterman, P.J., Brodersen, A.J., 2002. The effectiveness of Snow, R.E., 1989. Aptitude-treatment Interaction as a Framework for Research on
learning simulations for electronic learning environments. J. Eng. Educ. 91 (1), Individual Differences in Learning.
81–87. Stone, R.J., 2001. Haptic feedback: a brief history from telepresence to virtual reality. In:
Clark, R.E., 1994. Media will never influence learning. Educ. Technol. Res. Dev. 42 (2), Haptic Human-computer Interaction. Springer Berlin Heidelberg, pp. 1–16.
21–29. Stuerzlinger, W., Wingrave, C.A., 2011. The value of constraints for 3D user interfaces. In:
Clark, R.E., Voogel, A., 1985. Transfer of training principles for instructional design. ECTJ Virtual Realities. Springer Vienna, pp. 203–223.

116
M. Alfred et al. International Journal of Industrial Ergonomics 68 (2018) 110–117

Triona, L.M., Klahr, D., 2003. Point and click or grab and heft: comparing the influence of Zacharia, Z.C., 2007. Comparing and combining real and virtual experimentation: an
physical and virtual instructional materials on elementary school students' ability to effort to enhance students' conceptual understanding of electric circuits. J. Comput.
design experiments. Cognit. InStruct. 21 (2), 149–173. Assist. Learn. 23 (2), 120–132.
Woodfield, B.F., Andrus, M.B., Andersen, T., Miller, J., Simmons, B., Stanger, R., ... Zacharia, Z.C., Olympiou, G., 2011. Physical versus virtual manipulative experimentation
Bodily, G., 2005. The virtual ChemLab project: a realistic and sophisticated simula- in physics learning. Learn. InStruct. 21 (3), 317–331.
tion of organic synthesis and organic qualitative analysis. J. Chem. Educ. 82 (11), Zimmerman, B.J., 2000. Self-efficacy: an essential motive to learn. Contemp. Educ.
1728. Psychol. 25 (1), 82–91.

117

You might also like