Students' Perception
Students' Perception
Students' Perception
DOI 10.1007/s10984-008-9047-2
ORIGINAL PAPER
Abstract Across several decades, educational researchers have investigated the contri-
bution of the learning environment to the attainment of educational goals, such as
improving academic achievement and motivation to learn. The term learning environment
not only includes physical activities in the classroom (e.g. experiments kits, computers),
but also includes teaching methods, the type of learning in which pupils are engaged, and
assessment methods. In this research, we refined an approach to measuring the impact of a
variety of classroom features on many different learning outcomes through the lens of
students’ perception. A new instrument, the Design-Based Learning Environment Ques-
tionnaire (DBLEQ), was field-tested in an eighth-grade USA science classroom setting.
This study examined pre-post changes for two different curricula, one emphasising design-
based learning (n = 464) and another emphasising scripted inquiry (n = 248). The value
of the instrument and ways of analysing its data are illustrated through the range of
differences that were found between conditions over time.
Introduction
What is the science classroom like in a particular setting? This is a common question in
learning environment research. Previous research has investigated pupils’ perceptions of
their learning environment using questionnaires that assess classroom characteristics such
Present Address:
Y. Doppelt
194 Hadror Street, Segev 20170, Israel
123
Learning Environ Res
The term learning environment relates to the psychology, sociology and pedagogy of the
contexts in which learning takes place and their influence on pupils’ achievement in the
cognitive and affective domains.
Researchers have previously created numerous different questionnaires based upon the
classic definition of behaviour of the learner being a function of the environment and the
learner’s personality (Lewin 1936). Three of these questionnaires are overviewed in
Table 1. These questionnaires have been used in many studies across the world. They
include various items divided into the scales that are presented in Table 1.
These questionnaires have enabled researchers to study and build a broad perspective on
different factors that are involved in the learning process. However, there is no differen-
tiation in these questionnaires between learning activities and the outcomes of the learning
process. These questionnaires do not refer directly to the influence of a particular learning
environment characteristic on a particular learning outcome.
Various characteristics of the learning environment have been found to influence
learning outcomes (Fraser 1998; Fraser et al. 1995; Henderson et al. 2000), including class
arrangements, computers, laboratory experiment kits, teaching methods, learning styles
and assessment methods (Doppelt and Barak 2002).
Learning environment research has examined academic achievement and other learning
outcomes in the cognitive and affective domains (Doppelt 2004; Doppelt and Barak 2002)
to build a broad understanding of the variables involved. According to Fraser (1998, p.
528): ‘‘Defining the classroom or school environment in terms of the shared perception of
the pupils and teachers has the dual advantage of characterising the setting through the eyes
123
Learning Environ Res
Moos (1974) proposed the three types of dimensions (Relationship, Personal development, System main-
tenance and change)
of the participants themselves and of capturing data, which an external observer could miss
or consider unimportant.’’
Finally, several researchers have recommended combining quantitative and qualitative
tools for learning environment research in order to achieve wide and deep perspectives on
the research field (Fraser and Tobin 1991; Wong and Fraser 1996).
Design-based learning
123
Learning Environ Res
Method
This research identified the most important classroom features and learning aspects in DBL
through an analysis of the combined responses of pupils using the DBLEQ, which was
specifically developed for this purpose.
In this research, an experimental group of 587 Grade eight pupils (14–15 years old) from
nine different schools studied electronics using DBL. A contrast group of 466 pupils from
six different schools in the same school district learned electronics using scripted inquiry.
The participants completed the DBLEQ before and after they began studying electronics.
Four hundred and sixty-four participants from the experimental group and 248 from the
contrast group completed both pretest and posttest questionnaires.
The current study was conducted at middle schools in a mid-size, urban public school
district in the USA. In collaboration with district leaders and teachers, university
researchers developed the Electrical Alarm System module in the winter of 2003–2004
(Doppelt et al. 2005). The reform curriculum was designed to supplement and partially
replace the first 4–6 weeks of instruction in the established curriculum with an open-ended
design project. In 2004–2005, the district officially adopted the reform curriculum and
encouraged all teachers to implement it in their classrooms.
The structure and content of the DBLEQ was based on similar questionnaires used in
studies of design-based learning (Doppelt 2004, 2006; Doppelt and Barak 2002). The first
study (Doppelt and Barak 2002) dealt with the learning environment of design-based
learning in technology education in high schools in Israel. The second and the third studies
dealt with the learning environment of a science curriculum in middle schools in Israel.
In the current study, a combined questionnaire was developed to explore classroom
features and learning aspects of a learning environment that integrated design and science.
From our observations and other data analysed elsewhere (Doppelt et al. 2006), we
found that both inquiry and design-based settings have similar kinds of classroom features
and learning outcomes. As these environment features can have different effects on student
learning in the two settings, it is possible to explore pupils’ perspectives on the most
influential classroom features and learning aspects using a common instrument.
123
Learning Environ Res
In this study, the term ‘classroom features’ refers to the learning environment charac-
teristics and the term ‘learning aspects’ refers to the learning outcomes. The DBLEQ is
based on the question that is described in Fig. 1, which serves as a mapping sentence that
provides a flexible structure for researchers and teachers to construct and use in classes
(Waks 1995). In this structure of 14 learning environment characteristics and 14 learning
aspects, the questionnaire could have 196 items. In order to reduce the number of items that
a given student is required to complete, we created four versions. Each version includes 49
items (7 learning environment characteristics with 7 learning aspects). Each student was
randomly assigned to receive one of the four versions. Each questionnaire asks students to
score 49 items on a scale ranging from 5 (Very High Importance) to 1 (Very Low
Importance).
All pupils in this study had responded to the DBLEQ as a pretest before teachers started
to teach Electronics. Pupils responded to the DBLEQ as a posttest in the first week of June
at the end of the school year. Teachers were asked to devote one period (45 min) to the
completion of the questionnaire by their students.
Two types of questionnaire reliability were examined: stability and Cronbach’s a. Pupils
were asked to justify one high score and one low score to serve as a criterion in examining
the validity of the questionnaire.
Figure 2 presents the final questionnaire.
Findings
First, we present an analysis of the most influential characteristics and major outcomes.
Second, we apply factor analysis to reduce the characteristics and outcomes to be con-
sidered and then compare pretest and posttest results on the questionnaire.
Every characteristic can influence several learning outcomes in the affective and cognitive
domains. Here we compare perceptions of pupils in the two research groups in terms of the
most influential characteristics and major outcomes of the learning environment.
We begin with the basic question of whether ratings were higher overall in one group
than the other, either indicating halo effects on ratings or that ratings generally were more
positive for one group more than another. Because the overall ratings of the experimental
123
Classroom Learning Aspects
Features Good Desire to Self- Being Thinking Success in Desire to Motivation Make me Make me Make me Interest in Understand- Teamwork
relationship learn confidence challenged skills the science come to to work at feel feel curious science ing the ideas skills
with the class class home responsible independent topics in class
teacher for my
learning
123
Independent
Learning
Activities
Class
Discussions
Thinking
Activities
Constructing
Activities
Using the
Computer
Different
Learning
Experiences
Homework
Team
Projects
Designing
Activities
Performing
Experiments
Acceptance of
Mistakes
Tolerance of
Different
Opinions
Freedom to
Choose the
Learned Topics
Instructional
Worksheets
Fig. 2 The final questionnaire—rating the importance of each classroom feature on each learning aspect
Learning Environ Res
Learning Environ Res
group were not significantly different from those of the contrast group, using a t-test, we
need not worry about overall differences in using the ratings across groups as we examine
more specific differences.
Figures 3 and 4 present the mean scores from the two groups for the full matrix of
influence of each characteristic on each outcome. The highest means in each group are
bolded. Although there are general similarities, there are also some differences.
We next examine trends in differences between groups for each particular learning
environment characteristic (across outcomes) and for each particular learning outcome
(across learning environment characteristics).
Focusing on the learning environment characteristics, the highest mean scores for the
experimental group (Me) and the control group (Mc) were Homework (Me = 2.79,
Mc = 3.43), Instructional Worksheets (Me = 2.64, Mc = 3.38). Additionally, the experi-
mental group gave high ratings for Using the Computer (Me = 2.62).
Turning to outcomes, the highest mean scores for outcomes were Motivation to Work at
Home (Me = 2.68, Mc = 3.49), Interest in Science Topics (Me = 2.57, Mc = 3.45),
Making me Curious (Me = 2.57, Mc = 3.39). Additionally, the experimental group
highlighted Good Relationships with the Teacher (Me = 2.50) more than the contrast
group did, and the contrast group highlighted Independent Learning Activities (Mc = 3.37)
more than the experimental group did.
Figures 5 and 6 show that the experimental group rated many of the characteristics
lower, despite its curriculum being more engaging and demanding overall (from analyses of
concept learning and classroom observations of engagement). This trend in the DBLEQ is
perhaps explained by the fact that the developed module engaged the experimental group in
activities that encourage critical thinking, which also could have influenced their critical
evaluations. From our classroom observations, we suspect that the high scores for Home-
work and Instructional Worksheets could be explained by the fact that, for both teaching
methods, the class and homework assignments were short and focused. In practice, the
computer was rarely used in the newly-developed DBL curriculum or in the traditional
scripted enquiry used by the contrast group, and therefore we cannot explain these data.
The large number of potential variables was reduced using factor analysis to enable pre-
post comparison without obtaining too many differences just by chance, as well as to focus
attention on common trends. The 14 original characteristics can be reduced to eight and the
14 original learning outcomes can be reduced to six. Table 2 provides a comparison of
scale scores obtained by the experimental group on the pretest and the posttest.
The results show that Homework (M = 2.87) and Instructional Worksheets (M = 2.71)
were the most influential characteristics. From the outcomes viewpoint, Interest in Science
Topics (M = 2.66), Teamwork Skills (M = 2.61), and Independent Learning Activities
(M = 2.60) were found to be the most impacted outcomes.
Students were requested to score their interest in taking science classes in high school. The
scale ranged from 5 (Very Interested) to 1 (Very Uninterested). Figure 7 presents a
comparison of the experimental group with the contrast group.
123
Learning Mean score
Environment Good Desire to Self- Being Thinking Success in Desire to Motivation Make me Make me Make me Interest in Understand- Teamwork Mean
Characteristic relationship learn confidence challenged skills the science come to to work at feel feel curious science ing the ideas skills
with the class class home responsible independent topics in class
teacher for my
learning
123
Independent 2.36 2.09 1.88 2.17 1.86 2.15 2.17 2.64 1.99 1.88 2.43 2.70 2.40 3.29 2.29
Learning
Activities
Class 2.16 2.27 2.10 2.58 2.26 2.24 2.37 2.89 2.67 2.95 2.70 2.59 2.02 2.03 2.42
Discussions
Thinking 2.61 2.31 2.20 2.08 1.85 2.19 2.26 2.73 2.29 2.42 2.25 2.55 2.44 2.59 2.34
Activities
Constructing 2.34 2.07 2.32 2.05 2.24 2.17 2.13 2.37 2.36 2.71 2.46 2.68 2.43 2.22 2.33
Activities
Using the 2.97 2.38 2.57 2.59 2.57 2.59 2.35 2.46 2.74 2.35 2.78 2.84 2.60 2.83 2.62
Computer
Different 2.34 2.25 2.07 2.17 1.89 2.03 2.22 2.60 2.37 2.60 2.29 2.64 2.20 2.44 2.29
Learning
Experiences
Homework 3.30 2.90 2.52 2.50 2.33 2.63 3.23 2.64 2.35 2.33 2.94 3.20 2.73 3.49 2.79
Team 2.58 2.13 2.70 2.35 2.02 2.06 1.83 2.98 2.43 3.14 2.46 2.18 2.18 1.64 2.33
Projects
Designing 2.36 2.14 2.21 1.96 2.25 2.01 2.27 2.46 2.07 2.35 2.34 2.25 2.05 2.09 2.20
Activities
Performing 2.11 2.04 2.05 2.14 2.00 2.19 2.08 2.51 2.25 2.36 2.22 2.07 1.99 2.03 2.14
Experiments
Acceptance of 2.42 2.21 2.26 2.38 2.52 2.25 2.38 2.84 2.17 2.44 2.86 2.70 2.70 2.45 2.47
Mistakes
Tolerance of 2.31 2.35 2.50 2.33 2.45 2.15 2.52 3.04 2.52 2.75 2.66 2.61 2.66 2.12 2.50
Different
Opinions
Freedom to 2.44 2.30 2.46 2.55 2.31 2.18 2.25 2.49 2.22 2.10 2.64 2.24 2.46 2.43 2.36
Choose the
Learned Topics
Instructional 2.75 2.61 2.80 2.74 2.37 2.32 2.93 2.81 2.42 2.59 2.99 2.70 2.34 2.63 2.64
Worksheets
Mean 2.50 2.29 2.33 2.33 2.21 2.23 2.36 2.68 2.35 2.50 2.57 2.57 2.37 2.45
Fig. 3 Mean scores for the experimental group. Note: The highest means in each group are bolded
Learning Environ Res
Learning Mean score
Environment Good Desire to Self- Being Thinking Success in Desire to Motivation to Make me Make me Make me Interest in Understand- Teamwork Mean
Characteristic relationship learn confidence challenged skills the science come to work at home feel feel curious science ing the ideas skills
with the class class responsible independent topics in class
teacher for my
learning
Independent 3.24 2.48 2.56 2.72 2.65 3.01 2.95 3.50 2.55 2.67 3.32 3.50 2.99 3.70 2.99
Learning
Activities
Class 2.64 2.59 3.12 3.23 2.72 2.95 3.08 4.03 3.49 3.79 3.21 3.35 2.97 2.93 3.15
Discussions
Learning Environ Res
Thinking 3.30 2.55 3.04 2.85 2.56 2.86 3.10 3.44 3.29 3.27 3.18 3.57 2.94 3.11 3.08
Activities
Constructing 2.91 2.91 3.00 3.04 2.92 2.62 3.13 3.26 3.10 3.49 3.27 3.32 3.04 2.78 3.06
Activities
Using the 3.68 2.99 3.29 3.19 3.09 3.04 3.12 2.81 2.96 2.99 3.35 3.36 2.98 3.06 3.14
Computer
Different 2.88 2.60 2.93 2.91 2.86 2.49 2.82 3.42 3.11 3.36 3.32 3.38 2.89 2.97 3.00
Learning
Experiences
Homework 3.72 3.14 3.24 3.32 3.19 3.41 3.73 3.27 3.22 3.28 3.55 3.97 3.43 3.59 3.43
Team 3.14 2.62 3.18 2.84 3.31 2.77 3.10 3.73 3.37 4.30 3.49 3.46 3.09 2.90 3.23
Projects
Designing 3.14 3.30 3.11 3.07 3.13 3.07 3.26 3.44 3.08 3.06 3.26 3.09 3.10 3.19 3.16
Activities
Performing 3.08 3.19 2.96 3.21 3.02 3.12 3.24 3.53 3.06 3.03 3.37 3.01 3.18 3.29 3.16
Experiments
Acceptance of 3.19 3.18 3.36 3.28 3.22 3.24 3.16 3.63 3.23 3.32 3.30 3.58 3.27 3.62 3.33
Mistakes
Tolerance of 3.19 3.06 3.13 3.08 3.18 3.05 3.15 3.60 3.74 3.93 3.54 3.49 3.33 3.36 3.34
Different
Opinions
Freedom to 3.09 3.05 3.19 3.35 3.15 3.13 3.04 3.57 2.89 3.09 3.52 3.47 2.94 3.37 3.21
Choose the
Learned Topics 3.09 3.07 3.23 3.25 3.27 3.26 3.18 3.64 3.24 3.65 3.83 3.77 3.37 3.42 3.38
Instructional
Worksheets
Mean 3.16 2.91 3.10 3.10 3.02 3.00 3.15 3.49 3.17 3.37 3.39 3.45 3.11 3.24
Fig. 4 Mean scores for the contrast group. Note: The highest means in each group are bolded
123
Learning Environ Res
Fig. 5 Mean ratings (with SE bars) for each learning environment characteristic for experimental and
contrast groups
Fig. 6 Mean ratings (with SE bars) for each learning outcome for experimental and contrast groups
When a t-test was conducted for differences between the two groups in interest, no
significant difference was found. There was a significant improvement in interest between
pretest and posttest for the experimental group (Fig. 7). However, there was no significant
difference between pretest and posttest interest for the contrast group. In addition, there
was a significant difference between the experimental and contrast groups in terms of
posttest interest scores (Fig. 7). Of the students in the experimental group, 19% showed an
increase in interest, compared with only 9% in the contrast group.
Two types of the reliability were examined for the questionnaire: stability reliability and
Cronbach’s a coefficient. The stability reliability was determined by analysing the
123
Learning Environ Res
Table 2 Comparison of pretest and posttest means for learning environment characteristics for the
experimental group
Learning Cognitive Affective Desire to Independence Science Teamwork M
environment come to interest
characteristic class
Pre Post Pre Post Pre Post Pre Post Pre Post Pre Post Pre Post
Science-design 2.02 2.20 2.27 2.34 2.09 2.25 2.50 2.63 2.33 2.43 1.89 2.25 2.18 2.35
activities
Tolerance 2.43 2.46 2.41 2.43 2.50 2.47 2.73 2.79 2.70 2.73 2.46 2.60 2.54 2.58
Freedom to 2.29 2.35 2.35 2.44 2.38 2.33 2.26 2.42 2.48 2.49 2.49 2.66 2.37 2.45
choose
Instructional 2.45 2.53 2.64 2.61 2.83 2.66 2.64 2.74 2.84 2.95 2.71 2.74 2.69 2.71
worksheets
Cognitive 2.13 2.17 2.41 2.34 2.28 2.37 2.57 2.64 2.53 2.53 2.58 2.57 2.42 2.43
activities
Construction 2.14 2.23 2.43 2.54 2.19 2.35 2.39 2.59 2.42 2.53 2.29 2.22 2.31 2.41
activities
Using the 2.55 2.63 2.59 2.85 2.32 2.57 2.53 2.42 2.49 2.66 2.91 2.77 2.56 2.65
computer
Homework 2.50 2.62 2.82 2.82 2.96 3.15 2.56 2.58 2.81 2.94 3.03 3.10 2.78 2.87
Mean 2.31 2.40 2.49 2.55 2.44 2.52 2.52 2.60 2.57 2.66 2.55 2.61
5.0
Pre Post
Interest to Major in Science
4.0
3.49
3.23 3.08
2.98
3.0
2.0
1.0
0.0
Control (N = 292) Experimental (N = 368)
Research Group
Fig. 7 Mean interest in science (with standard error bars) for pretest and posttest for both groups
correlation between pretest and posttest ratings. This Pearson correlation was r = 0.4
(p \ 0.01). Cronbach’s a coefficient for the questionnaire was found to be 0.78. These
levels of stability and internal consistency are quite high for a questionnaire administered
in an urban middle-school setting.
The questionnaire was simple to answer, as students considered only one repeated
question: ‘‘How important is each classroom feature for each learning aspect?’’ Yet
sometimes students did not reflect carefully while doing Likert ratings. Therefore, students
123
Table 3 Examples for common pupils’ justifications
Learning Outcome
environment
Good relationship with the teacher Desire to learn Self-confidence Motivation to work at home
123
characteristic
Class discussions When dealing with class The desire to learn is very important You have to have the self- I don’t like science very much.
discussions, you have to have a for classroom discussions. If you confidence to ask the question to I don’t like to come to school.
good relationship with the don’t want to learn, then nobody know the answer.
teacher. This way, you can agree participates and the discussion
or disagree in a good way. cannot happen.
Constructing Constructing activities in the class This is the most important because, Constructing activities is a very It is important because
activities always goes more smoothly with when you construct something, good way to boost self- constructing activities helps
good relationships with your you have to think. confidence. I believe this because children understand more
teacher. Mostly they can help you you can look at what you have things about science.
and guide you with things you done and be proud.
need help with.
Team projects When we have to do projects, the You need to have a desire to learn in Working in groups without so many Taking responsibility to finish
teachers are there to help. order to do a team project. If you guides is very challenging. work at home prepares me for
don’t, you won’t be motivated to high school.
do well.
Acceptance of The relationship with my teacher Acceptance of mistakes is good for Acceptance of mistakes is good for If I make a mistake, I can change
mistakes and acceptance of mistakes are a desire to learn because it makes yourself and a healthy life. it instead of being laughed at.
very important to me. you think and mistakes can be
fixed.
Freedom to choose If we got to learn what we want, we The difference between desire to It is important because you need The class should be able to pick
the learned topics would like class way more. We learn and freedom to choose self-confidence because you need the classroom topic.
would be learning what we want could make a successful class. to know that confidence is in
to learn and not this boring stuff. learning. Freedom to choose the
learned topics because we have to
learn.
Learning Environ Res
Learning Environ Res
were asked to pick one high score and one low score and to give a reason why they rated
highly the importance of a specific classroom feature to a specific learning aspect. Using
students’ qualitative justifications, we can discover if they understood the content of the
questionnaire. Table 3 presents some of the common reasons that students used to justify
their scoring.
The reasons that pupils used to justify their scores varied in appropriateness. Three
assessors rated the quality of the pupils’ responses to a specific classroom feature and a
specific learning aspect. We used a scale of 5–1. If the response fitted both classroom
feature and learning aspect, the assessor rated 5 or 4; if the response fitted one of them, the
assessor rated 3 or 2; and if it did not fit at all, the assessor rated 1.
We used Cohen’s Kappa to measure the agreement between the evaluations of each of
the peers of three individual evaluations. Each assessor rated the same objects. A value of 1
indicated perfect agreement. A value of 0 indicated that agreement is no better than chance.
The agreement (Cohen’s Kappa) was 0.90 between assessor 1 and assessor 2, 0.81 between
assessor 1 and assessor 3, and 0.79 between assessor 2 and assessor 3. These Kappa values
are quite high, showing that our ratings were highly reliable.
The average rating was 3.54 (SD = 1.33), suggesting that pupils did generally appear to
understand the ratings task.
This study had two main goals. The first goal was to identify the characteristics of design-
based learning environments. The second goal was to investigate whether pupils’ per-
ceptions of learning environment characteristics impact on learning outcomes.
The DBLEQ developed in this study could be used by educators and researchers to explore
different groups and different learning environments. We used the DBLEQ to investigate
the impact of learning environment characteristics on learning outcomes from the pupils’
perspectives. The questionnaire results showed that pupils had differentiated views of the
impact of learning environment characteristics upon different learning outcomes.
According to pupils’ perceptions, the most influential characteristics of design-based
learning environments were Homework and Instructional Worksheets. We cannot explain
why Homework was perceived to be an important feature, because the DBL curriculum did
not specifically instruct teachers to give pupils homework assignments. However, the DBL
curriculum did have instructional sheets especially for documenting the design. This might
explain why pupils rated this feature to be very influential. The learning outcomes were
Motivation to Work at Home, Interest in Science Topics and Making me Curious. These
outcomes might signal that both curricula motivated pupils’ learning. Although both
groups scored the influence of the characteristics on outcomes similarly, the experimental
group reported a significantly higher interest in choosing science as a major in high school
after they finished the design-based learning module.
The high quality of pupils’ justifications for their ratings suggest that the questionnaire
can serve as a valid instrument for investigating pupils’ perceptions regarding the impor-
tance of the classroom features to the learning aspects. The DBLEQ might be used to assist
educators and researchers in exploring different groups and different learning environments.
123
Learning Environ Res
Our findings are in contrast to those of previous studies in different settings. In a previous
study that explored pupils’ perceptions of their learning environment in an Israeli com-
prehensive high school (Doppelt and Barak 2002), the overall learning environment
involved designing authentic team projects in LEGO-Logo. That study revealed that pupils
rated Team Projects and Constructing Activities as the most influential characteristics upon
the learning outcomes. The most impacted outcomes were: Understanding the Ideas in
Class, Interest in Science Topics and Good Relationships with the Teacher. The findings
might be explained not only by cultural differences, but also by content differences (such
as the length of the intervention) and the students selecting science-technology rather than
being required to take a science class in middle school.
In another research study, middle-school science pupils involved in a design-based
learning experience of longer duration than the current study rated Team Projects, Class
Discussions and Performing Experiments as the most influential characteristics upon their
learning environment. The most impacted learning outcomes according to pupils’ rating
were Understanding the Ideas in Class, Interest in the Science Topics, Independent
Learning Activities, and Desire to Learn (Doppelt 2004). Again the differences might have
resulted from a different length of intervention, in that longer design-based learning
experiences are required for producing very different styles of learning and perceptions of
the learning. These two examples suggest that further research needs to be undertaken to
explore pupils’ perceptions of their learning environment.
This article focused on differences between pupils’ perceptions of the impact of learning
environment characteristics on learning outcomes across different learning environments
and across time. Teachers and researchers can gain insights into how to further improve
learning environments by exploring pupils’ perceptions of the learning environment.
Further research is needed into factors underlying pupil perceptions regarding the impact
of learning environment characteristics on learning outcomes.
References
Barak, M., & Doppelt, Y. (1999). Integrating the Cognitive Research Trust (CoRT) programme for creative
thinking into a project-based technology curriculum. Research in Science & Technological Education,
17(2), 139–151. doi:10.1080/0263514990170202.
Barak, M., & Doppelt, Y. (2000). Using portfolios to enhance creative thinking. Journal of Technology
Studies, 26(2), 16–24.
Barak, M., Eisenberg, E., & Harel, O. (1995). ‘What’s in the calculator?’ An introductory project for
technology studies. Research in Science & Technological Education, 12, 147–154. doi:10.1080/
0263514950130204.
Barak, M., & Raz, E. (1998). Hot air balloons: Project centered study as a bridge between science and
technology education. Science Education, 84, 27–42. doi :10.1002/(SICI)1098-237X(200001)84:1\27::
AID-SCE3[3.0.CO;2-8.
Barlex, D. (1994). Organising project work. In F. Banks (Ed.), Teaching technology (pp. 124–143). London:
Routledge.
Barlex, D. (2002). The relationship between science and design & technology in the secondary school
curriculum in England. In I. Mottier & M. J. de Vries (Eds.), Technology education in the curriculum:
Relationships with other subjects. Proceedings PATT-12 Conference (pp. 3–12). Eindhoven, The
Netherlands: Eindhoven University of Technology. Available from: http://www.iteaconnect.org/
Conference/PATT/PATT12/PATT12.pdf.
De Vries, M. J. (1996). Technology education: Beyond the ‘‘Technology is applied Science’’ paradigm.
Journal of Technology Education, 8(1), 7–15.
123
Learning Environ Res
Doppelt, Y. (2003). Implementing and assessing project-based learning in a flexible environment. Inter-
national Journal of Technology and Design Education, 13, 255–272. doi:10.1023/A:1026125427344.
Doppelt, Y. (2004). Impact of science-technology learning environment characteristics on learning out-
comes: Pupils’ perceptions and gender differences. Learning Environments Research, 7, 271–293. doi:
10.1007/s10984-004-3297-4.
Doppelt, Y. (2006). Science-technology learning environment: Teachers and pupils’ perceptions. Learning
Environments Research, 9, 163–178. doi:10.1007/s10984-006-9005-9.
Doppelt, Y. (in press). Assessing creative thinking in design-based learning. International Journal of Technology
and Design Education. Retrieved July 23, 2008, from http://www.springerlink.com/content/102912/
?Content?Status=Accepted&sort=p_OnlineDate&sortorder=desc&v=condensed&o=20 (OnlineFirst).
Doppelt, Y., & Barak, M. (2002). Pupils identify key aspects and outcomes of a technological learning
environment. Journal of Technology Studies, 28(1), 12–18.
Doppelt, Y., Mehalik, M.M., & Schunn, C.D. (2005, April). A close-knit collaboration between researchers
and teachers for developing and implementing a design-based science module. Paper presented at the
annual meeting of the National Association for Research in Science Teaching, Dallas, TX.
Doppelt, Y., Mehalik, M. M., Schunn, D. C., & Silk, E. (2004). Alarm system—Design, construction and
reflection: Teacher guide, student module and embedded assessment. Pittsburgh, PA: Learning
Research and Development Center (LRDC), University of Pittsburgh.
Doppelt, Y., Silk, E., Mehalik, M.M., Schunn, D.C., Reynolds, B., & Ward, E. (2006, April). Evaluating the
impact of a facilitated learning community approach to professional development on student
achievement. Paper presented at the annual meeting of the National Association for Research in
Science Teaching, San Francisco.
Ennis, R. H. (1989). Critical thinking and subject specificity: Clarification and needed research. Educational
Researcher, 18(3), 4–10.
Fraser, B. J. (1998). Science learning environments: Assessment, effects and determinates. In B. J. Fraser &
K. G. Tobin (Eds.), International handbook of science education (pp. 527–564). Dordrecht, The
Netherlands: Kluwer.
Fraser, B. J., Anderson, G. J., & Walberg, H. J. (1982). Assessment of learning environments: Manual for
Learning Environment Inventory (LEI) and My Class Inventory (MCI) (3rd vers.). Perth, Australia:
Western Australian Institute of Technology.
Fraser, B. J., Giddings, J. G., & McRobbie, J. C. (1995). Evolution and validation form of an instrument for
assessing science laboratory classroom environments. Journal of Research in Science Teaching, 32,
399–422. doi:10.1002/tea.3660320408.
Fraser, B. J., & Tobin, K. (1991). Combining qualitative and quantitative methods in classroom environment
research. In B. Fraser & H. Walberg (Eds.), Educational environments: Evaluation, antecedents and
consequences (pp. 271–292). Oxford, UK: Pergamon.
Glaser, R. (1993). Education and thinking: The role of knowledge. In R. McCormick, P. Murphy & M.
Harrison (Eds.), Teaching and learning technology (pp. 91–111). Wokingham, UK: Addison-Wesley.
Henderson, D., Fisher, D., & Fraser, B. (2000). Interpersonal behavior, laboratory learning environments
and student outcomes in senior biology classes. Journal of Research in Science Teaching, 3, 26–43. doi
:10.1002/(SICI)1098-2736(200001)37:1\26::AID-TEA3[3.0.CO;2-I.
Lewin, K. (1936). Principles of topological psychology. New York: McGraw-Hill.
Mehalik, M. M., Doppelt, Y., & Schunn, D. C. (2008). Middle-school science through design-based learning
versus scripted inquiry: Better overall science concept learning and equity gap reduction. Journal of
Engineering Education, 97(1), 71–85.
Moos, R. H. (1974). The social climate scales: An overview. Palo Alto, CA: Consulting Psychologist Press.
Nicaise, M., Gibney, T., & Crane, M. (2000). Toward an understanding of authentic learning: Student
perceptions of an authentic classroom. Journal of Science Education and Technology, 9(1), 79–94. doi:
10.1023/A:1009477008671.
Resnick, M., & Ocko, S. (1991). LEGO/Logo: Learning through and about design. In I. Harel & S. Papert
(Eds.), Constructionism (pp. 141–150). Norwood, NJ: Ablex Publishing Corporation Norwood.
Taylor, P. C., Fraser, B. J., & Fisher, D. L. (1997). Monitoring constructivist classroom learning environments.
International Journal of Educational Research, 27, 293–302. doi:10.1016/S0883-0355(97)90011-2.
Waks, S. (1995). Curriculum design: From an art towards a science. Hamburg, Germany: Tempus
Publications.
Wong, F. L. A., & Fraser, B. J. (1996). Environment-attitude associations in the chemistry laboratory
classroom. Research in Science & Technological Education, 14, 91–102. doi:10.1080/02635
14960140107.
Zohar, A., & Tamir, P. (1993). Incorporating critical thinking within a regular high school biology cur-
riculum. School Science and Mathematics, 93, 136–140.
123