Rúbrica para Observar Clases - STEM
Rúbrica para Observar Clases - STEM
Rúbrica para Observar Clases - STEM
What is DoS?
The Dimensions of Success observation tool, or DoS, pinpoints twelve indicators of
STEM program quality in out-of-school time. It was developed and studied with funding
from the National Science Foundation (NSF) by The PEAR Institute: Partnerships in
Education and Resiliency (PEAR), along with partners at Educational Testing Service
(ETS) and Project Liftoff. The DoS tool focuses on understanding the quality of a STEM
activity in an out-of-school time learning environment and includes an explanation of
each dimension and its key indicators, as well as a 4-level rubric with descriptions of
increasing quality (see p.4 for sample rubric).
1
What are the dimensions?
STEM Content
Organization Participation Learning Relationships
Purposeful
Materials Activities Inquiry Relevance
Space Engagement
Utilization with STEM Reflection Youth Voice
DoS measures twelve dimensions that fall in 4 broad domains: Features of the Learning
Environment, Activity Engagement, STEM Knowledge and Practices, and Youth Development in
STEM.
The first three dimensions look at features of the learning environment that make it
suitable for STEM programming (e.g., do kids have room to explore and move freely,
are the materials exciting and appropriate for the topic, is time used wisely and is
everything prepared ahead of time?).
The second three dimensions look at how the activity engages students: for example,
they measure whether or not all students are getting opportunities to participate,
whether they are doing activities that are engaging them with STEM concepts or
something unrelated, and whether or not the activities are hands-on, and designed to
support students to think for themselves versus being given the answer.
The next domain looks at how the informal STEM activities are helping students
understand STEM concepts, make connections, and participate in the inquiry practices
that STEM professionals use (e.g., collecting data, using scientific models, building
explanations, etc.).
Finally, the last domain assesses the student-facilitator and student-student interactions
and how they encourage or discourage participation in STEM activities, whether or not
the activities make STEM relevant and meaningful to students’ everyday lives, and the
experiences. Together, these twelve dimensions capture key components of a STEM
activity in an informal afterschool or summer program.
2
Planning to use DoS
Step 1: What are your goals for assessment/evaluation?
• Do you want to help individual afterschool science program sites pinpoint their
strengths and weaknesses?
• Do you want data about entire programs (e.g., Boys and Girls Clubs or YMCAs)
• Do you want external evaluators to use DoS to report quality across the state?
3
How do you get certified to use DoS?
To use DoS, a potential observer must complete a certification process. First, he/she
must attend a 2-day training (in-person or online) to learn how to define and observe
quality in each dimension. Next, potential observers must complete a set of video
simulation exercises to practice their understanding of the tool. PEAR will then review
their ratings and evidence from these exercises, and will provide customized feedback at
a one-hour calibration session (phone conference). At this session, PEAR trainers will
help to address any questions and to provide additional examples that might be needed
to clarify use of the tool. Finally, potential observers will then arrange to practice using
DoS in the field at afterschool sites in their local area. This step allows them to use the
tool in the field and to incorporate the feedback they received on the video simulations.
Upon successful completion of all these requirements, observers will be DoS certified
for 2 years and can use the tool as often as they would like during that period. After 2
years, there are opportunities for re-certification if needed.
For pricing and registration for an upcoming training, please contact Rebecca Browne at
[email protected]
Video
2-day live Calibration Certification
training Exercises to One Hour 2 Practice & Ongoing
Calibration Field
(online or in- establish Technical
Call Observations
person) rater Assistance
reliability
4
What if we need help?
Technical Assistance will be provided by the PEAR team during the training and
afterwards as you start using the tool. You will also receive updates about possible
professional development opportunities or resources you can use to improve particular
dimensions where you are identifying weaknesses.
Overall, DoS can empower afterschool and summer STEM program staff to embrace
their role in inspiring the next generation to do STEM, be interested in STEM, and
understand important STEM ideas that they can take with them throughout their lives.
The tool helps to provide the common language that program/state administrators, staff,
evaluators, etc. can use to describe their activities and where they excel and where they
can improve.
5
Overview of DoS Dimensions
•Are the activities delivered in an •Are the materials appropriate •Is the space utilized in a way
organized matter? for the students, aligned with the that is conducive to OST
STEM learning goals, and learning?
•Are materials available and do appealing to the students?
transitions flow? •Are there any distractions that
impact the learning experience?
ACTIVITY ENGAGEMENT
Participation Purposeful Activities Engagement with STEM
•Are student participating in all •Are the activities related to the •Are students doing the cognitive
aspects of activities equally? STEM learning goals? work while engaging in hands-on
activities that help them explore
•Are boys participating more STEM content?
than girls? Are some students
dominating group work?
•Are there positive student- •Is there evidence that the •Are students encouraged to
facilitator and student-student facilitator and students are voice their ideas/opinions?
interactions? making connections between the
STEM content and activities and •Do students make important
students’ everyday lives and and meaningful choices that
experiences. shape their learning experience?
6
Sample Rubrics
Inquiry Rubric
EVIDENCE INCONSISTENT REASONABLE COMPELLING
ABSENT EVIDENCE EVIDENCE EVIDENCE
There is minimal Students are taught Students are engaging There is consistent
evidence that about STEM practices in STEM practices evidence that students
students are during activities but are during the activities are engaging in STEM
engaging in or not engaging in STEM but the engagement is practices during the
taught about STEM practices themselves. superficial activities.
practices during
activities.
1 2 3 4
Students do not Students observe STEM Students use some Students have
have any practices (by the STEM practices, opportunities to use
opportunities to facilitator, a guest however, they are STEM practices by
engage in STEM presenter, or a peer), used superficially and pursuing scientific
practices. but do not have do not help students questions, tackling
opportunities to engage deeply engage in the engineering design
in them on their own. thinking and reasoning issues, or create
of STEM professionals. mathematical
For example, they may arguments.
watch the activity leader For example, they may
or a student do an do an investigation,
experiment or but by following a They are supported
demonstration, or cookbook-approach, to use the practices in
watch the teacher make step-by-step set of authentic ways, where
and explain a scientific instructions. they are trying to
model. Participation in STEM actually solve a
practices is scripted or problem or gather
inauthentic. data to answer a
question.
7
Engagement with STEM Rubric
1 2 3 4
The activities Students engage in There are some There are consistent
mostly leave hands-on activities; opportunities for opportunities for
students in a however, there is students to engage in students to actively
passive role, where limited evidence that hands-on activities explore STEM
they are observing the hands-on activities that allow them to content by engaging in
a demonstration or encourage students to actively explore STEM hands-on activities,
listening to the engage with STEM content. Some parts where students do
facilitator talk. content in meaningful of the activities still the cognitive work
(minimal hands-on ways. (“hands-on, leave students as themselves and the
opportunities) minds-off”) passive observers facilitator maintains
while the facilitator the role as facilitator
does all the cognitive versus teller.
work.
• Feedback to program: Have a few big questions to guide students’ observations of the
different animal tanks. For example, “what do you observe on these animals that might help
them survive under water?” “how are the legs different on this animal from this other animal or
how are the legs similar or different from yours and why?”—this way the students are observers
with the purpose of gathering information to answer these questions.
8
Sample Reports
(reports will be customized to the needs of each region/state/network)
9
Year Average (includes data across all four quarters)
Engagement with
2.5 3 3 3.5 3
STEM
STEM Content
2 2.5 2 2.8 2.3
Learning
* Includes data for Boys and Girls Club of Cityville, Cityville Community Center, Science Center of
Cityville, and Cityville Afterschool STEM Project
10
Module Report (allows for comparisons across different science units)
11
Average Ratings for Modules
Robotics Animal Adventures Space Exploration
Contact Information:
12