Educ 6 Module 1

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 17

Colegio de Santa Catalina de Alejandria

Bp. Epifanio B. Surban Street, Dumaguete City


College of Liberal Arts -Education

Module in Education 6

Course Code Education 6


Course Title Assessment of Learning 1
Credit 3 units
Prerequisite None
Course Description The course introduces students to the concepts of measurement, testing,
assessment and evaluation. It focuses on the development and utilization of
assessment tools to improve the teaching - learning process.

Course Learning At the end of the course, students are expected to acquire necessary
Objectives knowledge and skills critical to efficient performance of roles and
responsibilities as prospective teachers. Specifically the objectives are:
Knowledge:
 To identify the role of assessment in making instructional decisions
 To apply the basic concepts and principles of high quality
assessment
 To match the different targets and their appropriate assessment
techniques
Skills/Abilities:
 To construct classroom assessment tools in measuring knowledge
and thinking skills
 To construct periodic/standardized tests
 To demonstrate skills in utilizing and reporting test results
 To apply statistical procedures in standardizing tests
 To grade students’ achievement
Values/Attitudes:
 To demonstrate an appreciation of assessment in improving
teaching and learning
 To manifest accuracy in computation , intellectual honesty ,and
responsibility

Course Requirements Submission of Activity notebook every 2 weeks ( eg. Friday )


On-line class (scheduled)
Projects: Table of Specification and Sample Report Card
Major Exams
COSCA Philosophy of The total formation of the human person in the quest for knowledge and
Education truth and search for meaning, enlightened by Faith, the good of the human
family and in view of man’s ultimate end
COSCA VISION We envision COSCA as a Christ-Centered, Premier Diocesan Catholic
Educational Institution, Transformative in Christian Leadership and Service,
Enhancing the Family, Church, and Society in the Midst of a Fast Changing
Word.
COSCA Mission We provide and impart a Catholic education that is Christ-centered and
competency-based, holistic and transformative, through appropriate use of
relevant pedagogy and technology.

We equip students with globally-responsive knowledge, attitudes, and skills


that are based on Christian principles and values.

We engage in research-based community projects for people empowerment


and nation-building.

We advocate and promote the protection and preservation of Mother Earth,


our common home, through responsible stewardship.
COSCA Core Values 1. Fides, Spes, Caritas – We believe, We hope, We love
2. Truth and Wisdom
3. Mission and Evangelization
4. Discipleship
5. Stewardship
6. Empowerment
7. Christian Service
Course Outline Topics
Module 1 : Nature and Roles of Assessment
1.1 Concepts and Relevance of Assessment
Prelim Grading 1.2 Roles of Assessment
Period Module 2: Principles of High Quality Assessment
2.1 Appropriateness and Alignment of Assessment Methods to Learning
Outcomes
2.2 Validity and Reliability
2.3 Practicality and Efficiency
2.4 Ethics

Midterm Grading Module 3: Development of Tools for Classroom – Based Assessment


Period 3.1 Planning the Test
 Test Development Process
 Identifying purpose of the Test
 Specifying the Learning Outcomes
 Preparing a Table of Specification
3.2 Selecting and Constructing Test Items and Tasks
 Categorizing Test Types
 Relating Test Types with Levels of Learning Outcomes
 Constructing Objective Supply Type of Items
 Constructing Non-Objective Supply Type
 Constructing Selected-Response Types

Final Grading Period Module 4 : Interpretation of Assessment Results


4.1 Utilization of Assessment Data
4.2 Grading and Reporting of Assessment Results

Module 5: Large-Scale Student Assessment


5.1 Understanding Large- Scale Student Assessment
5.2 Development of Large- Scale Student Assessment

Grading System CS - 60%


Major Exam- 40%

References:

Chatterji, M. (2003). Designing and using tools for educational assessment. Boston: Teachers College,
Colombia University.

de-Guzman, Estefania, S and Joel L. Adamos. (2015). Assessment of learning 1. Quezon City: Adriana
Publishing Co., Inc

De Guzman-Santos, Rosita (2007). Assessment of learning 1. Quezon City: Lorimar Publishing , Inc.

Macaranang, Mercedes and Violeta Vega (2009). Assessment of learning 1.


Module Presentation

PRETEST
Multiple Choice : Select the correct answer. Write the letter of your choice in your activity notebook.
1. Assessment is important to teachers because of the decisions they will make about their students
when teaching them. For teachers, which of the following is NOT among the functions of assessment?
A. Identify learner’s needs and abilities
B. Monitor and improve the teaching-learning process
C. Make decisions about how to implement learning activities
D. Make policy decisions regarding what is and what is not appropriate for lea andrners
2. Mr. Reyes uses evidences of student learning to make judgements on students achievement against
goals and standards. He does this at the end of a unit or period. Which purpose does assessment serve?
A. Assessment as learning B. Assessment for learning
C. Assessment of learning D. Assessment strategy
3. During his first meeting Mr. Montecino gave a readiness test to determine the prerequisite skills and
degree of mastery his students possess in relation to the course objectives or learning outcomes. He
intends to use the results to determine where he will begin in his lesson plan and decide on the best
mode of learning. Which form of assessment did he employ?
A. Diagnostic B. Formative
C. Placement D. Summative
4. Test results were utilized to make decisions about the school’s curriculum. What function does the
assessment serve?
A. Administrative B. Guidance
C. Instructional D. Research and evaluation
5. Which of the following guides the teacher in ensuring that achievement domains are achieved and a
fair and representative sample of questions appear on the test? It provides evidences of content validity.
A. Assessment matrix B. Lesson plan
C. Table of contents D. Table of specification
6. A Science teacher administered the same test twice. Which statistical tool is used in a test-retest
method?
A. Cronbach alpha B. Kuder-Richardson 20/21 C. Spearman Brown D. Pearson r
7. Which of the following is true about validity and reliability?
A. For an instrument to be valid, it must be reliable B. For an instrument to be reliable, it must be valid
C. Both (A) and (B) D. None of these
8 .A Grade 7 Science teacher grouped the students for an oral presentation about mitigation and
disaster risk reduction. She forgot to discuss to the class how they shall be graded. What ethical concern
did the teacher fail to consider?
A. Confidentiality B. Fairness C. Relevance D. Transparency
9. What is a basic feature of summative assessment?
A. It is used for evaluating student learning and effectiveness of instruction
B. It is basically utilized for monitoring learning of students
C. It marks student performance before and after instruction
D. It is used for making instructional decisions
10. Multiple-choice type is considered a versatile format since it can be used for assessing
A. learners of different ability levels B. lower and higher-level learning outcomes
C. knowledge, processes and performance D. all levels of assessment
11. What is the only requirement of a test to be considered an objective test?
A. Appropriateness of difficulty level of items to test takers means continued study, review,
B. Observance of fairness in writing the items
C. Presence of only one correct or nearly best answer in every item
D. Comprehensibility of test instructions
12. Performance assessment tasks are utilized for assessing complex understanding. Select the learning
outcome appropriate to this kind of assessment.
A. Making weather predictions based on available data
B. Choosing the best paraphrase of a proverb
C. Associating significant inventions with their inventors
D. Composing an original design based on a theme
13. The median score in a test is 34. What does this indicate?
A. The average is 34 B. Half of the test scores are above/below 34
C. Majority of the students received a score of 34 D. The cut-off score is 34
14. Mean is to central tendency while standard deviation is to
A. difficulty B. discrimination C. reliability D. variability
15. When interpreting test results? Which of the following types of scores is least meaningful?
A. Raw score B. Percentage correct score C. Percentile rank D. standard score
16. Joseph’s grade in Mathematics did not progress despite a high-quality curriculum and group support.
What can the teacher do?
A. Give makeup work to cope with lessons
B. Implement a personalized action plan that target skills deficit
C. Provide more assignments for practice
D. Transfer the student to another class
17. What must be satisfied for large-scale assessment to perform its use to compare countries?
A. consistency of findings for schools over time B. positive attitude of participating schools
C. score comparability among schools D. fairness of items used in schools
18. The purpose of field testing as a step in the test development process is to find out if the test items
A. can be answered only by students who have the knowledge to answer them
B. can be answered by all the students in the participating schools
C. are understandable to all types of students
D. are worded appropriately
19. To ensure fairness , it is important that large-scale test be pilot tested to
A. samples of students from high-performing and low-performing schools
B. random samples representing the age and gender grouping of the target population
C. selected samples belonging to the same grade level as the target population
D. random samples of the same grade level as the target group coming from different socio-economic
and cultural conditions
20. Which of the statements below is true?
A. Classroom-based and large-scale testing can be both norm-referenced and criterion-referenced
B. Classroom – based is more norm-referenced than criterion-referenced
C. Large-scale assessment testing can be better used for criterion-referenced interpretation
D. Neither classroom testing nor large-scale testing is appropriate for norm-referenced purposes.

Module 1 Lesson 1.1

Title (Topic) Concepts and Relevance of Assessment


Intended Learning 1. Compare and contrast measurement, testing, assessment, and
Outcome/s evaluation
2. Determine the types of test
3. Identify the functions of testing
4. Describe the nature ,purpose and relevance of assessment

Discussion/Presentation
Measurement. Comes from an old French word mesure which means “limit or quantity”. Basically, it is a
quantitative description of an object’s characteristic or attribute. When we measure there are appropriate
measuring tools to gather numerical data on variables such as height, mass, time, and temperature among others.
In the field of education, what do teachers measure and what instruments do they use?
Teachers are particularly interested in determining how much learning a student has acquired. They measure
particular elements of learning like students” readiness to learn, recall of facts or ability to analyze and solve
problems. Teachers use tools like tests, oral presentations, written report, portfolios and rubrics to obtain
pertinent information. Among these, tests are the most pervasive.
Testing. A formal, systematic procedure for measuring a learner’s knowledge, skills, or abilities administered
under certain conditions (manner and duration). Test is a tool or an instrument used in testing. Educational tests
may be used to measure the learning progress of a student . Tests are the most dominant form of assessment.
Types of Test
According to mode of response:
 Oral test – answers are spoken
 Written test – activities wherein students either select or provide a written response to a
question
 Performance tests - activities that require students to demonstrate their skills or ability to
perform specific actions. More aptly called performance assessments, they include
problem-based learning, inquiry tasks, demonstration tasks, exhibits, presentation tasks
and capstone performances.
According to Ease of quantification:
 Objective test can be corrected and quantified quite easily. The test items have a
single or specific convergent response.
 Subjective test elicits varied responses
According to test constructors :
 Standardized tests – prepared by experts who are versed in the principles of
assessment. They are administered to a large group of students or examinees under
similar conditions. Scoring procedures and interpretations are consistent. There are
available guides and manuals to aid in the administration and interpretation of results.
Results of standardized tests serve as an indicator of instructional effectiveness and a
reflection of the school’s performance.
 Non-standardized – prepared by teachers who may not be adept at the principles of
test construction
According to Nature of Answer or Construct they are measuring:
 Personality tests - initially intended to aid in the selection of personnel in the armed
forces, initially developed in the 1920s. It has no right or wrong answer, but it
measures one’s personality and behavioural style. It is used in the recruitment of
personnel , in career guidance, in individual and relationship counselling and in
diagnosing personality disorders. In schools, personality tests determine personality
strengths and weaknesses. Personality development activities can then be arranged
for students.
 Achievement and Aptitude Tests
Achievement tests measure students’ learning as a result of instruction and training
experiences. When used summatively, they serve as a basis for promotion to the next
Grade.
Aptitude tests determine a student’s potential to learn and do new tasks. A career aptitude
tests aids in choosing the best line of work for an individual based on his/her skills and interests.
 Intelligence tests - measure learner’s innate intelligence or mental ability.
 Sociometric test – measures interpersonal relationships in a social group. The test
allows learners to express their preference s in terms of likes and dislikes for other
members of the group.
 Trade or vocational tests – assesses an individual’s knowledge, skills, and competence
in a particular occupation Upon successful completion of the test, the individual is given
certification for qualification. Trade test can likewise be issued to determine the
fruitfulness of training program.
Functions of Testing
A. Instructional Functions
 Tests facilitate the clarification of meaningful Learning objectives
 Tests provide a means of feedback to the instructor and the student.
 Tests can motivate learning.
 Tests can facilitate learning
 Tests are a useful means of over learning. Over learning means continued study, review,
interaction or practice of the same material even after concepts and skills had been mastered.
B. Administrative Functions
 Tests provide a mechanism of quality control. Through tests, a school can determine the
strengths and weaknesses of its curricula
 Tests facilitate better classification and placement decisions.
 Tests can increase the quality of selection decisions.
 Tests can be a useful means of accreditation, mastery or certification.
C. Research and Evaluation
Tests are utilized in studies that determine effectiveness of new pedagogical techniques. Evaluators also utilized
assessment data to determine the impact and success of their programs.
D. Guidance Functions
Tests can be of value in diagnosing an individual’s special aptitudes and abilities. The aim of guidance is to
enable each individual to understand his/her abilities and interests and develop them so that he/she can take
advantage of educational, vocational and personal opportunities.

ASSESSMENT
The word assessment comes from the Latin word assidere which means “to sit beside a judge”. This implies
that assessment is tied up with evaluation. Assessment is a process of collecting information about a learner’s
performance using a variety of tools and methods. Tests are a form of assessment .However, there are also non-
test assessment techniques like portfolio, observation, oral questioning and case studies for authentic assessment
Categories:
 Maximum Performance – determines what learners “can do”. It is achieved when learners are
motivated to perform well. In this category, students are encouraged to aim for a high score.
 Typical Performance – shows what learners “will do” or “choose to do”. It is more focused on
the learner’s level of motivation rather than his optimal ability.
Purposes:
 Assessment for Learning (AfL) . Pertains to diagnostic and formative assessment tasks which are
used to determine learning needs, monitor academic progress of students and guide instruction.
Examples: Pre-tests, written assignments, quizzes, concept maps, focused questions
 Assessment as Learning (AaL). Employs tasks or activities that provide students with an
opportunity to monitor and further their own learning – to think about their personal learning
habits and how they can adjust their learning strategies to achieve their goals. Examples: self
and peer assessment rubrics
 Assessment of Learning (AoL). Summative and done at the end of a unit, task, process or period.
Its purpose is to provide evidence of a student’s level of achievement in relation to curricular
outcomes. Examples: unit tests, final projects
Relevance of Assessment
Assessment is needed for continued improvement and accountability in all aspects of the education
system. In order to make assessment work for everyone-students, teachers, and other players in the education
system ,one should have an understanding of what assessment provides and how it is used to explain the dynamics
of student learning.
EVALUATION

This comes in after the data had been collected from an assessment tasks. Evaluation is a process of judging
the quality of a performance or course of action. As what its etymology indicates (French word evaluer), evaluation
entails finding the value of an educational task. This means that assessment data gathered by the teacher have to
be interpreted in order to make sound decisions about students and the teaching-learning process. Evaluation is
carried out both by the teacher and students to uncover how the learning process is developin
Task/ Activity 1 Directions: Briefly answer the following questions in your notebooks.
1. Briefly discuss the differences and similarities of measurement, testing,
assessment and evaluation.
2. Rectify the following misconceptions. Explain in two to three sentences
why they are incorrect:
a. Assessment and evaluation are one and the same
b. Assessment is completed once every grading period.
c. Assessment is one-way. Only teachers are involved in assessment.
d. Assessment is ultimately for grading purposes.
e. Student’s work should always be given a mark or grade.
f. Assessment is the responsibility of program coordinators/ supervisors.
g. Assessment is imposed on teachers by the school and accrediting
agencies.
h. Formative assessment is a kind of test teachers use to find out what their
students know.
i. Instruction informs assessment but not the other way around.
j. Assessment is an average of performances across a teaching period.
3.As a college student, you underwent several assessments in basic
education . Recall from your own personal experience an assessment that
you think was truly meaningful to you. Explain why it is so. Explain also the
nature and purpose of that particular assessment.
4.Discuss the concept of measurement, assessment and evaluation in the
following scenarios.

a. Mrs. Vidal, A Grade 5 Science teacher, was done with her lessons on rocks
and soil erosion. She gave a test that required students to discuss the types
of rocks, how rocks turn into soil and the effect of soil erosion on living
things and the environment. She noted that Ellah scored 43 out of 50 in the
test. She informed Ellah that she satisfactorily attained the intended learning
outcome and deserved a grade of 86 with a remark that Susan showed
proficiency in the said topic.
b. Ms. Mendoza measured the size of her classroom and found that the
floor area is only 20 square meters .She reported to the principal that the
classroom is too small for a class of 40 students.
5. Explain briefly the relevance of assessment to learners, teachers, parents
and other stakeholders.

Module1: Lesson 1.2

Topic Roles of Assessment


Intended Learning Explain the various roles of assessment in the instructional process.
Outcome/s
Discussion/Presentation
ROLES OF ASSESSMENT
 Placement Assessment
Basically used to determine a learner’s entry performance. Done at the beginning of instruction.
If pre-requisite skills are insufficient, then the teacher can provide learning experiences to help
them develop those skills. If students are ready , then the teacher can proceed with instruction
as planned
 Formative Assessment
Mediates the teaching and learning processes. It is learner-centered and teacher-directed. It
occurs during instruction. It is context-specific since the context of instruction determines the
appropriate classroom assessment technique. Types of formative assessment include: question
and answer during discussion, assignments, short quizzes and teacher observations. Result of
formative assessments are recorded for the purpose of monitoring students’ learning progress.
However, these are not used as bases for students’ marks.
Attributes of an Effective Formative Assessment:
 Learning progressions. These should clearly communicate the subgoals of the ultimate
learning goals.
 Learning Goals and Criteria for Success. These should be clearly identified and
communicated to students.
 Descriptive Feedback. Students should be provided with evidence-based feedback that
is linked to the intended instructional outcomes and criteria for success. Discrepancies or
gaps in the students’ current actual performance and desired goal attainment can be
reduced by both teacher and students through effective feedback that answers three
vital questions: Where am I going? How am I going? Where to next?
 Self and Peer Assessment. These are important for providing students an opportunity to
think metacognitively about their learning.
 Collaboration. A classroom culture in which teachers and students are partners in
learning should be established.
 Diagnostic Assessment
Intends to identify learning difficulties during instruction. Contrary to what others believe,
diagnostic tests are not merely given at the start of instruction. It is used to detect causes of
persistent learning difficulties despite the pedagogical remedies applied by the teacher. This is
not used as part of student’s mark of achievement.
 Summative Assessment
This is done at the end of instruction to determine the extent to which the students have
attained the learning outcomes. It is used for assigning and reporting grades or certifying
mastery of concepts and skills. Example: written examination at the end of the school year to
determine who passes or who fails.
Task/Activity 1 Directions/Instructions: All answers should be written in your activity
notebooks.
1. What is the purpose of assessment shown in the following assessment
settings? Select from the options below. Write a short explanation why it is
so.
a. Assessment as selection or placement
b. Assessment as instruction and providing feedback
c. Assessment as determining what learners need to learn next
d. Assessment as diagnosing learner’s difficulties and misconceptions
e. Assessment as determining progress along a developmental continuum
f. Assessment as program evaluation or accountability
A. A twelve – year old out-of-school youth stopped during the fourth grade
took a test given by the department of Education to go back to formal
schooling. The test determines the grade or year level appropriate for the
learner.
B. Every year, a national assessment is given to grade 3 pupils in English,
Math and Science.
C. A teacher gives a test towards the end of the unit. He /she will use the
test items as starting point for discussion of conceptual problems revealed
by the test.
D. A Math teacher at the start of instruction gives a test to determine the
weaknesses of students on performing operations with signed numbers.
E. An English teacher regularly assesses students’ skills by using probes
which are brief, easily administered measures. The teacher then graphs
changes in the number of correct words per minute (reading) and compares
each student’s growth to the rate of improvement needed to meet learning
goals.

2. Synthesize and integrate information and ideas you have obtained in


Module 1 : Lesson 1.2. Create a concept map that would show the
relationship between and among the roles of assessment.
Note: Concept maps are tools for organizing and presenting knowledge.
They are visual representation that show relationship of concepts.

MODULE 2 : Lesson 2.1


Topic Appropriateness and Alignment of Assessment
Methods to Learning Outcomes
Intended Learning Match learning outcomes with the appropriate
Outcome assessment method.
Discussion
Assessment methods and tools should be parallel to the learning outcomes or targets to provide
learners with opportunities that are rich in breadth and depth and promote deep understanding. For
example, if a learning outcome in an English subject states that students should be able to communicate
their ideas verbally, then assessing their skill through written essay will not allow learners to
demonstrate that stated outcome.

Learning outcomes are statements of performance expectations : COGNITIVE, AFFECTIVE AND


PSYCHOMOTOR.

A. COGNITIVE (Knowledge-based)

Table 1. Cognitive Levels and Processes (Anderson, et.al.,2001)


Levels Processes and Action Simple Learning
Verbs Describing Learning Outcomes competencies
Remembering Processes :Recognizing, Recalling, Define the 6 levels of mental
Retrieving relevant Verbs: define, describe, identify, label, processes by Anderson, et.al.
knowledge from long list, match, name, outline, reproduce,
term memory select, state
Understanding Processes: Interpreting, Exemplifying, Explain the purpose of the
Constructing meaning Classifying, Summarizing, Inferring, Anderson’s Taxonomy of
from instructional Comparing, Explaining Educational Cognitive
messages, including Verbs: convert, describe, distinguish,
oral, written and estimate, extend, generalize, give
graphic examples, paraphrase, rewrite,
communication summarize
Applying Processes: Executing, implementing Write a learning objective for
Carrying out or using a Verbs :apply, change, classify, compute, each level of Anderson’s
procedure in a given demonstrate, discover, modify, operate, Cognitive System
situation predict, prepare, relate, show, solve, use
Analyzing Processes: Differentiating, Organizing, Compare and contrast the
Breaking material into Attributing thinking levels in the revised
its constituent parts Verbs: analyze, arrange, associate, Bloom’s taxonomy And
and determine how compare, contrast, infer, organize, solve Anderson’s.
the parts relate to
one another and to an
overall structure or
purpose
Evaluating Processes: Executing, Monitoring, Judge the effectiveness of
Making judgments Generating writing learning outcomes using
based on criteria and Verbs: appraise, compare, conclude, Anderson’s Taxonomy.
standards contrast, criticize, evaluate, judge, justify,
verify
Creating Processes: Planning, Producing Design a classification scheme
Putting elements Verbs: classify, construct, create, extend, for writing learning outcomes
together to form a formulate, generate, synthesize using Anderson’s levels of
coherent or functional Cognitive system
whole; reorganize
elements into a new
pattern or structure.
The cognitive domain involves the development of knowledge and intellectual skills. It answers the
question. “What do I want learners to know? The first three are lower- order and the next three levels
promote higher-order thinking.

B. PSYCHOMOTOR (Skills-Based)

Table 2. Taxonomy of Psychomotor Domain


Levels Action Verbs Describing Sample
Learning Outcomes Learning Competencies
Observing Describe, detect, distinguish, differentiate, Relate music to a
Active mental attending describe, relate, select particular dance step.
of a physical event
Imitating Begin, display, explain, move, proceed, react, Demonstrate a simple
Attempted copying of a show, state, volunteer dance step.
physical behavior
Practicing Bend, calibrate, construct, differentiate, Display several dance
Trying a specific physical dismantle, fasten, fix, grasp, grind, handle, steps in sequence.
activity over and over measure, mix, organize, operate, manipulate,
mend
Adapting Arrange, combine, compose, construct , Perform a dance showing
Fine tuning. Making create, design, originate, rearrange, new combination of steps.
minor adjustments in reorganize
the physical activity in
order to perfect it,

The psychomotor domain focuses on physical and mechanical skills involving coordination of the brain
and muscular activity. It answers the question , “What actions do I want learners to be able to perform?”

C. AFFECTIVE (Values, Attitudes and Interests)

Table 3. Taxonomy of Affective Domain

Levels Action Verbs Describing Sample


Learning Outcomes Learning Competencies
Receiving Asks, chooses, describes, follows, gives, Listen attentively to
Being aware of or holds, identifies, locates, names, points to, volleyball introduction.
attending to something selects, sits erect replies, uses
in the environment
Responding Answer, assist, comply, conform, discuss, Assist voluntarily in setting
Showing some new greet, help, label, perform, practice, present, up volleyball nets.
behaviour s as a result read, recite, report, select, tell, write
of experience
Valuing Complete, describe, differentiate, study, Attend optional volleyball
Showing some definite explain, follow, form, initiate, invite, join, matches.
involvement or justify, propose, read, report, select, share,
commitment study, work
Organizing Adhere, alter, arrange, combine, compare, Arrange his/her own
Integrating a new value complete, defend, explain, generalize, volleyball practice.
into one’s general set of identify, integrate, modify, order, organize,
values, giving it some prepare, relate, synthesize
ranking among one’s
general priorities.

Internalizing values Act, discriminate, display, influence, listen, Join intramurals to play
Characterization by a modify, perform, practice, propose, qualify, volleyball twice a week.
value or value complex question, revise, serve, serve, solve, use,
acting consistently with verify
the new value

The affective domain emphasizes emotional knowledge. It tackles the question, “What actions do I want
learners to think or care about?

TYPES OF ASSESSMENT METHODS

1. Selected-Response Format . Students select from a given set of options to answer a question or a
problem. Categories: true-false, matching type, multiple – choice formats
 Only one correct or best answer
 Objective
 Efficient
 Items are easy to grade

2. Constructed-response format. This format is more useful in targeting higher levels of cognition. It
is subjective and demands that students create or produce their own answers in response to a
question, problem or task . Categories: brief-constructed response, performance tasks, essay items,
oral questioning

3. Teacher observations. A form of on-going assessment, usually done in combination with oral
questioning. Teachers regularly observe students to check on their understanding. This assessment
method can also be used to assess the effectiveness of teaching strategies and academic interventions.

4. Student self-assessment
Self-Assessment is one of the standards of quality assessment. It is a process where students the
students are given a chance to reflect and rate their own work and judge how well they have
performed in relation to a set of assessment criteria.

MATCHING LEARNING TARGETS WITH ASSESSMENT METHODS

A learning target is a description of performance that includes what learners should know and be able
to do. Matching the assessment method with the learning outcomes requires an examination of the
evidences of learning needed and targeted levels of knowledge, understanding, reasoning , skills,
product/performance and affect as manifested in the learning targets.
Knowledge and simple understanding . Covers the lower order thinking skills of remembering,
understanding and applying. These can be assessed through pencil-and-paper tests, essays and oral
questions.

Deep understanding and reasoning. Involve higher order thinking skills of analyzing, evaluating and
synthesizing. Assessed through essays, performance tasks and oral questioning.

To assess skills, performance assessment is the superior assessment method. When used in real-life and
meaningful context, it becomes an “authentic assessment”.
Student affect can be assessed best by self-assessment. Oral questioning may also work in assessing
affective traits. Affect pertains to attitudes, interests, and values students manifest.
Tasks/ Activities All answers should be written in your activity notebooks .

1. Determine which domain and level of learning are targeted by the following
learning competencies.

Learning Competencies Domain Level


1. Identify parts of a microscope
2. Employ analytical listening to make
predictions.
3. Exhibit correct body posture.
4. Perform jumping over a stationary object
several times in succession, using forward-
and back and side-to-side movement
patterns.

5. Recognize the benefit of patterns in


special products and factoring.
6. Differentiate linear inequalities in two
variables from linear equations in two
variables.
7. Follow written and verbal directions
8. Compose musical pieces using a
particular style of the 20th century.
9. Describe movement skills in response to
sound.
10. Design an individualized exercise
program to achieve personal fitness

2. SEQUENCING. Arrange the learning competencies using the hierarchy from lowest
to highest
Domain: Cognitive
Topic A. Quadratic Equations
______ (a) Solve quadratic equations by factoring
______ (b) Describe a quadratic equation
______ (c) Compare the four methods of solving quadratic equations
______ (d) Differentiate a quadratic equation from other types of equations
______ (e) Formulate real-life problems involving quadratic equations
______ ( f) Examine the nature of roots of a quadratic equation.

Domain: Psychomotor
Topic B. Basic Sketching
______ (a) Watch how tools are selected and used in sketching
______ (b) Create a design using combinations of lines, curves and shapes.
______ (c) Draw various lines, curves and shapes.
______ (d) Set the initial drawing position.

Domain: Affective
Topic C. Short Story
______ (a) Write down important details of the short story pertaining to
character, setting and events.
______ (b) Share inferences, thoughts and feelings based on the short story.
______ (c) Relate story events to personal experience
______ (d) Read carefully the short story.
______ (e)Examine thoughts on the issues raised in the short story.

3.Assessment scenarios
For each of the following scenarios, indicate which method provides the best match.
In determining the appropriate method, apply Table i. Justify your choice in one or
two statements.
1. Mrs. Duque wants to know if her students can identify the different parts of a
flower.
2. Ms. De la Cruz wants to determine if her Grade 1 pupils can smoothly and legibly.
3. Ms. Uy wants to check if her students can subtract two-digit numbers.
4. Mr. Bogo wants to find out if his students can examine the quality of education in
the country.
5. Mrs. Dayao wants to see if her students have grasped the important elements of
the story before continuing on to the next instructional activity.

Module 2: Lesson 2.2


Topic Validity and Reliability
Intended Learning Outcomes 1.Define validity
2.Enumerate information that can be used to establish validity
3.Identify the different threats to validity
4.Define reliability
5.Identify sources of reliability evidences
Discussion/Presentation
Validity is a term derived from the Latin word validas, meaning strong. An assessment is valid if it
measures a student’s actual knowledge and performance with respect to the intended outcomes, and
not something else.
Sources of evidences in Establishing Validity
1. Content-Related Evidence – pertains to the extent to which the test covers the entire domain of
content. If a summative test covers a unit with four topics, then the assessment should contain
items from each topic. A test that appears to adequately measure the learning outcomes and
content is said to possess face validity. Another consideration related to content validity is
instructional validity – the extent to which an assessment is systematically related to the
nature of instruction offered. This is also closely related to the degree to which students’
performance on a test accurately reflect the quality of instruction to promote students’ mastery
of what is being assessed. An instructionally valid test is one that registers differences in the
amount and kind of instruction to which students have been exposed. Inclusions of test items
not taken up in class reduces validity because students had no opportunity to learn the
knowledge or skill being assessed. To improve the validity of assessments, it is recommended
that the teacher constructs a Table of Specifications (ToS).

2. Criterion- Related Evidence . Refers to the degree to which test scores agree with an external
criterion. It examines the relation between an assessment and another measure of the same
trait.
Types:
Concurrent Validity: provides an estimate of a student’s current performance in relation to a
previously validated or established measure. it is important to mention that data from the two
measures are obtained at about the same time. If the statistical analysis reveals a strong
correlation between the two sets of scores, then there is a high criterion Validity.

Predictive Validity- pertains to the power or usefulness of test scores to predict future
performance. For instance, can scores in the entrance /admission test (predictor) be used to
predict college success (criterion)? If there is a significantly high correlation between entrance
examination scores and first year grade point averages assessed a year later, then there is
predictive validity used.
In testing correlations between two data sets for both concurrent and predictive validity, the
Pearson coefficient of correlation (r) or Spearman’s rank order correlation may be ised.

3. Construct-Related Evidence
A construct is an individual characteristic that explains some aspect of behaviour. Construct-
related evidence of validity is an assessment of the quality of the instrument used. It measures
the extent to which the assessment is a meaningful measure of an unobservable trait or
characteristic.
Threats to Validity
These factors are defects in the construction of assessment tasks that would render
assessment inferences inaccurate. The first four apply to traditional tests and performance
assessments. The remaining factors concern brief-constructed response and selected-response
items:
 Unclear test directions
 Complicated vocabulary
 Ambiguous statements
 Inadequate time limits
 Inappropriate difficulty level of test items
 Poorly constructed test items
 Inappropriate test items for outcomes being measured
 Short test
 Improper arrangement of items
 Identifiable pattern of answer
For a test to be valid, it has to be reliable.

RELIABILITY
Refers to reproducibility and consistency in methods and criteria. An assessment is said to be reliable if
it produces the same results if given to an examinee on two occasions. It is expressed as a correlation
coefficient. A high reliability coefficient denotes that if a similar test is readministered to the same
group of students , test results from the first and second testing are comparable.
Types of reliability:
 Internal Reliability – assesses the consistency of results across items within a test
 External Reliability – gauges the extent to which a measure varies from one use to another.
Sources of Reliability Evidence:
 Stability. The test-retest reliability correlates scores obtained from two administrations of the
same test over a period of time. It assumes that there is no considerable change in the
construct between the first and second testing.
 Equivalence. In this method, two different versions of an assessment tool are administered to
the same group of individuals. However, the items are parallel , i.e. they probe the same
construct, base knowledge or skill. The two sets of scores are then correlated in order to
evaluate the consistency of results across alternate versions.
 Internal Consistency. Implies that a student who has mastery learning will get all or most of
the items correctly while a student who knows little or nothing about the subject matter will get
all or most of the items wrong.
Ways to establish internal consistency:
 Spearman –Brown formula
 Cronbach alpha
 Kuder-Richardson Formula (KR)20/21
For internal consistency, the range of reliability measures are rated as follows:
Less than 0.50 - the reliability is low
0.50 - 0.80 - the reliability is moderate
Greater than 0.80 – the reliability is high
 Scorer or Rater Consistency
Certain characteristics of the raters contribute to errors like bias, halo effect, mood, fatigue,
among others.
 Decision Consistency
Describes how consistent the classification decisions are rather than how consistent the scores .

Tasks/Activity Answers should be written in the activity notebooks.


1. Differentiate validity from reliability.
2. Is the following statement true or false? Explain.
“For a test to be valid, it has to be reliable’.
3. Search for the specific formulas of the following:
 Spearman –Brown formula
 Cronbach alpha
 Kuder-Richardson Formula (KR)20/21
4.For each of the following situations, determine whether the assessment is
valid . Explain your answer in two or three sentences citing the type of
validity:
4.1 Test constructors in a secondary school designed a new measurement
procedure to measure intellectual ability. Compared to well-established
measures of intellectual ability, the new test is shorter to reduce the arduous
effect of a long test on students. To determine its effectiveness, a sample of
students accomplished two tests – a standardized intelligence test and the
new days with only a few days interval. Results from both assessments
revealed high correlation.
4.2 A science teacher gave a test on volcanoes to Grade 9 students. The test
included the types of volcanoes, volcanic eruptions and energy from
volcanoes. The teacher was only able to cover extensively the first two topics.
Several test items were included on volcanic energy and how energy from
volcanoes may be tapped for human use. Majority of the students got low
marks.
5. For each of the following scenarios, determine the source of reliability
evidence and type of reliability coefficient. Explain your answer in two or
three statements.
5.1 For a sample of 150 Grade 10 students , a Science test on living things
and their environment was tested for reliability by comparing the scores
obtained on the odd-numbered and even-numbered items.
5.2 Scores from 100 Grade 7 students were obtained from a November
administration of a test in Filipino about panghalip na panao or personal
pronouns. These were compared to the scores of the same group from a
September administration of the same year.
Module 2: Lesson 2.3
Topic Practicality and Efficiency
Intended Learning Outcomes 1. Describe how testing can be made more practical and efficient
2. Identify the factors on practicality and efficiency that contribute to
high quality assessment.

Discussion/Presentation
Classroom testing should be “ teacher friendly”. Test development and grading should be practical and
efficient. Practical means ‘useful’, that is , it can be used to improve classroom instruction and for
outcomes assessment purposes. It likewise pertains to judicious use of classroom time. Efficient, in this
context, pertains to development, administration and grading of assessment with the least waste of
resources and effects.

Factors on Practicality and Efficiency

 Familiarity with the Method


While it is best that teachers utilize various methods of ASSESSMENT , it would be a waste of
time and resources if the teacher is unfamiliar with the method of assessment. The assessment
may not satisfactorily realize the learning outcomes. The teacher may commit mistakes in the
construct or format of assessment. Consequently, the results would not be credible. A rule of
thumb is that simple learning targets require simple assessment, complex learning targets
demand complex assessment. It is therefore, critical that teachers learn the strengths and
weaknesses of each method of assessment, how they are developed, administered and marked.
 Time Required
A desirable assessment is short yet able to provide valid and reliable results. It is best that
assessments are quick to develop but not to the point of reckless construction. Assessments
should allow students to respond readily but not hastily. Assessments should also be scored
promptly but not without basis.
 Ease in Administration
To avoid questions during the test or performance task, instructions must be clear and and
complete.
 Ease of Scoring
Selected response formats are the easiest to score compared to restricted and more so
extended- response essays. It is also difficult to score performance assessment like oral
presentation, research papers , among others. Performance assessments, however, make use of
rubrics, and while this facilitates scoring, care must be observed in rating to ensure objectivity.
 Ease of Interpretation
Oftentimes, a score is meaningless if it is not interpreted. Objective tests are the easiest to
interpret . By establishing a standard,, the teacher is able to determine right away if a student
passed the test. By matching the score with the level of proficiency , teachers cab determine if
a student has reached master or not. In performance tasks, rubrics are prepared to objectively
and expeditiously rate students’ actual performance or product. Rubrics would have to be
shared with students so that they understand the meaning of the scores or grades given to them
 Cost.
It is recommended that one chooses the most economical assessment. In the school level,
multiple choice tests are still very popular especially in entrance tests, classroom tests, and
simulated board examinations. However, high quality assessments focus on deep learning
which is essential if students are to develop the skills they need for a knowledgeable society.
Therefore, schools must support performance assessment

Tasks/ Activities All answers should be written in the Activity Notebook


1. Among the given assessment methods, choose which one is most practical or
efficient. Justify your answer.
Factor Assessment Method Reason
Familiarity with the Selected-response,
method constructed response,
teacher observation, Self-
assessment
Time Required Brief- constructed
response, Essay,
Performance assessment,
oral questioning
Ease in Selected-Response,
administration Constructed Response,
teacher observation, self-
assessment
Ease of scoring Selected-response,
constructed response,
teacher observation, self-
assessment
Ease of Selected-response,
Interpretation constructed response,
teacher observation, self-
assessment
Cost Selected-response,
constructed response,
teacher observation, self-
assessment
2. As an assessment expert, you were asked for advice on the following matters.
Scenario 1.
Mrs. Loreto handles 6 Grade 5 classes in English. She would like to test their skills in
spelling. She has two options:
a. Oral spelling test. The teacher pronounces each word out loud and the students
write down each word.
b. Spelling bee-type test. Each student is asked individually one-at-a-time to spell
specific words out loud. In terms of practicality and efficiency, which one would you
suggest? Why?
Scenario 2.
Mr. Chua is preparing a final examination for his third year college students in
Political Science scheduled two weeks from now. He is handling six classes. He has
no teaching assistant. He realized that after the final examinations, he has three
days to calculate the grades of his students. He is thinking of giving extended
response essays.
What advice would you give him? Which aspect of practicality and efficiency should
he prioritize? What type of assessment should he consider?
Module 2: Lesson 2.4
Topic Ethics
Intended Learning Outcomes 1. Enumerate other aspects of fairness
2. Identify ethical issues in testing
3. Recommend actions to observe ethical standard in testing
Discussion/Presentation

Teachers’ assessments have important long-term and short-term consequences for students; thus
teachers have an ethical responsibility to make decisions using the most valid and reliable information
possible. Validity and reliability are aspects of fairness. Other aspects of fairness include the following:
 Students’ Knowledge of Learning Targets and Assessments. This aspect of fairness speaks of
transparency. Transparency is defined here as disclosure of information to students about
assessments. This includes what learning outcomes are to be assessed and evaluated,
assessment methods and formats, weighting of items, allocated time in completing the
assessment and grading criteria or rubric. By informing students regarding the assessment
details, they can adequately prepare and recognize the importance of assessment. They become
part of the assessment process. By doing so, assessment becomes learner-centered.
In regard to written tests, it is important that students know what is included and excluded in
the test .The scoring criteria should also be known . As for performance assessments, the criteria
should be divulged prior to assessment so that students will know what the teacher is looking
for in the actual performance or product.
 Opportunity to Learn. Fair assessments are aligned with instruction that provides adequate time
and opportunities for all students to learn. Discussing an extensive unit in an hour is obviously
insufficient. Inadequate instructional approaches would not be just to the learners because they
are not given enough experiences to process information and develop their skills.
 Prerequisite Knowledge and Skills. Students may perform poorly in an assessment if they do
not possess background knowledge and skills. The teacher must identify early on the
prerequisite skills necessary for completing an assessment. The teacher may also provide clinics
or reinforced tutorials to address gaps in students’ knowledge and skills. He/she may also
recommend reading materials or advise students to attend supplemental instruction sessions
when possible. These are forms of remediation.
 Avoiding Stereotyping.
A stereotype is a generalization of a group of people based on inclusive observations of a small
sample of this group. Common stereotypes are racial, sexual and gender remarks. Stereotyping
is caused by preconceived judgments of people one comes in contact with which are sometimes
unintended. Teachers should avoid terms and examples that may be offensive to students of
different gender, race, religion, culture or nationality. Stereotypes can affect students’
performance in something that they do not know already.
 Avoiding Bias in Assessment tasks and Procedures
Assessment must be free from bias. There are two forms of assessment bias: offensiveness and
unfair penalization.
Offensiveness happens if test-takers get distressed, upset, or distracted about how an individual
or a particular group is portrayed in the test.
Unfair Penalization harms student performance due to test content, not because items are
offensive but rather, the content caters to some particular groups from the same economic
class, race, gender, etc., leaving other groups at a loss or a disadvantage.
 Accommodating Special Needs
Teachers need to be sensitive to the needs of students. Certain accommodations must be given
especially for those who are physically or mentally challenged. The legal basis for
accommodation is contained in Sec 12 of Republic Act 7277 entitled “An Act Providing for the
Rehabilitation, Self-Development and Self-Reliance of Disabled Person and Their Integration into
the Mainstream of Society and for other Purposes”. The provision talks about access to quality
education – that learning institutions should consider the special needs of learners with
disabilities in terms of facilities, class schedules, physical education requirements and other
related matters. Another is Sec.32 of CHED Memorandum 09, s.2013 on “Enhanced Policies and
guidelines on Student Affairs and Services” which states that higher education institutions
should ensure that academic accommodation is made available to persons with disabilities and
learners with special needs.
 Relevance
This can be thought of as an aspect of fairness. Irrelevant assessment would mean short-
changing students of worthwhile assessment experiences.
Additional criteria for achieving quality assessment:
 Assessment should reflect the knowledge and skills that are most important for students
to learn.
 Assessment should support every student’s opportunity to learn things that are
important
 Assessment should tell teachers and individual students

Ethical Issues
Asking pupils to answer sensitive questions like their sexuality or problems in the family are
unwarranted especially without the consent of the parents. The following are other ethical issues in
testing:
 Possible harm to the participants
 Confidentiality of results
 Deception regarding the purpose and use of the assessment
 Temptation to assist students in answering the test
.
Tasks/Activities Answer the following in your activity notebooks.
A. Suppose you are the principal of a public high school. You received
complaints from students concerning their tests. Based on their
complaints, you decided to talk to the teachers and offered advice based
on ethical standards. For each scenario, write down your
recommendations citing specific aspects of ethics or fairness discussed in
the module.
Scenario 1. Eight-grade students complained that their music teacher uses
only written tests as the sole method of assessment. They were not
assessed on their skills in singing and creating musical melodies.
Scenario 2. Grade 7 students complained that they were not informed that
there is a summative test in Algebra.
Scenario 3.Grade 9 students complained that there were questions in their
Science test on the last unit which was not discussed in class.
Scenario 4. Grade 7 students complained that they were not told what to
study for the mastery test. They were simply told to study and prepare for
the test.
Scenario 5. Students were tested on an article they read about Filipino
ethnic groups. The article depicted Ilocanos as stingy. Students who hail
from Ilocos were not comfortable answering the test.

You might also like