Journal Pre-proof
Quality assurance of test blueprinting
Ghada Eweda, Zakeya Abdulbaqi Bukhary, Omayma Hamed
PII:
S8755-7223(19)30148-6
DOI:
https://doi.org/10.1016/j.profnurs.2019.09.001
Reference:
YJPNU 1263
To appear in:
Journal of Professional Nursing
Received date:
15 February 2019
Revised date:
14 August 2019
Accepted date:
4 September 2019
Please cite this article as: G. Eweda, Z.A. Bukhary and O. Hamed, Quality assurance
of test blueprinting, Journal of Professional Nursing(2019), https://doi.org/10.1016/
j.profnurs.2019.09.001
This is a PDF file of an article that has undergone enhancements after acceptance, such
as the addition of a cover page and metadata, and formatting for readability, but it is
not yet the definitive version of record. This version will undergo additional copyediting,
typesetting and review before it is published in its final form, but we are providing this
version to give early visibility of the article. Please note that, during the production
process, errors may be discovered which could affect the content, and all legal disclaimers
that apply to the journal pertain.
© 2019 Published by Elsevier.
Journal Pre-proof
Quality Assurance of Test Blueprinting
GHADA EWEDAa, ZAKEYA ABDULBAQI BUKHARY*b&c & OMAYMA HAMED*d&e
a
Exam Center, College of Nursing, Taibah University, Medina, Saudi Arabia
b
College of Medicine, Taibah University, Medina, Saudi Arabia and c Prince Mohammad Bin
Abdul Aziz Hospital, National Guard Health Affairs, Saudi Arabia.
King Abdulaziz University, Jeddah, Saudi Arabia and e Cairo University, Cairo, Egypt
of
d
ro
*These authors contributed equally to this work.
-p
Correspondence: Dr. Ghada Eweda, Assistant Professor of Microbiology and
re
Immunology, and Head of the Exam Center Committee at the College of Nursing, Taibah
lP
University, PO Box 344, Medina 41411, Saudi Arabia. T: +966530269920.
ur
Acknowledgement:
na
[email protected]
Jo
Special appreciation goes to the administrative academic staff members, at the College of
Nursing Taibah University (Medina, Saudi Arabia), who support an inclusive system for
improvement of the College assessment process.
Declaration of interest: none.
Notes on Contributors:
1
Journal Pre-proof
GHADA EWEDA, PhD, is an Assistant Professor of Microbiology and Immunology,
and the Head of the Nursing College Exam Center Committee at Taibah University
(Medina, Saudi Arabia). She conceived the study, designed the matrices and equations
and participated in manuscript writing, revision and final copy preparation.
ZAKEYA A. BUKHARY*; A Senior Fellow in the Higher Education Academy for
Teaching and Learning. Associate Professor of Internal Medicine and Infectious Diseases
of
Consultant at PMBAH-NGHA, Saudi Arabia. She participated in manuscript writing,
ro
editing and final review and revisions. She was the Vice Dean of College of Nursing at
-p
Taibah University. She managed and led the Exam Center, coordinated the team-based
re
work and committees for education. PO Box 42477, Medina 41541, Saudi Arabia.
lP
[email protected] ORCID: http://orcid.org/0000-0002-5639-5975
OMAYMA A.E. HAMED*, MBBCh, MSc, MD, JMHPE, FAIMER, is a Professor in
na
the Medical Education Department and Chairman of the Quality & Academic
ur
Accreditation Unit in King Abdulaziz University (Jeddah, Saudi Arabia). She participated
Jo
in manuscript writing, revision and final copy preparation and providing consultation.
2
Journal Pre-proof
Revised Version (Ms. Ref. No.: JPN-D-19-58)
Title: Quality Assurance of Test Blueprinting
Abstract
Tests offer scores that measure student learning and programs outcomes. Valid
examinations are needed to accurately reflect scores related to dimensions of knowledge,
of
analysis and different competencies in health education. The primary step in the process
ro
of exam development should be the construction of a test blueprint. However, a published
-p
blueprint does not ensure its adoption and a method to evaluate and monitor content
validity is required. The degree of alignment of a test with its blueprint is a critical
re
element of content validity. However, the availability of a published blueprint does not
lP
ensure that instructors adhere to it when developing their tests and a method to evaluate
na
and monitor content validity is required. Usually, this is done in a subjective manner
through asking students about whether the material taught has been fairly represented on
ur
the exam. Though useful, subjective evaluation can lack accuracy and can be biased and
Jo
misleading. This article study aims to present a tool of for quantitative determination of
the degree of consistency between the actual test and the developed blueprint. objective
verification of alignment of the actual test with the developed blueprint. Ensuring
Developing an objective tool to assure the quality of the test blueprinting process,
through objective verification of alignment of the test with the test blueprint, quantitative
determination of the degree of consistency between the test and the test blueprint, allows
to better evaluate, monitor and increases the extent of content validity of students’
assessments.
3
Journal Pre-proof
Keywords: Blueprint; Alignment; Verification; Quality assurance; Verification; Content
validity; Students’ Assessment
Introduction
Test scores are the basis of making influential decisions on students and programs
of
(Bridge, Musial, Frank, Roe, & Sawilowsky, 2003). Invalid examinations that lack true
ro
reflection of the course content and objectives could lead to misjudging students’
-p
knowledge and abilities (Newble & Cannon, 2001). An important approach to developing
content-valid examinations is the test blueprint (Kane, 1992; Dauphinee, 1994; Fowell,
re
Southgate, & Bligh, 1999; Hamdy, 2006). A test blueprint represents a systematic
lP
approach to assessment construction, linking the course content, used for teaching and
na
learning, to the test items, which increases the content validity of the assessment and the
alignment of test items with objectives (Nunally, 1978; Bridge et al. 2003; Coderre,
ur
Woloschuk, & McLaughlin, 2009; Siddiqui & Ware, 2014). A test blueprint is considered
Jo
as a map that shows the test structure with the content areas and/or learning outcomes
usually represented on the left side of the table and the measured cognitive skills across
the table’s first row (Suskie, 2009). A blueprint guides item writers to develop sufficient
items that cover important content areas and objectives at the suitable cognitive level, and
helps instructors to be accountable for the learning outcomes they produce (Oermann &
Gaberson, 2017). Designing examinations based on a blueprint can overcome challenges
such as over- or under-representation of some topics of the material taught (Jolly, 2010;
Malau-Aduli, Walls, & Zimitat, 2012). By allowing the representative sampling of
learning outcomes and instructional content, a test blueprint ensures that a test measures
4
Journal Pre-proof
what it is intended to measure and allows teachers to reach valid judgements about
students’ test scores (Billings and Halstead, 2016; Oermann & Gaberson, 2017).
Combining the results of item analysis statistics with the developed blueprint helps
evaluate and revisit the learning outcomes and teaching processes (Abdellatif & AlShahrani 2019).
A crucial element of content validity is the degree of congruency between a test
of
and its blueprint (Sireci, 1998). The more congruent a test is with its blueprint, the higher
ro
the evaluation content validity will be (McDonald, 2018). Despite being crucial to test
-p
development and construction, the application of test blueprint has been disregarded in
re
health education (American Educational Research Association, American Psychological
lP
Association, & National Council of Measurement in Education, 2001). Only 15 percent
of medical school administrators out of (144) American and international medical schools
na
found test blueprint to be very important and required course directors or instructors to
ur
develop assessment blueprints (Bridge et al. 2003). None of the nursing course
coordinators, at an Associate Degree Nursing Program, consistently documented a test
Jo
blueprint or required the submission of test blueprint by individual faculty members for
review before each examination (Siegel, 2015). Moreover, publishing However, the
availability a published blueprint does not necessarily ensure its adoption that instructors
adhere to it when writing their exams, and still a method to monitor and evaluate exam
alignment with blueprint content validity is necessary. Usually this is done in a subjective
manner by asking the students and/or faculty, after summative examinations, questions
that reflect the extent of content validity (Coderre et al. 2009; Suskie, 2009).
5
Journal Pre-proof
The authors believe that introducing a tool, to objectively verify the quality of a test
blueprinting process, adds value to the process and encourages its implementation by
providing the ability to objectively judge and follow up the degree of consistency of the
final test with what is stated in the blueprint; thus, enhancing monitor and ensure the
content validity of students’ assessment. In this article the authors present a two-matrix
tool that allows to quantitatively evaluate the extent of exam content validity by
ro
of
calculating the degree of test alignment with the test blueprint.
-p
Materials and Methods
re
Two matrices were constructed for stepwise quality assurance of test blueprinting to
lP
reach a numerical value that expresses the degree of consistency between the test and the
na
test blueprint.
ur
Start with your own test blueprint
It is important to note that there is no one ideal way to construct a test blueprint. The
Jo
choice of a blueprint template is based on how suitable it is to achieve the learning
outcomes of the course. The quality assurance process of test blueprinting, presented in
this article study, can be adapted to a wide variety of blueprint models as will be
discussed later.
The test blueprint model presented in this paper (Table 1) is used as an example to
show how you can apply the alignment matrices on a test blueprint. This blueprint model
shows the distribution of the test items over the course topics, the cognitive domains and
the assessment tools.
6
Journal Pre-proof
Table 1. A test Blueprint Model, as an example, for a 20-question exam*
2
3
4
5
Course
topics
Contact
hours
allocated
to the
topic
Weight
Of the
topic
Topic 1
2
2/8=
0.25
5
3
(1, 2,0)
Topic 2
1
2
Topic 3
3
Topic 4
2
Total
8
1/8=
0.125
3/8=
0.375
2/8=
0.25
1
0
(0, 0 ,0)
2
(2, 0, 0)
0
(0, 0 ,0)
5**
(3, 2, 0)
6
Number
Cognitive skill
of test
(assessment tool; MCQ, EMQ,
items
SAQ)
on each Remember/ Apply/ Evaluate/
topic
Understand Analyze
Create
2
(2, 0, 0)
ro
lP
20
-p
5
re
8
0
(0, 0 ,0)
TOTAL
Number of
items
(MCQ,
EMQ,
SAQ)
5
(3, 2, 0)
1
(0, 0, 1)
1
(0, 0, 1)
1
(0, 0, 1)
3**
(0, 0, 3)
2
(0, 0, 2)
8
(3, 2, 3)
5
(1, 2, 2)
20*
(7, 6, 7)*
of
Column
No.: 1
1
(0, 0, 1)
5
(1, 2, 2)
4
(1, 2, 1)
12**
(4, 4, 4)
Jo
ur
na
* A test blueprint for a 20-question exam distributed as seven multiple choice questions
(MCQs), six extended matching questions (EMQs) and seven short answer questions
(SAQs). **Five, twelve and three of the exam questions target the
(Remember/Understand), (Apply/Analyze) and (Evaluate/Create) cognitive skills,
respectively.
This test blueprint model is the one used at …….. University College of Nursing.
Steps for alignment of the actual test with the test blueprint during writing the exam
and for verification of alignment:
Columns 1, 2 and 3 of the Alignment Matrix-I (Table 2) are filled by the assigned
faculty during writing the test which helps aligning the test with the test blueprint. In
Column 1, the assigned faculty writes the ordinal number of test items as on the test
paper. In Column 2, the faculty writes the topic number represented by the corresponding
7
Journal Pre-proof
item. In Column 3, the assigned faculty puts a check mark in one of the three sub-column
s based on that represents the cognitive skill measured by the designated test item. For the
review process, the assigned faculty submits the test blueprint, the Alignment Matrix-I
and the final test to the reviewer/s.
Columns 4 and 5 of Alignment Matrix-I are filled by the reviewer/s of the blueprinting
process, to verify the degree of alignment between the actual test and the presented test
of
blueprint. For each test item, the reviewers write the topic number represented and the
ro
cognitive level measured by represented by this each test item from their point of view in
-p
Columns 4 and 5, respectively.
re
In the model example illustrated in the Alignment Matrix-I, the misaligned test items are
lP
highlighted and underlined.
question exam model
2
Item
number
(in the
exam
paper)
Topic
Number
(as stated
in the
blueprint)
1
2
3*
4
5
6
*filled by
faculty
member(s)
writing the
exam
1
2
2
1
3
1
ur
Colum
n No.:
1
na
Table 2. Alignment Matrix-I: Alignment of actual test with the test blueprint; a 20-
4
5
Cognitive skill
*filled by faculty member(s) writing
the exam
Topic
Number
Cognitive skill
*filled by the reviewer
Jo
3
Remember/
Understand
Apply/
Analyze
Evaluate/
Create
*filled by
the
reviewer
1
2
2
1
3
1
Remember/
Understand
Apply/
Analyze
Evaluate
/
Create
8
Journal Pre-proof
3
1
4
1
4
3
3
3
4
4
17***
18**
19**
20
3
4
3
3
3
1
4
1
4
3
4
3
4
4
2
4
3
3
of
7
8
9
10
11
12
13***
14
15*
16
ro
In this model, the reviewers find six misaligned items and record their findings in Columns 4 and 5; *items
number (3) and (15) on the exam paper actually target (Remember/Understand) not (Apply/Analyze)
-p
cognitive skill, **items number (18) and (19) target (Apply/Analyze) not (Evaluate/Create) cognitive skill,
re
***item number (13) actually covers Topic 4 not Topic 3, and item number (17) covers Topic 2 not Topic
na
lP
3.
ur
Alignment Matrix-II (Table 3) is a framework filled by the reviewer/s to reach a
numerical value expressing the degree of test alignment with the blueprint. The reviewers
Jo
fill Columns 1, 2 and 3 of Alignment Matrix-II are filled based on what is stated in the
blueprint matrix submitted by the assigned faculty (Table 1), and fill Columns 4 and 5 of
Alignment Matrix-II are filled based on the data that have been filled by the reviewer/s
on Columns 4 and 5 of Alignment Matrix-I.
Table 3. Alignment Matrix-II: Degree of alignment of the test with the test blueprint; a
20-question exam model*
9
Journal Pre-proof
1
2
3
4
Total
Remember/
Understand
Apply/
Analyze
Evaluate/
Create
3
0
2
0
5
2
1
5
4
12
0
1
1
1
3
4
5
Total
(blueprint)
Actual number of test items
representing this topic on the
exam paper as observed by the
reviewers
Total
(actual)
5
2
8
5
20
Remember/
Understand
Apply/
Analyze
Evaluate/
Create
3
1
2
1
7
2
1
4
5
12
0
1
0
0
1
5
3
6
6
20
-p
Row No.: 1
2
3
4
5
Number of test items representing
this topic as stated in the test
blueprint submitted by the faculty
3
ro
Topic
number
(as stated in
the
blueprint)
2
of
Column
No.: 1
re
The degree of alignment= the number of congruent items/ the total number of test items x
100
lP
For the 20-question exam model represented in Tables 1, 2 and 3:
na
AlT = (5+2+6+5) /20 x 100= 90%.
ur
AlCS = (5+12+1) /20 x 100= 90%.
Jo
* The reviewers transition what is stated in the blueprint (model example in Table 1) to Columns 1, 2
and 3 of Alignment Matrix-II (Table 3) and transition their findings from Column 4 and 5 of
Alignment Matrix-I (Table 2) to Columns 4 and 5 of Alignment Matrix-II (Table 3). This chart
reflects a comparison and calculative process for quantitative interpretation of findings, to objectively
reflect the degree of alignment of the actual test with the test blueprint.
The reviewer uses Alignment Matrix-II to calculate the degree of test alignment.
The degree of test alignment with the blueprint will be simply expressed by the formula:
The degree of alignment = the number of congruent items/ the total number of test items
x 100
10
Journal Pre-proof
The degree of alignment can be expressed for various aspects:
1. The degree of test alignment with the test blueprint regarding the total number of
test items representing each topic (AlT):
For In Row 1 each row (topic number) of Alignment Matrix-II, compare the total
number of test items representing this topic as stated in the blueprint (Column 3 of
of
Alignment Matrix-II) the number in Column 3 (that represents the total number of test
ro
items representing Topic 1 as stated in the blueprint) with the total number of test items
-p
representing the topic in the actual test (Column 5 of Alignment Matrix-II), number in
Column 5 (that represents the total number of test items representing Topic 1 on the
re
actual test), then choose the smaller number out of the two numbers. This number
lP
represents the number of items appearing on the test that are congruent with the blueprint
regarding the representation of this topic Topic 1. Repeat this step for all the the rest of
na
the rows (topics), add up the resulting numbers up together and substitute in the equation
ur
to get the percentage of test alignment with the blueprint with respect to the total number
Jo
of test items representing each topic (calculations illustrated in Table 3).
2. The degree of test alignment with the test blueprint regarding the total number of
test items measuring each cognitive skill (AlCS):
In the last row of Alignment Matrix-II (Row 5 in Table 3), compare the number in each
sub-column of Column 2 (that represents the total number of test items measuring the
designated cognitive skill as stated in the blueprint) with the number in the corresponding
sub-column of Column 4 (that represents the total number of test items measuring the
designated cognitive skill on the actual test as judged by the reviewers). the total number
11
Journal Pre-proof
of test items, measuring each cognitive skill, in the first, second and third sub-columns of
Column 2, with the corresponding sub-columns of Column. For each two corresponding
sub-columns, choose the smaller number out of the two numbers, then add the resulting
three numbers up together to get the number of items in the test that are congruent with
the blueprint. Substitute in the equation to get the percentage of test alignment with the
blueprint with respect to the total number of test items measuring each cognitive skill
of
(calculations illustrated in Table 3).
ro
After the review process is complete, a copy of both Alignment Matrices I and II is sent
-p
to the course instructors as a feedback on the degree of consistency of their assessments
re
with what they state in the submitted blueprints and on which items show discrepancies.
ur
Discussion
na
guided corrective measures.
lP
This objective feedback can effectively be used by the course instructors to undertake
Jo
Publishing a test blueprint does not necessary reflect its adoption. Monitoring and
evaluation of the extent of content validity is usually carried out by asking the students
and/or faculty about whether the material taught and the course learning outcomes have
been fairly represented on the exam (Coderre et al. 2009, Suskie, 2009) which, though
useful, are not accurate and could be biased by many factors, for example, the level of
student’s performance on the exam. This article study presents a tool for quantitative
determination of the percentage of test consistency with the blueprint that allows to
objectively judge the test blueprinting process and thus enhancing the content validity of
12
Journal Pre-proof
assessments. The degree of consistency with respect to various categories can give
important insights into flaws in the education system or assessment process and how to
take corrective measures. For example, low degree of consistency between the test and
the blueprint regarding the total number of test items measuring each cognitive domain
(AlCS) could reflect a problem with the instructor’s skills to write items at the correct
cognitive domain, that would call for running a training workshop for the faculty on how
of
to write test items that correctly target each of the different cognitive domains. Moreover,
ro
expressing the degree of alignment in a numerical manner allows to objectively follow-up
-p
the degree of improvement of alignment of the subsequent course exams with the course
re
blueprint.
lP
There are several ways to construct a test blueprint for written exams and objective
structured clinical examinations (OSCEs). The rows of a blueprint matrix may represent
na
the course or module learning outcomes (Patil, Gosavi, Bannur, & Ratnakar, 2015;
ur
Billings and Halstead, 2016), the course units/systems/topics (Bridge et al. 2003; Gaffas,
Sequeira, Al Namia, & Al-Harbi, 2012; Siddiqui & Ware, 2014; Becker & Vassar, 2015;
Jo
Patil, Gosavi, Bannur, & Ratnakar, 2015) or clinical presentations (McLaughlin et al.
2005, Coderre et al. 2009). The columns may represent the cognitive skills at various
levels (Bridge et al. 2003, Ahmad & Hamed 2014) or the clinical tasks such as diagnosis,
investigation and treatment (McLaughlin et al. 2005, Coderre et al. 2009, Siddiqui &
Ware 2014). The weighting of a topic can be based on the contact hours assigned to each
topic (Bridge et al. 2003, Ahmad & Hamed 2014) or the impact of a topic and frequency
of clinical presentations (McLaughlin et al. 2005, Coderre et al. 2009, Patil et al. 2015).
Teaching staff can have different opinions on which design of a blueprint matrix is more
13
Journal Pre-proof
practicable, and a teacher can make changes in a blueprint matrix to make it more
applicable to the course content and objectives (Izard, 2005; Oermann & Gaberson,
2017). The quality assurance process of test blueprinting, presented in this paper, can be
adapted to whatever blueprint template used. Alignment Matrices I and II cells are
modified to match the variety of alternatives used in the test blueprint.
of
Conclusion:
ro
Subjective evaluation of the extent of the exams content validity through
-p
questionnaires, though useful, could lack accuracy or be biased. This project study
re
presents a tool for numerical verification of alignment between the actual test and the test
lP
blueprint which allows to objectively monitor and evaluate the blueprinting process and
the compatibility of exam questions to the learning outcomes intended to be assessed.
na
This The presented process of quality assurance of test blueprinting is expected to make
blueprint application more practicable and valuable in guiding faculty development
ur
programs to improve the validity of students’ assessments results. The tool presented in
Jo
this paper study is applicable to a wide variety of blueprint templates.
Notes on Contributors:
Acknowledgement:
14
Journal Pre-proof
Declaration of interest: none.
of
References:
Abdellatif, H., & Al-Shahrani A. M. (2019). Effect of blueprinting methods on test
ro
difficulty, discrimination, and reliability indices: cross-sectional study in an integrated
-p
learning program. Adv Med Educ Pract, 10, 23–30.
re
Ahmad, R. G., & Hamed O. A. E. (2014). Impact of adopting a newly developed
lP
blueprinting method and relating it to item analysis on students' performance. Med Teach,
36(1), S55–S61.
na
American Educational Research Association, American Psychological Association, &
ur
National Council of Measurement in Education. (2001). Standards for educational and
Jo
psychological testing. Washington, DC: American Educational Research Association.
Becker, L. R., & Vassar, M. (2015). An assessment blueprint for the Advanced Medical
Life Support two-day prehospital emergency medical services training program in the
United States. J Educ Eval Health Prof, 12(1), 43.
Billings, D. M., & Halstead, J. A. (2016). Teaching in nursing: a guide for faculty. 5th
ed. St. Louis, Mo.: Elsevier/Saunders.
15
Journal Pre-proof
Bridge, P., Musial, J., Frank, R., Roe, T., & Sawilowsky, S. (2003). Measurement
practices: Methods for developing content-valid student examinations. Med Teach, 25(4),
414–421.
Coderre, S., Woloschuk, W., & McLaughlin, K. (2009). Twelve tips for blueprinting.
Med Teach, 31, 359–361.
of
Dauphinee, D. (1994). Determining the content of certification examinations. In: Newble
ro
D, Jolly B, Wakeford RE, editors. The certification and recertification of doctors: issues
-p
in the assessment of clinical competence (pp. 92 – 104). Cambridge: Cambridge
University Press.
re
Fowell, S. L., Southgate, L. J., & Bligh, J. G. (1999). Evaluating assessment: the missing
lP
link? Med Educ, 33, 276 – 281.
na
Gaffas, E. M., Sequeira, R. P., Al Namia, R.A., & Al-Harbi, K. S. (2012). Test blueprints
for psychiatry residency in-training written examinations in Riyad, Saudi Arabia. Adv
ur
Med Educ Pract, 3, 31-46.
Jo
Hamdy, H. (2006). Blueprinting for the assessment of health care professionals. Clin
Teach, 3,175–179.
Izard, J. (2005). Overview of test construction. In: Ross, K. N., (Ed.), Quantitative
research methods in educational planning. Module 6. Paris, France: UNESCO
International Institute for Educational Planning.
Jolly, B. (2010).Written examinations. In: Swanwick, T., (Ed.), Understanding medical
education: evidence, theory and practice (pp. 208–231). Oxford: Wiley-Blackwell.
16
Journal Pre-proof
Kane, M. T. (1992). The assessment of professional competence. Eval Health Prof,
15(2), 163 – 182.
Malau-Aduli, B. S., Walls, J., & Zimitat, C. (2012). Validity, reliability and equivalence
of parallel examinations in a university setting. Creative Educ, 3, 923–930.
McDonald, M. E. (2018). The Nurse Educator’s Guide to Assessing Learning Outcomes
(4th ed., p. 25). Burlington, Massachusetts: Dordrecht, Netherlands: Jones & Bartlett
of
Learning Publishers.
ro
McLaughlin, K., Coderre, S., Woloschuk, W., Lim, T. H., Muruve, D., & Mandin, H.
-p
(2005). The Influence of Objectives, Learning Experiences and Examination Blueprint on
re
Medical Students’ Examination Preparation. BMC Med Educ, 5, 39.
lP
Newble, D., & Cannon, R. (2001). A handbook for Medical teachers (4th ed., pp. 126–
129). Dordrecht, Netherlands: Kluwer Academic Publishers.
na
Nunally, J. C. (1978). Psychometric theory. New York: McGraw Hill.
ur
Oermann, M. H., & Gaberson, K. B. (2017). Evaluation and testing in nursing education
Jo
(5th ed.). New York: Springer Publishing Company.
Patil, S. Y., Gosavi, M., Bannur, H. B., & Ratnakar, A. (2015). Blueprinting in
assessment: A tool to increase the validity of undergraduate written examinations in
Pathology. Int J App Basic Med Res, 5, S76–79.
Siddiqui, I., & Ware, J. (2014). Test blueprinting for multiple choice questions exams. J
Health Spec, 2, 123–125.
Sireci, S. G. (1998). The construct of content validity. Social Indicators Research, 45,
83–117.
17
Journal Pre-proof
Siegel, T. J. (2015). Assessment Practices at an Associate Degree Nursing Program.
Retrieved
from
http://scholarworks.waldenu.edu/cgi/viewcontent.cgi?article=1602&context=dissertations
Suskie, L. (2009). Assessing student learning: A common sense guide (2nd ed.). John
Jo
ur
na
lP
re
-p
ro
of
Wiley & Sons.
18
Journal Pre-proof
Revised Version (Ms. Ref. No.: JPN-D-19-58)
Title: Quality Assurance of Test Blueprinting
Highlights
The construction of a test blueprint is the primary step in the process of test
of
development to ensure content validity.
ro
A crucial element of content validity is the degree of congruency between a test and
its blueprint
-p
A published blueprint does not ensure that instructors adhere to it when writing their
re
exams, and still a method to monitor and evaluate exam alignment with blueprint is
lP
necessary its adoption and a method to evaluate and monitor content validity is
required.
na
Subjective evaluation of the extent of content validity through surveys can be biased
ur
leading to inaccurate judgement.
Jo
The authors present a tool for numerical verification of alignment between the actual
test and the test blueprint, which is useful required for objective monitoring and
evaluation of the degree of adhering to the blueprint when writing the exams to
ensure the compatibility of exam questions to the learning outcomes intended to be
assessed.
19