ExitExamsforUniversityGraduates Report PDF

Download as pdf or txt
Download as pdf or txt
You are on page 1of 112

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/361094060

University Exit Examination in Ethiopia: Strategies for Institutionalization and


Implementation

Research · February 2019

CITATION READS

1 403

3 authors, including:

Belay Hagos Hailu


Addis Ababa University
10 PUBLICATIONS   20 CITATIONS   

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Traffic Psychology View project

All content following this page was uploaded by Belay Hagos Hailu on 04 June 2022.

The user has requested enhancement of the downloaded file.


Report on
University Exit Examination in Ethiopia:
Strategies for Institutionalization and Implementation

(Revised)

Submitted to the Ministry of Science and Higher Education)

February 2019
Addis Ababa, Ethiopia
List of Acronyms and Abbreviations

AASTU Addis Ababa Science and Technology University


AAU Addis Ababa University
EAS Ethiopian Academy of Sciences
EC Ethiopian Calendar
ESC Education Strategy Center
ESDP-V Education Sector Development Plan - Five
EthERNet Ethiopian Educational Research Network
FGD Focus Group Discussion
HERQA Higher Education Relevance and Quality Agency
HESC Higher Education Strategy Center
HPLED Health Professionals Licensure Examinations Directorate
IER Institute of Educational Research
MCQ Multiple Choice Questions
MOE Ministry of Education
NEAEA National Educational Assessment and Examinations Agency
OSCE Objectively Structured Clinical Examination
UEE University Entrance Examinations

i
Table of Content

List of Acronyms and Abbreviations ............................................................................................................. i


Acknowledgments......................................................................................................................................... v
Executive Summary ...................................................................................................................................... 1
1. Introduction ........................................................................................................................................... 6
1.1. Preamble ....................................................................................................................................... 6
1.2. Objectives of the Exit Exam Strategic Document......................................................................... 8
1.3. Approaches in developing the strategic document........................................................................ 8
1.3.1. Review of existing documents .................................................................................................. 8
1.3.2. Key informant interviews.......................................................................................................... 9
1.3.3. Consultative meetings and workshops with stakeholders ......................................................... 9
1.3.4. Expert review and assessment................................................................................................... 9
2. Rationale ............................................................................................................................................. 15
3. Review of existing policy documents and proclamations ................................................................... 16
4. International and National Experiences of universities on Exit Exam................................................ 21
4.1. International experiences ............................................................................................................ 21
4.1.1. State Exam in Germany ...................................................................................................... 21
4.1.2. Exit exam in the USA ......................................................................................................... 21
4.1.3. Exit Exam in India .............................................................................................................. 22
4.1.4. Exit Exam in the United Arab Emirates .............................................................................. 25
4.2. Ethiopian experience on Exit Exam ............................................................................................ 31
4.2.1. Exit exam in Ethiopian Law Schools .................................................................................. 31
4.2.2. Exit exam as a licensure examination for health professionals ........................................... 32
4.2.3. The experience of Testing Center at Addis Ababa University............................................ 40
5. Instituting Exit Examinations in Ethiopia ........................................................................................... 41
5.1. Procedures for Implementing University Exit Exam .................................................................. 41
5.2. Use of University Exit Examination ........................................................................................... 43
5.3. Implementation Time .................................................................................................................. 46
5.4. Content of Exit Examinations ..................................................................................................... 47
5.5. Contributions of exit exam to quality education ......................................................................... 49
5.6. Consequences of the results of exit exam ................................................................................... 50

ii
5.7. Prioritization of fields for examination ....................................................................................... 52
5.8. Technical and administrative preparations.................................................................................. 54
5.9. Accommodation of Students with disability ............................................................................... 55
5.10. Re-exam opportunity............................................................................................................... 56
5.11. Credibility of exit exams to stakeholders ................................................................................ 57
5.12. Challenges ............................................................................................................................... 59
5.13. Legal mandate of exiting institutions ...................................................................................... 60
5.14. Possible arrangements in instituting exit examinations .......................................................... 61
5.15. Cost implication of the exit exam ........................................................................................... 63
6. Guidelines on the Technical Aspects of Exit Exam ............................................................................ 65
6.1. Characteristics of a good test ...................................................................................................... 65
6.2. Purpose of testing ........................................................................................................................ 67
6.3. Guidelines for Test Blue Print or Table of Specification ............................................................ 67
6.4. Guideline for writing multiple choice questions (MCQs)........................................................... 73
6.5. Guidelines for Reviewers of Test Items ...................................................................................... 75
6.6. Guidelines for Psychometricians ................................................................................................ 76
6.7. Pretesting, Item Analysis, and Item Banking .............................................................................. 77
6.8. Database for Exit Exam .............................................................................................................. 80
6.9. Guidelines for Test Booklet Compilation and Booklet Formation ............................................. 81
6.10. Guidelines for University Exit Exam Standard Setting: Using the Modified Angoff Method 82
6.11. Guidelines for University Exit Exam Security........................................................................ 87
7. Summary of findings........................................................................................................................... 89
8. Scenarios for Instituting Exit Examination ......................................................................................... 93
Annex 1. Discourse over the Need for Exit Exam in the USA ................................................................... 97
Annex 2. Organogram for Instituting Exit Examination .......................................................................... 102
References ................................................................................................................................................. 103

iii
List of Tables

Table 1: Enrollment in Under Graduate programs by Six Bands for the year 2010 EC............................... 10
Table 2: The trend of graduates from 2005 to 2009 EC.............................................................................. 11
Table 3: Trends of Exit Exam Results of Prospective Graduates of Law Schools in Ethiopia ...................... 32
Table 4: Results of Prospective Graduates of Health Professionals in Ethiopia (2016/17) ........................ 35
Table 5: Ratio of Estimated Difficulty Level by domain of cognitive outcomes ......................................... 70
Table 6: A Template for Developing Test Blue Print or Table of Specification ........................................... 71
Table 7: Revised Bloom’s Taxonomy Verbs: Cognitive Domain (from lower to higher skills) ................... 72
Table 8: Test Item Rating Form ................................................................................................................... 85
Table 9: Expert Rating Spreadsheet ............................................................................................................ 86
Table 10: Advantages and disadvantages of scenarios for instituting exit examination ............................ 93

iv
Acknowledgments

The Ethiopian Academy of sciences gratefully acknowledges the Ministry of Education for
commissioning the study. The Academy also acknowledges the following members of the team
of experts for their industrious efforts on conducting the study and producing this report:

 Belay Hagos (PhD), Team Leader


 Girma Lemma (PhD), Team Member
 Tamirie Andualem(PhD), Team Member

The Academy would also like to thank the institutions that cooperated by providing the required
information and the individuals who provided their comments and views as key informants,
focus group members of the study; and as reviewers of the first draft report. The constructive
support and valuable feedback from the officials of the Ministry of Science and Higher
Education is also very much appreciated.

Ethiopian Academy of Sciences

v
Executive Summary

The concern over quality of education in Ethiopia has been an issue at all levels of the education
system. It has been questioned whether university graduates have the required minimum
competencies and relevant stakeholders would like to know. One of the recommended ideas is
the implementation of university exit examination for prospective graduates to redress this
problem. The implementation of exit examination is related to the regulation and improvement of
the quality of higher education. The feedback from the results of exit examination is instrumental
for institutional improvement. University Exit Exam is a standardized comprehensive curriculum
based test which is designed to assess if students have achieved the minimum competence that
was stipulated in the graduates’ profile.

The purpose of this study is twofold: 1) To document strategies of implementing university exit
examination and 2) To identify the technical requirements of implementing exit examination. To
achieve these objectives, the study team explored the local and international experiences,
critically reviewed relevant documents, and interviewed key stakeholders such as the directors
from the Ministry of Science and Higher Education, Education Strategy Center, Higher
Education Relevance, Quality Agency, Testing Center of Addis Ababa University, University
Presidents. Besides, the team got relevant feedback from consultative meetings with stakeholders
and feedback from specialized expert reviewers.

This study revealed the following major findings:

 Procedures required for the implementation of exit examination were described in detail.
Four steps were identified for the implementation of exit exam: 1) Preparation phase, 2)
Item development and standard setting, 3) Exam duplication, sorting, packing,
transporting and administration and 4) Exam scoring and reporting.

 The use of exit examination was discussed as either a requirement for graduation or as a
licensure examination. Besides, the idea of emphasizing rigorous university entrance
examination rather than focusing on exit examination was also discussed in the long term.
Controlling the quality potential candidates at entry level through college entrance

1
examination or testing the competence of candidates just before graduation through exit
examination is required.

 The implementation time for exit examination was also discussed and results suggested
two options: The first option is two-time testing where students sit at the mid-term (or
half way) and at the end of the academic programs. And the other option was sitting for
exit exam only once at the end of the academic program.

 The introduction exit examination was mentioned as contributing factor to the


improvement of the quality of higher education due to the institutional feedback
generated by the results of exit examination.

 Whether exit examination should be implemented for fields of study all at once or based
on phase by phase approach was also explored. To start with, the phase by phase
approach was found feasible. As the number of students enrolled in engineering
constitute of 35% of the total undergraduate students, starting the implementation of exit
examination with this field was recommended.

 Test accommodation for students with disabilities was also considered to fairly remove
any barriers in testing. For instance, using JAWS computer software which is a text
reader could be used as one of accommodation strategies.

 Re-exam opportunities for those who could not get a pass mark were also discussed as a
major issue. Four options emerged in this regard:

o Open opportunity for re-examination, indefinitely or unlimited;


o Second re-examination chance: fee-free and with face to face and online
university support;
o Third re-examination chance: candidate pays the fees and no university face to
face support, only online support and self-support;

2
o Fourth and last chance of re-examination is similar to the third chance since the
candidate pays the fees and no university face to face support, with only online
support. In addition, it suggests more, i.e.,

 Down-grading level of qualification following unsuccessful re-


examination: i.e., diploma or advanced diploma, instead of degree;
 Linking them with further skills training including TVET

 Working towards enhancing the credibility of exit exam to stakeholders including


students and parents was also discussed. Raising the awareness of the stakeholders and
selling the rationale behind exit examination to the public in general and key stakeholders
in particular was underlined.

 Universities have the legal mandates for implementing exit examinations. However, due
to the potential variability of universities and the need to regulating the standards
centrally they could delegate an institution that represents them to implement the exit
examination which would be considered as a requirement for graduation in the same way
they did for university entrance examination where Addis Ababa University Testing
Center was officially delegated.

 Cost implication of instituting exit exam was also assessed. Given the large number of
candidates or would be graduating students the estimation of the cost for the
implementation of exit exam suggests it is immense and requires resources and
commitment to assign budget.

This study has also produced several technical guidelines that would help implementing
organization(s) follow standardized procedures of testing, scoring and reporting, which include
the following:

 Guidelines for test blue print or table of specification


 Guidelines for writing multiple choice questions

3
 Guidelines for reviewers’ test items
 Guidelines for psychometricians
 Pre-testing, item analysis and item banking
 Guidelines for test booklet compilation and booklet formation
 Guidelines for standard setting and cut off
 Guidelines for exam security

Finally, the study did not make recommendations; rather, it presented various scenarios with
their option advantages and disadvantages for further discussion and decisions.

1) Regarding the purpose of the exit exam two scenarios emerged: The first option was
university exit exam as a requirement for graduation and the second option was as a licensure
examination.

2) Concerning institutionalization of exit examination three scenarios came up: The first option
was establishing an independent exit exam institution accountable to the Office of the Prime
Minister; The second option was using the existing institutions by forming a consortium
(NEAEA, IER/AAU, HERQA, HESC, EthERNet, etc) for the implementation of exit exam
which would be accountable to the Ministry of Science and Higher Education; and The third
option was establishing a directorate within the Ministry of Science & Higher Education to
implement exit examination.

3) Three option scenarios were generated for the testing format: The first option was
implementing exit exam on paper and pencil format; the second option was implementing
exit exam using electronic format or computer based testing; the third option was
implementing exit exam using a combination of paper and pencil and computer based testing
formats.

4) Concerning the phases of administration exit examination two scenarios emerged. The first
scenario option was administering exit exam twice: mid-term and end-term of the academic

4
program; and the second option was administering it only once at the end of the academic
program.

5) With regard to re-examination opportunities five options emerged:


a) Indefinite opportunity for re-examination;
b) Only one more chance for re-examination free of any fees with face to face and online
academic support;
c) Two more time, opportunity for re-examination but with fees to be covered by the
candidate with face-to face and online academic support;
d) Three more time or opportunity for re-examination with fees to be covered by the
candidates and no face to face academic support; and
e) Last and four more time or opportunity for re-examination followed by down-graded
certification (i.e., advanced diploma or diploma instead of degree) including denial of
certification for those who still do not reach the required level. Besides, other students
might be linked to skills training institutions including TEVT.

These scenarios were open for further discussion and ultimate decision by the relevant
stakeholder although the study team has indicated some suggestions based on the advantages and
disadvantages indicated and in terms of its feasibility of implementation in the long-term a short
term.

5
1. Introduction

1.1. Preamble
The rapid expansion of higher education, the unprecedented growth in student population and
program diversification which is not in congruence with quality has become concern of the
institutions themselves and the public at large. Currently, there are 45 publics and 4 private
universities and over 170 private colleges, institutes and university colleges in Ethiopia
(HERQA, 2018). There is considerable pressure on higher education institutions to demonstrate
that they are not simply training young graduates enrolled in various fields of study, but also
these new graduates have developed what might be called general competencies in their areas of
specializations. This situation brings assessment at the heart of the undergraduate experiences. In
order to demonstrate that universities are on the right track and graduating with the necessary
skills or qualities, which graduates can display in their life, there is a need to ensure that the
curriculum embodies the purpose of higher education. Sound assessment has become a pressing
issue these days whether the profession is medicine, law, engineering or any other field. It
provides information for better learning and teaching and that assessment reform is necessary if
universities are to ensure that graduates have the competencies embedded in the respective
curriculum materials desired by government, employers, funding bodies, tax payers, and by the
students themselves. Brown and Knight (2004) described the situation as follows:

In these times of public accountability and consumer power, we shall explain that
assessment is central to universities’ practices for total quality management,
practices which are going to become increasingly important in financially
strained and highly competitive times. We also provide a range of assessment
methods to encourage universities to assess in ways that are fit for the intended
purpose (p.13)
The current assessment practices in higher education institutions however, is marked by discrete
pieces of procedures and practices which do not actually encourage learning of higher level skills
such as the application of theoretical knowledge to a given context, analysis, synthesis and
evaluation of new components of their learning. Such assessment practices may not maximize
the learning benefits students can achieve and does not help them to look into what they have
learned in a new perspective. As Brown (2003) suggested “the sheer volume may instead

6
trivialize the nature of assessment tasks and result in a poor quality of attention by students,
markers and examiners alike” (p.39). For learning to take place in its true sense, it is absolutely
crucial to place assessment at the center of the teaching learning process. For enhancing quality
of education at higher education institutions and influence the teaching and learning process
positively teachers and the system at large have to make a shift in designing and making a fit-for
-purpose assessment strategy. As Boud (1994, cited in Brown, 2003) suggested “students can
escape bad teaching: they can’t avoid bad assessment. The conventional ways by which we
choose how to assess our students are just not good enough to achieve what we want, so we need
to radically review our assessment strategies to cope with changing conditions we have to face in
higher education internationally” (p.4)

System wide assessment, therefore, is central to meeting the pressure upon higher education
institutions and to ascertain the fulfillment of missions and vision of the institutions. There is a
need to look at new ways of assessment practices and procedures. Systems approach to
assessment must be introduced and strategies be worked out and institutionally managed to
change the ways in which students are assessed in higher education institutions. When
institutions tackle the problem of assessment strategically and make it the result of conscious
decision based on informed choice, then it can be motivating and productive for students and
help them to know how well they are doing and what else they need to do. It also lets staff know
how they and their students are doing and it gives them the performance indicators that they
need.

One way towards that end is introducing system wide high-stake standardized and
comprehensive exit examination that encourage students to achieve their best and teachers to
provide the instruction that will assure high levels of competency and learning. Exit
examinations are system-based assessment programs administered to undergraduate students
after completion of their training in their respective areas of specialization. Exit examinations are
high stake testing programs meant to serve different purposes such as creating awareness about
public accountability in education, feedback to the institutions and provide evidence for teachers
and parents that students have acquired the necessary skill and knowledge during their stay in
university campuses.

7
1.2. Objectives of the Exit Exam Strategic Document

This strategic document has two major objectives:

1) To technically guide how exit examination should be designed, developed, standardized,


administered, scored and reported, and
2) To propose strategies of institutionalization of exit examination in Ethiopia.

1.3. Approaches in developing the strategic document

To develop the strategic document for instituting exit examination for prospective graduates of
higher education institutions in Ethiopia the team conducted desk review of existing documents
including international and national experiences, and interviewed key stakeholders and solicited
feedback from experts and relevant stakeholders.

1.3.1. Review of existing documents

Exhaustive desk review of relevant documents was conducted to synthesize existing knowledge,
experiences and lessons with regard to exit examination of prospective university graduates.
Review of documents including the following was made:
 Revised LL. B Exit Examination Guidelines, Education Strategy Center and Consortium,
of Ethiopian Law Schools, (Final Version), March 2017,
 Guidelines for Health Professionals Licensure Examination, Ministry of Health, 2017,
 Proclamations (NEAEA, ESC & HERQA),
 Guidelines for developing item banks and item pools (NEAEA),
 Reports on Law exit exams and health professionals’ licensure examinations and
 Review of international experiences.

8
1.3.2. Key informant interviews
The desk review was complemented by data from interviewing key informants. The content of
the interview focused on identifying existing experiences in planning, developing test items,
administering and scoring. Besides, through interviewing the key stakeholders the team tried to
figure out the organizational arrangements, structures and operational procedures. The following
data sources were considered for identifying key persons for interview:
 Ministry of Science and Higher Education (5-Directors),
 National Educational Assessment and Examinations Agency (NEAEA),
 Higher Education Relevance and Quality Implementation Agency (HERQIA),
 Ethiopian Higher Education Strategy Center (HESC),
 Ministry of Health, National Health Professionals’ Licensure Examination Directorate,
 Testing Center at Institute of Educational Research of Addis Ababa University,
 Representative professionals of the six bands, and deans & staff representatives, and
 Higher officials in three public universities from three different generations (Addis
Ababa University, Addis Ababa Science and Technology University, and Injibara
University).

1.3.3. Consultative meetings and workshops with stakeholders

Consultative meetings and workshops with relevant and key stakeholders were conducted to
sharpen the approach and enrich the draft document. The perspectives and relevant inputs of key
stakeholders were considered in order to ensure that our approach was inclusive and relevant.

1.3.4. Expert review and assessment

The core expert team produced the draft inception report for developing a strategic document for
the implementation and institutionalization of exit examination in Ethiopian higher education
institutions. In order to maintain higher standard and quality, the document was reviewed by five
professionals in the area and their constructive comments have been accommodated in this
document.

9
Landscape of Higher Education in Ethiopia
Although basic education is the prime agenda of the nation, expansion of higher education is not
a last option in addressing the issue of equity and equality of education. This is witnessed by
rapid expansion of the sector explained in terms of access as measured by gross and net
enrollment ratios. Table 1 shows share of under graduate program in six bands.

Table 1: Enrollment in Under Graduate programs by Six Bands for the year 2010 EC

Band Total No of undergraduate students %


Engineering and Technology 201,427 35.19%
Natural and computational sciences 52,230 9.12%
Health and Medical sciences 60,132 10.50%
Agriculture & life Sciences 54,378 9.50%
Business and Economics 130,835 22.86%
Social sciences& Humanities 73,417 12.83%
Total 572,419
(Source: EMIS_MOE, 2010)
The goal of higher education is explicitly stated in the Education Sector Development Programs
that have been in place by Ministry of Education for the last five consecutive periods. In the
Education Sector Development Program V (ESDP-V), the goal of this sector is stated as follows:

“to produce competent graduates who have appropriate knowledge, skills and
attitudes in diverse fields of study; to produce research which promotes
knowledge and technology transfer based on national development and
community needs; and to ensure that education and research promote the
principles of freedom in exchange of views and opinions based on reason,
democratic and multicultural values” (p.102)

Universities are expected to transcend beyond the bounds of their campuses and beyond the
process of knowledge transfer on local basis to also preparing the graduates to be part of the
global international academic research network and market. In order to meet the demands of
contemporary societies that are marked by rapid economic, political, cultural and technological
upsurges, universities have both local and international responsibilities in producing competent

10
graduates armed with the necessary knowledge and skills. In order to be competitive nationally,
regionally and globally, Ethiopian higher education institutions in general and universities in
particular need to mobilize their human and material resources towards producing citizens that
can not only assimilate themselves to the existing environment but also make a difference and
build on the past and create a new future.

Regardless of the year of establishment and program diversity, all Ethiopian universities spelled
out in their respective senate legislations three important and core missions of engagement
namely: teaching and learning, research and community service. This being the case across all
higher education institutions, the landscape has dramatically changed for the better in the last
two and half decades. In terms of known parameters such as access, equity enhancement,
relevance and quality, research and technology transfer, and community engagement the sector
has made transformations. Access seems well addressed indicating the fast growth trend of the
sector where thousands of new graduates are produced.

Table 2: The trend of graduates from 2005 to 2009 EC

2005 EC 2006 EC 2007 EC 2008 EC 2009 EC

Total enrolled students 553,848 593,574 729,028 778,766 788,033

Graduates 79,073 96,981 107,567 127,275 141,700

(Source; EMIS_MOE, 2009)

The expansion has yield cluster of four generations depending on their years of establishment
over a period of two decades. Whether or not these four generations differ in their performance
in terms of core values and areas of engagement ascribed to them by the society and the ministry
is not clear. One could however, hypothesize that the first and second generations are likely to
outsmart the latter ones in terms of their intake capacity, facilities and engagements.

The fact that these early comers are better in facilities and staff mixture does not necessarily
guarantee the provision of quality education in these institutions. Yet the academic culture that
has been deposited in these early generations is undoubtedly an asset for relevance and quality

11
enhancement, research and community engagement. This obviously trickles down to the profile
of new graduates coming from the strata of generations. There are now 45 functional public
universities established in three ‘generations’, distributed equitably across the country. As part of
equity enhancement strategy, emerging regions and zonal administrative structures are reached
through this expansion process. There are also four private universities and over 94 private
university colleges engaged in higher-level human development endeavors across the country as
indicated in the ESDP-V document. The prime triggering factor for this expansion might be
attributed to the federal system of government which eventually entailed decentralization and
perpetuated regionalization of universities. In the ESDP-V program document it is stated that
“The placement of new institutions will be guided by local demand as well as a preference for
equitable access across all regions” (p.102). By law all universities are accountable to the Federal
Ministry of Education. In practice regional governments seem to exercise their autonomy and
stretch their political influence on universities adjacent to their sphere of influence. In the new
governance landscape, what should be the role of regional governments and their contribution to
efficiency and effectiveness of education in the respective higher education institutions is not
neatly delineated issue. If regional governments are somehow autonomous in the overall
economic and social activities of the region, parallel to that they should be held accountable for
results. Accountability requires clear delineation of authority and responsibility and transparent
and understandable information on results both educational and financial.

Relevant and quality enhancement strategy in higher education institutions is to produce demand
driven and competent graduates via relevant academic programs, sufficient and qualified staff,
and proper educational resources or facilities. On the other hand, concerns have been voiced by
various stakeholders that due to the disproportionate human and material resource allocation
across the generations it is unlikely that this would be attained equally or comparably all
universities. There is a wide variation among the generations with regard to teachers’
qualifications mix (first, second and third degrees) as well as educational resources.

This does not mean qualifying exit examinations do not contribute to quality education in the late
generations. Rather one has to ensure the presence of minimum competency threshold among the
graduates of the universities despite variations in the establishment of the universities. An all-

12
round preparation and readiness for the exit examination seem to be appropriate especially in
those that are labeled as third or fourth generation universities before undertaking such high
stake testing programs. This is partly ethical issue and partly contextualization of the purpose of
exit examinations as per the prevailing conditions in the respective universities.

Designing and implementing national exit examination system in priority fields is a strategy
adopted by MoE to ascertain the minimum required competencies of graduates and quality in
higher education institutions.

In view of the wide disparity among the four generations of universities in terms of input,
process variables, the implementation of uniform testing program such as exit examination
seems not an easy task. Unless and otherwise handled properly and the ground work is laid to
bring stakeholders on board, the aftermath effect could be bad. Sound implementation strategy
has to be devised and the merits and demerits of options have to be spelled out and properly
discussed. Hence, data from stakeholders have to be solicited on issues such as introducing exit
examination phase by phase, simultaneously for all graduates across all generations or springing
from few institutions through pilot project and gradually building on best experiences over time.

The following are overall strategies to place the exit examinations:


 Clearly explain the purpose of the exit examinations to all stakeholders so that the
process is open, transparent, and sound;
 Provide students with meaningful and useful information and feedback on the objectives
and results of the exit exam that can help them to enhance their preparation, performance
and learning at a deeper level in the course of their stay in the undergraduate program;
 Provide institutions with the data they require for the certification of achievement;
 Make the exit exam an integral part of the curriculum design, so that meaningful changes
and improvements are made about all competencies and learning outcomes embedded in
the respective disciplines;
 Establish criteria that are clear, explicit and public so that students and staff know what
constitutes threshold and higher standards for achievement, graduation and certification;
 Ensure standards and appropriateness for staff and students; and

13
 Ensure and demonstrate psychometric properties of the examination such as validity,
reliability and consistency.

Hence, awareness creation to all stakeholders is so important. The client groups served by the
system wide exit examination are diverse. These may include students themselves, the employers
who will take graduates and other significant stakeholders in higher education such as the
teachers who work in the system, subject reviewers who review the system, parents who
increasingly pay for the system and society in general for whom the system should be geared.
Although there are many actors which have a stake in higher education programs students of
higher education are the major beneficiaries of well planned and executed exit exams. It
enhances and enriches their learning experience, as well as leading to qualifications that are valid
and appropriate for their future lives and careers. Research findings show that the extent to which
students were cued in to assessment demands was found to be a strong predictor of their overall
performance and system based assessment was found to be the dominant influence on the way
students learnt, on how much effort they put in and what they allocate this effort. According to
MacFarlane (1992) as cited in Gibbs (2003):

The increase in the extent to which students behave strategically is in part a


cultural and economic phenomenon. Students who work in the evenings to pay off
debts, and who worry about the completion for employment after graduation tend
to make very careful use of their time and effort. Faced with contexts as powerful
as this, teachers have little choice but to get with the tide and use assessment
strategically. If it is going to have a profound influence on what, how and how
long students study, then it might as well be designed to have educationally sound
and positive influence rather than leaving the consequences to chance (p.43)

Moreover, at present students tend to use trial and error technique towards the present
assessment procedures than making effort and making themselves ready for system wide
assessment programs. If system wide assessment procedures such as exit examinations are
introduced, students will exert maximum effort to succeed prospectively, think about their future
development in their respective fields.

14
Employers are stakeholders that have vested interest in such system wide institutionally based
assessment procedures. Students’ achievements are the principal basis of selection for
employment. Employers wish to be able to make well-informed decisions when short-listing
candidates for interview. They also wish to be able to tell, from assessment data, whether
students are going to turn out to be good employees, and not just subject experts.

There are allegations by students and parents that assessments in higher education institutions are
full of bias where faculties tend to be discriminatory in assessing their students. The introduction
of system wide assessment practices such as exit examinations led by an accountable institution
would give parents and students assurance on the processes and practices are fair, equitable and
valid, and more over students will be well informed about the rules of the assessment procedures.

Exit examination is by no means an alternative procedure for teacher made classroom tests.
Rather it is a means to further empower teachers in their professional exercises. The current
assessment practices in higher education institutions are not to the standard. There is a need to
ensure that assessment as major component of the system is serving what it is intended to serve.
There is vested interest among faculty members to make the assessment system robust and valid
where the society at large is questioning about standards associated with graduate profiles. With
increase in student population in higher education institutions, assessment becomes an
intolerable burden on them, and it is not surprising that the assessment innovations which are
most attractive to lecturers are those which can make their work more efficient. Hence,
instructors of higher education institutions are going to be major beneficiaries of system wide
assessment programs such as exit examinations.

2. Rationale

Why do we need comprehensive, curriculum based exit examinations in Ethiopian universities?


Although the expansion of higher education institutions in Ethiopia, by and large with
harmonized curricula, is an opportunity to creating access to higher learning, the adequacy and
comparability of the competence of graduates from the same field of studies across universities
has not been ensured. Introducing a comprehensive exit examination at the end of the academic
programs for undergraduate students, or for prospective graduates, has an immense contribution
15
to quality monitoring purpose. The justifications behind introducing exit examination, among
other things, include the following:

1) There is a concern whether university graduates are fit for purpose for which they are
trained or if they do fulfill the minimum expected learning competencies,
2) Through exit examinations stakeholders would find it valuable to determine the
equivalence of the competence of graduates of the same field of specialization from
various universities,
3) Universities do not have national level feedback as to how well they are performing
especially in terms of the quality of their graduates. By proving feedback to universities
on how effective they are in empowering their graduates to be competent and preparing
their graduates for the world of work. As a result, it will boost the higher education
system checking and monitoring mechanisms.
4) The introduction of exit examinations in universities will have a motivational purpose for
both students and teachers since it enhances student readiness and teacher preparations.

In addition, the study for the Ethiopian education roadmap indicated the practice of the existing
law exit-exam and health professionals’ licensure examination as “excellent practice” and it has
recommended to apply it to all other fields stating as “it is important to institutionalize exit exam
for all first-degree graduates and put as a requirement accreditation of graduate programs in five
years' time after establishment by an independent body”. (HESC, 2017). Furthermore, the
introduction of exit examination to all undergraduate students was planned in the Ministry of
Education Sector Development Plan - fifth round (ESDP-V, 2015).

3. Review of existing policy documents and proclamations

National Educational Assessment and Examinations Agency (NEAEA) was established by the
Council of Ministers Regulation No. 260/2012 and was mandated to administer national
examinations and assessments of the general education. In its article 2.2., the proclamation
defines administration as “the process of registration of examinees, preparation, printing,
conducting and correction of exam, consolidation of exam results and declaring and certifying
same” (p.6290). Under the establishment section, the proclamation authorizes the agency as an

16
autonomous federal government office having its own legal personality but accountable to the
ministry of education.

Ethiopia National Educational Examinations are administered at grades ten and twelve marking
the completion of the first cycle of secondary education and university entrance examination at
the end of the preparatory program respectively. The NEAEA is mandated by law to administer
these two exams except that the Institute of Educational Research at Addis Ababa University is
delegated by the Council of Universities to develop camera ready university entrance
examinations for students heading to higher education institutions based on the preparatory
program curriculum. Based on the exam results, the agency has the objective of identifying
students who are eligible to be promoted to the next educational level.

Pursuant to this proclamation and the mandate given to the agency, the question here is that exit
exam is nationwide and meant to serve national educational goals and in conjunction with the
goals of national education and training policy. The answer is yes. Exit exams are
comprehensive, curriculum–based national learning assessment tools administered to all students
after completion of the undergraduate program across all public and private higher education
institutions. Hence, unless and otherwise this task is delegated to any section within MoE, or a
separate institution, NEAEA has a stake in the administration of exit exam. The wealth of
experience in administering nationwide examinations is an opportunity and can be extrapolated
to this new setting and help to devise sound implementation strategy to execute exit
examinations.

Education Strategic Center (ESC)


Education Strategic Center (ESC) is another potential government structure for administering
exit exam. The council of ministers’ regulation issued a proclamation for the establishment of
education strategy center: Regulation No. 727/2012. In the proclamation under the objective
section the mandates of the center are stated as follows:

The center shall, with a view to transforming the quality and standards of education and
training of the country to a higher level, have the objectives to carry out research and
studies on education and training and thereby initiate policy and strategy proposals on

17
the administration of education, education programs, research, and teachers and human
resource development, and serve as reliable national center for data base.

In the Proclamation, on page 6727, under the section that states powers and duties of the center,
sub article 2, 3, and 4 the following statements are shown.

 Prepare national qualifications framework for education and training; evaluate its
implementation and report to the ministry;
 Advise the ministry on quality and standards of the national education system and
prepare implementation directives;
 Arrange forums for discussion and public debate among stakeholders on findings of
research and study, and current issues of public concern related to education, and collect
public opinion and submit summary reports to the ministry;

In view of the above quoted mandates of the center, though not explicitly mentioned, the purpose
and contributions of exit exams are linked to the duties and responsibilities of the center. Large-
scale assessments such as exit exams are broad based assessment procedures that help to for
monitoring and providing policy-and practitioner-relevant information on overall performance
levels in the system, changes in those levels, and related or contributing factors. Current
measures of learning will need to expand beyond basic competencies in order to better assess the
relevance of education systems to the world of work and to life in general. Hence, the purpose of
exit examination is within the bound of the mandates ascribed to the center as per the
proclamation.

The Higher Education Proclamation, proclamation No. 650/2009

The Higher Education Proclamation, proclamation no. 650/2009 whereas, has laid down a legal
system to enable higher education institutions to effect institutional transformation, and thus be
able to serve as dynamic centers of capacity building consistent with the aspirations of the
peoples of Ethiopia in the context of globalization. This Proclamation has spelled out important
articles and sub articles that have important bearings on the purpose and implementation of exit

18
exams. These may include: profile of the would be graduates, institutional autonomy and
accountability, award of academic qualifications of graduates, delivery of the curriculum,
assessment and the feedback obtained from it to improve the teaching learning process in
particular and quality of education in general. One of the objectives of exit exams is to certify
undergraduate students after completion of a program. Under article 4, sub article 1, article 8,
sub article 2, the proclamation states the objectives and responsibilities of higher education as
follows:

1/ prepare knowledgeable, skilled, and attitudinally mature graduates in numbers with


demand-based proportional balance of fields and disciplines so that the country shall
become internationally competitive (Page 4979)

2/ develop programs of study and provide higher education; prepare and supply qualified
graduates in knowledge, skills, and attitudes on the basis of needs of the country; and
award academic qualifications in accordance with its programs (Page 4981)

Institutional autonomy and accountability is one of the objectives to address by the exit exams.
The outcome of any assessment program is to ensure students learning. Whether or not students
have achieved the minimum learning competencies in major learning areas is ascertained by the
feedback obtained from high stake assessment procedures that have national significance. Hence,
under article 7, Guiding Values of Institutions, sub article 4, institutional autonomy and
accountability is clearly stated.

Assessment systems tend to be made up of three main activities that correspond to three
information needs: classroom assessments for providing real-time information to support
teaching and learning in individual classrooms; examinations for making high-stakes decisions
about individual students at different stages of the education system (e.g., certification or
selection of students);and large-scale assessments for monitoring and providing information on
overall system performance levels and contributing factors. With regard to these three activities
the higher education proclamation states important statements under article,21 sub article 1 and 6
that have important bearings on curriculum delivery as follows:

19
1/ Curricular design, delivery, and assessment of learning outcomes in any institution
shall aim at enabling the learner to acquire pertinent scientific knowledge,
independent thinking skills, communication skills and professional values that
together prepare him to become a competent professional. (Page 4987)

6/ Curriculum delivery shall be research and study-based and up-to-date and the
teaching learning process shall be continuously updated in its design, delivery
methods, and instruments of assessment. (Page 4988)

The degree of congruence between assessment activities and system learning goals, standards,
curriculum, and pre- and in-service teacher training is important to enhance quality of education.
System alignment refers to the extent to which the assessment system is aligned with the rest of
the education system. It is important for assessment activities to align with the rest of the
education system so that the information they provide is of use to improving the quality of
education in the system, and so that synergies can be created. Exit exams serve such noble
purpose. Data obtained from high stake exit exam are evidences to bring change in the higher
education system and ultimately improve quality of education.

The Higher Education Quality Improvement Agency


The Higher Education Quality Improvement Agency institutionalized through proclamation is
meant to consistently monitor programs of higher education institutions and support them to
achieve high standards. These issues are stated under article 22, sub articles 1,2, and 7 as
follows.

1/ Without prejudice to other provisions of this Proclamation and the relevant


regulations and directives, every institution shall have a reliable internal system for
quality enhancement that shall be continuously improved. (Page 4988)

2/ The internal system of quality enhancement of every institution shall provide for clear
and comprehensive measures of quality covering professional development of
academic staff, course contents, teaching-learning processes, student evaluation,
assessment and grading systems, which shall also include student evaluation of

20
course contents together with the methods and systems of delivery, assessment,
examinations and grading. (Page 4988)

7/ The Ministry, the Centre, and the Agency shall also guide institutional quality
enhancement efforts as well as curricula development through a national qualifications'
framework that shall, as the case may be, determine or indicate core learning outcomes
or graduate competencies. (Page 4989).

4. International and National Experiences of universities on Exit Exam

4.1. International experiences

4.1.1. State Exam in Germany

The system of exit exam in Ethiopian Law schools was adopted from the experience of Germany
(Law Exit Examination in Ethiopia, June 2015). Exit exam in Germany is known as “State
exam”. The German lawyers must take two state exams to be fully licensed, one after completing
university, the second after two years rotating through low-paid internships with law firms,
prosecutors, judges, etc (Law Exit Examination in Ethiopia, June 2015).

4.1.2. Exit exam in the USA

Experience of exit exam is different from country to country. For instance, exit exam is not
managed centrally in the USA, as it is thought and practiced in Ethiopia. The University of
Alaska, the Department of Psychology is requiring students to take exit exam for graduation.
Missouri State University requires seniors to take GEN 499 University Exit Exam. Furthermore,
sample universities in the United States of America which are using exit examinations for their
university graduates: Alcorn State University, Augusta University, Morehead State University,
University System of Ohio, Crowder College (graduation exit examination). The following
excerpts are informative about the current discourse on exit exam as a graduation requirement
(see also Annex for more on similar discourse).

21
“There is a groundswell from the public about whether a college degree is worth
what people are paying for it,” said Stephanie Davidson, vice chancellor for
academic affairs at the University System of Ohio. “People are asking for
tangible demonstrations of what students know.”

Source: http://time.com/2187/the-new-college-exam-a-test-to-graduate/

4.1.3. Exit Exam in India

India has 398 colleges providing the MBBS degree in the privately run, government- and trust-
run programs. These produce 52,105 graduates every year. Of these, 27,170 are produced by 215
private medical colleges, while almost 25,000 are produced by government colleges, as reported
by the Medical Council of India (MCI). In India, many regulatory bodies in the field of higher
education act as decision making organizations for the approval, accreditation and regulation of
higher education. The Bar Council of India (BCI) conducts an eligibility test for law graduates to
register in the association. The Medical Council of India (MCI) is another regulatory body
responsible for ensuring the quality of medical graduates. The duty of these regulatory bodies is
to ensure a supply of efficient manpower to different systems for the provision of better and
more effective services in the domain concerned. The Medical Council of India (MCI), the
country’s apex statutory body for medical education, proposed exit examination to improving the
quality of medical graduates and to ensure better medical services throughout the country. The
deterioration in the quality of medical professionals is the major driving force to implement strict
regulation of the exit examination for medical practitioners. The MCI proposed a radical plan for
an exit exam for graduating MBBS doctors that would test their suitability to pursue
postgraduate studies. Passing this examination would also enable the practitioners to be
registered anywhere in India without limiting themselves to specific region. The opportunity to
practice across the country was considered as an incentive for meritorious candidates. Candidates
who fail the examination would be entitled to practice only within the jurisdiction of the state in
which they were registered earlier.

22
There were a fore and against arguments related to the MCI. Some argued that the new system
will succeed only in bringing a large number of disqualified candidates into the states simply
because merely sitting for an exit exam cannot guarantee that the graduates have the required
competency to practice the profession. Others argued that measuring graduates’ competency with
a centrally administered exit examination would force training institutions to follow a uniform
syllabus and implement similar pedagogical systems and may teach for testing than focusing on
important learning outcomes. They argued that exit examination would compromise the
independence of universities and boards and would force all innovative teaching practices in
higher education institutions to follow a uniform syllabus to prepare their undergraduate medical
students to qualify for the exit examination and thus be in a position to save their prestige.
Because medical practice requires multiple skills and the capacity to serve under different
cultures and contexts, it becomes so difficult to assess graduates’ competencies in a wide range
of areas using exit examinations. According to Dehuri & Samal (2017),

The quality of manpower is not necessarily measured by uniformity in approach,


which is what the exit examination is likely to breed. Rather, diversity plays an
important role in the measurement of the parameters of quality. Medical education
is not only a matter of training students to treat patients or for specific tasks. It has
a great role to play in solving the challenges of public health, creating social
values, as well as research and development. Hence, we must give due
consideration to the option of developing an open system that encompasses the
various emerging areas related to the medical sciences (p.2)

Medical education in India is imparted mostly by the state governments and private sector. Very
few institutes are run by the Central government. Although there is a uniform mechanism for
assessing the minimum quality of medical professionals across medical colleges to earn them
recognition by the MCI, there is no provision for output-based efficiency measures. There are
wide differences in the provision of infrastructure and manpower across various medical
colleges. The Central government colleges enjoy much better funding than the provincial ones.
Medical students enrolled in Colleges with teaching hospitals will have better opportunity and
exposure to clinical management of medical conditions and for demonstration and clinical

23
practices, which invariably reflects on the quality of these graduates. An overall evaluation
shows that government-run medical colleges with teaching hospitals in India provide a wider
range of medical exposure than privately managed ones.

The medical exit examination differentiates between medical graduates on the basis of cut-off
marks. One of the criticisms is that such cut-off grading techniques may indirectly encourage
students to cram information and reproduce knowledge in the qualifying examination rather than
internalizing knowledge by understanding concepts. Rather than entirely relying on the results of
the exit exams researchers suggested that it is advisable to enhance the quality of education on
setting a high standard at the entry level and enforcing it throughout the medical curriculum.

There is wide disparity in the performance of students in the medical and technology institutes.
The distribution of successful students is skewed against some states. There is a similar trend in
the case of the rural–urban divide and the state boards versus Central boards. Hence, before
introducing the exit examination there is a need to normalize these anomalies and solutions must
be sought to bridge the gaps. Professionals suggested that instead of selecting candidates on the
basis of absolute scores in entrance examinations, account should be taken of gender, the
deprivation index, rural–urban factors and state board weight age. As a solution to this problem
expert panels were set to draft the methodology for the inclusion of such factors.

The MCI’s exit exam was meant to be used as a qualifying standard for pursuing postgraduate
degree courses approved by the Council. The exit examination followed the pattern of the
postgraduate entrance examination. However, it was criticized for its less orientation to the
clinical and practical approach which is necessary to serve the public. As mentioned earlier, this
examination was also criticized for encouraging memorization of facts than internalization of the
skills required for treatment.

On the basis of various studies and observations, independent agencies arrived at the conclusion
that the MCI could not achieve its goal, given its present structure. Assessments showed that the
MCI, may not work for the very reason that there is dire shortage of doctors in the rural areas.
The Government of India instead considered the establishment of a National Medical

24
Commission Bill to replace the MCI and opinions are being sought on the Draft National
Medical Commission Bill, 2016. The focus is on improving the quality of doctors through strict
monitoring at the time of their entry into medical colleges and during the study period,
irrespective of the regulatory authority. Moreover, by avoiding the privatization and
commercialization of medical education and by helping training institutes to work on ethical
grounds and by focusing on the quality of manpower quality of medical education can be
improved.

4.1.4. Exit Exam in the United Arab Emirates

All degree seeking graduates are required to take the exit exam at the United Arab Emirates
University (UAEU). The College of Engineering (COE), in accordance with standards
established by the Accreditation Board for Engineering and Technology (ABET), has established
assurances of learning /educational standards and specific performance indicators that evaluate
how well any college of engineering fulfills its educational objectives. In order to implement this
initiative, the COE has introduced an exit exam requirement for all its running programs. The
purpose of this exam is to ensure that it demonstrates accountability (through the ABET
accreditation) and to assist its faculty members to improve programs and courses.

At COE, the exit exam is given a credit of 5% of the grade in the final graduation or final year
project. In each semester the exit exam is attended by approximately 200 students from the
COE's five different departments namely Department of Architectural Engineering; Department
of Chemical and Petroleum Engineering; Department of Civil and Environmental Engineering;
Department of Electrical Engineering and Department of Mechanical Engineering. These
departments follow several procedures to prepare students for the exit exam.

Most faculty members contribute to the exam by suggesting helpful questions. Generally
speaking, the exit exam is made up of a balance of questions usually collated from the past years
courses of study at the specific department. The problems allocated to each department must be
relevant to their area of expertise. The UAEU engineering curriculum focuses on a balance of
knowledge and skills that prepare graduates to analyze and design engineering systems and

25
become technical leaders in their fields, and provides students multiple opportunities to
demonstrate these skills through laboratory hands-on experience, project based learning relevant
to their disciplines, design challenges, and professional activities. The engineering applications
are founded on a solid core of scientific and engineering knowledge that instills the engineering
sense, and the teaching philosophy emphasizes cooperative and collaborative teaching and
learning with emphasis on individual and group activities, effective communication, professional
responsibility, self-learning & lifelong learning, and teamwork.

The ABET program outcomes are defined as:


a. an ability to apply knowledge of mathematics, science and engineering
b. an ability to design and conduct experiments, as well as to analyze and interpret data
c. an ability to design a system, component, or process to meet desired needs within
realistic constraints such as economic, environmental, social, political, ethical, health
and safety, manufacturability, and sustainability
d. an ability to function on multidisciplinary teams
e. an ability to identify, formulate, and solve engineering problems
f. an understanding of professional and ethical responsibility
g. an ability to communicate effectively (3g1 orally, 3g2 written)
h. the broad education necessary to understand the impact of engineering solutions in a
global, economic, environmental, and societal context
i. a recognition of the need for, and an ability to engage in life-long learning
j. a knowledge of contemporary issues
k. an ability to use the techniques, skills, and modern engineering tools necessary for
engineering practice.

Along the worldwide trend, the computer-based and online exams is opted for the following
reasons.
1) Multiple versions of the exam can be distributing without having to manually monitor
which students got which tests.
2) Quickly evaluation of the performance of the group is possible.
3) Less time and effort is needed.

26
4) Question styles on exams, including graphics can be mixed and made more interactive
than paper-based exams.
5) Human errors in grading can be eliminated.
6) Save paper.

The online exam has been conducted with the following conditions:
a. The exam can only be taken once.
b. All exiting students take the exam simultaneously.
c. The total number of questions is ranging from 50 to 100, depending on the program.
d. Once started, this test must be completed in one sitting.
e. Time is set for 2 hours.
f. There is a required password to access this test.
g. One question is normally presented at a time.
h. Changing the answer to a question that has already been submitted is not allowed.
i. Questions are randomized.
j. No smart phones or electronic devices are allowed in the examination hall.

The exit examination tests students at the end of their program of study for attainment of the
program's intended learning outcomes. They cover one or more program-level outcomes, not
course-level outcomes. Because the test is used to determine whether COE has met the ABET
accountability standards, students still would be required to take the exam. A set of questions is
normally identified covering the fundamentals and highlighting key concepts in engineering in
the past year course. These questions stress the basic and fundamental knowledge that any
engineer should possess before starting practicing. By default, fewer advanced questions are
introduced to distinguish between the smart and the average students. The number of questions
varies from department to department, but is normally kept identical among all years. Moreover,
some exams are the same as in the past taking into consideration that even no single question
will be known or revealed for the next student batch. The exam is conducted on the same day
and same questions are given to everyone on campus in the presence of proctors. The score on
this exam should reflect the student performance. This exam tests the students’ knowledge, skills

27
and attitudes gained throughout the study of the courses of a specific program. It is expected that
students do not study for the exam, but they put a good faith effort into doing their best.
The graduate exit exam is a requirement for degree completion. The exit exam is a mandatory
exam, which is administered on campus. It is offered twice a year: fall and spring terms and is
normally scheduled at least three weeks before the graduation deadline.

Immediately after the exit exam, student’s feedback about the exam and their reflections were
collected. Many students do not see any benefit for the exam since it only weighs 5% of their
final grade in the final graduation project course. Some students believe that this exam has to be
taken first by the instructors themselves to see how many of them can pass. Others believe that
this exam is a good chance to remind them of what they have studied in the last four years of
their study. Because the exam is computer based and has been designed and integrated in the
blackboard-UAEU online education system for students, the build- in grade system offers the
opportunity to display each question with its multiple choice options along with the percentage
weight of each multiple choice option. This helps in detecting how many students have chosen
the right answer and how many have chosen each of the other choices.

The exam serves to provide the departments, colleges and universities with detailed up-to-date
feedback, which helps develop the program and its courses. Once the students take the exam, the
results are analyzed, examined and discussed extensively to identify points of strengths as well
as weaknesses and pinpoint any areas for development in the academic programs or even the
introduction of new programs and courses. Thus as a consequence, those detailed results will
allow colleges and departments to identify domains where the students excel and those which
need improvement and better assessment. The expansion of exit exams can be attributed mostly
to standard based reform, and college administrators are the main drivers. Standards have
provided solid reliable foundations and backgrounds to the concept of exit exams by setting what
students should know and be able to do by the time they graduate from engineering schools.
College administrators have an ongoing responsibility to closely monitor the implementation of
exit exams. They must understand the effects of these tests, including any negative or unexpected
consequences, so they can address problems or adjust state policies. Results of the exit exams
could be used by universities in the evaluation of their instructors and faculty member during one

28
batch. This evaluation may impact the academic promotion of faculty members. This action
would force instructors to do their best to explain the content of the course well, which will
impact significantly on the students' performance and proficiency. Questions of the exit exam are
mapped with their corresponding courses, within which the information has been covered.

Needless to say, a good instructor is the one who helps students to retain the necessary and basic
knowledge as much as possible. The exit exam aims at measuring the students’ attainment of the
program learning outcomes as well as their performance in the individual courses/domains
relevant to their specialization. This is highly important in higher education as it serves as a point
of reference for program enhancement. The students taking the exam realize the importance of
the exam when it is seriously taken. These types of exams will motivate students to work harder
and help teachers identify and address students' weaknesses.

The ABET accreditation now requires students at engineering school to pass an exit exam before
graduating; a key element of standards-based accountability reforms. The objectives of this test
are as follows:
1) To make sure that all courses are expeditiously completed and that they are taken
before other courses that may have them as prerequisites. If the time limit is not met,
the student may be dis-enrolled.
2) To use the exit exam results to introduce necessary adjustments to teaching and
learning processes.
3) To assess functions to improve student learning, to discover course-embedded
assessment models and contemporary approaches to curriculum design, teaching
methods, and assessment
4) To benchmark measure of excellence this will help to improve the department
services and operations by tracking several measureable parameters over the years.
5) To measure the quality of the engineering programs.
6) To provide data and information for decision making process.
7) To learn how assessment strategies can form the groundwork for an improved
"assessment"
8) To map and to provide "backwards" feedback for curriculum design and development

29
To improve the exit exam results, the study suggested the following:
a) The contribution percentage of the exit exam should be increased from 5 to 10%.
b) Students must repeat the exit exam until they pass.
c) A minimum score level for passing the exit exam should be defined.
d) After the exam, a statistical analysis of the results should be used to compare the new
test to the benchmark set.
e) It is necessary to rewrite the exam periodically to maintain security.
f) The exam score of the students should appear on their transcripts.

The study by Ahmad, et al. (2014), investigated the impact of the implementation of exit exams
on educational and learning process, and assessed the importance of exit exams as a quality
indicator for academic program reviews and for benchmarking. To improve the exit exam
results, the following are suggested:
1) The contribution percentage of the exit exam should be increased from 5 to 10%.
2) Students must repeat the exit exam until they pass.
3) A minimum score level for passing the exit exam should be defined.
4) After the exam, a statistical analysis of the results should be used to compare the new
test to the benchmark set.
5) It is necessary to rewrite the exam periodically to maintain security.
6) The exam score of the students should appear on their transcripts.

The study concludes that exit exams can be vital to the improvement of academic programs
quality and effectiveness. These benefits can help improve the quality of programs across
colleges at the UAEU as well as other institutions within the UAE. Moreover, this creates a good
area for cooperation between academic departments since they can compare results and work
jointly to improve the quality of higher education as a whole. Coaching or preparing students for
the test is a problem and should be avoided. At the end, ABET and degree-accreditation agencies
have strong reasons to keep track of the impacts of exit exams. It is worthy to add that in any
program development cycle, evaluation is ongoing and one should not wait for the completion of
the program or the course to introduce changes. Other elements such as the teaching skills of

30
faculty members may need to be considered as part of the success of the test. The personal
worries and anxieties that students may have at the end of the course are also factors that should
be taken into account.

4.2. Ethiopian experience on Exit Exam

4.2.1. Exit exam in Ethiopian Law Schools

The idea of exit exam in Ethiopia was initiated by Tsegaye Regassa’s Concept Note in 2009 and
has been implemented since 2010/11 academic year. The objectives of exit exam in Law schools
is stated in a study (Evaluation of the Law Exit Examination System in Ethiopia, June 2015) as
“monitoring whether the graduate profile of LL.B curriculum has been achieved; monitoring
levels of achievement in the learning outcomes of courses under the LL.B curriculum;
facilitating the efforts of students to revise the core learning outcomes of the courses they have
taken during their years of legal education; safeguarding prospective employers from loose and
inflated grading systems that might lead to graduation at a standard below the threshold required
in the graduate profile and learning outcomes of an LL.B Program; and creating a constructive
competitive spirit among law schools in Ethiopia with a view to encouraging them to give due
attention to the quality and standards of legal education; imposing some standards on the way
law instructors teach individual courses.” The above study includes the following observations in
its background:
 The representativeness, clarity and quality of the questions have not also been helping the
question of the legitimacy of the very institution of the exit exam and the organ
administering it. A great deal of stakeholders has been criticizing the exit exam questions
for exhibiting exorbitant linguistic and conceptual clarity, lack of representativeness of
the core courses as well as the topics of the same course.
 A considerable number of law schools in the country also give little attention for
preparing their students for the exit exam.
 The quality of extension, summer and distance students and the quality of education they
receive compared to the regular program students leaves much to be wanted.

31
Table 3: Trends1 of Exit Exam Results of Prospective Graduates of Law Schools in Ethiopia

YEAR Status on Law Exit Exam Results


(EC) Detained Passed Total
2006 Count 1,956 1,329 3,285
% 59.5% 40.5% 100.0%
2007 Count 1,361 1,215 2,576
% 52.8% 46.2% 100.0%
2008 Count 1,519 1,596 3,115
% 48.8% 51.2% 100.0%
2009 Count 1,751 1,246 2,997
% 58.4% 41.6% 100.0%
Average Count 6,587 5,386 11,973
% 55.0% 45.0% 100%

The figures in the preceding table show that from among the nearly twelve thousand Law School
students who sat for the exit examinations for the years 2006-2009, only about 45% managed to
get pass marks. The majority, or 55%, could not get the pass mark. The implication for the
envisaged exit exam is immense in terms of the pass rate.

4.2.2. Exit exam as a licensure examination for health professionals

The Federal Ministry of Health (FMOH) launched the National Licensing Examination for health
professionals on July 8, 2015 for the first time. It was introduced at the beginning to graduates of
four selected cadres i.e., Medicine, Health officers, Midwifery and Anesthesia to independently
verify competence of all graduates in their respective fields and gradually it included other
cadres as well. In 2018/19 testing program it is planned to administer the examination in seven
different health specialization areas for about 13,000 new graduates from public and private

1
Source: Belay Hagos (2018). Trends of Results in Exit and Licensure Exams in
Ethiopia, Unpublished, Institute of Educational research, Addis Ababa University.

32
higher education institutions. The national licensure exam initiative is established with the
mission to identify the minimally competent health professionals to insure the safety of the
public through standardized assessment. Health professionals’ licensure examination could be
considered as a requirement for graduation since candidates take the test before they graduate.
The specific objectives are to:

 Provide objective assurance that all those joining the health work-force have reached a
common standard, and thereby help to protect patients.
 Bring consistency to the assessment of outcomes of all graduates from Ethiopian Medical
Schools (both accredited private institutions and public training institutions).
 Serve as a requirement for both Ethiopian and foreign graduates for registration, for
issuing license to practice or, employment in the health system.
 Determine the suitability of a candidate to be trained as a future specialist-consultant
capable of practice in his/her specialty at the highest competency level.
 Drive up standards in Ethiopian medical schools as failures in the HPLED will be a
significant driver for quality improvement.
 Give particular assurance to key interests, notably the public who express concerns about
some aspects of the competence or behavior of new graduates.

The major tasks in this high stake testing program included rigorous review of exam blue print
and exam items by testing experts, psychometricians and subject specialists, administering the
exams, scoring and reporting the outcomes to different stakeholders were the major activities.
Expected outcomes of this nationwide high stake testing program includes capacity development
through workshops and training forums, development of high quality test items that would
measure higher order learning outcomes in the specified testing areas, preparing camera ready
examination booklets, setting of standardized and uniform administration procedures in selected
testing centers, item analysis, item banking and communicating feedback to higher education
institutions that would serve in improving quality of education and conducting research are the
major expectations.

The examination is designed in such a way that

33
 It measures a comprehensive medical knowledge, clinical competency and performance
in a multidisciplinary and integrated manner.
 The MCQ examination focuses on basic and applied medical knowledge across a wide
range of topics and disciplines, involving understanding of disease process, clinical
examination, diagnosis, investigation, therapy and management, as well as on the
candidate’s ability to exercise discrimination, judgment and reasoning in distinguishing
between the correct diagnosis and plausible alternatives.
 The clinical examination aims at assessing the clinical competence and performance of
the candidate in terms of his or her medical knowledge, clinical skills and professional
attitudes for the safe and effective clinical practice of medicine in Ethiopia. Specifically,
Objectively Structured Clinical Examination (OSCE) assesses candidate’s capacity in
such areas as history taking, physical examination, diagnosis, ordering and interpreting
investigations, clinical management and communication with patients, their families and
other health workers although this may not be applicable for 2010 EC testing.

The Ethiopian National Licensure Examination for Health Professionals is a collaborative project
jointly designed by the Health Professionals Licensure Examinations Directorate (HPLED) and
the Institute of Educational Research (IER) of the Addis Ababa University based on the
Memorandum of Understanding signed between the two institutions. The two institutions
mobilized huge amount of material and human resources in the implementation of the licensure
examination and useful experiences are documented despite the fact that it encountered some
problems related to infrastructure and stringent financial regulations for optimum
implementation of the project. It is however, believed that the wealth of experience accumulated
in these two institutions in planning and administering this high stake examination would help to
institutionalize the higher education institution exit examination across all disciplines and
universities in the nation.

34
Table 4: Results2 of Prospective Graduates of Health Professionals in Ethiopia (2016/17)

N=9,128 Public HEIs (n=3,991) Private HEIs (5,137)

Cadre Cut-off Number Pass rate Number Pass rate

Anesthesia 45.90% 96 89.58% --- ---

Health officer 52.92% 1,304 66.56% 2,902 1.14%

Medicine 54.00% 403 45.69% --- ---

Midwifery 51.56% 1,055 75.92% 39 84.62%

Nursing 53.47% 1,133 19.86% 2,196 0.0%

(Source: Belay Hagos (2018). Trends of Results in Exit and Licensure Exams in Ethiopia,
Unpublished, Institute of Educational research, Addis Ababa University)

The implication of the data above indicates that there is a possibility of getting very high number
of failure in the anticipated exit examinations.

The benefits of introducing exit exams in Law Schools and Licensure Examinations for health
professionals showed that more candidates from public and regular undergraduates performed
better than those students enrolled in private higher education institutions and those in distance,
evening and summer programs. Second, over the years, student performances on exit
examinations improved showing that instructors and students are responding to the demand
imposed by exit examinations thereby contributing to the improved practice. On the other hand, a
large number of students who could not manage to get the pass mark is a burden to the system,
society and the individual student which requires systemic solution rather than relying only on
exit examination.

2
Source: Belay Hagos (2018). Trends of Results in Exit and Licensure Exams in
Ethiopia, Unpublished, Institute of Educational research, Addis Ababa University.

35
Steps and procedures followed the National Health Professionals Licensing Examination

The Ethiopian Federal Ministry of Health has been working to administer a National Licensing
Examination that aims to protect the public from inadequately trained health professionals. The
National Licensing Examination is designed as a comprehensive test of medical knowledge,
clinical competency and performance. Both multiple-choice questions (MCQ) and clinical
assessments (OSCE) are used to implement the examination. One of the critical steps taken by
the National Health Professionals Licensing Examination Directorate (NHPLED) was to produce
implementation guidelines that foresee all activities from the planning stage to test
administration, scoring and interpretation of the licensure examination.

The guidelines were prepared and organized by different experts selected from National Health
Professionals Licensing Examination Directorate, Legal and Justice Directorate, Human
Resource Development Directorate, different universities, Center of Competences, Ministry of
Education and Professional Associations. The draft guidelines consist of seven types namely:
Implementation Directive (rule and regulation), Exam Development, Exam Administration and
Management, Registration, examiner, candidate, and assessment center guidelines and duties and
responsibilities of individuals. The methodology adopted in the production of the guidelines
included awareness creation presentations, formation of taskforces for developing the
components of the guidelines, group presentations and panel discussions. The guidelines were
comprehensive in the sense that they elaborate general guidelines and binding principles, duties
and responsibilities of implementation bodies and especially of exam developers and
administrators,

Guidelines for Blue Print Developer, Item developers, Exam Reviewers, Invigilators, Expertise
Team that would determine cutoff passing grade points, duties and responsibilities of Local test
center coordinators, Supervisors and Time keepers are included. Guidelines also include
procedures for determining content areas to be covered, disciplines or areas of training,
candidates legible to sit for the licensure examination, examination types, issuance of license,
and other implementation guidelines were prepared and endorsed.

36
The exam development guide line was meant to assure the psychometric properties of the exams
such as reliability and validity of exams by following important steps such as Job Analysis,
Exam specifications and exam development process through applying an expertise made blue
print of table of specifications. This is an important document that defines the purpose and focus
points of the Licensure Examination, roles and responsibilities of exam development stake
holders, exam development process, the exam review process, pilot testing, development of
Exam bank, Exam assembly, standard Setting and Exam Security procedures. The exam
administration guideline spelled out details of procedures to be followed during the
administration phase such as Pre-exam administration activities, during exam administration
(MCQ), during OSCE administration, Post-exam administration procedures, quality assurance
mechanisms, and roles and responsibilities of the exam developing team.

The registrar manual gives direction on the standardized registration process of NHPLE
including topics such as: releasing results, and appeals, documentation and certification,
communication and code of conduct to be followed in due process. The examiner manual is
another document developed with the purpose of providing the necessary guidance particularly
to the examiner during the implementation of National Licensing Examination. Roles and
responsibilities of the examiner, roles and responsibilities of the MCQ Examiner, roles and
responsibilities of the OSCE examiners, examiners’ code of conduct and certification are
delineated.

The candidate manual provides the candidates information on eligibility requirements, the
application process, ongoing responsibilities to maintain certification and a variety of other
important topics such as OSCE and MCQ exam application and registration, preparation for
assessment, absenteeism or being late for the appointment, withdrawal, conduct, dressing code,
arrangements for candidates with disabilities, general exam and test center information, pregnant
or Breast feeding (baby <26 week age) candidates, grounds for dismissal or cancellation of the
exam results are addressed in this guideline. Protocol in the event of suspected cheating, the
passing standard (pass, fail), appeal procedures and soliciting of feedback, re-exam procedures,
disclosure of exam result report are topics discussed under the assessment center manual. The
purpose of the assessment manual is to identify those different job related competencies of
candidates that would enable to select competent health care workers. The contents of this

37
manual include assessment of center standards, center rules and regulations, accreditation of
assessment exam centers, duties and responsibilities of assessment centers are listed under this
section.

The National Licensure Examinations preparation is a process that comprises different tasks to
be accomplished in different phases of the project. This is a joint project administered by
Ministry of Health and Institute of Educational Research, Addis Ababa University each of these
collaborating partners have defined roles and responsibilities based on the memorandum of
understanding signed. As per the project proposal the major activities to be accomplished in the
process of implementing the project are summarized and presented below.

 Carrying out task analysis


 Preparing exam blueprinting
 Candidate registration data organization
 Preparing training manuals and guidelines
 Conducting training of item writers, assessors, standard setting team
 Developing exam items
 Editing and psychometric review of items
 Standard setting
 Conducting training of exam supervisors/invigilators
 Exam administration
 Scoring and item analysis
 certifying for a license
 Giving feedback to the health science colleges and stakeholders
 Developing test administration manuals and guidelines

A Steering Committee composed of professional drawn from Ministry of Health and Institute of
Educational Research of Addis Ababa University oversees the implementation of the
aforementioned activities. Major Activities in undertaking the project include procurement of
testing related materials, preparation of manuals, guidelines and directives such as exam
directives, test administration manual, test security guidelines and agreements, test development
manual, item review and selection manual, development of formats applicable to associate

38
supervisors, center supervisors, irregularity, attendance, test security, test material inventory,
formats are all in place. That is followed by test blue prints revision/ development for specified
number of cadres(areas) and the development of this table of specification is carried out by
experienced subject specialists drawn from higher education institutions that have health related
programs.

Test developers in the respective areas are recruited based on their merit and experience in
teaching in higher education institutions and health centers. Twenty professionals for medicine
and ten professionals for the rest of the cadres (disciplines) are selected and brought together in
the premises of the Institute of Educational Research. High caliber health professionals and
psychometricians working in the Institute of Educational Research are aligned with these item
developers and each and every other item is evaluated on the basis of checklist developed for this
purpose. This is a very crucial stage in the development of test item development and a rigorous
psychometric assessment is done with regard to the quality of the items. Test developers are
encouraged to write as many items as possible based of the test blue print in their respective
areas and an item pool is created. It is from this pool that items with better psychometric
properties selected and become likely candidate to be included into the final version.

At least five professionals in each cadre (knowledgeable in measurement and evaluation) would
take the assignment of reviewing and selecting/assembling of the items. The next step is the item
lay out and booklet formations where two persons in each testing discipline are engaged.

Prior to activities related to test blue print preparation and item development, series of
workshops are organized for these professionals. For example, validation workshop with
stakeholders on the suitability of manuals, guidelines and directives for the purpose, workshop
for task analysis professionals, workshop for test blue prints revision/ development subject
specialists, workshop for test item development experts are organized so that experts involved in
these tasks would clearly understand their roles and responsibilities adequately and effectively.

39
4.2.3. The experience of Testing Center at Addis Ababa University

For the past several years, the Testing Center of Institute of Educational Research of Addis
Ababa University has been providing testing services to many Governmental and Non-
Governmental Organizations (NGOs) including the Addis Ababa University: Department of
Human Resource Management (HRM). The services of the Unit include the development,
administration, scoring and reporting results of personnel selection, training and recruitment
tests, administration of international tests such as Test of English as a Foreign
Language(TOEFL), Grade Record Examination(GRE), Association for Certified and Chartered
Accountants(ACCA), Scholastic Aptitude Test(SAT) and very recently National Licensure
Examination for Health Personnel (NLEP) and others, and conducting research and training in
the area of testing and measurement. Moreover, the Unit coordinates the development of the
Ethiopian University Entrance Examinations (EUEE) previously named as Ethiopian School
leaving Certificate Examination (ESLCE) as of August 2002 based on the mandate given to the
Institute of Educational Research, Addis Ababa University by all Higher Learning Institutions
and Ministry of Education.

IER in collaboration with the Ethiopian Federal Ministry of Health has been working to
administer a National Licensing Examination designed as a comprehensive test of medical
knowledge, clinical competency and performance. Multiple-choice questions (MCQ) have been
used to implement the licensure examination. The National Health Professionals Licensing
Examination is meant to ensure that new graduates of health professionals from all Ethiopian
higher education institutions have the minimum competency before they engage in care services
in the Country. The overall purpose of the licensing examination is to identify persons who
possess the minimum basic knowledge and experience necessary to perform tasks on the job
safely and competently not to select the top candidates. In this nationwide high stake program
IER has played pivotal role in test development, printing, packaging, deployment of work force
for the administration of the examination and more over carrying out all administrative routines.
The institute was able to administer the licensing examination for 10,100 graduates across five
disciplines (i.e., anesthesia, health officer, medicine, midwifery, and nursing) trained in both
public and private higher education institutions.

40
The Testing Center of IER in Addis Ababa University has been rendering quality testing services
to customers and stakeholders locally and other international agencies like Education Testing
Service (ETS) and ACCA. Towards this end the center demonstrated test integrity with utmost
ethical standards in administering the tests and shipping test materials timely and confidentially.
Accountability is the guiding principle in realizing quality service to customers.

The contribution of Testing Center since its establishment as testing center in 1983 and its wealth
of experience in test development, administration and the level of customer satisfaction of
various stakeholders that have been getting testing and measurement service from the unit for the
last decades and the extent to which Testing Center and IER are engaged in community
outreaching services as stipulated in the Addis Ababa University 2011 Senate Legislation and
Strategic Plan would enable it to undertake the leading role in the development and
administration of exit examination meant for undergraduate students.

5. Instituting Exit Examinations in Ethiopia

5.1. Procedures for Implementing University Exit Exam

The following four steps are required for the implementation of exit examination:

Phase one: preparation

The preparation phase is a very crucial step in implementing the exit examination.

1) Organize awareness creation forums for stakeholders;


2) Set up a Steering Committee composed of professionals drawn from MoE, universities, and
professional associations;
3) Have all those involved in testing process sign a TEST SECURITY agreement;
4) Establish learning /educational standards and specific performance indicators, including task
analysis as appropriate;
5) Develop test blue prints or table of specification;

41
6) Produce implementation guidelines that foresee all activities from the planning stage to test
administration, scoring and interpretation of the exit exam;
7) Procure testing related materials including stationery materials;
8) Select test developers in the respective areas based on their merit and experience in teaching
in higher education institutions; and
9) Select high caliber professionals and psychometricians working and align them with item
developers.

Phase two: item development and standard setting

1) Provide training to relevant stakeholders on key concepts in exam development,


administration, scoring reporting and test security;
2) Organize series of item development workshops and develop and compile as many items
as planned;
3) Develop a repertoire of test items as produced by the item writers, or item pool;
4) Accumulate items with acceptable psychometric characteristics in the item bank;
5) Select items form an exam booklet;
6) Workout the item lay out and finalize the booklet formation;
7) Conduct standard setting workshops to determine the cut off points for each field.

Phase three: exam duplication, sorting, packing, transporting and administration


1) Duplicate the exam booklets in a very secure location dedicated for testing;
2) Define authorized personnel in rooms for exam duplication, sorting and packing;
3) Store exam materials, including scratches, in safe boxes;
4) Transport exam materials to the testing locations with vehicles dedicated for this purpose
and escort the exam packages and the crew with the necessary security guards;
5) Follow the strict guideline for exam administration to optimally protect the exam from
cheating, and the exam booklets and answer sheets from leakage in any forms, be it
physically or electronically.
6) Return all question papers and the answer sheets including the attendance sheets to only
one center and lock them in safe boxes prepared for this purpose.

42
Phase four: exam scoring and reporting

1) Scan each answer sheet in a scoring machine for scoring;


2) Use the scored data for each candidate and produce a post-test administration item
analysis for evaluating the psychometric characteristics of the test and any additional
piloted test items;
3) Produce a report card for each candidate as an automatic output based on pre-
programmed template;
4) Produce a summary performance report for each institution by fields of studies;

5.2. Use of University Exit Examination

Ethiopian university exit examination is a curriculum based comprehensive examination that is


developed based on the graduates’ profile stated in the harmonized undergraduate curriculum.
The primary purpose of Ethiopian university exit examination is to ensure that undergraduate
students complete their studies with at least the required minimum competencies as stipulated in
the curriculum graduate profile. The following quotations from interviews of stakeholders
support this claim:

HERQA is built around four pillars. Exit examination is one of the missions of
HERQA as mandated by the proclamation. One of the major purposes of exit
examination is to contribute towards the improvement on quality of education.
The feedback obtained from the analysis and interpretation of exit exam results
should be well organized and must be used as an input towards HE quality
improvisation. Exit examination has double edges. One is a measure of
graduates’ competence. The other is contribution towards increasing
accountability. (HERQA, Vice Director)

First and foremost, the purpose should be seen in terms of improving quality of
education in higher education institutions. The remaining issues are subordinate
to this major purpose. The LLB exit exam administration to law graduates of
public and private demonstrated issues related to quality education and

43
competences of graduates. Results of this examination showed that students
enrolled in the private higher education institutions performed far below the
minimum requirements. And yet, they continued issuing degrees for those who
have not fulfilled the minimum competency (ESC, Director).

I think exit exam could be used as a requirement for graduation and at the same
time for licensure examination. … Especially for undergraduate programs,
checking whether the programs meet the required standard could be done using
exit examinations. The second element is also about checking whether graduates
have the required competence for the labor market. It is pertinent to consider the
purpose of Exit exam in terms of meeting the curricular standard and ensuring
the competence of graduates for employment (AAU, top management).

Hence, the university exit examinations could serve as a graduation requirement. In other words,
those who get a pass mark, as long as they fulfill the minimum required competencies, on the
exit examination would be allowed to graduate and get their first degree and those who could not
make it would be given another chance of sitting for re-examination. Here are some quotes from
our interviewees:

I support the view that exit exam should serve as a graduation requirement for
various reasons. To ensure quality standards and improve teaching learning, exit
exam is necessary. One of the expectations is ensuring our graduates fulfill the
minimum requirements for graduation which should be checked using exit
exam.… Equivalence of competencies of graduates from different universities
could be established …. We have to focus on ensuring that our graduates fulfill
the minimum competencies before they graduate. Although graduation is the
mandate of universities ensuring that the graduates have the minimum
competencies is the responsibility of the government. We need to balance the
mandate of the universities, the expectations of the government and the
requirements of the employers. That is why we claim we could strike the balance
through curriculum based exit exam which is based on each graduate profile.

44
This requires the consensus of universities and delegating implementing organ on
their behalf. The immediate task should be including the exit exam in each
curriculum, with a grade of pass and fail value. If universities accept and own
this, they will start to prepare themselves and support their students by providing
them model exams (FGD, participants from MOE).

I agree with the view that graduation is the mandate of universities…. universities
can delegate and run the exit examination. I think exit exam should be a
requirement for graduation... since we are using harmonized curriculum, exit
exam could serve to measure quality of higher education, in terms of determining
each university’s pass rate and variations among universities…. To make exit
exam a graduation requirement, this should be reflected in the curriculum with a
grade value of pass and fail and such a decision need to be endorsed by each
university’s senate (FGD, participant from MOE).

On the other hand, some interviewees argued that let universities graduate their students or grant
degrees irrespective of exit exam results. Licensure examinations need to be introduced to select
those who could from those who could not fulfill the minimum requirements for doing the job.

Is it fair to impose exit examinations on universities? Universities have the


mandate to graduate their students as per the Higher Education Proclamation
number 650 and their respective Senate Legislations. I think the examination
should be for licensure which is directly linked to employment. Only those who
could meet the minimum requirement to do the job will be given license to do the
job (Interviewees with a Specialization in Engineering).

Contrary to the above view the feasibility of licensure examination was questioned by some
respondents. For instance, let’s have a look at the following excerpt:

… as there is no regulated policy on employment, licensure examination may not


be feasible in Ethiopia at the moment. The private sector might employ any

45
graduate with a degree, which is difficult to regulate at this moment. It may also
have negative effects on creating job opportunities (Interviewee from MOE).

Although there are variations among respondents where exit exam should be used for graduation
requirement or licensure for employment, the following points might be considered due to the
logical arguments provided:
 Exit exam could be used as a requirement for graduation.
 Universities have to revise their curriculum to include exit examination in the curriculum
with a grade point value of pass and fail and get it approved by the university Senate.
 Since graduation is the mandate of the universities, the Council of University Presidents
need to arrive at a consensus on buying the idea of exit exam being a graduation
requirement and delegate implementing institution(s) on its behalf.

The reflections of the authors might be necessary in this regard. That is, as long as the admission
requirements to the universities are fair enough in predicting student performances in university
studies one might assume universities are producing competent graduates and licensure
examinations might be recommended to check whether they are fit for the job. On the other
hand, as universities have no much influence on the admission criteria of undergraduate students,
exit examinations as a requirement for graduation would help filter those with the minimum
required competencies in their field of training.

5.3. Implementation Time

Although end of the academic programs, preferably every year on the month of April, was
expected to be the exit exam schedule, the interviewed stakeholders suggested a two phase
approach to the administration of exit examination: mid-way and end academic program
schedule of exit examination.

If we are measuring the competencies and if we want to empower the students, I


propose two times throughout the training duration: one is mid-way through and
at the end of the program. This will help the students and the universities to get

46
feedback and provide them opportunity to improve in the next round exam (FGD,
participant from MOE).

I expect exit exam at two levels: mid-term and end of the academic program. 1)
Mid-term, when 50% of the courses are completed. Providing exit exam at
midterm serves as a feedback to institutions instead of a onetime final exit exam.
2) End of the program, usually End of June with enough time for students to get
prepared (AAU, top management).

In terms of the kick off year for the exit examination in April-May 2019 (or 2011 EC), views of
respondents appear controversial. Some prefer to delay the commencement of exit exam for at
least one more year for different reasons:

There are several reasons why rushing is not recommended: 1) the academic year
did not start until end of October 2018 and this has already created negative
impact on course coverage, 2) the contextual factor, especially on the probability
of politicizing the exit exam on the part of some groups is high or is feared, and 3)
too short time for technical and administrative preparation of the examination, 4)
time is too short to get the students prepare themselves. Due to these factors
implementation of exit exam in 2011 EC is not feasible; rather, it is going to be
counterproductive. However, if at all, implementation of exit exam is inevitable in
the current academic year, it might be acceptable to start with the field on
engineering, which has more than 35% student population (Interview with
Professors, Science and Technology).

5.4. Content of Exit Examinations

The content of the exit exam should emanate from the curriculum based graduate profile. The
coverage of the content is expected to be comprehensive that it includes both core subjects and
basic courses that would generate the expected graduate profile.

47
[The content of the exit exam should include] Integrated courses rather than
focusing on major courses; it should be more comprehensive. The content should
address the test taker’s skills, knowledge and attitude. The focus should be on
major courses without disregarding other relevant courses (Interviewee, higher
official of AAU).

The preparation of the exit exam could be multiple choice questions (MCQ) type just to start
with and step by step performance based assessment shall be introduced in due course.

I have a reservation on the applicability of MCQ for all subjects. The major gap
we have is hands on skills. Hence, performance based testing is highly required
(FGD participant, MOE).

Why do we base on paper and pencil testing? We have to shift to digital testing.
Digital testing using tablets or computers could be used to be economical and
efficient. If some of the tests require laboratory, we may think of using digital
laboratory to assess skills (Interviewee, higher official of AAU).

Again the LLB exit exam practice gives us important insights with regard to
content, and format. It is not entirely MCQ. Some law and judiciary cases are
part of the format and candidates are expected to explain and interpret them.
Although carefully developed MCQs can measure higher order learning
outcomes, there is a need to include knowledge, attitude and skill based items in
the pool. When the law exit exam was introduced in 2003, there were T/F and Fill
in the blank items. De-merits of such items were learnt for experience and the
format was changed to MCQs and case related interpretive items. One-fit-for all
approach is not applicable in this context. We need to customize our test formats
depending on the competencies and standards identified in the curriculum
materials. Let’s leave aside the nitty-gritty and focus on the major competences
that may lead the graduates to the world of work (ESC, Director).

48
5.5. Contributions of exit exam to quality education

Exit exam has positive contributions to improving quality education through institutional
feedback, motivating students and teachers’ engagement. The following excerpts best describe
how exit exam is related to quality education.

If quality is defined as ‘fit for purpose’, then exit exam ensures whether graduates
are fit for purpose. Test results are immediate indicators. The test items need to
be analyzed to get feedback to identify the gaps and to build the institution. If
quality is defined as a standard, exit exam could serve as a proxy indicator for
quality assessment. Exit exam results could also be used as feedback for the
institutions. It can also bed for institutional accountability (Interviewee, higher
official of AAU).

Encouragement and support for all private and public universities. The education
strategic center is currently developing a draft strategic document on formula
based budget for universities, performance based allocation of resources. This
strategy will encourage and motivate universities. Every institution will be
engaged energetically to capacitate students which eventually will make the less
performing ones out of market. They will be depleted and students will likely opt
for the better ones. Hence, exit examinations will largely contribute to improve
quality of education in higher education institutions (ESC, Director).

The introduction of exit exam will have contribution to improving quality


education: 1) Motivates students to be engaged, 2) Increases engagement and
motivation of teachers to empower students, 3) Enhances the commitment and
engagement of institutions in empowering their students, 4) Boosts the confidence
of students as certified competent graduate, 5) Industries will also have
confidence on employment of graduates, who passed exit exams, 6) Discourages
fetishism of certification or discourages aspiration for certification irrespective of
the associated competencies (FGD participant, MOE).

49
HERQA as institution understands exit examination as a means to improve
quality of education in higher education institutions. HERQA’s interest is to
maximize public interest in education, make education outcome-based. The COC
administered to TVET graduates to ensure the competency of graduates from
colleges is exemplary practice. This best practice can be extrapolated in using
exit examination in improvising quality of education in HEIs. The AAU testing
center experience can also be used as benchmarking in this exercise. The exit
exams are founded on competencies shown in the respective curriculums. Hence,
based on results of exit exams curriculum revision activities can also be
undertaken to improve all aspects of the teaching learning process in HEIs
(HERQA, Vice Director).

5.6. Consequences of the results of exit exam

One of the effects of the results of the exit exam is determining those who fulfill the minimum
competencies and achieve the minimum required standard. As a result, bachelors’ degree shall be
awarded to those who could get the pass mark on the exit exam and those who could not get a
pass mark should be allowed to sit for re-examinations.

Universities should sign contract to produce competent graduates and they should
be held accountable according to the results they deliver (Interviewee, higher
official of AAU).

Whether universities should be held accountable based on the results of the exit exam did not get
consensus.

I think this is a controversial issue to consider universities with more graduates


with pass rate on exit exam. There are several factors that make the university
successful on exit exam. Placement of students is one of the factors. Higher
scorers join Addis Ababa University or closer to Addis Ababa, such as Debre

50
Berhan University. Low scorers had to go to remote universities. Because of such
differences, universities should not be held accountable based on exit exam
results. Rather, the exit exam results could be used as an institutional feedback. In
fact, institutions could be held accountable based on patterns of performance over
the years if there is no improvement or if it shows declining trends. We have to
consider the three missions of the universities (teaching, research and community
services) to evaluate the university. Considering only one of these, i.e., student
learning outcome, would make our evaluation of the performance of the
universities incomplete. We need to be cautious (FGD, MOE).

…based on results of exit exams, strengths and shortcomings of institutions


should be identified, assessed, evaluated and devise support systems and help
institutions to achieve standards set for specific programs. We need to exhaust the
whole range of packages and scale up universities into centers of excellence
before making them accountable based on performances of undergraduate
students on the exit exams (Interview, HEQIA).

The inherent meaning of accountability is not to purge out low performing


universities out of the market. HERQA is essentially meant to support universities
and its strategy is to introduce and enhance quality assurance mechanisms.
Hence, based on results of exit exams, strengths and shortcomings of institutions
should be identified, assessed, evaluated and devise support systems and help
institutions to achieve standards set for specific programs. We need to exhaust the
whole range of packages and scale up universities into centers of excellence
before making them accountable based on performances of undergraduate
students on the exit exams (HERQA, Vice Director).

Exit exams will help higher education institutions to look inwardly. Exit exams
are prepared and administered in packages. Results will show areas of
deficiencies. In light of interpreted results universities will identify their gaps in
terms of inputs, processes including facilities and staff profile. Rather than

51
aggregating students’ results over the entire competency, it is advisable to
analyze and interpret disaggregated scores in specific areas of competencies.
Accountability and deliverables should be seen in terms of disaggregated scores
(ESC, Director).

According to the respondents, focus should be on providing feedback to universities to support


them improve their performances rather than solely relying on exit exam results for
accountability purposes. However, patterns of increased and decreased institutional
performances on exit exam could be considered as one of the criteria for institutional evaluation
and accountability.

5.7. Prioritization of fields for examination

Which fields of studies to prioritize was one of the questions raised during interview and
discussions. Responses are that health areas are vital, and as a result, they are already
implementing the examination in the form of certification for Employment. Law is another field
that runs exit examination like that of health fields. What is at issue now is the fields to start
following 2011EC.

According to a respondent from HERQA, “JHPIEGO experience with MoH can be taken as best
practice in identifying disciplines along the six bands.” A respondent from ESC said that
Engineering followed by business fields should practice the Exit exam. Many of the respondents
agree that the exit exam should be phases by phase due to capacity problem (limited resources
and extensive preparation). Engineering should be prioritized, respondents said, for the reason
that Ethiopia is in dire need of infrastructure (construction and technology).

Kicking off the exit exam in 2011 EC (i.e., 2019) was already planned. Universities
know this and students are also aware about this. We have to focus on one subject
from each college representing each of the six bands: Band 1. Engineering, Band 2.
Applied sciences, Band 3. Health sciences, Band 4. Agriculture, Band 5. Business and
Economics Band 6. Social Sciences and education (MoE respondents).

52
Setting priority is dictated by governments’ policy i.e., the current 70/30 policy that
gives due emphasis to science and technology. Government places emphasis on
health and agriculture and civil engineering in order to achieve the Growth and
Transformation Program. In newly opened universities we find 40 to 50 different
programs. Hence, we need to start in the areas prioritized by the government.
Another strategy is to sort out those programs that are well established in the
universities and take off gradually build on capacity and cover the rest of the
programs. JAPEGO experience with MoH can be taken as best practice in identifying
disciplines along the six bands (HERQA, Vice Director).

The priority should be in the area of health. For one thing the health status of a given
population is determinant factor. Second, the newly opened health institutions
affiliated to hospitals such as the millennium St. Paulo’s hospital are affront with
higher education goals. MoH is also in full support of these newly emerged health
programs. The first generation medical schools such as the Black lion and Gondar
medical schools were not happy with the opening of such medical schools. There was
serious doubt about their capacity to offer health programs. There was serious
confrontation. However, experience has demonstrated that these newly opened
medical schools and programs achieved better results in terms of graduate profile.
There is also the possibility of transcending the experiences of the licensure
examination administered by MoH. Next to health comes engineering. For one reason
all universities have engineering programs. Numerous programs and students are
enrolled in the engineering field. Third; Business and Economics. This is an area
where fraud practices are done especially by the private institutions. The private
higher education institutions are highly engaged in these areas. In general, my
suggestion is that this new approach has to be implemented in phases, gradually
building best practices (ESC, Director).

One of the respondents from one of the Universities countered the claim of prioritization of
fields. The explanation he gave was “it may cause weaker students to oppose and lead others to
overall student-strike”. It may be taken that when students (even in the lower range of

53
performance) who do not take exit exam have a chance of employment, those students who fail
in the priority fields in exit exam may not be allowed to get employment even if they completed
the courses in the university.

5.8. Technical and administrative preparations

Most respondents have similar ideas on the technical and administrative preparations of exit
examinations. For instance, a response from interviewee from ESC is an example:

Active involvement of stakeholders is where we should start. At several points in


the continuum, there is a need to have discussion forums with all stakeholders;
university management, students, parents and professional associations on the
purpose and their roles have to be clearly defined. Institutionalizing exit exams
should precede all other suggested activities. … Professional associations are
especially helpful in setting standards in the respective programs and make use of
their experiences.

We need to get ready along three dimensions before we actually engage in the
implementation of exit exams. These are human, financial, and organizational.
There is a need to mobilize professional associations such as civil engineers’
association, public health association, and medical practitioners’ association.
These associations are especially helpful in setting standards in the respective
programs and make use of their experiences (HERQA, Vice Director).

Respondents added that the uproar at Bahir Dar University due to comprehensive examination is
a good sign that serous preparation and discussion among stake holders should be made.

Regarding effective implementation of the program, an official from AASTU said, “a separate
autonomous and accountable institution that would facilitate the process and create enabling
environment in all higher education institutions has to be in place.” This idea is also supported
by other respondents including an official from NEAEA and Enjibara University.

54
Explaining the purpose of institutionalization, an official in AAU said” If the purpose of exit
exam is ensuring educational standard there should be independent institution which should be
set up to run exit exam in particular and educational quality assurance in general. The
independent institution should ensure standard and support training institutions after analyzing
the data.”

In relation to technical and administrative preparation, almost all respondents suggested that
starting the exam in this year (2018/19) would be very difficult. For instance, an official from
AAU said the following: “This year is unique; there is unrest. Current academic calendar is
already pushed which created delay in academic programs and the content might not be covered
adequately, this academic year might not be advisable to launch exit exam.” An official who is
involved in managing national exams in the country and also had experience in preparation of
Law-exit exam stressed that starting exam in this year is unrealistic. Technical and administrative
preparations take time and that should be completed beforehand.

5.9. Accommodation of Students with disability

An official from AAU remarked that “accommodation for students with disabilities such as
accessibility of exam rooms and using JAWS program for the blind are important”. He and other
respondents agreed that there should not be double standard on deciding the passing mark: they
have to meet the standard but there should be suitable testing environment and facilities for
students with disabilities.

An official from Enjibara pointed out that more chance of re-testing for failing students can be
one way of accommodation. Accommodation should not be limited to students with special
needs; it should also refer to developing regions of the country. A respondent from ESC noted
that “special attention should be given to the marginalized regions and disadvantaged groups.”

55
5.10. Re-exam opportunity

In arrangements for those who could not pass the exit exam, an official from AAU said, “re-
exam should be the right of the test takers to re-sit every three months.” When this official
emphasized that there should be limit (a maximum of three times) to the number of exit-exams
for an individual, other officials including AAU and NEAEA said that the cost should be covered
by the students, even if limiting the number of exit-exams is not important.
Regarding the number of times re-exam is given in a year, an official from AAU said that three
times in a year is preferable, if the exam is paper-and-pencil.
The official from AAU added the following:

“If test takers repeatedly take the exam but could not make it, then there should be a
mechanism of determining the level of the test takers that is lesser that the degree for
which the program was designed such as 12+4, 12+3, 12+2, etc. In the first place, 12+4
or 12+3 was not equivalent to a Bachelor’s degree; rather it means the student did not
complete the requirements for Bachelor’s degree but has reached 12+4, or 12+3, which
means not qualified”

Regarding the re-exam and the quality of education, a respondent from ESC stated the
following:

“Every candidate who sits for exit examination should at least get a pass mark though it
is difficult to determine the pass mark cutoff point at the start. When the LLB exit
examination was administered in 2003 the cutoff point was 25%. In 2010 it was raised
to 40%. Theoretically the pass grade is 50%. Rather than emphasizing on the cutoff
point; precautious measures that improve quality of education should precede the cut-
off point.”

56
Regarding the roles of the universities and other stakeholders, an official from AAU noted,

Provide tutorial programs: using MoE TV; using digital tutorial support system;
using educational mass media of the regions; and allow them to use libraries

Avoid unnecessary burden on the institutions, those who should re-sit have to
prepare themselves with the exception of those who are marginally disadvantaged.
There should be more pressure to complete their studies before they leave the
university.
Universities should be open to library services and tutorial services, including
digital tutorial services, for those recognized as preparing for re-sit. The TV by
MoE could be used for tutorial support. Even those educational mass media at
regional level could be engaged on academic support.

5.11. Credibility of exit exams to stakeholders

One means of creating credibility of the exit exam to stake holders is developing awareness. A
respondent from HERQA suggested that three important stakeholders (government, universities
and students) should come to the forefront.
All three partners should take their share and responsibility and execute policies in
harmony. Be it MoE or any institution exam security must be first and foremost a
concern. Right from the entry point when course outlines are distributed students,
teachers, departments, and faculties should enter into contractual agreement. Support
system in its various forms including tutorials, faculty-student advisement, monitoring
and evaluation of departments, effective implementation of continuous assessment
should target towards ensuring mastery of the courses on semester basis. Credibility is
not a onetime exercise; rather, it is a result of transforming classroom practices and
student teacher interactions in general. Credibility of exit examination can also be
achieved by redefining the procedures followed in opening new programs in the
universities. One possibility is tailoring programs to different universities. Certain
areas like health, civil engineering that require huge resources to run programs can be

57
undertaken by first and second generation universities that are relatively well
established. Identifying those important programs across universities and backing these
programs with adequate resources and gradually building capacity and move on to the
rest of the disciplines will add to the credibility of exit examinations. (Interview from
HERQA)

In creating credibility, a respondent from ESC suggested to give emphasis on, “matters related to
exam preparation, administration, selecting professionals from universities for item development
and psychologically preparing students from the outset need to be taken seriously.”

Consensus building is a point raised in the discussion with experts and officials in MOE.

Consensus building through various media is a priority issue rather than focusing on
top-down approach. Empowering students over the years and enabling the students to
be competent and up to the standard. Such a mission should be shared among the
training institutions and the students. We need to make an effort to get a buy in of the
importance of the exit exam. And institutions should sign contract in this regard.

Unless awareness and legal framework are in place implementation of exit exam would be in
danger. On this point a discussion with officials and experts in MOE underlined the following:

Last year, there was awareness raising attempt and the reaction of university students
was noisy and dangerous. We need to be careful on how to design the dissemination
part. Unless students and relevant other stakeholders are convinced, it is difficult to
implement it. We have to get ready in answering the questions such as exit exam for
what purpose, on which field, how to implement it, who does what, when and what
would be the consequence of the exam results. Once we are ready on answering these
and related questions we can plan on raising the awareness of stakeholders.

Another respondent from AASTU voiced his concern on the problem of introducing exit exam
before making the necessary preparation in the following point.

If we are really at the verge of introducing exit examination in Ethiopian Higher


Education Institutions, the top-down approach of policies and educational reforms
have to be discouraged. All individuals, groups, institutions, and professional

58
associations should be given the opportunity to voice on the subject. This is a high
stake program and adequate media coverage is also a prerequisite. Forums have to be
organized so that all stakeholders could entertain different ideas. Experience of other
nations has to be captured and best practices and benchmarking have to be adapted to
the Ethiopian context. All potential factors that may entail unrest have to be curbed
and in full control. Moreover, establishing a separate and independent institution
possibly accountable to the prime minister, parliament or MoE has to be in place by
law.

All respondents agreed that the necessary preparation including creating awareness in both face
to face discussion and media analysis among the key stakeholders including students, parents,
and professional association must be made.

5.12. Challenges
It was learned in the interview as well as discussion that students who think may not pass the test
may challenge the process. For instance, resistance has been made by some students by going to
court challenging the legality of Law exit exam. This may also be expected in the future.

The private universities specially may not show consent for the very reason that the
result of exit examination is going to challenge their quality compromising practices.
Early preparation would help to reduce the challenges. Lack of experience in planning
and administering exit examinations is likely to be another challenge. (Interview from
AASTU)

Necessarily preparations should be made to minimize the challenges related to the exit exam. An
official from AAU raised the following concerns to be addressed:

 Does the exit exam measure the required competencies? Test construction in measuring
knowledge, skill and attitude is really a challenge. Test construction will be challenging.
 Managing huge number of students; administration, institutional set up and man power
arrangement; demanding manpower arrangement.
 Gap in awareness of stakeholders; public acceptance may not be achieved immediately.
 University level challenges; universities may claim they adequately trained or claim their
graduates are competent.
59
The challenge that may come from new universities (for instance, fourth or third generation
universities) could be countered, as an official from Injibara University remarked, by raising
experiences of previous achievements of relatively new universities compared to AAU. He cited
that Wollo University achieved better than AAU in Law-exit exam. In the new generation
universities, students and teachers could give more time to learning and teaching compared to
senior universities.

5.13. Legal mandate of exiting institutions


Different stakeholders within the ministry were interviewed on the legal mandates of existing
institutions in administering the exit examination. The Higher Education Relevance Quality
Agency (HERQA), the Higher Education Strategic Center (HESC), and Focus Group Discussion
with Experts drawn from the various departments of Federal Ministry of Education were
interviewed on the feasibility of reinstituting the already existing institutions and agencies to suit
for the purpose of institutionalizing the exit exams. Although HERQA seems to be mandated by
law to handle this nationwide high stake assignment as per the views of the Director, and stated
in the establishment proclamation, it lacks institutional capacity as stated by the Director. The
interviewees, however, underlined that the experience of these existing institutions is an asset in
instituting exit exams and if at all there is a tendency to use these institutions they have to be
reorganized and restructured backed by legal framework. Excerpt from the transcribed interview
substantiates the discussion.

NEAEA is not mandated to administer such high stake examination meant to serve higher
education institutions. The mandate is limited to administering only nationwide grade 10
and university entrance examination after completion of the preparatory program.
Otherwise there is a need to redefine the status of NEAEA and HERQA, and ESC if they
are going to be likely candidates to administer this examination (HESC Director)

Focus Group Discussants drawn from MoE argued that HERQA and NEAEA are not legally
mandated to ensure competence of graduates. Accordingly, there is no quality assurance
mechanism in place to ensure competencies of graduates after completion of programs. Rather
they suggested that there should be a unit under the newly established Ministry of Science and
Higher Education that can administer exit for the short term and has to evolve into an
independent institution. Alternatively, the experts suggested that University Exit Exam should be

60
the mandate of the universities like that of the Ethiopian University Entrance examination
(EUEE) and for practical purposes the Council of University Presidents should delegate the task.
In general, taking into account the width and breadth (scope) of the work, interviewees seem to
opt for an independent institution accountable to the upcoming higher education ministry as
proposed in the new education roadmap, with a strategy document that clearly delineates the
autonomy and accountability of this envisaged institution.

5.14. Possible arrangements in instituting exit examinations


Another issue posed to the interviewees is on the possible arrangements in instituting exit
examinations. In response to this question the HERQA Directorate stated that by restructuring
one of the several directorates in the ministry there is a possibility to carry out this nation-wide
testing program. The respondent still claims that HERQA is likely candidate to discharge this
responsibility provided that the agency is reinstituted in such a way that it would have the status
of an independent body or agency. He stated the prospective status of HERQA in exercising the
mandate of administering exit exam in the following way:

Based on suggestions coming from different stakeholders the agency has drafted a
proposal to be submitted to the parliament so that it may stand as an independent
institution not influenced by MoE. Regardless of the problem related to exam security,
the COC model can be extrapolated in establishing and structuring of the new institution.

The HERQA Director as well suggested on how to organize this independent institution. A quote
taken from the interview transcript shows what the-would-be institutions should look like.

If at all a new institution is the last option, I see two broad functions that implicate the
organo-gram of the institution. These two broad functions are the testing wing and the
item banking wing; the former being responsible for carrying out all activities ranging
from the planning stage to the administration of the examination, the later for keeping
test security and integrity.

The ESC directorate seems to have similar line of thought with the HERQA Directorate with
regard to the mandates of the existing institutions and procedures in the possible advance
arrangements to be taken in instituting exit examinations. Accordingly, the institute to be

61
established has to be accountable to the new higher education ministry within MoE. He is not in
view of making this new institution accountable to the prime minister’s office in view of the
multiple responsibilities and tasks this office is carrying out. HERQA is fully authorized and
autonomously and authentically in full control of quality enhancement practices both at private
and public universities. It can also manage and administer exit examinations. Yet new
regulations, guidelines have to be worked out before mandating HERQA. New programs that do
not meet standards should not be allowed to admit students. This measure by itself is a
prerequisite to make exit examinations effective and efficient and serve worth mentioning
purpose. If HERQA, then it must be accountable to MoE. MoE and HERQA are working
towards a common goal. They should not be considered as two independent organs within the
system.

The NAEA director shares similar idea with HERQA and ESC Directorates. He is of the opinion
that exit examinations have to be administered by an independent testing center for the very
reason that such institution would enable to make critical planning, and help in making exam
setting and exam administration confidential and to the standard. To let each institution to take
responsibility for giving its own exam may lead to corruption as it may be related to capacity of
each institution and resources available. Citing the experience of south Africa where
accountability is directed to the president’s office, the Director suggested that such institution in
the Ethiopian context should be accountable to the Prime minister’s office

The NAEA directorate as well suggested on how to organize this independent institution.
According to him, the institution can have board members representing different stake holders.
As part of operation unit its internal structure should have an academic unit and administrative
unit. A research wing that is responsible for analyzing and interpreting exam results that would
help as an input for policy direction has to be set and shown on the organogram of the institution.

Data obtained from Focus Group Discussion (FGD) with Experts at the MoE showed that all
examinations that are administered at exit stage should be aligned and administered by a separate
and an independent body. Law exit exam, health licensure examination should be aligned and
streamed to this new high stake nationwide examination. Accordingly, this will bring all
stakeholders on board to discuss and work towards common mission.

62
Further reviewers of the draft version of this report indicated that there should be an independent
regulatory institution that can run exit examinations;

In my opinion, one way of assuring quality is to create a system of checks and


balance in a system, where there is no major stakeholder, and make sure other
stakeholders are represented fairly. If we are now proposing exit/licensure exams
be placed under one of the implementing arms of the government (repeat what is
going on at the moment), it would be very difficult to institute the desired checks
and balances. (One of the reviewers)

5.15. Cost implication of the exit exam

The cost implication of exit exam is so high when program diversity and number of higher
education institutions are taken into account. Interview with different stakeholders showed that
various sources of funding have to be sought in order to facilitate the implementation of exit
exams. One possibility in mitigating the effect of cost as suggested by respondents is to think of
IT-based nationwide testing program. This will reduce the huge amount of resource that could be
depleted due to paper and pencil testing. Focus Group Discussion (FGD) with Experts at the
MoE as well suggested the need to be cost effective. The experts commented that traditional way
of exam preparation and administration is going to be extremely costly and one has to resort to
technology supported testing program. The experts suggested that the first round exam related
costs has to be covered by the government and gradually universities may take over the
responsibility and cover the cost of exit exam with the knowledge of Ministry of Finance

The private institutions have to take their share as long as they are benefitted from this high
stake examination. Private institutions are beneficiaries of HERQA’ attempt to foster quality
improvement packages in all higher education institutions be it private or public. So far the
contribution of the private institutions in supporting HERQA’S quality improvement endeavor is
absent or negligible. According to the interviewee from HERQA this trend has to be stopped. A

63
quote taken from the interview transcription supports the role of the private sector in financing
the exit exams.

When private institutions are accredited by HERQA, all expenses that lead to
accreditation are covered by the agency. I think this should not continue in the
future. The private institutions need to contribute and share administrative
expenses that the agency is covering. Professional associations are also potential
sources of funding. They have the capacity. At the start the program should be
government funded and gradually cost reduction strategies have to be worked out.
I don’t see student cost sharing as plausible option. The upcoming Road Map
suggested student cost sharing to be raised to 30%. Where higher education is
free, student cost sharing as source of finance in implementing exit exams is not a
viable alternative. Rather, there must be strong financial backing from the private
sector to augment government funding. (HERQA, Vice Director).

Although the ESC directorate felt the tremendous cost implication of this high stake
examination, he is in favor of the cost sharing scheme. Currently students cost sharing accounts
only 15% of the teaching and learning process. Otherwise many of expenses per student in
higher education institutions are covered by the government. In real sense higher education in
Ethiopia is more or less free. For the law exit examination, for example, 10 ETS is set in the
curriculum. Students are registered for the exit examination. However, law graduates do not pay
anything when they sit for the exit examination. The interviewee suggested the cost sharing
scheme as viable approach and the need for differential treatment based on students’ socio
economic background.

Students’ cost sharing for the teaching learning process currently accounts only 15%. I
suggest this should be increased to 30% as per the suggestions stated in the upcoming
road map. Loan scheme is another strategy. There are students who can pay for sitting
exit examinations. Let the haves pay and students who cannot afford can have the option
of loan scheme. Affront payment like international tests such as TOEFL, SAT is also
another option (ESC Directorate).

64
With regard to the cost estimation, the experience of Institute of Educational Research of Addis
Ababa University in administering health professionals’ licensure examination, which is
equivalent to exit examination, ranges from 1750 and 2400 birr per test taker in the first and
second round testing, which is 2075 Birr on the average. The cost might increase to 2300 birr per
test taker if we include the cost of vehicle and fuel. As long as the exit examination is paper and
pencil, the cost will be more or less similar to the cost of the licensure examination, which is
only multiple choice questions (MCQ) type (IER, Testing Expert). On the other hand, the cost of
the exit examination could be substantially reduced if we can avoid paper and pencil testing and
rely on computer based testing which has a serious question on its feasibility.

6. Guidelines on the Technical Aspects of Exit Exam

For exit exams to be reliable and valid, curriculum experts, subject matter experts,
psychometricians all have to be involved in the preparation. Besides, trying out the items and
finding out the psychometric properties on a representative sample of candidates and sorting out
those items with better properties, will increase the reliability and validity of the exam. Hence, it
is important to develop guidelines upon which these professionals might regulate the entire
process ranging from item development to test result reporting of exit exams.

6.1. Characteristics of a good test

Test is a systematically organized set of questions made for the purpose of measuring or
quantifying a sample of behavior, with a clearly described scoring procedure. In the testing
literature it is used as a broad name including examinations, quiz and other related terms.

Tests can be categorized into standardized (expert-made) and non-standardized (teacher-made


tests), or norm-referenced (interpretation is based on group score) and criterion-referenced
(interpretation is based on fixed score or criteria). This Exit-exam can be grouped under
standardized and criterion–referenced test.

65
There are some features of good tests which need to be considered when developing a test, in this
case, exit examination. The following factors are necessary conditions for any test to be
considered as good.

a) Integrity of the test. A good test has integrity provided that it is free from test developer
bias, test insecurity including theft and cheating. A panel of test developers and reviewers
need to demonstrate high integrity themselves to enhance the integrity of the test.

b) Validity of the test. Validity is the extent to which the test measures what it intends to
measure. In this section, we shall only focus on three types of validities: content validity,
construct validity, and predictive validity.

 Content validity is the extent to which the test adequately represents the subject
matter and the instructional objectives. A test that does not proportionally
represent the various contents is said to be not valid content wise.
 Construct validity is the extent the test measures the concept, or construct, it
intends to measure. For instance, if we intend to measure graduate’s competence
and we are sure we do, then our test has construct validity.
 Predictive validity the extent scores on a given test predicts scores on another test.
For instance, if scores on university exit exam are highly correlated with scores on
employer’s evaluation results, then one could say the exit exam has high
predictive validity.

c) Reliability of the test. Reliability is the extent to which a test is internally consistent and
yields stable results over time. There are various approaches to measure reliability of a
test. Split-half method is used to measure internal consistency of a test while test-retest
method is used to measure stability of a test over time. Either or both approaches could
be used depending on the intended purpose. A good test needs to demonstrate reasonably
high reliability.

66
d) Sufficient variance in test scores. A good test is supposed to positively discriminate the
well prepared student from the poorly prepared one where variability of scores will be
sufficiently achieved. On the other hand, if the test is too difficult for test takers where all
test takers scoring low marks or too easy which every test taker gets overall high score
without discrimination, then the test does not serve the intended purpose.

e) Standardization. Setting in advance a standard or a benchmark that would serve as a


criterion for fulfilling or meeting the required minimum level of performance or
competence would make the test a good test. For criterion referenced assessment, setting
the standard, usually using the recommended Modified Angoff method, is necessary for
standardization.

6.2. Purpose of testing

In order to design a test, it is recommended to define the purpose or rationale why the test, in this
case exit examination, is needed. The purpose of the curriculum based exit exam is to ensure
university graduates have acquired at least the minimum competencies specified in the graduate
profile of the respective curriculum, which will help them to satisfactorily perform in the world
of work in their respective areas of expertise. The intended exit exam is a criterion referenced
assessment where test taker’s performance is compared against a pre-determined standard or
benchmark. In this case, comparability of knowledge and skills of graduates of different
universities would be reasonably acceptable provided that all graduates from same field of
specialization meet the minimum standard or get a pass mark on the common curriculum based
exit exam. From the preceding premises, the preparation of exit exam assumes: 1) curriculum
based graduate profile, and 2) comprehensive contents that cover major courses from all years.

6.3. Guidelines for Test Blue Print or Table of Specification

A table of specification of a test or a test blue print is a fundamental design stage of test
development that has a two-way chart or grid that integrate the type of instructional objective by

67
the instructional content covered. According to Bloom, et al. (1971), "… it useful to represent the
relation of content and behaviors in the form of a two dimensional table with the objectives on
one axis, the content on the other.” The number of questions to be developed depends on the
weight of the subject matter content and the level of cognitive domain required to process the
content. The test blue print, which normally is prepared before any test is constructed, ensures
not only the coverage of the required subject matter contents in the test it also identifies what
level of cognitive domain is being assessed. Benjamin Bloom’s Taxonomy of Instructional
Objectives, there are cognitive, affective and psychomotor domains. In this paper, we shall focus
on only the cognitive domain of instructional objectives. The initially proposed six major levels
of Bloom’s cognitive domain have been revised to emphasize creative thinking3 as the highest
level cognitive outcome as shown below.

Revised Taxonomy of Instructional Objectives by Bloom’s student Lorin Anderson

1) Creating – Placing collectively ideas or elements to develop an original idea or engage in


creative thinking
2) Evaluating – Judging the value of ideas, materials and methods by developing and applying
principles and criteria
3) Analyzing – Breaking information down into its component elements
4) Applying – Implementing abstractions in concrete circumstances
5) Understanding – Understanding of given information
6) Remembering – Recall or recognition of definite information

Tips for developing table of specification

For a multiple choice questions (MCQ) test preparation, the following tips are suggested in order
to develop a test blue print or test table of specification:

3
Source: http://www.kurwongbss.qld.edu.au/thinking/Bloom/blooms.htm
68
a) Define the purpose of the test. In this case, the MCQ test is prepared to discriminate
university graduates who have the minimum competencies as specified in the graduate
profile of each curricula;
b) Select major topics or content areas (A, B, C, … etc), as indicated in the template, from
the syllabus which are assumed to be covered during the training duration and contribute
to the required competencies of the learners;
c) Define the sub-topics or sub-content areas (A1, A2, A3, …; B1, B2, B3, … etc) within
each major topic;
d) Estimate the number of days required to learn each topic. For instance, if there are 8
major topics that should be covered in 50 days, let’s say the first topic, in this example A,
is estimated to require 6 learning days and we continue to estimate the number of days for
all eight topics.
e) Estimate the weights of each topic based on how much time was spent on learning the
topic. This can be done after getting all content areas and the total estimated time
required to complete the course. For instance, for the first topic, in this case topic A, it
requires 6 of the 50 days for learning, which is 12% weight allocation. We do the same
for each topic.
f) Decide the total number of questions required for the test. If we decide to develop 200
multiple choice questions (MCQs), then the first topic A which has the weight of 12%
gets 24 questions from this topic. We also do similar weighting schemes not only for the
major topics but also for the sub topics of each of the major topics.
g) Prepare test items that have generally 20%, 50% and 30% EASY, MODERATE, and
DIFFICULT items respectively (i.e., with the subjectively rated difficulty ratio of 2:5:3).
Although determining the difficulty level of each question is the professional’s subjective
rating, it also depends on the level of cognitive domain being assessed. In this guideline
the following suggestion is presented for establishing the proportional weight the
cognitive domain by estimated level of difficulty:

69
Table 5: Ratio of Estimated Difficulty Level by domain of cognitive outcomes

Ratio of Estimated Difficulty Level


Cognitive E=Easy M=Moderate D=Difficult Total = Proportional
Domain = 20% = 50% = 30% 100% weight
Remembering 8% 8% .08
Understanding 12% 10% 22% .22
Applying 25% 12% 37% .37
Analyzing 15% 8% 23% .23
Synthesizing 6% 6% .06
Creating 4% 4% .04

h) Determine the number of questions to be developed from each domain by considering the
level of difficulty. For instance, for the first topic A, 24 questions are planned to be
prepared. To determine the specific number of questions for each cognitive domain,
multiple the weight by 24 (i.e., 24 X .08 = 1.92, which is closer to 2 questions will be
planned for preparing easy question that focuses on remembering).

i) Test blue print developers will have to professional consensus before finalizing the
template.

70
Table 6: A Template for Developing Test Blue Print or Table of Specification
(D)
DIFFICUL
T (30%)

(M) Proportion of
MODERA estimated test
TE (50%) item difficulty
(E) level
EASY
(20%)
Proportional
weight of E,
12%+ 25%+ 15%+ M, and D
8% 10% 12% 8% 6% 4% items
R= U= AP=
Sub- No. Remem Understan Applyin AN= S= C=
topics of No. bering/ ding/ g/ Analyzin Synthesizi Creating
day Weigh of knowle comprehen Applicat g/ ng/ /Evaluat
Topic s t, % Items dge sion ion Analysis Synthesis ion
Weight for
each cognitive
0.08 0.22 0.37 0.23 0.06 0.04 domain
A1, A2,
A A3, … 6 0.12 24 2 5 9 6 1 1
B1, B2,
B B3, … 9 0.18 36 3 8 13 8 2 1
C1, C2,
C C3, … 7 0.14 28 2 6 10 6 2 1
D1, D2,
D D3, … 4 0.08 16 1 4 6 4 1 1
E1, E2,
E E3, … 5 0.10 20 2 4 7 5 1 1
F1, F2,
F F3, … 8 0.16 32 3 7 12 7 2 1
G1, G2,
G G3, … 5 0.10 20 2 4 7 5 1 1
H1, H2,
H H3, … 6 0.12 24 2 5 9 6 1 1
Total 50 100% 200 16 44 74 46 12 8

71
Table 7: Revised Bloom’s Taxonomy Verbs: Cognitive Domain (from lower to higher skills)

This table is focusing on only cognitive domain of instructional objectives.


Definitions
I. Remembering II. III. IV. V. VI.
Understanding Applying Analyzing Evaluating Creating
Bloom’s Exhibit memory of Demonstrate Solve Examine and Present and Compile
Definition previously learned understanding of problems to break defend opinions information
material by facts and ideas by new information by making together in a
recalling facts, organizing, situations by into parts by judgments different way by
terms, basic comparing, applying identifying about combining
concepts, and translating, acquired motives or information, elements in a new
answers. interpreting, knowledge, causes. Make validity of pattern or
giving facts, inferences and ideas, or proposing
descriptions, and techniques find evidence quality of work alternative
stating main and rules in a to support based on a set solutions
ideas. different way. generalization of criteria.
s.
Verbs  Choose  Classify  Apply  Analyze  Agree  Adapt
 Define  Compare  Build  Assume  Appraise  Build
 Find  Contrast  Choose  Categorize  Assess  Change
 How  Demonstrate  Construct  Classify  Award  Choose
 Label  Explain  Develop  Compare  Choose  Combine
 List  Extend  Experime  Conclusio  Compare  Compile
 Match  Illustrate nt with n  Conclude  Compose
 Name  Infer  Identify  Contrast  Criteria  Construct
 Omit  Interpret  Interview  Discover  Criticize  Create
 Recall  Outline  Make use  Dissect  Decide  Delete
 Relate  Relate of  Distinguis  Deduct  Design
 Select  Rephrase  Model h  Defend  Develop
 Show  Show  Organize  Divide  Determine  Discuss
 Spell  Summarize  Plan  Examine  Disprove  Elaborate
 Tell  Translate  Select  Function  Estimate  Estimate
 Solve  Inference

72
 What  Utilize  Inspect  Evaluate  Formulate
 When  List  Explain  Happen
 Where  Motive  Importance  Imagine
 Which  Relationsh  Influence  Improve
 Who ips  Interpret  Invent
 Why  Simplify  Judge  Make up
 Survey  Justify  Maximize
 Take part  Mark  Minimize
in  Measure  Modify
 Test for  Opinion  Original
 Theme  Perceive  Originate
 Prioritize  Plan
 Prove  Predict
 Rate  Propose
 Recommen  Solution
d  Solve
 Rule on  Suppose
 Select  Test
 Support  Theory
 Value

6.4. Guideline for writing multiple choice questions (MCQs)


The general rules used for writing multiple-choice items are described below. Recognize that
these are general rules; not all rules will be applicable to all types of testing.
1) Avoid humorous items. Achievement testing is very important and humorous items may
cause students to either not take the exam seriously or become confused or anxious.
2) Items should measure only the construct of interest or the content of the subject matter.
3) Write items to measure what students know, not what they do not know.
4) The stem should contain the problem and any qualifications. The entire stem must always
precede the alternatives.

73
5) Each item should be as short and verbally uncomplicated as possible. Give as much context
as is necessary to answer the question, but do not include superfluous information. Be careful
not to make understanding the purpose of the item a test of reading ability.
6) Avoid negatively stated items. If you have to use this kind of item, emphasize the fact by
underlining the negative part, putting it in capital letters or using italics. (For test construction
purposes, if possible, put all such items together in a single section and indicate this with
separate directions.)
7) Keep each item independent from other items. Don't give the answer away to another item. If
items require computation, avoid items that are dependent on one another.
8) If one or more alternatives are partially correct, ask for the "best" answer.
9) Try to test a different point in each question. If creating item clones (i.e., items designed to
measure the exact same aspect of the objective), be certain to sufficiently change the context,
vocabulary, and order of alternatives, so that students cannot recognize the two items as
clones.
10) If an omission occurs in the stem, it should appear near the end of the stem and not at the
beginning.
11) Use a logical sequence for alternatives (e.g., temporal sequence, length of the choice). If two
alternatives are very similar (cognitively or visually), they should be placed next to one
another to allow students to compare them more easily.
12) Make all incorrect alternatives (i.e., distracters) plausible and attractive. It is often useful to
use popular misconceptions and frequent mistakes as distracters. In the foreign languages,
item distracters should include only correct forms and vocabulary that actually exists in the
language.
13) All alternatives should be homogeneous in content, form and grammatical structure.
14) Use only correct grammar in the stem and alternatives.
15) Make all alternatives grammatically consistent with the stem.
16) The length, explicitness and technical information in each alternative should be parallel so as
not to give away the correct answer.
17) Use 4 or 5 alternatives in each item.
18) Avoid repeating words between the stem and key. It can be done, however, to make
distracters more attractive.

74
19) Avoid wording directly from a reading passage or use of stereotyped phrasing in the key.
20) Alternatives should not overlap in meaning or be synonymous with one another.
21) The correct choice should appear about an equal number of times in each response position.
22) Do not use any pattern of correct responses, e.g., ABCDE, etc.
23) Avoid terms such as "always" or "never," as they generally signal incorrect choices.
24) To test understanding of a term or concept, present the term in the stem followed by
definitions or descriptions in the alternatives.
25) Avoid items based on personal opinions unless the opinion is qualified by evidence or a
reference to the source of the opinion (e.g., According to the author of this passage, . . .).
26) Do not use "none of the above" as a last option when the correct answer is simply the best
answer among the choices offered.
27) Try to avoid "all of the above" as a last option. If an examinee can eliminate any of the other
choices, this choice can be automatically eliminated as well.

6.5. Guidelines for Reviewers of Test Items


The following guidelines are recommended for reviewing individual test items. When you
review an item, write your comments on a copy of the item indicating your suggested changes. If
you believe an item is not worth retaining, suggest it be deleted.
1) Consider the item as a whole and whether
a) it measures knowledge or a skill component which is worthwhile and appropriate;
b) examinees who will be tested;
c) there is a markedly better way to test what this item tests;
d) it is of the appropriate level of difficulty for the examinees who will be tested.

2) Consider the stem and whether it


a) presents a clearly defined problem or task to the examinee;
b) contains unnecessary information;
c) could be worded more simply, clearly or concisely.

3) Consider the alternatives and whether


a) they are parallel in structure;

75
b) they fit logically and grammatically with the stem;
c) they could be worded more simply, clearly or concisely;
d) any are so inclusive that they logically eliminate another more restricted option from
being a possible answer.

4) Consider the key and whether it


a) is the best answer among the set of options for the item;
b) actually answers the question posed in the stem;
c) is too obvious relative to the other alternatives (i.e., should be shortened, lengthened,
given greater numbers of details, made less concrete).

5) Consider the distracters and whether


a) there is any way you could justify one or more as an acceptable correct answer;
b) they are plausible enough to be attractive to examinees who are misinformed or ill-
prepared;
c) any one calls attention to the key (e.g., no distracter should merely state the reverse of
the key or resemble the key very closely unless another pair of choices is similarly
parallel or involves opposites).
6.6. Guidelines for Psychometricians
 Make sure that all instructions are clear and precise
 Ensure that items are clearly stemmed
 Make sure that each question has one correct answer
 Check that each item addresses one problem at a time and is stated clearly
 Blank spaces all have to be equal in length across all questions
 Ensure that a question and options are printed on the same page and do not extend to the
next page
 Check that each item stands by itself
 Check that each item is as specific as possible
 Make sure that options are placed consequentially and have equal length
 Check that items should not be exceptionally long
 Check that all distracters are plausible and enhance the reliability of the test

76
 Avoid “cute” distracters that are extremely unlikely to be chosen
 Check for visibility of diagrams, pictures, standard notations etc.
 Check for spellings and grammar
 Avoid redundant items that serve similar purpose
 Items should not be double-barreled that convey two or more ideas at the same time
 Ensure that positively and negatively word items are mixed by including items that are
worded in the opposite direction
 Evaluate the optimum difficulty level for each item (half way between 100% of the
respondents getting the item correct and the level of success expected by chance)
 Check that test items should have a variety of difficulty levels so that they may
discriminate at many levels
 Because the exit examination is to select qualified graduates after completion of
programs, include difficult items to make fine discrimination
 Check that a greater concentration of easier items is in the test to address special needs
students
 Ensure that the test formats used are in conjunction with the purpose of the test
(dichotomous format vs polytomous format)

6.7. Pretesting, Item Analysis, and Item Banking

a) Pretesting
b)
On standardized test, items that have survived the initial screening are combined into one or
more pre-test and administered to a sample of people similar to those who will take the
completed test. For example, on achievement test, pre-tests would be administered to a sample of
students of the same grade level and educational background as students who will take he
completed test. The goal of pretesting is to obtain information on how test takers respond to the
item. This analysis will include both qualitative comments by test taker, such as perceived
ambiguities in items, and quantitative analyses.

77
c) Item analysis
When constructing a test, we must be concerned with the quality of the items. The statistical
analysis of test items is called item analysis. Item analysis is among the most valuable tools that
test developers can apply in attempting to improve the quality of their tests. Item analysis is
generally conducted for three general purposes:
(a) to select the best available items for the final form of a test;
(b) to identify structural or content defects in the items;
(c) to identify general content areas or skills that need to be reviewed by the test developer; and

There are three main elements in an item analysis. One is evaluation of the difficulty level of the
items, that is, the percentage of students responding correctly to each item in the test. Another is
determination of the discriminating power of each item. Item discrimination in its simplest form
usually, but not always, refers to the relation of performance on each item to performance on the
total test. A third element in item analysis, if multiple-choice or matching items are used in the
test, is examination of the effectiveness of the distracters (foils, or alternative answers).

Steps to follow in gathering and recording of data for an item analysis:


1. Arrange the answer sheets in order from high to low. This ranking is usually based on the
individual's total score on the test.
2. High-and low-scoring groups are identified. For purposes of item analysis, these two extreme
sets of examination papers are called criterion groups. Each subgroup will generally contain
from 25 to 50 percent of the total number of people who took the test. The high and low
group, must contain the same number of individuals.
3. Record separately the number of items each alternative was selected by individuals in the high
and low groups
4. Add the number of correct answers to each item made by the combined high and low groups.
5. Divide the total number of correct responses by the maximum possible, that is, the total
number of students in the combined high and low group, and multiply the result by 100. This
percentage is an estimate of the difficulty index. If the goal of measurement is to maximize
differences between students, items with difficulties of .50 (between .30 and .70) are best. For

78
criterion assessments, a more restricted range is generally desired with a mean difficulty value
of .80.
6. Subtract the number of correct answers made by the low group from the number of correct
answers by the high group.
7. Divide this number (the difference, H-L) by the number of individuals contained in the
subgroup (i.e., the number is the high [or low] group). This decimal number is the
discrimination index. Discrimination values can range from -1.00 to 1.00, with positive values
indicating items that favor the upper scoring group and negative values showing items that
favored the lower scoring group. Items are considered to be performing well when they have
relatively large positive discrimination values, meaning a larger portion of the high scoring
group responded correctly compared to the low scoring group.

An ideal item, at least from a statistical item-analysis standpoint, is one that all students in the
high group answer correctly and all students in the low group miss. In addition, the responses of
the low group should be evenly distributed among the incorrect alternatives.

d) Item Banking
The definition of item banking ranges from simple to more detailed and less generic. The simple
definition according to Ward & Murray-Ward, 1994 is "collection of text items that may be
easily accessed for use in preparing exams". The other definition, according to McCallon &
Schumacker (2002), is "collection of test items that can be readily accessed for use in preparing
examinations ... normally computerized for ease of item storage and to facilitate the generation
of new tests. Each item ... is coded according to competency area and instructional objective, as
well as empirically derived data such as measures of item difficulty and discrimination. (Sclater,
Niall & MacDonald, Mary (2004).

An item bank is a term for a repository of test items that belong to a testing program, as well as
all information pertaining to those items. In most applications of testing assessment, the items are
of multiple choice format, but any format can be used. Items are pulled from the bank and
assigned to test forms for use either as a paper-and-pencil test or some form of e-assessment.

79
An item bank will not only include the text of each item, but also extensive information
regarding test development and psychometric characteristics of the items. Examples of such
information include: Item author, Date written, Item status (e.g., new, pilot, active, retired),
Angoff ratings, Correct answer, Item format, CTT statistics, IRT statistics, Linkage to test
blueprint, Item history (e.g., usage date(s) and reviews), User-defined fields.

The following steps could be used for item bank replenishment:

Step1: Items needing to be replaced are identified


Step 2: Item writers write new items
Step 3: New items are reviewed. Approved are passed into a pre-test item bank
Step 4: During operational process, pre-test items are selected and administered.
Step 5: After the examinee finishes the test the item parameters of the administered pre-
tested items are updated.
Step 6: Newly calibrated items are analyzed and, if approved, added to the operational
item bank.

6.8. Database for Exit Exam


Exit examination generates a lot of data in terms of test item banking, test scores and detailed
background information of test takers. There should be a systematic electronic record of data
related to exit exam for historic and research purposes. The following suggestions are important
in relation to the exit database:
 Prepare for a server computer that can store huge data related to exit exam;
 Generate unique identification number for each of the test takers which will not be a
replica any other candidate in the history of exit exam;
 Register test takers online by filling out detailed background information and relevant
few research questions;
 Connect the unique identification number of each test taker with the detailed, item level
responses, and summarized scores on the exit examinations; This is required to conduct
research;

80
6.9. Guidelines for Test Booklet Compilation and Booklet Formation
 Keep copy of attendance roaster in each envelop
 Include stand by test materials
 Attendance list of candidates’ registration numbers by examination type must be
included in the packaging
 Test booklets and other materials should be sealed within plastic bags
 Sign-in-sheets showing candidates’ names by examination to verify candidates’
signatures be prepared and included in the packaging of booklets
 All answer sheets must be packaged separately from the examination booklets
 Handle test booklets carefully. Edges must not be damaged
 It is necessary to use more than one envelop and number them consecutively
 Every supervisor must confirm that the correct examination center is indicated on each
package before departure
 Prepare checklist to ascertain all test centers have received necessary documents.
 The total number of test booklets must be the same as number of test takers
 A list indicating names of candidates, registration numbers and the number of booklets
sent for each examination must be prepared
 Include supervisors report form in all envelops
 Candidates registration number must be contained confidential and should not be used
other than for the exit exam
 Checklist detailing all the items must be completed and returned to the office
 Include a template form summary report of question papers, answer sheets and test
irregularities

81
6.10. Guidelines for University Exit Exam Standard Setting: Using the Modified Angoff
Method

For high stake examinations, like university exit examinations, justified and legally defensible
cut score is required. In order to avoid arbitrarily determined cut score, there is a need for
guidelines for setting standards for establishing a cut score which is helpful to decide a valid
passing mark. Although there are varieties of standard setting approaches, the Modified Angoff
Method is well researched, most widely used and highly recommended procedure for setting
standards in most educational assessments (Horn, Ramos, Blumer, & Madaus, 2000; Hurtz &
Auerbach, 2003).

The Angoff Method was first recommended by William H. Angoff (1971). Over the course of
nearly half a century after the initial proposal by Angoff, several researchers have contributed
and improved the originally proposed Angoff Method and is currently named as the Modified
Angoff Method. Such contributions have focused on refining the standard setting procedure by
improving the reliability of the raters and minimizing errors and improving the validity and
defensibility of the cut scores (Kane, 1994; Mehrens & Popham, 1992).

On theoretical and psychometric grounds, the cut-off score is considered more likely to
resemble the ‘true’ performance standard if there is consensus among experts on what
the cutoff score should be. Consensus does not guarantee accuracy, but if careful
procedures are used it is assumed to increase the probability of accuracy. Therefore, any
systematic effects of different procedural modifications on the degree of consensus
among judges will have implications for the desirability of those modifications (Hurts &
Auerbach, 2003, pp.585-586).

Why do we need valid standard setting procedure? 1) For ethical grounds, we need accurate
standard. It is unethical to arbitrarily assign a cutoff score which might result in failing those
who already possess the minimum competence and/ or passing those who do not fulfill the
required minimum competence. 2) For valid decision making we need a scientific procedure of

82
cutoff determination that is closer to reality which will be convincing to the test takers and
decision makers.

Definition
There are several terms used to refer to performance standards: cutoff score, passing scores, cut
scores, performance levels, achievement levels, mastery levels, proficiency levels, thresholds,
and standards (Hambleton, 2001). In this guideline, we shall use cutoff scores although it may
mean similar to the alternative terms indicated just to avoid any possible confusion. The cutoff
scores are meant to indicate the minimum standard any defined required knowledge or skill that
any test taker has to at least fulfill or meet. In other words, the cutoff score is defined in terms of
the probability of the minimally competent candidate to correctly answering the test items, on the
average, which will be considered as a pass mark.

Brief overview of the procedure


The Modified Angoff Method assumes that the consensus reached among subject matter experts
is closer to the reality in setting standards. The general essence of this procedure underlines that
subject matter experts do rate or judge the probability that a minimally competent person would
answer each item correctly. The average of these probabilities across judges is the cutoff score.
The following steps shall be used to determine the cutoff score.

Step 1. Prepare Subject Matter Experts


Select at least 10 -15 subject matter experts as raters or judges and convene in a common
syndicate where they can work both individually and in group. The more number of subject
matter experts involved in the ratings, the more accurate the cutoff score will be. Subject matter
experts who have the following characteristics need to be selected:
 Familiar with the tasks that the test will assess,
 Knowledgeable about the set of competencies of persons who will perform those tasks,
 Ability to pass the existing test at the current cutoff score,
 Ability to edit the test items for clarity, accuracy, spelling, and grammar,
Besides, the subject matter experts need to be an assembly of diverse groups in terms of gender,
age, ethnicity, and educational level.

83
Step 2. Have Subject Matter Experts take the Test
Before starting the rating of each item, the judges should take the test and collect feedback
especially on objectives, design, and wording of each item. Such feedback should be used to
revise the test items before starting the ratings.

Step 3. Define the Minimally Competent Candidate


Who is the minimally competent or borderline candidate? The subject matter experts (SMEs) are
guided to imagine a hypothetical person who has a minimal knowledge and skill after completion
of a given training. The SMEs need to have consensus on conceptualization of the minimally
competent candidate before starting the ratings. A minimally competent candidate is the one who
 has the least amount of education necessary to perform a task4 for which he or she is
trained,
 performs the tasks in the field (not as a student),
 meets standards although barely,
 is borderline in terms of task performance but is acceptable,

Step 4. Start Rating Each Item in Round 1


The original proposal recommends subject matter experts to estimate the probability of a
minimally competent candidate to correctly answer an item. Since thinking in terms of
probability might yield variation across different persons’ understanding, a modified version of
the same was suggested. In the modified approach, subject matter experts, as judges, are asked to
imagine a pool of 100 minimally competent or borderline candidate and estimate the number of
these students who would correctly answer the item. And they would proceed with the ratings of
each item until all items are complete.

The following Test Item Rating Form could be used to record the estimation of the percent of
minimally competent candidate (for instance, ‘how many would answer the question out of 100
minimally competent candidates?’).

4
The tasks or jobs need to be defined after deciding for which minimum competence is required.

84
Table 8: Test Item Rating Form

Test Item rating


Course Name: Test Name:
Rater Name: Date:
Instructions: Review each test item. Estimate how many of the 100 minimally competent candidate would
answer the item correctly. Do not provide ratings greater than 95 and less than 25. Perfect scoring of 100%
correct is less likely and correctly guessing a four distracter item would be about one in four). When increasing
or decreasing the ratings by 5 percentage points (e.g., 60, 65, 70, 75, etc).
TEST ITEM # PERCENTAGE (%) TEST ITEM # PERCENTAGE (%) CORRECT
CORRECT
1 4
2 5
3 Etc

Here are some tips to consider and the rates should:


 periodically consider the minimally competent candidate during their ratings,
 get both the Test Item Rating Form and the Test Items,
 be provided with the answer keys so that they will not underestimate the item difficulty,
 estimate each item independent of other raters,
 be allowed to work for 2 hours for a set of 100 items,

Step 5. Discuss and Review Item Ratings to Identify Disagreements


After completing individual ratings of each item subject matter experts have to convene for
reviewing the ratings.
 Collect the Forms with ratings and enter the results in Microsoft Excel for all raters (you
may use their names or any other convenient codes); Expert Rating Spread Sheet could
be used to collect data from individual rater.
 Calculate the average or mean of the ratings for each item,
 Find out the standard deviation of the ratings of the subject matter experts for each item,

85
 Watch out the standard deviations for each item. Lower standard deviation means there is
more agreement and high standard deviation, usually higher than 10, mean there is more
disagreement.
 Discuss the reasons for variation among raters and re-evaluate the estimates.

Step 6. Re-rate items on Round 2


In this second round the panel will convene and discuss the degree of agreements as shown in the
standard deviations calculated on the Expert Rating Spreadsheet. The rationale behind re-rating
of the items is to reach a consensus with high inter rater reliability of the ratings or lower than 10
standard deviations for each item. If each item has a standard deviation greater than 10, then
there will be a second chance of independently revisiting and rerating the items. You may
eliminate some items 1) if the judge gives the same value or rating for all items, and 2) if the
rater provides rating scores that is an outlier in contrast to all other raters.

Table 9: Expert Rating Spreadsheet

Expert Rating Spreadsheet


Course Name: Test Name:
Angoff Facilitator Name: Date:
Test Average Expert Expert Expert Expert 4 Expert Expert … Standard
Item # Percentage 1 2 3 (name) 5 n (name) Deviation
Correct (name) (name) (name) (name)
1
2
3
4
…n
Overall mean Mean 1 Mean 2 Mean 3 Mean 4 Mean 5
(i.e., the cutoff
score)

Step 7. Determine the Cutoff Score

86
Once the consensus is reached on each expert rating which means there is less standard deviation
and high inter rater reliability for each item rating, then we can proceed to determine the cutoff
score. We can use the Microsoft Excel Spreadsheet to calculate the cutoff score:
 Add the average percentage correct for all items,
 Divide the overall sum by the number of test items to get the overall mean,
 The rounded overall mean is the cutoff score.

Note: if an item is revised or modified, may be due to the fact that it is defective or any other
reason, it is recommended that the Angoff Method should apply for the item in question before
determining the cutoff score.

6.11. Guidelines for University Exit Exam Security

Exam security is priority number one. When and if examination materials are lost, stolen, copied,
cheated, distributed or are vulnerable to various types of breaches, or is at risk, then everything is
lost. This guideline is very important to protect the integrity and security of the university exit
examinations through meticulous planning that ensures test security. The purpose of this
guideline is, therefore, to ensure test security through careful planning and preparations against
any possible breaches and risks by involving all stakeholders to contribute towards the same
purpose.

Principles of test security


 Test security demands that there should be no conflict of interest among personnel
involved in the testing process.
o Clearly display that only authorized personnel can access the exam planning,
development, storage, scoring and reporting rooms. Even among those involved
in the testing process there should be accessible and access denied resources
depending on who is authorized to access what. Such authorization should be in
detailed defined after the structure and human resources are set up.

87
o Declare and sign a test security agreement that stipulates not only the declaration
of the non-existence of any conflict of interest with the testing but also showing
compliance to the guidelines on test security by all personnel.
o Conflict of interest arises when the personnel working in the testing office has any
possibility of breaching the test security including:
 Existence of test taker family member (spouse, children, parents, and
siblings) or close extended family member (grandchildren, aunt, uncle,
cousin, nephew and niece) during exam year.
 Talking, sharing and exposing the process and content of the exam
through oral, written, electronic or any other media.
 Testing demands the integrity of personnel. Any personnel involved in the testing process
need to be recommended not only for the required competencies but also for his/her
integrity and work ethics. This should be emphasized during recruitment although it
needs to be checked all the way during the cycle of testing (i.e., from the designing stage
all the way through up to reporting test scores).

Test securing is a priority at all stages of the testing, i.e., during the planning, development,
compilation, transportation and administration phases.
 The development of test blue print needs to be as secured as the development of the test
items. All personnel involved in the design and write up of the test blue print must have
not only the caliber to prepare it but also the integrity and professionalism to maintain the
security of the test blue print material that they are responsible to develop, upgrade or
amend.
 No one is allowed to have access to the test blue print or table of specification except
those authorized to develop, revise /update or make use of it for the sake of test item
development. Sharing of the test blue print with other testing personnel or experts in the
field or test takers, or any other unauthorized person is strictly prohibited.
 Any recording of discussions, which could be through paper and pencil or electronic
devices or otherwise, must be kept in secure locations in such a way that the contents of
the designing of the test should not be exposed to unauthorized persons even among the
personnel in the testing office.

88
 Sharing of the contents of the test blue print by those who developed it to colleagues,
spouse, family members, or any other third party is strictly forbidden.
 Computers used for developing the test items must be offline from any internet
connection. Sharing of exam contents through emails is not allowed.
 Preparing for protecting the exit exam papers and answer sheets, including any scraps,
from any threats to fire, flood, theft, copy or robbery during exam transportation,
administration, scoring, reporting and storage.
 Deploy security guards and cameras to support the test security efforts as appropriate.
 Test takers should not be allowed to bring electronic devices such as smart phones, smart
watches, smart eye glasses or any smart ornaments to the examination hall.

7. Summary of findings

A technical team drawn from Addis Ababa University was assigned to produce a strategic
document for instituting exit examination for prospective graduates of higher education
institutions in Ethiopia. The strategic document was meant to serve two major objectives. These
are to develop guidelines that would give direction on how the exit examination could be
designed, developed, standardized, administered, scored, reported, and to propose strategies for
institutionalization of the examination in Ethiopia. In order to achieve these two broad
objectives, the team conducted desk review of existing documents including international and
national experiences, and interviewed key stakeholders and conducted Focus Group Discussions
with experts drawn from the Federal Ministry of Education and other stakeholders. The
following are major points that came out from data obtained from different sources.

 Review of relevant documents such as proclamations for establishing institutions


working in the area of higher education and national testing program, reports and
experiences compiled from the administration of LL.B Exit Examination, Health
Professionals Licensure Examination, review of international experiences showed that
exit examinations do contribute to quality education in higher education institutions and
a means to ensure the presence of minimum competencies among the graduates of the
universities despite variations in the establishment of the universities.

89
 Results showed that the existing institutions such as HESC, HERQA, NEAEA, Addis
Ababa University Testing Center are potential candidates to administer such high stake
testing program provided that they are supported by policy and the necessary human and
material resources.

 Research participants underlined that an all-round preparation and readiness is needed


before launching of this high stake undertaking and it is essential to implement the
program with high ethical standards and integrity taking note of the prevailing conditions
in the respective universities and the nation at large.

 In order to achieve high standard in the implementation of the exit examination, it is


essential to be transparent, have a shared vision on its objectives, make the exit
examination an integral component of the curriculum, and map out activities from
planning to test administration and interpretation of results. This would increase the
credibility of the examination by all stakeholders including students and professional
association.

 Introducing exit examination into higher education institutions would likely to enhance
institutional autonomy and accountability and thereby improve the teaching learning
process and ascertain the achievement of the minimum learning competencies in major
learning areas.

 Review of international experiences showed that exit examinations in some countries is


used for licensing professionals after completing university education or as graduation
requirement depending on legislation and country contexts.

 A review of the Ethiopian Law exit exam showed that the examination in general lacked
coverage adequacy, clarity in its objectives and the items were of poor quality with
linguistic and conceptual problems and more over suffered from inadequate attention

90
paid by the respective law schools in preparing students for this high stake testing
program.

 On the primary purpose of the Ethiopian university exit examination some of the
interviewees argued that it must be used as a requirement for graduation while others
noted that licensure examinations need to be introduced to select those who could fulfill
the minimum requirements for doing the job from those who could not. Although there
are variations among respondents whether the exam should be used for graduation
requirement or as licensure for employment, universities have to revise their curriculum
to include exit examination in their programs with a grade point value of pass and fail and
get it approved by the respective university Senate.

 With regard to testing time interviewees suggested a two phase approach to the
administration of exit examination: mid-way and end academic program schedule of exit
examination. The coverage of the content should be comprehensive that it includes both
core subjects and basic courses and the format of the examination be multiple choice
questions (MCQ) type at the start and gradually performance based assessment could be
introduced in due course of time.

 There is a general consensus among research participants on the role of exit exams in
improving quality of education. They stated that feedbacks obtained from test results,
institutions would identify gaps, motivate students and teachers and build the confidence
of the private sector when employing graduates.

 Although it is difficult to introduce double standards, participants agreed that special


provisions have to be in place for students with different forms of disability. In addition
to that, interviewees and FGD participants are all in the opinion that re-sit opportunities
have to be given for students who are not able to fulfill the minimum pass grade in their
first attempt and support programs have to be planned to capacitate those students unable
to achieve the minimum standard.

91
 It was learned in the interview as well as discussion that the program may face some kind
of resistance from students, teachers working in different higher education institutions
and even parents. Accordingly, making the necessarily psychological and physical
preparations ahead of time would minimize the challenges related in its implementation.

 On the legal mandate of the existing institutions in administering exit examination,


research participants stated that HESC, HERQA and NEAEA are not legally entitled and
institutionally capable in shouldering this high stake responsibility in their present
condition. They opted for a separate, independent and autonomous institution accountable
to the government.

92
8. Scenarios for Instituting Exit Examination

Based on the various interviews conducted the following scenarios with their merits and demerits
are generated for further discussion and ultimate decision:

Table 10: Advantages and disadvantages of scenarios for instituting exit examination

No. Scenario Options Advantages Disadvantages Researchers’


Comments
1. Purpose of Exit Exam
Option 1. Graduation  Provides feedback for institutional  Burdens universities in Option 1 could be
requirements: fulfilling improvements handling re-sit implemented in the
the minimum standard or  Make universities accountable for examinations short term
required competence lack of efficiency  Many students may not get
 Enhances the standardization and their degree
comparability of university
outcomes
 Motivates to achieve high
standards
Option 2. Licensure  Provides feedback to institutions  Focuses on job rather than Option 2 could be a
Examination: fit for the and assures quality of graduates to on improving quality of long term issue;
required job or fit for employers training There will be issue
purpose  Enhances university industry  Some graduates will be out of mandate if
linkage of the market universities
 Enhance credibility of programs  May diminish the implement it
by associations and employers credibility of HEIs
Option 3. Focus on  Considers highly competent  Decline in gross enrollment Option 3 could be
rigorous university student admission to a university; ratio of university students; implemented in the
admission and placement ensures quality of would-be  Equitable access to higher intermediate
criteria in contrast to the graduates; education enrollment will
current loose approach  Increases efficiency and be challenged
credibility of the universities by
employers;

93
2. Instituting exit exam Advantages Disadvantages Researchers’
Comments
Option 1. Establish  Enhances better accountability  Requires new establishment Long-term
independent exit exam system of institution which
implementing institution  Enhances quality the testing demands more time
accountable to the Office programs  may incur high cost in
of the Prime Minister  Reduces conflict of interest establishing new institution
between MOE and the
independent implementing
institution
Option 2. Use existing  Can be immediately implemented  Challenges on coordinating Option 2 seems
institutions by forming a  would enable to use the existing the various institutions possible for the
consortium (NEAEA, resources and accumulated  may entail diffusion of short term or
IER, HERQA, HESC, knowledge responsibilities immediate
ICT, etc) for the  May enable to come up with a implementation
implementation of exit new blend and better synergy
exam accountable to the
Ministry of Science and
Higher Education
Option 3. Establish one  Contains the agenda on Exit  Requires new institutional There will be issue
directorate within the Exam within its programs establishment; of mandate since
Ministry of Science &  Potential conflict of exit exam is the
Higher Education to interest; i.e., trainer vs. mandate of
implement exit evaluator universities
examination  Make the testing program
highly bureaucratic
3. Testing format Advantages Disadvantages Researchers’
Comments
Option 1. Implement exit  Familiar approach  Bulky duplication of papers Costly
exam on paper and pencil and huge transportation and
format logistics issues
 Difficult to guarantee test
security and integrity

94
Option 2. Implement exit  More efficient and cost effective  Doubt on feasibility: Long term
exam using electronic for administering the exam unreliable computers,
format or computer based  Increases the security and internet and power supply
testing integrity of the testing process  may require HEIs to have
high tech IT infrastructure
Option 3. Implement exit  Cost effective approach which can  Lack of uniformity in test Short term
exam using a lead to gradual transition to full- administration
combination of paper and fledged computer based testing  may threaten test security
pencil and computer and integrity when
based testing formats administered in two
modalities
4. Exit exam Advantages Disadvantages Researchers’
administration phases Comments
Option 1. Administer  Provides better opportunity for  Doubles the cost of test Costly but useful
exit exam twice: mid- test takers to adjust themselves for administration;
term and end-term of the the second half part of the test  No international experience
academic program  Enables to track performance of in this regard
test takers
 Decreases the fail rate
Option 2. Administer  Cost effective  Does not give more chance Cost effective but
exit exam only once at for improvement while in does not give more
the end of the academic the system chance
program  a onetime testing program
may not adequately
measure student
performance
5. Re-sit or re-examination Advantages Disadvantages Researchers’
opportunities Comments
Option 1. Open  Assures the rights of candidates  Costly and inefficient Long-term
opportunity for re- for sitting for re-examination  Burdens universities and
examination, indefinitely anytime regulatory agency
or unlimited
Option 2. Second re-sit  Reduces burden of universities  Costly and inefficient Short term

95
chance: fee-free and with and regulatory agency
face to face and online  The support could be face to face
university support and/or online
Option 3. Third re-sit  Broadens chances for candidates  Candidates may find it Long-term
chance: candidate pays to improve their performances difficult to pay or to
the fees and no university independently prepare
face to face support, only themselves for the test
online support and self-
support
Option 4. Fourth and last  Reduces inefficiency of  Inefficiency Recommended as a
chance of re-examination: universities with more number of  Mismatch between short-term and
 Candidate pays the candidates with re-sit status curriculum and certification immediate
fees and no university  Mitigation of wastage of trained intervention
face to face support, manpower with down-graded
with only online certification
support
 Down-grading level
of qualification
following
unsuccessful re-
examination: i.e.,
diploma or advanced
diploma, instead of
degree;
 Linking them with
further skills training
including TVET;
 It also includes denial
of certification for
those who still do not
reach the required
level.

96
Annex 1. Discourse over the Need for Exit Exam in the USA

The following excerpts were taken from http://time.com/2187/the-new-college-exam-a-


test-to-graduate/ . The discourse justifies the importance of exit examinations as a requirement
for graduation from universities in the USA:

Ohio this year started testing candidates for education degrees before they
graduate. The Wisconsin Technical College System requires its graduating
students to take tests, or to submit portfolios, research papers or other
proof of what they know. And all undergraduates at the Missouri have to
pass a test before they are allowed to graduate. Such activity is up
“significantly,” according to a new report from the National Institute for
Learning Outcomes Assessment.
The trend unmasks a flabbergasting reality: that those expensive university
degrees may not actually prove a graduate is sufficiently educated to
compete in the workforce. And it advances the seemingly obvious
proposition that students should be made to show they are before they get
one.
“Isn’t it amazing that the newest and most brilliant idea out there is that
students should achieve particular skills and prove it?” Marsha Watson,
president of the Association for the Assessment of Learning in Higher
Education, asked wryly. “Wow.”

Faculty grades fail to do this, advocates for testing say.


Forty-three percent of grades given out by college faculty are A’s,
according to research published by Teachers College at Columbia
University. Yet one-half of students about to graduate from four-year
colleges and 75 percent at two-year schools fall below the “proficient”
level of literacy, according to a survey by the American Institutes for
Research. That means they’re unable to complete such real-world tasks as

97
comparing credit-card offers with different interest rates, or summarizing
the two sides of an argument.
“It’s really bad news, and it’s gotten worse,” said Margaret Miller, a
professor at the University of Virginia’s Center for the Study of Higher
Education and an expert on assessing learning.

A separate survey of employers by an association of universities found that


more than 40 percent don’t think colleges are teaching students what they
need to know to succeed. One-third say graduates aren’t qualified for even
entry-level work.

“I had a syllabus, and I had what the outcomes of the course would be, and
those included critical thinking,” said Julie Carnahan, who taught public
policy and organizational management before she took her current job
working on assessment at the State Higher Education Executive Officers
Association. “In retrospect, now that I know so much more, I didn’t ever
test students to determine if they could actually demonstrate critical
thinking.”
So most students pass their courses, said Watson, the former director of
assessment at the University of Kentucky, and “are given degrees because
they accumulate credits. Each credit hour is assumed to be a metric of
learning, and it’s not. The only thing it’s a metric of is that your butt was in
a seat.”

Many of the exit tests now being tried are similar to the kinds of practical
licensing exams that candidates for nursing degrees have to take, which
require them to prove in the real world that they can apply what th ey’ve
learned in a classroom. “Nobody wants a nurse who’s only taken written
tests,” Watson said. “What you want is a nurse who has some experience
before they jab a needle into your arm.”

98
In Ohio, for example, candidates for education degrees have to writ e a
lesson plan and make videos of themselves teaching.
But introducing new ways of measuring what students learn is time-
consuming, complicated and expensive—not to mention resisted by
universities fearful the results will be used to compare them with
competitors. And the ones that have the most at stake are universities and
colleges already assumed to be among the best.

“They hate it. They hate it,” Miller said. “They already have the reputation
for educating students well, so they can only lose.”

The more selective an institution’s admission standards, the National


Institute for Learning Outcomes Assessment report found, the less likely it
tests what its students know.

“They have everything to lose and nothing to gain,” Watson said.


That’s one reason why the move to establish exit tests is starting slowly,
and the standards remain comparatively low. The cutoff score in the
University of Central Missouri exit exam is below the lowest level of
proficiency and exemptions are made for students with learning di sabilities
or whose native language is something other than English. No one there in
the last three years, or ever in Wisconsin, has been blocked from
graduating because of a poor exit-test score.

In other places, students are being tested not to determine whether or not
they should be allowed to graduate, but to check for strengths and
weaknesses within specific majors or campuses. Some colleges and states,
including Ohio, let students and their families see the results. Others don’t,
or make them hard to find. In all, only about a third of colleges and
universities make assessment results public, according to the National
Institute for Learning Outcomes Assessment report.

99
More and more states, including Missouri, Pennsylvania, and South
Carolina, have approved using student exit-test results to determine how
institutions are doing as one of the measures on which funding is based.
Almost 50 colleges and universities in nine states are trying to develop a
way to test students, before they graduate, in written communication and
quantitative literacy. So far these metrics are only used for evaluating their
own programs, not to judge individual students or decide whether they’ve
earned degrees.
“We want to be very careful,” said Carnahan, who is coordinating the
project. “We don’t want this process to end up where states are being
ranked. What we hope to do in the short term is to only look at the data by
sector across states and not identify institutions. That’s really critical until
we can be sure that this paradigm we’re looking at is valid.”
Rather than wait for that to happen, some students and employers are
taking things into their own hands.
“We can see that in the portfolios that are coming, where it’s not just,
‘Here’s my GPA,’ but, ‘Here’s my work as well, and what I’ve learned from
my internships and classes,’” said Angela Taylor, director of quality
enhancement at Texas Christian University. And some employers are now
testing job applicants themselves to see if they know what their college
degrees say they do.
All of the 42,000 students at Western Governors University, an online
university founded by the governors of 19 states, have to prove what they
know, by getting at least a B on an assessment test, not only before
receiving a degree, but before completing any course.
“We want to be sure that they leave with more than they started with,” said
Joan Mitchell, the university’s spokeswoman.
“At some point in our world,” she said, “we’re going to have to look at,
‘Do you know it? Have you mastered it?’ It’s a whole cultural shift.”
This story was produced by The Hechinger Report, a nonprofit, nonpartisan
education news outlet based at Teachers College, Columbia University

100
Contact TIME Editors about this story at [email protected].
http://time.com/2187/the-new-college-exam-a-test-to-graduate/

101
Annex 2. Organogram for Instituting Exit Examination

To be decided

National Exit Examination


Management Board

Director of Testing

Office Manager

Testing Technical Administrative Research


Department Department Department

Test Test Test Publications


Administration
preparation administration analysis and unit
and finance unit
and and reporting
development management unit
unit unit

Specific job titles are left for later decisions (as TBD or to Be Decided)

TBD TBD TBD TBD TBD TBD TBD TBD

102
References

Ahmad, Mahmoud Al., Marzouqi, Ali H. Al., & Hussien, Mousa. (2014) EXIT EXAM AS
ACADEMIC PERFORMANCE INDICATOR.TOJET: The Turkish Online Journal of
Educational Technology, volume 13 issue 3
Angoff, William H. (1971). Scales, norms and equivalent scores. In R. L. Thorndike (Ed.),
Educational measurement (pp.508-600). Washington D.C: American Council on
Education.
Belay Hagos (2018). Trends of Results in Exit and Licensure Exams in Ethiopia, Unpublished,
Institute of Educational research, Addis Ababa University.
Boud,D. (1994) Keynote Speech at SEDA Conference on Assessment, Telford, May 1994
Brown, S. (2003). Institutional Strategy for Assessment. In Sally Brown and Angela Glasner,
Assessment Matters in Higher Education: Choosing and Using Diverse Approaches. The
Society for Research into Higher Education & Open University Press.
Brown,s. & Knight, P. (2004). Assessing Learners in Higher Education. RoutledgeFalmer,
Taylor &Francis Group. London
Dehuri RK, Samal J. “Exit exams” for medical graduates: a guarantee of quality? Indian J Med
Ethics. 2017 Jul-Sep;2(3) NS:190-3. DOI 10.20529/ IJME.2017.037
Gibbs,G(2003) Using Assessment Strategically to Change the nway Students Learn. In Sally
Brown and Angela Glasner: Assessment Matters in Higher Education: Choosing and
Using Diverse Approaches. The Society for Research into Higher Education & Open
University Press.
Horn, C., Ramos, M., Blumer, I., Madaus, G., Peter S. & Carolyn, A. (2000). Cut Scores: Results
May Vary. National Board on Educational Testing and Public Policy (NBETPP)
Monographs, Vol. 1. No. 1.
Hurtz, G. M., & Auerbach, M. A. (2003). A meta-analysis of the effects modifications to the
Angoff method on cutoff scores and judgement consensus. Educational and
Psychological Measurement, 63(4), 584-601.
Kane, M. (1994). Validating the performance standards associated with passing scores. Review of
Educational Research, 64, 425-461.

103
Mehrens, W. A., & Alexander R. A. (1992). How to evaluate the legal defensibility of high-
stakes tests? Applied Measurement in Education, 5, 265-283.
MoE (2007). Education Statistics Annual Abstract 2000E.C. (2007-2008): Ethiopian Ministry of
Education: Education Sector Development Program Planning and Policy Analysis
Department. Addis Ababa: Ministry of Education.
Winkler, Donald R. & Gershberg, Alec Ian (2003) Education Decentralization in Africa: A
Review of Recent Policy and Practice. Research Triangle Institute & The New School

104

View publication stats

You might also like