OpenEMIS Jordan Evaluation 2017
OpenEMIS Jordan Evaluation 2017
OpenEMIS Jordan Evaluation 2017
CONTENT
The OpenEMIS project was highly relevant to the beneficiaries ' needs and the project model was
contextually appropriate - although the time and resources needed to train key MoE stakeholders and
build capacity on the management and use of the system was underestimated . The project made a
strong effort to adapt and respond to changes in the internal and external context, some of which
negatively influenced the implementation of the core OpenEMIS elements of the project, resulting in
delays. The UNESCO core project team's resources appear to have been used efficiently, but there
were some inefficiencies in how CSF resources were deployed. The amount of work delivered under
the project is considerable, and as such cost ef{iciency is assessed to be good. UNESCO 's planning,
management and coordination of the project was adequate overall, ~!though certain areas could have
been improved - in particular the project's M&E strategy . Beneficiaries were overall satisfied with the
services delivered, but recognize the need for additional capacity building if the project's achievements
are to be sustainable.
The evaluation identified the following key lessons learned :
1. The design of the project underest imated the magnitude of implementing Open EM IS in a country
of Jordan's size and the corresponding capacity building needs of MoE .
2. A disconnect between UNESCO/MoE 's initial conception of Open EM IS and the subsequent vision
of other key MoE stakeholders for the system 's purpose and functionality delayed and complicated
OpenEMIS implementation .
3. A lack of clearly defined roles and responsibilities within MoE in relation to the EMIS negatively
impacted project implementation and sustainability .
4. The project adapted well to unanticipated external developments, but these had an unavoidable
negative impact on project implementation.
5. Top-level MoE engagement and support is crucial for the successful implementation and
sustainability of a large-scale technical assistance project such as OpenEMIS .
The following recommendations are made for any future iteration of the project:
1. Continue building QRC 's capacity to maintain and manage the core IT aspects of the Open EM IS
platform, but identify a more suitable MoE counterpart for strengthening the link between EMIS data
and its utilization for evidence -based policy formulation.
2. Incorporate broader institutional support for building a culture of evidence-based policy making in
MoE .
3. Conduct a comprehensive stakeholder consultation to ensure cross -MoE consensus on the primary
features and functionalities of the OpenEMIS platform, and clarify institutional roles and
responsibilities in relation to the EMIS.
5. Ensure the project is coordinated and aligned with other donor support to MoE relating to planning ,
M&E , research and evidence-based policy making .
2. EVALUATION PURPOSE & SCOPE
UNESCO Amman in November 2017 commissioned Altai Consulting to conduct an End-of-Project
Evaluation of the European Union-funded project 'Technical Assistance to Enhance Accessibility and
Use of the Jordanian MoE EMIS for Evidence-based Policy Formulation .' The project began in February
2014 and concluded on 30 November 2017 .
According to the Terms of Reference (ToR) , the main purpose of the evaluation is to assess the
performance (activities , outputs, outcomes) of the project and to generate lessons and
recommendations for UNESCO to improve planning, implementation , management, and monitoring
and evaluation of future similar interventions .
In addition , according to the ToR the evaluation should :
• Assess the achievements of the project outputs and target indicators towards the contribution
to the outcomes
• Assess the effectiveness and efficacy of the project in meeting the stipulated results
• Draw lessons for improving the design and management of similar projects in the future
including identifying issues encountered through implementation and identify the level of
engagement and role of the program stakeholders
• Formulate recommendations for strengthening accountability; including exit strategy and
sustainability mechanisms
• Consider the relevance , effectiveness, efficiency, impact and sustainability of the overall project
• Assess the extent to which gender considerations were mainstreamed in the project
3. EVALUATION METHODOLOGY
3.1. OVERVIEW OF EVALUATION METHODOLOGY
The methodology for th is final evaluation comprises:
• Review of secondary data . This includes project documentation (including progress reports) ;
EU, UNESCO and Government of Jordan strategic documentation; evaluations and
assessments already carried out ; and implementing partner surveys . A list of secondary data
reviewed as part of the evaluation is provided in Annex 1.
• Field observation visits and interviews with beneficiaries at project implementation sites
(selected Ministry of Education Field Directorates and schools) in Jordan. A list of the
Field Directorates and schools where field observation visits and interviews were conducted is
provided in Annex 3. The interview guidelines that were used to structure interv iews with
beneficiaries are provided in Annex 4.
l ,, • _i 1
• Other key interv iews with
stakeholders beneficiaries
(UNESCO , EU, • Revie w of
partners) secondary data
• Secondary data
Outputs/deliverables Indicator 3: OpenEMIS
• MoE (key • Qualitative • Qua Iitative
operational and owned by project interviews with analysis of
• Education analysis
the government
counterparts , key MoE interview
tools and products
developed to assist key Indicator 4: OpenEMIS Planning & stakeholders output
stakeholders in administration team with Research
• Qualitative • Assessment of
decision support with the capacity to sustain the Department , interviews with secondary
existing education data use of the system QRC , ETC) other key
. An operational
OpenEMIS with current
Indicator 5: Launch of an
• Beneficiaries
(field
stakeholders
(UNESCO , EU ,
data
integrated education
data on institutions, directorates and partners)
decision support system
students, teachers and schools) • Field observation
with data from OpenEMIS
.
staff and other relevant public- • Other key visits and
An integrated sector databases stakeholders intervie ws with
education decision (UNESCO , EU, beneficiaries
support system partners) • Review of
established and used • Secondary data secondary data
by education system in
the country
Year 1 Activities Indicator 6: OpenEMIS • MoE (key • Qualitati ve • Qualitative
loaded with existing project interviews with analysis of
education data
counterparts , key MoE interview
Indicator 7: OpenEMIS Planning & stakeholders output
used with other data Research • Qualitative • Assessment of
visualization and reporting Department, inter views with secondary
tools to generate key QRC , ETC)
other key data
planning and • Beneficiaries stakeholders
management reports (field (UNESCO , EU ,
directorates and partners)
Indicator 8: A team of
schools) • Field observation
key of MoE officers
trained in use of • Other key visits and
education data for better stakeholders interviews with
planning and (UNESCO , EU, beneficiaries
management partners) • Rev iew of
• Secondary data secondary data
Year 2 & 3 Activities Indicator 9: OpenEMIS • MoE (key • Qualitative • Qualitati ve
piloted in 50+ schools project interviews with analysis of
Indicator 10: OpenEMIS counterparts , key MoE interview
scaled up to cover all Planning & stakeholders output
schools , students , Research • Qualitative • Assessment of
teachers and staff Department , interviews with secondary
QRC , ETC) other key data
• Beneficiaries stakeholders
(field (UNESCO , EU,
directorates and partners)
schools) • Field observation
• Other key visits and
stakeholders interviews with
(UNESCO , EU, beneficiaries
partners) • Revie w of
. • Secondary data secondary data
Year 3 & 4 Activities Indicator 11: OpenEMIS
integrated with other
• MoE (key
project
• Qualitative
interviews with
. Qualitati ve
analysis of
public databases to counterparts , key MoE interview
Planning & stakeholders output
establish an education
decision-support system
Research
Department ,
• Qualitative
interviews with
. Assessment
of
QRC , ETC) other key secondary
Indicator 12: Education
• Beneficiaries stakeholders data
simulation modelling
(field (UNESCO, EU,
capacity building and
directorates and partners)
training provided to
government officers schools) • Field observation
• Other key visits and
Indicator 13: Availability interviews with
stakeholders
of a reviewed National beneficiaries
(UNESCO , EU,
Teacher Professional
partners) • Review of
Standards
• Secondary data secondary data
Indicato r 14 : Availability
of generic frameworks for
teacher professional
development and teacher
induction and in-service
programmes
Indicator 18 : Availability
of a GIS platform to map
education inequities ,
support decision-making
for rationalization of the
school network and
teacher allocation across
schools
Indicator 20 : Availability
of a MoE VE work plan for
implementation of the
decisions stemming from
the National Conference
on Human Resource
Development Strategy
. Effectiveness
.
Planning &
Research
stakeholders output
. Impact Department ,
• Qualitative
interviews with
• Assessment of
. Sustainability
QRC, ETC)
• Beneficiaries
other key
stakeholders
secondary
data
(field
directorates and (UNESCO , EU,
schools) partners)
• Other key • Field observation
stakeholders visits and
(UNESCO, EU, interviews with
partners) beneficiaries
• Secondary data • Review of
secondary data
Mainstreaming of gender Gender considerations • MoE (key • Qualitative • Qua Iitative
considerations were adequately project interviews with analysis of
assessed and counterparts , key MoE interview
mainstreamed in project Planning & stakeholders output
design and Research
implementation • Qualitative • Assessment of
Department, interviews with secondary
QRC, ETC) other key data
• Beneficiaries stakeholders
(field (UNESCO, EU ,
directorates and partners)
schools)
• Field observation
• Other key visits and
stakeholders intervie ws with
(UNESCO, EU, beneficiarie s
partners) • Review of
• Secondary data secondary data
..
4. MAIN FINDINGS
4.1. DEVELOPMENT OBJECTIVE: TO IMPROVE EDUCATION PLANNING AND
MANAGEMENT IN JORDAN
Indicator 1: Timely, relevant, integrated education data available on demand for decision
support
1 lnterv;ew with Constanza Fanna. D,rector of UNESCO Amman Office. 27 December 2017
Basis for assessment
• Secondary data
• Interviews with key project stakeholders and implementation and sub-contracted partners
• Field observation visits and interviews with beneficiaries at project implementation sites
Assessment
This indicator has been partially achieved : timely , relevant and integrated education data is
available through the enhanced EMIS , although its systematic exploitation for decision support and
evidence-based policy-making is yet to be fully realized .
The education data available through the Open EM IS platform is timely in that the system is
updated with data from the 2016-2017 school year (and data collection for the 2017-2018 school
year appears to be progressing well); relevant in that an extensive scoping phase and continuous
customization of the system ensured the type and format of data is coherent with the Jordanian
education system and meets MoE requirements; and integrated in that the OpenEMIS system is
linked to a series of other relevant public-sector databases.
That said, the core MoE EMIS team 's limited capacity for producing custom reports and indicators
for other MoE departments means that much of this data cannot in practice be said to be available
on demand for the MoE at large. Consequently, the extent to which it is used to systematically
support decision-making - beyond responding to ad hoc requests for data and the generation of a
limited number of simple reports - is limited .
All N/A
• Secondary data
• Interviews with key project stakeholders and implementation and sub-contracted partners
• Field observation visits and interviews with beneficiaries at project implementation sites
Assessment
This indicator has been partially achieved : there are many examples of the Open EM IS platform
being used to respond to ad hoc requests for data and generating a limited number of basic
reports, as well as of workflows being used to facilitate some MoE administrative procedures;
however, there is little evidence of the EMIS being used systematically for evidence-based policy
formulation.
The Head of the MoE Strategic Planning Department reported that the most important ways in
which his department uses EMIS data is to determine whether a single-shift school should be
upgraded to a double-shift school (through an indicator derived from the number of students in
relation to the number of teachers and the sizes of classrooms), and to determine whether more
schools need to be built in a particular area {through observing the number of double-shift schools
in that area) .3
The CSF EMIS Programme Advisor and Developer provided a number of examples of how EMIS
data is regularly used to respond to ad hoc requests for data, including requests from the MoE
Infrastructure Department for data about school buildings and infrastructure when they receive
maintenance requests from schools, and regular requests from the MoE Donor Coordination Unit
for data on Syrian refugees in schools . He also described how the QRC EMIS team had recently
responded to a request from the Ministry of Youth for data on running water in school buildings in
order to inform planning for holding youth camps in public school facilities . In terms of reporting, the
CSF Developer noted that whenever the Minister for Education or senior MoE official visits a
school, EMIS data is used to generate an 'Educational Status Report' compiling various data on the
particular school. 4 The decision to use the Open EM IS system as the repository for official data on
Syrian refugees in Jordanian schools and the OpenEMIS system's role in assisting the Ministry of
Finance in its disbursal of 20JD to every schoolchild on behalf of the King are further examples of
the OpenEMIS system being used to respond to ad hoc data requests, albeit on a country-wide
scale.
At the Field Directorate level, staff interviewed reported using EMIS data to facilitate decision-
making around allocating scholarships and aid to students, as well as around when to build new
interv,ew with N1ihan Siam, Jordan EMIS Programme Ad Jisor/CSF 29 November 2017
·' Interview with Dr Abdullah Hassoneh Head of MoE Strategic Planning Department 28 November 2017
4 Interview with N,lhan S,am Jordan EMIS Programme Adv,sor/CSF 29 November 2017
classrooms and school buildings .5 At the school level , OpenEMIS does not appear to play a
prominent role in planning or management, although several principles reported that the system
helps them to communicate with parents by being able to easily look up every student's grades.
Some principles also reported using EMIS data to recommend students for scholarships. 6
Yet numerous key project stakeholders interviewed reported that the EMIS is not being used
systematically for evidence-based policy formulation , planning and management at the central
level. This is partly due to a lack of capacity among the core MoE EMIS team . While core MoE
EMIS team members have been given training in how to build custom reports , dashboards and
visualizations , capacity is still relatively weak in this regard and the MoE EMIS team is as yet still
not able to go beyond the limited number of pre-defined reports and dashboards and create custom
reports for each MoE department .7 The same is true for the development of custom indicators and
KP ls that would facilitate the monitoring of sector performance at a more systematic and strategic
level.
The weak link between the core technological aspects of the OpenEMIS system and the use of
educational data to drive evidence-based decision-making in MoE may be related to the decision to
transfer responsibility for the EMIS system from the Policy and Planning Department to the QRC .
QRC is focused on data collection (for which it had responsibility under the previous EDUWAVE
system) and the technical IT aspects of the Open EM IS system, but largely lacks the capac ity and
perspective for generating custom reports and indicators and linking EMIS data to planning and
decision-making processes .8 At the same time, due to a combination of staff turnover, lack of
capacity and lack of engagement with the Open EM IS project among certain key stakeholders, the
Policy and Planning Department is also not able to generate reports and drive the integration of
EMIS data in MoE planning and decision-making processes .9 A reported reluctance in QRC and
the Policy and Planning Department to work with each other only serves to weaken the link
between the core technological aspects of the system and the raw data on one hand , and reporting
and the strategic use of data for evidence-based decision-making on the othe r.
All N/A
~ Interview with Uwa Ban, Obeid Field Directorate 28 November 2017 1nterv1ewwith Madaba Field Directorate 29 November
2017
'' Interview with Principle Un· Tofayl Secondary School Amman 25 November 2017, interview with Principle Ai Hoson
Secondary School. lrb1d. 28 November 2017
Interview with UNESCO Amman Pro1ect Team 19 November 2017 Interview with Dr Omar AI-Jarrah Chair of OpenEMIS
Technical Steering Con1m11tee 29 November 2017
\ Interview with UNESCO Amman ProJect Team 27 November 2017 ROM Monitoring Report (June 2017): Support to Jordanian
EMIS System - Phase 2 (September 2017 - August 2020)· Pro1ect Document (May 2017,
4.3.1.1. ER1: Comprehensive scoping of MoE needs for a more robust Education
Decision Support System
This Expected Result has been achieved. An initial needs assessment included consultations with a
wide range of stakeholders (MoE Planning , Research and Development Directorate, Queen Ran ia
Center , Educational Train ing Cente r, Examinations and Tests , Vocational Education, General
Education , selected field directorates and schools) , as well as a wide range of external stakeholders) ,
as well as a technical review comprising of a comprehensive IT infrastructure assessment from the
central (ministry) to the school level. The EMIS project team continued to accommodate MoE's evolving
needs for the EMIS throughout the lifetime of the project.
4.3.1.2. ER2: Open and user-friendly access to education data and indicators for
better informed decision-making at all levels
This Expected Result has been achieved in principle, but is not quite happening in practice . The
0penEMIS system has extremely rich data on a range of education sector components, along with the
tools for accessing , analyzing and sharing this data. Yet because these tools are not being utilized to
their full potential, the link between education data and better-informed decision-making at all levels is
relatively weak (see Indicator 7). Moreover, the core MoE EMIS team lacks the capacity to transform
data into useful indicators - including the development of KPls. 10
As such, education data from 0penEMIS is currently largely used to generate a limited number of
reports and to respond to ad hoc requests for information , rather than being used systematically to drive
better-informed decision-making at all levels.
4.3.1.3. ER3: New effective, easy to upgrade EMIS pilot-tested and rolled out
across all MoE institutions
This Expected Result has been achieved. The EMIS was successfully pilot tested in 64 schools , with
feedback from the pilot test used to refine and further customize the system (see Indicato r 9). Following
MoE approval the enhanced EMIS was rolled out throughout the entire country during the second
semester of the 2015-2016 school year - resulting in the eventual training of more than 110,000
Open EM IS users (see Indicator 10). The roll-out also included the training of key MoE staff in various
aspects of Open EM IS administration , management and usage .
The extensive consultation process during the scoping stage and protracted customization of the
system to be coherent with the Jordanian education system and meet the needs of various MoE
stakeholders ensured that the resulting EMIS is an effective tool for supporting evidence-based policy
formulation . The system is easy to upgrade both in the sense that custom fields , workflows , reporting
templates, visualizations and connections to other systems can easily be added , and in the sense that
the 0penEMIS platform is constantly being updated with improvements and new features.
1' ' lnterv,ew with Dr Omar AI-Jarrah Chair of OpenEMIS Technical Steering Committee. 29 November 2017. Interview with Dr
Mahmoud Smadi Statistics Consultant 29 November 2017: Support to Jordanian EMIS System - Phase 2 (September 2017 -
August 2020). ProJect Document (May 2017)
4.3.1.4. ER4: Fully-integrated Decision Support System with data from EMIS and
other relevant public-sector databases in place
This Expected Result has been achieved. At the close of the project MoE is in possession of a fully-
operational education decision support system (see also Indicator 5). This includes pre-existing
education datasets from the legacy EDUWAVE system as well as up-to-date educational data for the
2016 -2017 school year (see Indicator 6).
The OpenEMIS system has also been successfully integrated with other relevant public-sector
databases, including the Civil Service and Passports Department Database (CSPD) and the
Examinations Data Management Information System (EXAMIS) (See Indicator 11 ). The development
of an OpenEMIS Generic Application Programming Interface (API) allows OpenEMIS to connect with
other public-sector databases and systems in the future .
4.3.1.5. ER5: An EMIS system fully compliant to beneficiaries ' needs and
international standards in place
This Expected Result has been achieved. An extensive initial scoping , consultation and needs
assessment phase ensured that the OpenEMIS platform was fully customized to the Jordanian
education system and met the requirements of key MoE stakeholders . This was reinforced through the
iterative pilot testing phase and by the EMIS project team's accommodation of requests for additional
features and customization throughout the lifetime of the project. Moreover, the open and customizable
nature of the Open EM IS platform means that the system can easily be updated to comply with MoE's
future needs - from the creation of new KPls or dashboards to linking OpenEMIS with other public-
sector databases.
The project is replete with examples of the latest international standards being adhered to in the design
and implementation of the Open EM IS system - from the technical specifications of the servers that host
the system to the protocols for programming and software development. There are also examples of
consultancy services rendered under the 'Complementary Technical Advisory Services' project
component drawing on international best practice , such as the utilization of UNESCO Institute for
Education Planning and UNESCO Institute for Statistics methodology in the setting up of the National
Education Account in Jordan .
• Secondary data
• Interviews with key project stakeholders and implementation and sub-contracted partners
• Field observation visits and interviews with benefic iaries at project implementation sites
Assessment
This indicator has been partially achieved : OpenEMIS is fully operational (see Indicator 10) and is
owned by the government in the sense that all the required hardware and software systems, data
and data collection processes (see Indicator 6), linkages with other systems (see Indicator 11) and
reporting functionalities (see Indicator 7) are in, place .
At the central level, MoE is assessed to possess the necessary hardware for supporting the
system, including 'optimum' servers (although the provision and quality of IT infrastructure at the
field directorate and school level is less consistent - a fact that was observed during field visits to
four schools as part of this evaluation) . 11 The system can also be said to be owned by the
government in the sense that commitment to using the system appears to be strong at all levels ,
from the Minister of Education to the field directorates and schools interviewed as part of this
evaluation. This observat ion is also confirmed by an EU mon itoring exercise carried out in June
2017 . 12
However, the government cannot be said to have ownership of OpenEMIS in the sense of MoE
being able to independently sustain the system following the closure of the project and withdrawal
of UNESCO support (see Indicator 4) .
All N/A
Indicator 4: OpenEMIS administration team with the capacity to sustain the use of the
system
• Secondary data
• Interviews with key project stakeholders and implementation and sub-contracted partners
• Field observation visits and interviews with beneficiaries at project implementation sites
Assessment
This indicator has been partly achieved: the administration team can manage some components of
OpenEMIS independently , but MoE does not have the capacity to autonomously sustain the use of
the system .
QRC is reported as being able to competently manage OpenEMIS core, which is the component of
the system related to data and workflows pertaining to schools. 13 QRC is also driving and closely
monitoring the on-going data collection activities . 14 The QRC EMIS core team is also able to use
the system to produce basic reports and respond to ad hoc data requests (see Indicator 2 above) .
That said, the secondary data consulted as part of this evaluation as well as interviews with key
project stakeholders indicate that MoE lacks the capacity to autonomously sustain the use of the
system from both a technical and practical perspective .
From a technical perspective, an internal UNESCO capacity assessment of the core QRC EMIS
team entitled 'Support to Jordanian EMIS System - Phase 2 (September 2017 - August 2020) :
Project Document' conducted in May 2017 found that they had a generally weak command of tools
for system development, system design for linkages to other relevant data systems , system design
for data reporting , data utilization for key education management fields, and database
administration - as well as of the Linux programming language on which the whole system is based
(although it should be noted that selected members of the core QRC EMIS team subsequently
received additional training in several of these areas under the Accelerated Capacity Development
11 Support to Jordanian EMIS System Phase 2 (September 2017 August 2020) ProJect Document (May 20! 7)
1-' Interview w:th Niihan Siam Jordan EMIS Programme AdvisorlCSF, 29 November 201 7
From a practical perspective , MoE is not yet self-sufficient in fully exploiting the system's various
reporting functionalities - including the generation of custom reports and the use of dashboards .
The UNESCO Amman Project Team reported that the QRC EMIS team has not yet been able to go
beyond the limited number of pre-defined report templates to generate custom reports for each
department, 19 while the Chair of the EMIS Technical Steering Committee noted that MoE is
currently only using one of the many dashboards that are built into the system. 20 Both of these
observations are echoed in the internal UNESCO capacity assessment of the core QRC EMIS
team conducted in May 2017 . The capacity of MoE to sustain the use of the system in MoE
decision-making processes is therefore limited.
Indicator 5: Launch of an integrated education decision support system with data from
OpenEMIS and other relevant public-sector databases
• Secondary data
• Interviews with key project stakeholders and implementation and sub-contracted partners
• Field observation visits and interviews with beneficiaries at project implementation sites
Assessment
This indicator has been achieved : at the close of the project MoE is in possession of a fully-
operational education decision support system. This includes pre-existing education datasets from
the legacy EDUWAVE system as well as up-to-date educational data for the 2016-2017 school
year (see Indicator 6). As is usual with any large primary data set, the 2016-2017 data is reported
to contain some gaps , errors and outliers .2 1 A statistics consultant has been retained by UNESCO
1' Support to Jordanian EMIS System •· Phase 2 (September 2017 - August 2020) ProJect Document (May 2017) Note that
l(, Interview with N1lhan S1arn.. Jordan EMIS Programme Adv1sor/CSF 29 November 2017 P
1·· Interview with Marwar Turman Head of EMIS and E Learning Division. QRC, 5 December 2017
Interview w:tn Dr. Omar AI-Jarrah. Chair of OpenEM1S Technical Steer•ng Com m1ttee 29 November 20' 7
:,i, Interview with Dr Omar AI-Jarrah Chair of OpenEMIS Technical Steering Committee 29 November 2017
All N/A
• Secondary data
• Interviews with key project stakeholders and implementation and sub-contracted partners
Assessment
This indicator has been achieved: relevant historical datasets were migrated from EDUWAVE to
OpenEMIS, and the data was also cleaned. Historical Statistical Yearbooks were also migrated to
OpenEMIS. Reports on data migration and cleaning were submitted to the EMIS Steering
Committee and endorsed.
A precursor to Open EM IS being loaded with existing education data was the completion of a
comprehensive needs assessment (see the first key activity below). The project logframe does not
include an indicator related to this, but the activity is highly relevant to the achievement of Expected
Result 1: Comprehensive scoping of MoE needs for a robust Education Decision Support System .
Four relevant reports were submitted to the EMIS Steering Committee and endorsed (EMIS Needs
Assessment Report, Refining Key Performance Indicators Report, EMIS IT Infrastructure Report,
Staff Training Plan) .
' '
This activity corresponds largely to 'Project Component 1: Technical
reviews , needs assessments and inventories of human, technical ,
Policy and planning institutional capacities of procedural issues '.
activities including policy
The technical review component comprised a comprehensive IT
development , project
infrastructure assessment from the central (ministry) to the school
scoping, detailed needs
level, including a network connectivity assessment.
assessment and gap
analysis, data collection The needs assessment included consultations with a wide range of
analysis to assess the stakeholders (MoE Planning, Research and Development Directorate ,
needs and capabilities of Queen Rania Center, Educational Training Center, Examinations and
government institutions for Tests, Vocational Education, General Education, selected field
improved planning and directorates and schools , as well as a wide range of external
management of the stakeholders). The scope of the needs assessment included data
education system quality and tools, data quality issues, educations sector KPls and
data gaps, analysis of MoE business processes in relation to EMIS,
and reporting requirements .22
Indicator 7: OpenEMIS used with other data visualization and reporting tools to generate key
planning and management reports
• Secondary data
• Interviews with key project stakeholders and implementation and sub-contracted partners
• Field observation visits and interviews with beneficiaries at project implementation sites
Note that this indicator corresponds closely to 'Project Component 2: Consolidation and expansion
of the Jordan OpenEMIS indicators automation, reporting and data visualization features.'
This indicator has been achieved in principle: all of the processes and enabling factors for
OpenEMIS to be used with other data visualization and reporting tools to generate key planning
and management reports are in place. MoE staff and beneficiaries are using EMIS data
visualization and reporting tools to answer requests for information and generate reports in certain
contexts. However , capacity gaps among MoE staff mean Open EM IS data visualization and
reporting tools are not being used to their full extent to generate key planning and management
reports.
The EMIS team began developing school profiles that are connected
to real-time OpenEMIS datasets, which will allow stakeholders (the
Use of Open EM IS system ministry, schools and families) to access a series of pertinent
to generate advocacy education sector indicators.
materials for dissemination
OpenEMIS data also contributed to an Education Sector Assessment
and visibility of the
(ESA) conducted in early 2017, from which a variety of shorter policy
education data
briefs will be developed. 26
Indicator 8: A team of key of MoE officers trained in use of education data for better
planning and management
• Secondary data
This indicator has been achieved in principle : the project included extensive training and capacity-
building of key MoE officers in the use of education data for better planning and management.
That said , there still appears to be a capacity gap in the MoE - both in relation to the ownership
and management of the system, and in its use for informing and driving better planning and
decision-making.
• Secondary data
• Interviews with key project stakeholders and implementation and sub-contracted partners
Assessment
This indicator was achieved : the EMIS was successfully pilot tested in 64 schools (before this a
closed pilot test was conducted in two schools). Feedback from the pilot test was used to further
refine and customize the system.
Extensive customization of the EMIS platform to fit the context of the Jordanian education system
as well as to meet the evolving priorities and requirements of the MoE took place throughout the
pilot phase and beyond.
OpenEMIS piloted in a
representat ive sample of As Indicator 9.
schools
Indicator 10: OpenEMIS scaled up to cover all schools, students, teachers and staff
• Secondary data
• Interviews with key project stakeholders and implementation and sub-contracted partners
• Field observation visits and interviews with beneficiaries at project implementation sites
Assessment
This indicator was achieved: the Minister of Education approved the roll-out of the enhanced EMIS
throughout the country during the second semester of the 2015-2016 school year.
The roll-out took place in three phases . In Phase 1, 40 regional-level trainers trained 608 field level
trainers. In Phase 2, these field level trainers trained more than 13,000 staff at the school level (two
teachers from each school) . In Phase 3, the two teachers from each school trained all other staff in
his/her school, resulting in the training of more than 110,000 OpenEMIS users.
Key MoE staff were also trained in numerous aspects of system administration, management and
usage . Open EM IS data validation and quality assurance processes were developed and
implemented, and OpenEMIS helpdesk was established .
Data collection for the 2016-2017 school year was launched in August 2016. Some difficulties were
encountered during data collection, due in large part to the decision to switch responsibility for the
EMIS (and hence data collection activities) from the Central Planning Department to Queen Rania
Center for Education Technology .
An OpenEMIS helpdesk
established within the MoE An online Support Service Desk was established to provide support to
for support to users within the central MoE team working on OpenEMIS.
the country
Both of the indicators for Years 3 and 4 for the remaining activities relating to the core Open EM IS roll-
out and implementation were achieved: the OpenEMIS platform was successfully integrated with a
series of other relevant public databases and key MoE staff received training in data analysis and the
use of the enhanced EMIS . In spite of this, the capacity of key MoE staff to manage and utilize the
enhanced EMIS to its full potential continued to lag behind in Years 3 and 4 due to the knock-on effect
of factors that emerged in Years 2 and 3 that detracted from the capacity-building and knowledge
transfer aspects of the project. The addition of the 'Complementary Technical Advisory Services' project
component, while highly useful to MoE and well-received in its own right, also drew time and resources
away from knowledge transfer and capacity building activities relating to the core OpenEMIS
workstream.
Indicators related to the 'Complementary Technical Advisory Services' were largely achieved: MoE
received the requested technical support and associated deliverables in the areas of teacher
professional standards and development, education sector finance, implementing a GIS platform to
support school mapping and planning, and impl~menting the MoE ICT strategy. One indicator relating
to the development of a vocational education work plan for implementing decisions stemming from the
_2q Note that son1e activities irnpiemented under the ACDP drew on a different funding source
IJ..; \
National Conference on Human Resource Development Strategy (HRD) could not be achieved due to
the postponement of the HRD Strategy; instead, UNESCO provided a consultant who prepared on-
demand technical notes for the Minister and provided strategic advice on TVET. Finally, it should be
noted that while a GIS platform for supporting school mapping and planning was established and is fully
operational, MoE appears to lack the capacity and resources to utilize the tool to its full potential to
support decision-making.
Indicator 11: OpenEMIS integrated with other public databases to establish an education
decision-support system
• Secondary data
• Interviews with key project stakeholders and implementation and sub-contracted partners
• Field observation visits and interviews with beneficiaries at project implementation sites
Assessment
This indicator has been achieved : OpenEMIS is integrated with a series of other relevant public
databases.
Discussions are ongoing with the Ministry of Finance on the linking of Open EM IS with the
Government Finance Management Information System (GFMIS).
An Application Programming Interface (API) was developed to allow for the future integration of
OpenEMIS with other public databases and platforms.
Indicator 12: Education simulation modelling capacity building and training provided to
government officers
• Secondary data
• Interviews with key project stakeholders and implementation and sub-contracted partners
Assessment
This indicator has been achieved : a Jordan Education Simulation Model (JESM) was delivered to
MoE.
Education data analysis Eight end-users of the JESM were trained and a restitution workshop
and management training was held in December 2016 involving key management staff from
to senior education staff in MoE.
the use of the education
decision-support system
• Secondary data
Assessment
This indicator has been achieved : 'Teacher status and professional standards policies ' were
delivered by SOFRECO to MoE in March 2017 .
Indicator 14: Availability of generic frameworks for teacher professional development and
teacher induction and in-service programmes
• Secondary data
Assessment
This indicator has been achieved: 'Teacher continuous professional development policies ' were
delivered by SOFRECO to MoE in March 2017. A 'Teacher evaluation and appraisal system' and
'Career paths and ranking system for teachers' was delivered in April 2017.
Technical Advisory
Services to support As for key activities under Indicator 13.
opalization of MoE
Teacher Policy Framework
• Secondary data
Assessment
This indicator has been achieved : an 'Analysis of Government Expenditure' report was delivered by
an international expert to MoE in August 2016 .
• Secondary data
Assessment
This indicator has been achieved: the foundations for a National Education Account were set up.
• Secondary data
Assessment
This indicator has been achieved : a Jordan Education Simulation Model (JESM ) was delivered to
MoE.
Assessment
This indicator has been partially achieved : a GIS platform - the Jordan Spatial Education Decision
Support System (JSEDSS)- is available and fully operational, although MoE appears to lack the
capacity to utilize the tool to its full potential to support decision-making .30
Technical Advisory
Services and Capacity
Development Tools in The JSEDSS supports a wide variety of workflows in relation to
GIS-based school school mapping and school planning, providing MoE with a tool to
mapping for equity and make evidence-based decisions on the rationalization of the school
rationalization of the network system .
education facilities
network across Jordan
• Secondary data
• Interviews with key project stakeholders and implementation and sub-contracted partners
Assessment
This indicator has been achieved : an 'ICT Strategy Implementation Plan' was delivered to MoE .
Technical Advisory
Four National Experts provided technical advisory services to MoE on
Services on the
the operalization of the recently approved ICT Strategy in the
operalization of the MoE
teaching and learning process - with a particular focus on e-learning.
ICT strategy
Indicator 20: Availability of a MoE VE work plan for implementation of the decisions
stemming from the National Conference on Human Resource Development Strategy
• Secondary data
Assessment
This indicator was not achieved due to the postponement of the National Human Resources
Development Strategy, although a consultant still provided MoE with support and strategic advice
on TVET-related issues .
Indicator 21: 80% of MoE EMIS officers at central and regional level satisfactorily trained on
their EMIS tasks
• Secondary data
• Interviews with key project stakeholders and implementation and sub-contracted partners
Assessment
This indicator cannot be verified. However, interviews with numerous stakeholders, including with
the Director of the MoE EM IS and E-learning Division, indicate that at the close of the project MoE
EMIS officers at the central level require further training and capacity building support to be able to
satisfactorily carry out their EMIS tasks. 31
4.7.1 . RELEVANCE
The project was highly relevant to the beneficiaries' needs, originating from a direct request from
MoE in September 2011 to UNESCO for technical assistance in identifying gaps in the existing EMIS
system and improving it. The previous EDUWAVE system contained numerous technical weaknesses
and bugs, had very weak reporting functionality and was not able to generate indicators, and was
licensed to MoE by the vendor under commerciql terms whereby MoE had to pay up to JD 400,000 per
year in licensing and maintenance costs as well as additional fees whenever modifications or upgrades
' 1 Interview w;th Marwan Tu,rnal' Head of EMIS and E-Learn,ng D1vis1on,ORC 5 December 2017
were required. 32 Previous mIssIons by GOPA (2008), USAID (2011) and the World Bank (2011)
identified EDUWAVE as an Achilles' heel in MoE , and as partly responsible for MoE's extremely weak
capacity for evidence-based policy-formulation and decision -making . In contrast to EDUWAVE , the
UNESCO OpenEMIS system is license - and royalty-free , fully customizable, has strong reporting
functions , and is designed to be fully owned by countries adopting the system .
The project model was contextually appropriate and responsive to the needs of beneficiaries,
although the time and resources needed to train key MoE stakeholders and build capacity on
the management and use of the system was underestimated . An extensive consultation, scoping,
design and pilot-testing phase ensured that the OpenEMIS platform reflected the context and structure
of the Jordanian education system , while customization of the EMIS to meet the various data collection,
workflow and reporting requirements of MoE took place continuously throughout the lifetime of the
project. The basic project model - from design to implementation to capacity building - was appropriate
to the context. However, the project model underestimated the time and resources that would need to
be dedicated to the capacity-building component of the project. Delays in the implementation phase
meant that the capacity-building component was further squeezed . Attempts to accelerate capacity
building and knowledge transfer in the final six months of the project significantly boosted MoE's
capacity to manage and use various aspects of the system . However, at the close of the project MoE
does not have the capacity to autonomously manage the OpenEMIS platform and use it to its full
potential.
The project made a strong effort to adapt and respond to changes in the internal and external
context. The most significant internal change was the decision following the conclusion of the pilot-
testing phase by the then-Minister for Education to transfer responsibility for the Open EM IS project from
the Policy and Planning Department to QRC. The project team responded to this change by advocating
for the transfer of four Policy and Planning Department employees to the QRC to smooth this transition .
A second important change in the internal context was the mismatch between UNESCO and MoE's
initial conception of the EMIS as primarily a tool for storing and report ing on educational data and
statistics, versus the subsequent requirement from key MoE stakeholders that the OpenEMIS platform
also incorporate numerous Enterprise Resource Planning (ERP) features to facilitate various internal
MoE processes . The project team responded to this requirement by building a series of custom
workflows into OpenEMIS - including around student and teacher transfers and examinations
administration .
The most significant change in the external context was the strong focus that emerged on modifying
OpenEMIS to be able to collect, aggregate and disseminate data on Syrian refugees in Jordanian
schools . The project team responded to this change by dedicating time and resources to addressing
this issue in Project Years 2 and 3. The MoE's requirement for technical advice and support in a series
of areas outside of the core EM IS workstream - including GIS, education sector financing and teachers'
professional development - was accommodated through the incorporation of the additional
'Complementary Technical Advisory Services ' project component.
4 .7.2. EFFICIENCY
A series of internal and external factors negatively influenced the implementation of the core
EMIS elements of the project, resulting in delays. Yet given the high importance of addressing
these factors, such delays would seem to be reasonable. The project was originally forecast to last
33 months, from 1 February 2014 to 31 October 2016 - but was eventually extended by 13 months .
Due to delays with the implementation and customization of the EMIS , the main capacity building
-'2 Interview with N!lhan S1arn Jordan EMlS Programme Advisor-"CSF, 29 November 2017 Further details on the legacy
EDUWAVE system's weaknesses can be found in the Context and Justification section of the EU Ong,nal EMIS Project
Doc um ant (2013)
component did not start until around six months before the original project end date - leading to the
extension of the overall project timeline and the implementation of the 'Accelerated Capacity
Development Program' component.
The most sign ificant internal factor was the decision following the conclusion of the pilot-testing phase
by the then-Minister for Education to transfer responsibility for the OpenEMIS project from the Policy
and Planning Department to QRC . The disruption caused by this uprooting of the project was
compounded by the subsequent lack of clar ity on individual and departmental roles and responsibilities
in relation to the EMIS in MoE , as well as a lack of capacity in QRC in key skills required for managing
and using OpenEMIS to its full potential. Second, the mismatch between UNESCO and MoE 's initial
conception of Open EM IS and the subsequent requirements of key MoE stakeholders in relation to the
system (outlined above) also negatively affected the project's implementation . The additional time and
resources spent customizing OpenEMIS to accommodate a variety of different workflows had a
detrimental effect on the capacity building component, delayed project delivery overall , and resulted in
a platform that is more complex and difficult to maintain and manage than initially envisioned. The
internal factors referred to above - namely the coopting of the EMIS to provide data on Syrian refugees
and the addition of the 'Complementary Technical Advisory Services ' module - also negatively
influenced the project's implementation by drawing resources away from the core EMIS implementation
and capacity building .
The UNESCO core project team's resources appear to have been used efficiently, but there were
some inefficiencies in how CSF resources were deployed. The CSF developers being based in
Singapore meant that opportunities for knowledge-transfer to key QRC staff and on-the-job capacity
building were limited .33 Contact between CSF and QRC appears to have been largely through
UNESCO , resulting in inefficient communications processes and limited on-the-job training and
knowledge-transfer. 34 QRC staff praised the quality and efficiency of project activities delivered by
PALCO , a Jordanian company contracted by UNESCO on various aspects of the installation and
maintenance of the hardware associated with OpenEMIS .35
The amount of work delivered under the project is considerable, and as such cost efficiency is
assessed to be good. The recent EU ROM report is worth quoting in this regard : "The customization
of OpenEMIS to the Jordanian education system and the sheer amount of data that was processed
from 43 field directorates, approximately 8,000 schools, 100,000 schoolteachers and more than 1
million pupils is remarkable and required substantial resources from both the Implementing partner
UNESCO and the Beneficiary MoE ."36 A consultant contracted by the EU who is familiar with the
OpenEMIS project since its inception also remarked on the wide scope of what has been delivered. 37
4.7.3. EFFECTIVENESS
UNESCO's planning, management and coordination of the project was adequate overall,
although certain areas could have been improved. The initial design of the project appears to have
underestimated the sheer scale of the task of implementing Open EM IS in a country of Jordan's size
and the corresponding capacity building needs . It is also an open question whether the UNESCO project
---- -· ·- ·- -~.
J_; Interview w,th UNESCO Arnrnan ProJect Team 19 November 2017 Interview witll Marwan Turman. Head of EMIS and E-
L.ean,ng Division. ORC 5 De, errbe, 2017
l
1
l•1ter ✓1ew v.:th UNESCO Amma11ProJect Tearn 19 Nove<Yber 20 1 l. ROM Mon•tonng Report (June 2017)
1
' Interview with Marwan Turman Hea,j of El\:11Sand E-Le ctrning Division ORC 5 Decer,,ber 2017
1
- '' ROM Monitor:ng Report June 20171
The project M&E strategy suffered from some weaknesses. The project logframe is very weak. The
majority of the 21 indicators are expressed more as aspirations or aims rather than measurable and
verifiable indicators of success . The link between activities , indicators, expected results/outputs and
overall objectives is also unclear and often overlapping . The result is that the logframe is not an effective
tool for project management or for monitoring and assessing progress towards expected results and
outcomes. Assumptions and risks are also weakly formulated and appear not to have been properly
assessed . Due to these deficiencies , the logframe appears to have been largely discarded and replaced
by a detailed action plan consisting of over 60 activities as the key project management tool. As such,
project implementation seems to have become very activity-driven rather than objective-driven.
Monitoring was therefore mainly focused on the implementation of activities rather than on the
achievement of objectives .40 MoE also appears to have been a weak counterpart for the project's M&E
strategy and did not systematically produce expected progress reports on Open EM IS implementation. 41
Beneficiaries were overall satisfied with the services delivered, but recognize the need for
additional capacity building . All key MoE stakeholders interviewed felt that the implementation of
OpenEMIS had been a success, yet all emphasized that additional capacity building and knowledge
transfer is necessary to enable MoE to manage and use the system to its full potential. Key MoE
stakeholders also reported that they were satisfied with the services delivered under the
'Complementary Technical Advisory Services' module , although the previous EU ROM report , the
UNESCO EMIS project narrative report and a key EU stakeholder reported that quality had been
variable across these ad hoc consultancies . A key QRC stakeholder, while generally satisfied with the
way the project had been managed and delivered , highlighted two key weaknesses : the delays in
implementing and customizing the system to meet MoE requirements , and the fact that training and
knowledge transfer was carried out sporadically in an ad hoc manne r rather than implemented
systematically with predetermined curricula, written materials, and time allocated for direct supervision
and on-the-job training.42
'' ROM Monitonng Report (June 201 lnterv,ew with J.3cob Arts. EU Programme Manager Educat,on and Youth Cooperation
Section 5 Decernber 20 1 7
1
'' Interview wih Jacob Arts. EU Programme i,1anager Education and Youth Cooperat,on Section 5 December 2017
This issue ,•,as aiso ,a1sed ,n the ROM Mon,tonng Repor t (June 2017)
-1.' Interview w,th Marwan Turman. Head of EMIS and E-Learrnng D1v1s1on.ORC 5 December 2017
/
data, the urgency of meeting these demands meant the UNESCO EMIS project team often had to step
in as the primary point of contact for EMIS data and reporting requests .43 An EU consultant who
periodically required EMIS data in order to monitor education sector performance also reported that , as
QRC were not able to respond to his requests for EMIS data, the UNESCO project team became his
primary interlocutors for EMIS data requests. 44
4.7.4. IMPACT
The project is well-positioned to generate meaningful change in the future. With the
implementation of the EMIS complete and the system fully operational , future support can focus strongly
on building MoE capacity to autonomously manage and utilize the system to its full potential. An
important impact of the project in the medium term will be the utilization of the EMIS to generate and
monitor KP ls during the implementation of the upcoming Education Sector Plan . The EMIS should also
support MoE in continuing a positive and constructive relationship with donors by allowing it to respond
effectively to requests for data and demonstrate evidence-based decision-making .
4.7.5 . SUSTAINABILITY
The core IT and data collection aspects of the EMIS are likely to be sustainable into the future,
although use of the EMIS for evidence-based policy-making needs to be further reinforced
before it is sustainable . The OpenEMIS platform is now successfully established and data collection
procedures across field directorates and schools are in place and likely to be sustainable . Further
support will be required to strengthen the link between the core IT and data collection activities and
reporting , such that the systematic incorporation of EMIS data into decision-making becomes
sustainable .
4.8. GENDER
A significant proportion of central MoE staff who have received training and capacity-building on
OpenEMIS are women . At the school level, most principals and teachers who have received training
on OpenEMIS and are largely responsible for OpenEMIS data collection are women .
" 4 Interview w,til Dr Joachim (Joe) Friedricl1 Pfaffe EU consultant, G December 2017
5. LESSONS LEARNED
1. The design of the project underestimated the magnitude of implementing OpenEMIS in a
country of Jordan's size and the corresponding capacity building needs of MoE. The full
customization of OpenEMIS to the Jordanian context and MoE requirements , roll-out across the
country and linking of the system to other public-sector databases took much longer than initially
anticipated. MoE's institutional capacity to absorb the OpenEMIS technology was also lower than
expected . Efforts towards the end of the project to intensively build MoE capacity and accelerate
knowledge transfer were not sufficient to ensure that MoE was able to independently manage and
utilize Open EM IS to its full potential by the close of the project.
2. A disconnect between UNESCO/MoE's initial conception of the EMIS and the subsequent
vision of other key MoE stakeholders for the system's purpose and functionality delayed
and complicated OpenEMIS implementation. The initial idea behind Open EM IS was that it would
be primarily a tool for aggregating, analyzing and reporting on educational data and statist ics to
provide a basis for and strengthen evidence-based policy- and decision-making. However , from the
pilot-testing phase onwards the UNESCO project team had to accommodate repeated requests
from MoE that OpenEMIS be customized to support various MoE internal processes and workflows
- from student and teacher transfers to examinations administration . The expansion of the scope
of Open EM IS to accommodate numerous Enterprise Resource Planning (ERP) features led to the
creation of a system that is more complex to manage and use than initially anticipated , and drew
time and resources away from building MoE capacity on the core data analysis and reporting
aspects of OpenEMIS.
3. A lack of clearly defined roles and responsibilities within MoE in relation to the EMIS
negativity impacted project implementation and sustainability . MoE's decision early in the
project to transfer responsibility for OpenEMIS from the Policy and Planning Department to QRC
caused delays to the project timeline and a lack of clarity around departments ' and individuals ' roles
and responsibilities in relation to OpenEMIS within MoE. The decision also served to weaken the
link between the core data storage and management aspects of OpenEMIS and the use of
educational data to drive evidence-based policy-making: QRC is focused on data collection and
management of the IT aspects of OpenEMIS, but lacks the capacity and perspective for generating
custom reports and indicators and linking EMIS data to planning and decision -making processes.
At the same time, due to a combination of staff turnover, lack of capacity and lack of engagement
with the OpenEMIS project among certain key stakeholders, the Policy and Planning Department
is also not able to generate reports and drive the integration of EMIS data in MoE planning and
decision-making processes .
4. The project adapted well to unanticipated external developments , but these had an
unavoidable negative impact on project implementation. The need to respond to out-of-scope
external requests in relation to the EMIS - most notably the requirement to modify OpenEMIS to
collect and disseminate data on Syrian refugees in Jordanian schools and to support the Ministry
of Finance in the one-off disbursal of government funds to every Jordanian student in a public
school - drew time and resources away from core Open EM IS implementation and capacity-building
activ ities. Yet meeting these requests served to increase OpenEMIS visibility and credibility among
key MoE and donor stakeholders . The same is true for the addition of the 'Complementary
Technical Advisory Services' project compQnent in the final year of the project - which provided
important support to MoE in a series of key strategic areas and reinforced UNESCO's credibility at
the policy level, but impacted negatively on the core EMIS workstream .
): :,
5. Top-level MoE engagement and support is crucial for the successful implementation and
sustainability of a large-scale technical assistance project such as OpenEMIS. The decision
by the former Minister for Education to transfer responsibility for the EMIS from the Policy and
Planning Department to QRC left the EMIS institutionally adrift and without clear ownership and
oversight of the project within MoE . This negatively impacted on the project timeline and left the
UNESCO project team without the most suitable interlocutors for building capacity on the data
utilization and reporting functions of OpenEMIS . The current Minister for Education's reportedly
stronger direct engagement with the EMIS provided a boost to the implementation of OpenEMIS
activities in the final year of the project (notably data collection for the 2017-2018 school year) , and
provides a positive foundation for the next phase of the project.
6. RECOMMENDATIONS
Any future iteration of the project should:
1. Continue building QRC's capacity to maintain and manage the core IT aspects of the
OpenEMIS platform, but identify a more suitable MoE counterpart for strengthening the link
between EMIS data and its utilization for evidence-based policy formulation. QRC will
continue to house the physical OpenEMIS hardware and technological infrastructure, and will
require additional capacity building to be able to independently maintain and manage the system .
QRC is also likely to have primary responsibility for driving EMIS data collection. However, QRC
lacks the capacity to use EMIS data to develop and monitor indicators and generate custom reports
for various MoE departments . As such, a more suitable MoE counterpart should be identified to
take responsibility for utilizing EMIS data for evidence-based policy formulation. Given the currently
weak capacity of the Policy and Planning Department in this regard, this is likely to entail identifying
suitable counterparts across other MoE departments or supporting the set-up and capacity building
of a new, dedicated MoE unit or sub-department for evidence-based policy formulation.
5. Ensure the project is coordinated and aligned with other donor support to MoE relating to
planning, M&E, research and evidence-based policy making . OpenEMIS capacity building and
support for evidence-based policy making should be careful coordinated with the development and
monitoring of indicators for the upcoming Education Sector Plan, as well as other relevant donor
'•,•, r
initiatives such as the DFID Educational Research Programme and the broader ERfKE framework .
Coordination with donors on approaches to supporting the integration of Syrian students in
Jordanian schools - and likely data needs - will also help to anticipate external demands on the
EMIS .
7. ANNEX 1: LIST OF SECONDARY DATA
The table below lists the documents and secondary data that was reviewed as part of the evaluation :
Document Source
EU project documents
Narrative reports
Name Position
Name Position
ENGLISH
Introduction to the evaluation and the context of the interview ,
1. Could you please tell me in your own words what you understand about EMIS? What is it? How
does it work? Why is it important?
2. Do you feel that EMIS has been well-explained to you and that you understand the system and its
uses?
3. Were you consulted on the design of the current EMIS? Was there an opportunity for you to have
input into what they system should look like and do?
4. How do you personally interact with EMIS? What are your responsibilities in relation to the system?
5. Did you receive training on using EMIS? Was this training helpful? Is there any way you think the
training could be improved?
6. If you have a problem with the system of there is something you don't understand , how do you
resolve it? Is there someone you can go to for help?
7. Do you feel that you have sufficient knowledge and understanding to use EMIS effectively? What
are the things you understand well about the system, and things you don 't understand so well?
Who would you ask if there is something you don't understand? Have you sought help on any
aspect of using EMIS before? What was it and who did you go to for help? Were you able to resolve
the issue?
8. How much time do you spend using or updating EMIS? Per day? Per week? Does this interfere
with your other work? Would you say that EMIS has any negative impacts on your job? If so, what
are they?
9. Do you find the system user-friendly? Do you experience any problems using the system? Is there
any way you think it could be improved?
10. How does your field directorate/school use EMIS? Can you give me some examples of how you
have used it for decision-making? Or how the system has been useful to you in doing your work?
11. In your job, what are the most important types of data that you need to make decisions? Do you
use EMIS to get this data? Are there other systems or means for obtaining this data?
12. What kind of data does EMIS provide on students? How do you update and use this data?
13. What kind of data does EMIS provide on teachers and staff? How do you update and use this data?
14. What kind of data does EMIS provide on the school and its facilities? How do you update and use
this data?
15. Do you feel that EMIS is useful? Do you think it improves and strengthens the education system in
Jordan? If you met a colleague from another country where they do not have an EMIS, would you
recommend it to them? Why/why not?
16. Are there any particular things that you feel make it difficult to successfully implement and run an
EMIS such as this in Jordan? What are they? How could the system be better suited to the local
context in Jordan?
ARABIC
I ~ ., . )
\. ....
't- ''t..
t
....
[,
C
~
....
[,
[,
''i.
i;-
....
~
b<';"
-~
J
L
~
i;-
(,,
·-~
G.,
'[ f ;~
;~
C
t f"' ~ r
(,,
.:-
'\
.c "
~-
~-
" ~
tf . ..l'''
t
t-· ~. (,,
7'... >!.- i;- .,.
T er
f.'I ~ -
~
t l· ~i
(: ·• E E r.
1 c ....i· .... ~t 1 f, -~.,. .r ·~
"l l '-__ '.··(. ~i;- I
.[. ~- :[[, .:,,
~ [
' ,c ~
...: f r=
.,;: t ~-
~ ''t.. ;; l { {
l ~- tr
' i.. ' .
~
l'- ••
~
....
_t,.
t.
:
;,t
E , .•·
t
. ~.
t ·. . ~
·<t.
'~
'"- 'Ee f
(,,
i;-
·_,
..
t
l 'r- l_ ~
: fc
-
f t ~t
·
r .,-:f.
t
{ r·( ·'t,. -
t 1 :l
·-~ _~-
1 ~£ r- l'L. ~ ....
i.,,
L - C... - -• ' .t
~ ''t.. \· ~ ~
.,. i.,, i.,, .,. - . - • "' L "'
.(•· . t .(·· e: r
' f l-
y; _ ·'t,.
~
L . \.
.~' ·'t,t. ...~, [~~ L F'
:_
_~F . );. f \;-· ,t·.. \." 1. ~~ ~~
c;, ..
i;- ,
~- ' ~
' t..
·t
L:
1(,, l { - l.t :E l
..
~
C _;
\ ·(;,( ~ ''t.- .... l. (,, f ' ;."' [
~
.
'r, ....·C
'L r-
....
I
.
t
I ·-~t
~ t'
t
l'o,
~-f
l- I:; tt \,f
. -
;."' f ~ f ·~ ~ l f t
t
.- ·
t t ,[
r1, 1E· l.: \•t, ~'i-: \-' .:,,
~ ~ r i
~ 'i. ~
~ ,r ,(_
ti? " 't
{J
ii' ~•
ff: ' ·f ' . ~- ~
~ ['-
- 1- en .§i k
~~... 'g
'f-£££
~~~~(SC
(,, ....t t t t );.:~
(,
~
t
fl'- 'ff':- :
~ ' f . [, tf
(SIS
,\:, l.
t'
-
t'~''i.
. ;."'
f fu [,
S
' . ;~ b
' - ~''- ~- ~t
~[·: E ~- ~ f. ~ < 1-.-~ [ ~ ~~ .~ f ~-.~ t- 1.
f. . t
r,,. ..
.c r;,"' 1
....1 'ii
. : '1::_ '1::_ '1::. (,,
- - - (,,
~
r .: e ..~. t, ' ~..(,,
''i,
b E. -~
~~-'!:
~-·- ' · ~ t C.
.... ~"
' '1, \- -~ '; ~
G{
. 'e-
,. [ ~- ....
I:
' i.. .:,, f
.. [,
~
-~ ~- i:: 1. l ·E,E t r ~..[: .~ ~ ~ r ~ 1 • t r ! t .[: f, 't 't; ::
C· ~ - [, [, ·<t, ·
[, -; - - t:,
't,_ ~g;~
~ ~
b
~
r-\'t. b
~
. e~~
~
.. ~
~[~~~ -~~~
~
.
b ••
,-I 0
"' -1 --< ~
-" > < ,-I 0
"' -1 --< ~
11. ANNEX 5: OECD DAC CRITERIA INDICATIVE
QUESTIONS
Criteria Indicative questions
Relevance
• When it was designed to what extent were the overall objectives of
the project and those of each component consistent with the needs
of the country and the beneficiaries' requirements?
• To what extent was the project model contextually appropriate and
responsive to the needs of the beneficiaries? How has this model
gained resonance with or generated support from local
stakeholders (government , private sector, communities)?
• As the context continuously changed was the Programme and each
component/activities still relevant? Were the changes made in
response sufficient?
• Was it appropriate to have a country project of six components and
multiple activities? Were the components and activities integrated
or not, and if not, should they have been?
• Which activities of the project should be flagged for further
exploration or analysis?
• To what extent has the project fostered coord ination with other
funders/initiatives and integration with other providers of similar
activities?
Efficiency
• Did any external/internal factors positively or negatively influence
the project's implementation? Were any problems or bottlenecks
encountered? If so, what were they?
• Were the outputs delivered in a timely manner? Where applicable,
were delays experienced by the project reasonable, given the local
context?
• What measured were taken to ensure that resources were
efficiently used?
• Was the project overall, and each component/activity, cost efficient
(funding, personnel, installations and resources?)
Effectiveness
• How well did UNESCO plan, manage, and coordinate the project
overall?
• To what extent were the objectives achieved? What major factors
influenced the achievement or non-achievement of the objectives?
How well were the objectives designed initially and what can be
improved for the design of future projects?
• Did the context in Jordan affect delivery of the project? Did
UNESCO plan well for potential changes?
• Were there examples of innovation and best practice?
• How effective and useful was the project's M&E strategy? How
adequate were data collection tools to obtain useful and
valid/reliable data? To what extent were data supporting project
related decision-making and allowing the program to make timely
adjustments?
• To what extent were beneficiaries (direct and indirect) satisfied with
the services delivered?
• What were the positive and negative, primary and secondary short-
term effects produced by the project, directly or indirectly, intended
or unintended?
' '
r
r
Impact
• To what extent is the project positioned to generate meaningful
change in the future, considering that the impacts of such
interventions are often longer term in nature?
• What are possible short- and medium-term positive/negative
impacts of the project?
• How can positive aspects be maximized or replicated/scaled up in
the future?
• To what extent are differences in impact explained by variations in
implementation?
Sustainability
• Will the benefits from the project continue into the future and what
is the probability of continued longer -term benefits?
• What aspects of the project are likely to be most sustainable into
the future?