SDN Monitoring SOP Draft 30042024
SDN Monitoring SOP Draft 30042024
SDN Monitoring SOP Draft 30042024
Department Programme/M&E
Version 1.0
2
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP
Table of Contents
1. SOP Context.........................................................................................................................................4
1.1. Background................................................................................................................................4
1.2. Purpose.......................................................................................................................................4
2. Monitoring Procedures.......................................................................................................................5
STEP 1: Prepare or revise performance monitoring narratives and logical frameworks...............5
STEP 2: Prepare CSP Monitoring, Review and Evaluation (MRE) Plan and draft M&E Budget.......6
STEP 3: Develop a monitoring toolkit....................................................................................................7
STEP 4: Collect primary data and collate secondary data..................................................................8
4.1. Process monitoring......................................................................................................................8
4.2. Output monitoring.....................................................................................................................12
4.3. Outcome monitoring.................................................................................................................12
STEP 5: Capture, compile and validate data......................................................................................15
STEP 6: Analyse data and prepare information products................................................................16
STEP 7: Make use of monitoring findings: Take action and document lessons............................18
STEP 8: Conduct evaluations or reviews............................................................................................19
8.1. Planning......................................................................................................................................21
8.2 Inception......................................................................................................................................22
8.3 Data collection and management............................................................................................22
8.4. Reporting and dissemination...................................................................................................23
3. Roles and Responsibilities................................................................................................................24
Annexes...................................................................................................................................................... 27
Annex 1: Sudan CO process monitoring master tool.......................................................................27
Annex 2: Monitoring findings’ action plan tracking system.............................................................27
Annex 3: List of acronyms....................................................................................................................28
3
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP
1. SOP Context
1.1. Background
WFP Sudan’s operation is one of the largest and most complex operations globally, with
over 8 million unique beneficiaries receiving assistance in 2023 amid unprecedented
insecurity and access challenges following the conflict which erupted in April 2023.
To this effect, the Sudan Country Office has developed a robust monitoring system
aligned with the Corporate Result Framework and based on the Sudan CSP logical
framework. The M&E system is in line with the emergency’s nature and the current
humanitarian response, and the corporate Minimum Monitoring Requirements. The
monitoring system is designed to measure process, output, outcome, cross cutting
indicators.
1.2. Purpose
The Sudan CO monitoring SOPs describe the common procedures, timing and
responsibilities for the design and implementation of the Sudan CO monitoring system.
They are intended to ensure that anyone assigned with the responsibility of managing
the M&E function at the Country Office, and Field Offices, follows the same steps and
timelines, and is guided by the same underlying principles and processes in monitoring
data collection, analysis, and reporting. They provide monitoring staff with information
needed to perform their tasks appropriately by describing step-by-step how the
monitoring system is implemented in Sudan following the corporate WFP CSP
monitoring cycle (see Figure 1).
4
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP
The primary users of these monitoring SOPs include Country Office management, the
programme unit, M&E staff, the Hub and Field offices, and reports and donor relations
unit. The proposed roles and responsibilities of relevant staff are included under each
functional area and step of the monitoring cycle shown in Figure 1. The monitoring
SOPs are considered mandatory for the conduct of Sudan CSP monitoring, remain
adaptable and subject to revision and re-approval to align with changes in CO strategy,
operations, and context.
The SOPs are fully in line with the Corporate Standard Operating procedures for CSP
Monitoring and the Corporate Monitoring Guidance.
2. Monitoring Procedures
The eight steps of the Monitoring SOPs are described in this section.
The development of logframes is an important first step in the CSP monitoring cycle,
establishing intended results, and accompanying indicators to be tracked during the
course of the CSP life. The development or review of logical frameworks for the CSP
and funding is the responsibility of the CO M&E Unit. Technical units should be
consulted for inputs during the development/review of the Logical framework. CSP
logframe development is done during the CSP development or Budget Revisions (BR).
Revision of the logical framework during any BR should be informed by CSP strategic
changes/adjustments. If there is a challenge in reporting any of the indicators
5
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP
committed or any other challenges warranting changes to the logframe, M&E unit
should ensure that review of CSP logframe be part of the BR objectives, to allow any
proposed changes.
The Head of Programme is responsible for approving the logframe at CO level and is
done outside COMET system. The CSP logical framework should be validated in COMET
only after clearance by the Head of M&E and approval by the Head of Programme. After
approval of the logframe at CO level, the logframe is submitted to RBN for validation,
and subsequent submission to HQ for approval.
In addition, all logframes developed for different funding proposals should be aligned to
the CSP logframe in terms of types of indicators being committed and targets as
applicable. The alignment of logical frameworks for donor reporting to the CSP logical
framework limits/prevents changes to the monitoring systems and additional
monitoring needs, thus ensuring efficiency. However, in exceptional cases, new
indicators may be incorporated based on donors’ requirements.
To ensure that indicators are systematically monitored and reported on, and that
reviews and evaluations are planned and budgeted for, a VAM Monitoring & Evaluation
Planning and Budgeting Tool is required. The VAM Monitoring & Evaluation Planning
and Budgeting Tool captures all indicators included in the CSP logframe, timing,
frequency, and costing of activities throughout a CSP, and related staffing needs,
equipment, and services. The tool is linked to the corporate budget structure and feeds
into the Country Portfolio Budget (CPB). The tool is updated on an annual basis, in
January and ensures that adequate resources are planned for monitoring needs
annually. The update includes reporting on actual activities undertaken in the past year
2and actual expenditures used and adjusting planning for the upcoming years of the
Country Strategic Plan if needed.
The log frame can include other indicators than the ones in the indicator compendium
responding to specific donor requirements. In addition to outcome and output
indicators, monitoring of process indicators and cross-cutting indicators (for gender
equality and women’s empowerment, protection, and accountability to affected
populations, nutrition integration and environmental sustainability) should also be
included. The tool describes how each indicator will be monitored and is presented as a
matrix. It also includes basic information on reviews and evaluation plans. The process
6
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP
for preparing the VAM Monitoring & Evaluation Planning and Budgeting Tool should be
participatory, and the final tool endorsed by the head of programme. The VAM
Monitoring & Evaluation Planning and Budgeting Tool should be prepared by the CO
M&E Unit in consultation with activity managers, programme and BPU staff. Preparation
of VAM Monitoring & Evaluation Planning and Budgeting Tool should be done
immediately after CSP approval and submitted within the first quarter. This timeline
allows for ease of reporting especially in reporting the previous years’ expenditures
after the annual financial closures. The approved plan and implementation report, once
cleared by the Head of Programme is shared with the Regional Office through
Monitoring and Review Advisor, for review and facilitating clearance by HQ.
The MRE Plan lists the sources of data and processes through which the data will be
collected as well as the data collection methods. The main data collection methods in
the Sudan CO include process monitoring, output monitoring and outcome monitoring.
The tools (checklists, questionnaires, and databases) required to conduct each type of
monitoring need to be developed. This involves adapting corporate methodological
guidelines and data collection templates provided in Survey Designer1, or creating new
ones as required. All monitoring tools are expected to be developed by the CO M&E unit
in consultation with the technical units and Hub programme staff, based on the
approved CSP logframe indicators in addition to various programmatic needs. Reviews
and Evaluations are done in consultation with the country office management and
relevant units. Surveys and checklists are first developed in English then translated into
1
Survey Designer is an application that allows the CO M&E team to quickly and easily build standardized
assessment and monitoring surveys. Standardized data collection tools assure that collected data meets
corporate quality standards and enables data integration.
7
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP
Arabic by WFP, which are then inserted in MoDa2. Final forms designed in MoDa are
then cross-checked and tested to ensure that the flow, visibility, and validity conditions
are aligned and compatible. Details of the tools by monitoring exercise are provided in
the CSP Monitoring, Review and Evaluation (MRE) plan and M&E toolkit. The monitoring
tools can be updated on a regular basis as required in accordance with corporate
guidance and relevant CO unit’s needs.
Development of the methods and required tools should begin as soon as the MRE Plan
is complete and should be finalized before CSP activities begin. Once the tools are
finalized and tested, the Analysis Plan should be prepared.
Data collection procedures are described in line with the different types of monitoring.
2
MoDa is the main platform for data collection for WFP. The application, available on both web and mobile,
allows M&E teams to collect data in a safe and secure manner.
8
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP
Process monitoring data is collected each month based on the approved Monthly
Monitoring Plan (MMP) by WFP monitors (FMAs) and Third-Party Monitors (TPMs).
Monthly Monitoring Plans (MMPs) are developed at hub office level according to the
implementation plan at the beginning of each month, outlining the monitoring activity
and the sites selected. MMPs are developed using the Risk-based Monitoring
Framework (RBMF) which provides standardized templates for planning and tracking
monitoring visits and monitoring coverage. This approach helps better in allocating
resources for monitoring and helps to ensure that Minimum Monitoring Requirements
are achieved across the year. The information from the RBMF templates help the CO to
develop detailed monthly monitoring plans based on the full list of all activity sites and
prioritize sites that were not monitored in the previous month(s), sites that have a high
number of beneficiaries, sites that have a high risk of issues based on previous reports,
and sites that may become inaccessible soon due to security or weather conditions.
The Hub Office monitoring plans are shared with the CO M&E team who then
consolidates them, provide guidance, and oversee the monitoring plans as well as track
CO monitoring coverage. The consolidated MMPs are then shared with the hub offices
and TPM service provider for implementation.
Indeed, WFP Sudan CO commissioned a TPM firm to complement its capacity and
collect monitoring data with the aim to increasing monitoring coverage where WFP staff
cannot access due to security reasons. This aims to ensure WFP meets the corporate
minimum monitoring requirements (MMR) and benefits from the efficiency of private
sector firms in this field. WFP Sudan outsource around 70-80 percent of its process
monitoring activities to the TPM service provider particularly in access-constrained area
where WFP staff are unable to conduct monitoring activities.3
3
For more information, kindly refer to the Third-Party Monitoring SOP.
9
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP
The MMPs are revised, as required, at the middle of the month to incorporate any new
information received from programme units regarding the distributions.
The process monitoring planning flow is summarized by the figure 2 which also
indicates responsible and expected timelines.
Develop Hub Office MMP using the RBMF tools and share it with the CO M&E team
Res pons ible: Hub M&E focal point Timeline: 20 of every month
Fe e dbac k to Hub
Share the final MMP with the Hub Offices and TPM for implementation
Res pons ible: CO M&E team Timeline: 28 of every month
* MMP is revis ed, as required, at the m iddle of the m onth to incorporate any new inform ation received from program m e units .
Once the approved MMP is shared by the CO M&E team, Hub offices and TPM service
provider plan for monitoring field visits. Both WFP FMAs and Third-Party Monitors are
then deployed to the field for data collection according to the approved plan. Data
collection is done using tablets with ODK Collect application. Data is collected offline
and uploaded to the MoDa server whenever monitors access an internet environment.
The Sudan CO adopted a process monitoring master tool allowing to collect data for
each of the type of monitoring describe earlier. Specific controls have been
programmed to direct the monitors to the specific section of the tools to use based on
the type of monitoring being conducted. The MoDa master process monitoring is
available in Annex 1.
Upon finalization of data collection at the selected locations, monitors conduct exit
debriefs with the Cooperating Partners responsible for that area while at Hub Office
level, WFP monitors debrief Head of Programme.
10
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP
In addition, during process monitoring visits, WFP field monitors or Third-Party Monitors
may discover issues of concern that require immediate action and need to be reported
without delay. This may include discovering fraudulent activities or personnel
misconduct, as well as circumstances which may negatively impact programme
implementation and the well-being of beneficiaries. Issues that may hinder the
provision of assistance in a dignified, safe, and transparent manner shall be escalated
immediately by the monitors as describe in the Sudan CO Monitoring Issues Escalation
SOP.
11
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP
As per the United Nations Development Group definition, outputs are changes in skills
or abilities and capacities of individuals or institutions, or the availability of new
products and services that result from the completion of activities within a development
intervention within the control of the organization. They are achieved with the
resources provided and within the period specified. The overall importance of output
monitoring is to ensure the planned progress of output results is sustained. Data
collected through output monitoring enables managers to take corrective actions
needed to reach planned or intended output results which serve the larger objectives
and goals of projects/programmes. Output monitoring in the Sudan CO is managed by
This level of monitoring is managed by the Information Management unit.
In the planning phase, output indicators and the related planning figures are set in the
Needs Based Plan/Project Plan. The Needs Based Plan/Project Plan provides data on the
identified annual and overall beneficiary targets and needs for the entire duration of
the Project/Country Strategic Plan, including total beneficiaries, food and/or CBT rations,
and assistance calendar. Output indicators and planning figures should also be included
in partner agreements and thereafter in the monitoring visit plans.
Output data in WFP Sudan are collected through standardized monthly Cooperating
Partners’ (CP) Distribution Reports (DR), which are completed and submitted by CPs (or
by WFP staff when implementing direct distributions). The report provides data on
beneficiaries reached (disaggregated by sex and age) and the amount of food
commodities/ cash-based transfers (CBT) distributed. Other output data are collected
through periodical activity progress reports (depending on the context) as well as CPs’
final project reports (also known as “completion reports”) that the CO design and
standardize in agreement with partners and include in Field Level Agreements.
12
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP
The CO M&E team is responsible for planning and coordination of outcome monitoring
data collection. The CO M&E team should prepare a detailed “project plan” prior to any
outcome monitoring survey, The plan should include the following elements:
The main objectives described in the plan can also be used to communicate about the
data collection activity with all stakeholders and can be translated for sharing, for
example, with relevant authorities such as HAC or COR to obtain the required permits.
Outcome monitoring is conducted at Strategic Objective levels (SOs) twice a year for
SO1 and once a year for SO2 and SO3 in accordance with the CSP outcome indicators
data frequency mentioned in the MRE plan.
The sample size for outcome monitoring will be representative of the target population,
aiming at statistical reliability with a 95 percent confidence level (minimum 90 percent),
with +5 percent precision and the previously calculated prevalence of the main indicator
to assess.
13
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP
WFP Sudan conducts sample household surveys to track the CSP outcome indicators on
food security and livelihoods, nutrition indicators, and crosscutting indicators that
include gender, protection, and accountability to affected population, and environment.
Outcome monitoring for institutional-based activities such as School Feeding, Home
Grown School Feeding and Nutrition treatment, is done through desk reviews and
Nutrition Database. The common methods used by WFP Sudan to collect outcome data
are therefore household surveys, using both quantitative and qualitative data collection
methods. The Outcome Monitoring exercise under SO1 is conducted jointly with the
VAM team through the Food Security Monitoring System (FSMS) exercise. Data
collection is primarily carried out by staff from both the WFP and its government
partners. However, in case the FSMS cannot be carried out or is delayed, the CO M&E
Unit may request the assistance of the TPM service provider to assist in the outcome
monitoring data collection exercise. In the event of security challenge or access
constraints, remote monitoring could be envisaged to collect outcome data. This is
subject to the availability of the beneficiary phone number database and the network
coverage.
Data collection tools are reviewed before each outcome monitoring activity to align with
corporate guidelines and incorporate new or adjusted needs. Before data collection,
enumerators are trained on the revised tools (questionnaires), sampling techniques and
basic research principles of data collection.
On another hand, desk review-based outcome results tracking, specifically for School
Feeding and Nutrition activities is directly managed by responsible technical units. For
Nutrition activities, data on nutrition performance indicators are reported and tracked
through the National Nutrition Information System (NIS) while School Feeding outcome
indicators are reported and tracked by the School Feeding Unit. In addition, user
satisfaction surveys are conducted annually by responsible technical units (UNHAS, Log
Cluster) to report on user satisfaction related outcome indicators. The CO M&E Unit is
responsible for tracking implementation to ensure timely reporting of results and
uploaded in COMET in addition to operational adjustments.
14
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP
The table 3 summarize key roles and responsibilities of planning, preparation, and
implementing outcome monitoring.
15
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP
The completeness and accuracy of data is essential for its used for decision-making
purposes. Therefore, it is critical to ensure that all collected data during monitoring are
uploaded to the MoDa server before moving to the analysis step. This requires following
up with data collectors to ensure submission of all expected data sets as well as with
partners for output data from regular reporting as per agreements to be uploaded into
COMET.
For outcome and output levels, data should be processed and validated before final
values are entered into COMET. Where representative sampling is used, validation
includes comparison of datasets with the expected number of cases based on the
sampling, and determining whether the dataset is adequate for analysis. For output
data, validation and reconciliation must be done with relevant documents and other
data sources. Data required for corporate reporting – Annual Country Reports (ACRs)
should be reconciled for consistency on a quarterly basis.
In addition, the CO M&E Unit will implement the corporate guidance to address the
need for systematic and consistent practices for ensuring monitoring systems produce
high quality data that measures the outputs, outcomes, cross-cutting priorities, and
processes of WFP’s programmes at the CO level. The guidance outlines two approaches
for ensuring high quality data:
- Preventive measures: The first, setting up systems, processes, and tools mainly
prior to data collection, ensures that quality is an inherent characteristic of the
data produced. This is an ex-ante method for guaranteeing data quality through
the design of robust monitoring systems, anticipative action, and management
of staff resources.
- Detective controls: The second method occurs after data collection and consists
of ex-post checks on the data generated and reported to identify possible
shortcomings in the five dimensions of data quality 4. These checks are known as
4
Completeness, timeliness, reliability, validity, and integrity. For more information, kindly refer to the
corporate guidance on Data Quality.
16
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP
The CO M&E unit is responsible for producing periodic report or information products
that are summarized in Table 2.
Beside the periodic report listed above, the Sudan CO also develops dashboards in
accordance with the information needs. To facilitate communication and coordination, a
directory of the Sudan CO M&E dashboards 6 is available and should be updated
regularly.
5
Before being shared externally, the reports should be vetted in contents for an external audience and
approved by the Head of M&E and Head of Programme.
6
The Sudan CO M&E dashboards directory is available here.
17
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP
Regarding the Annual Country Report (ACR), while the overall coordination is under the
Performance and Reports Unit, the CO M&E unit holds the following responsibilities:
• Coordinate with technical units in setting annual targets for all outcome
indicators and that COMET is timely updated.
• Ensure timely availability/generation of outputs and outcome results in
alignment to the approved CSP logframe.
• Ensure timely data entry of output and outcome results in COMET, quality, and
completion.
• Prepare a waiver for all indicators not being reported and seek management
approval for further sharing with Regional Bureau Office.
• Support technical units with preparation of narratives for outcome level results,
by ensuring that results including short narratives are shared with technical units
on preparation for the ACR.
The CO M&E team will maximize the dissemination of monitoring evidence internally to
ensure that the data collection has been relevant, and the necessary information has
been derived from it. An approval from management is required before sharing
monitoring information products with external audience.
18
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP
On another hand, regular meetings between M&E, programme, other technical staff,
and management are required to review monitoring findings and implement actions
related to improving programme implementation and design (see table 3). The output
of these meetings should be the action plans developed, stating agreed actions based
on monitoring findings, responsible for implementation and timelines. A tracking
system is established to ensure that agreed action plans are implemented as presented
in Annex 2.
This section describes the procedures for conducting reviews and evaluations. These
procedures may be applicable to other specialized assessments.
7
Please refer to the Sudan CO Monitoring Issues Escalation SOP for more information.
8
In addition to the regular meetings, ad hoc meetings to review monitoring findings could be organised as
required.
9
The Sudan CO Office programme team organizes a bi-weekly programme meeting. This meeting should be
used as well to discuss monitoring findings among others.
19
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP
According to the WFP Evaluation Policy (2022), a review is the periodic or ad hoc
assessment of the performance of a programmatic intervention, or a specific aspect of
it, informing operational decision making and support learning and accountability.
Reviews do not have to conform to specified external reporting or publication
requirements or to the international standards applicable to evaluation, but they must
comply with the standards of the United Nations System-wide Action Plan on Gender
Equality and the Empowerment of Women. The Evaluation Policy also define an
evaluation as an assessment, conducted as systematically and impartially as possible, of
an activity, project, programme, strategy, policy, topic, theme, sector, operational area,
or institutional performance. It analyses the level of achievement of both expected and
unexpected results by examining the results chain, processes, contextual factors, and
causality using appropriate criteria such as relevance, effectiveness, efficiency, impact,
and sustainability. Evaluations adhere to the United Nations definition of evaluation.
There are three categories of evaluations within WFP context: impact evaluation (OEV
led), centralized evaluation (OEV led) and decentralized evaluation (managed by
CO/RBN).
However, this SOP however, focuses on decentralized evaluation and specifically those
managed at country office level. The Decentralized Evaluation Quality Assurance System
(DEQAS) provides detailed guidance about management of decentralized evaluation.
Detailed Information on different types of evaluation can be accessed here.
There are various types of decentralized evaluation which are determined by the nature
of the subject of the evaluation, as follows:
20
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP
8.1. Planning
Planning of reviews and evaluations should be done either as part of the CSP design,
annual or periodical reviews of the CSP. CSP reviews are in response to CO ad-hoc
needs or demands (i.e. for learning) or in response to donor demands or funding
proposal requirements. The planning should be in response to learning and
accountability of the CSP. Planning at CSP design is encouraged to ensure that funding
needs are also budgeted for. All planned reviews and evaluations and implementation
timelines should be included in the annual VAM, M&E planning, and budget tool. Table 4
provides the key planning steps for reviews and evaluations.
Table 6: Key planning and preparation steps for reviews and evaluations
Reviews Evaluations
1. Ensure that the surveys have been included 1. Ensure that the surveys have been included
in the MRE plans and budgeted for. in the MRE plans and budgeted for.
21
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP
Reviews Evaluations
consultant. survey will be done by an externally or
internally recruited.
3. Prepare a TOR with clear purpose, scope,
and budget in consultation with the relevant 3. Identify the evaluation manager and
technical unit. establish the evaluation committee (Head of
M&E)
3. Seek endorsement from technical unit,
clearance from the Head of MEAL and 4. Prepare a TOR with clear purpose, scope,
approval by Head of RAM. and budget in consultation with the relevant
technical unit.
4. Organize internal review team or recruit
the external consultants/TDY 5. Establish the evaluation committee and
evaluation reference group.
8.2 Inception
The approach to data collection and management differs depending on whether the
review or evaluation is managed internally by the Sudan CO or externally. In the case of
a review/evaluation managed by WFP CO, data collection tools are developed by WFP,
data collection is done by WFP monitors and analysed by the M&E unit at the country
office. When the review/evaluation is managed externally, development of tools, data
collection planning and analysis is done by the external team(s). WFP provides
22
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP
coordination and administrative support. If the review/evaluation team does not have
the capacity to recruit data collection teams, WFP provides support though its monitors
or coordinates recruitment of data collection team through hub offices. For evaluations,
WFP M&E support is restricted to coordination and facilitation only. All process will be
guided by the approved inception report. The table 5 summarized the key steps for data
collection and analysis.
Reviews Evaluation
(Internal)
1. Prepare a survey tool and pre-test by WFP. 1. Review and provide inputs on the survey
2. Develop a deployment plan in consultation tool to the review team.
with FOs. 2. Liaise with FOs and CPs to support the
3. Data collection including close data quality review team’s field mission.
check during the survey. 3. Data collection to be conducted by the
4. Share the preliminary findings for evaluation team.
feedback. 4. Provide requested secondary data.
5. Incorporate feedback in the final review 5. Coordinate provision of feedback on
findings. preliminary findings from relevant CO
technical units.
(External) 6. Share feedback on the draft evaluation
1. Review and provide inputs on the survey findings with the evaluation team.
tool to the review team
2. Liaise with FOs and CPs to support the
review team’s field mission
3. Field work by the review team
4. Provide feedback on preliminary findings
shared by the review team
Review and evaluation reporting use the recommended corporate templates 10. The
reporting process is in three stages: preparation of the first draft report, preparation of
the second draft report, following the incorporation of comments from targeted
audience and the final report after presentation of results based on second draft.
The three stages ensure quality reporting in addition to ensuring that the content of the
report is within and fully covers the scope of the evaluation. Specific for evaluations, the
reports undergo quality check by an externally outsourced quality support service using
a standard checklist.
10
Available here
23
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP
Throughout the reporting process, the report undergoes different approval stages.
While the evaluation report approval process strictly follows the corporate guidelines
(DEQAS), reports from other assessments and reviews approval process are decided at
the CO. Reviews are generally approved by the CO management while the evaluation
reports are approved by the chair of the evaluation committee (CD/DCD).
The dissemination and follow-up phase has two main purposes both of which aim at
maximizing the use of the evaluation findings. The key deliverables include: 1)
Development of management response to the evaluation recommendations and 2)
Publication of the report and dissemination of the findings.
Following the approval of reviews/evaluations reports, the M&E unit coordinates the
process of reviewing the recommendations and assigning them to relevant technical
units. The technical units are assigned a task of identifying corrective actions in
response to the recommendations with implementation timelines using the corporate
management response template. The M&E unit follows up the implementation of
activities based on the provided or agreed timelines.
24
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP
25
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP
In addition to the internal roles and responsibilities described in the table 8, WFP Sudan
partners (namely CPs and TPM) also play an important role in the M&E function. on the
monitoring, review, and evaluation function within the Sudan Country office. Within the
M&E workstreams, CPs are expected to:
Share the required information on food and cash distributions, such as the exact
location and dates to facilitate monitoring visits by WFP Fields Monitors and TPM
monitors.
Maintain a close communication with WFP FMAs and TPMs to conduct
monitoring visits.
Certify TPM attendance at distribution sites during the spot-checks conducted by
WFP.
Prepare and submit timely the monthly reports to WFP as indicated in the Field
Level Agreements (FLAs).
Support outcome monitoring data collection as required.
Participate in review / evaluation exercise as required.
26
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP
These roles and responsibilities should be communicated to all the CPs during the
onboarding stage. The TPM roles and responsibilities are described in detail in the
Sudan CO SOPs for Third-Party Monitoring (TPM).
With regards to the segregation of duty, it is important to separate the duties between
the M&E team and the teams involved in programme implementation. Such separation
of duties helps avoid conflicts of interest and ensure that staff who design programmes
and Field Level Agreements (FLAs) are not the same as those who monitor programme
implementation and outcomes. The CO management in addition to the hub offices
management also need to ensure that the M&E teams are fully dedicated to their
monitoring activities and are not diverted to perform other programme-related tasks.
27
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP
Annexes
28
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP
29