WFP 0000011896
WFP 0000011896
WFP 0000011896
8 Logframe
Evaluations 2
and MRE Plan
Reviews
7 3
Take
Toolkit
action
6 4
Data 5 Data
analysis collection
Data
vaidation
WFP/M&E SOPs
Page 2 of 26
WFP/M&E SOPs
Contents
1. Introduction ................................................................................................... 4
1.1 Why Monitoring Standard Operating Procedures? .................................................................................4
1.2 What are the Standard Operating Procedures? ......................................................................................4
1.3 Who are the SOPs intended for? .............................................................................................................5
1.4 When are the SOPs effective? .................................................................................................................5
PAGEREF _Toc505067079 \h 5
Step 1: Prepare or revise performance monitoring narratives and logical frameworks ...................................5
Step 2: Prepare CSP Monitoring, Review and Evaluation (MRE) Plan and draft M&E Budget ..........................8
Step 3: Develop a monitoring toolkit (methodologies, tools, analysis plan, implementation plan) ...............10
Step 4: Collect primary data and collate secondary data.................................................................................11
Step 5: Capture, compile and validate data .....................................................................................................12
Step 6: Analyse data and prepare information products .................................................................................13
Step 7: Make use of monitoring findings: Take action and document lessons................................................14
Step 8: Conduct evaluations or reviews ...........................................................................................................15
3. Annexes ....................................................................................................... 17
Annex 1: Sample Analysis Plan .........................................................................................................................17
Annex 2: Sample Action Sheet/Tracking system ..............................................................................................18
Annex 3: SOP Checklist .....................................................................................................................................19
Annex 4: Generic Template to Apply the SOPs at Country Level .....................................................................22
Page 3 of 26
WFP/M&E SOPs
1. Introduction
1.1 Why Monitoring Standard Operating Procedures?
In the absence of standard processes and procedures, staff can be expected to conduct core
business functions in an unharmonized, ad hoc manner. In such cases, organizations often choose
to develop Standard Operating Procedures (SOPs) to achieve uniformity of practice in relation to
specific business functions. Before the introduction of the current SOPs in 2014, WFP country
offices relied on differing processes, methodologies and frequencies to conduct monitoring, even in
very similar operational contexts. The absence of harmonized standards and procedures,
minimum requirements and expectations meant that WFP monitoring was overly dependent on
the individual capacities, competencies and initiatives of staff members charged with M&E
responsibilities at a given time. When such staff members departed, the processes and systems
were likely to deteriorate or be replaced with new initiatives by their successors. This had three
undesirable consequences: uneven quality of M&E systems, costs inefficiencies, and historical data
inconsistencies and gaps that limited the use of monitoring data. The Monitoring SOPs were
developed to address these issues. They are intended to ensure that anyone charged with the
responsibility of managing the M&E function at a country office follows the same basic steps, and
is guided by the same underlying principles and processes. Ultimately, the SOPs contribute to
improvements in WFP monitoring and are helping WFP better demonstrate its results.
1.2 What are the Standard Operating Procedures?
These SOPs describe common procedures for monitoring WFP Country Strategic Plans (CSPs), and
suggest proposed timing and responsibilities for each of the eight steps included in the WFP CSP
monitoring cycle (see Figure 1). Within each step of the monitoring cycle, the SOPs define standard
practices and expectations for monitoring the processes, outputs and outcomes associated with
WFP CSPs. The SOPs are organized in relation to this same cycle beginning with upstream
preparation of CSPs’ logical frameworks (logframes) and performance-monitoring narratives,
through to ensuring use is made of monitoring findings, including as input to CSPs’ reviews and
evaluations. While the SOPs are considered mandatory for the conduct of CSP-level monitoring at
all country offices, the proposed timelines and responsibilities listed under each SOP are indicative
only, as they will need to be reviewed and adjusted according to the country context.
Figure 1: CSP Monitoring Cycle
Further guidance: Links to guidance describing how to carry out the procedure
of the CSP document is guided by the WFP Strategic Plan and includes a narrative section on
performance monitoring that should concisely explain how activities will be monitored. The
intended results of the CSPs are summarized in the logframe, which captures the following.
• WFP strategic goals, strategic objectives and strategic results to which the CSP intends to
contribute. These are selected from the WFP Strategic Plan and link directly to the Sustainable
Development Goals 1 (SDG 1: End hunger) and 17 (SDG 17: Partnership for implementation of
the SDGs).
• Related strategic outcomes, including their focus area, crosscutting results and outputs, as well as
related indicators. Strategic outcomes and outputs are free text results aligned to national
priorities, linked to corporate categories listed in the WFP CRF to allow for consolidation of
results and corporate reporting. Contributions to other SDGs are also recorded at output level.
Identifying the risks and assumptions that may affect achievement of results is needed to ensure
that mitigation actions are put in place and is relevant at this stage. Establishing baselines and
ensuring targets are set for each strategic outcome indicator selected is required within three
months before/after the start of the activity, and it is important to consult with relevant partners,
including government partners, when setting targets.1
The development of logframes is an important first step in the CSP monitoring cycle, establishing
intended results and accompanying indicators to be tracked during the course of the CSP life.
Logframe development in WFP is guided by the CRF, which provides a compendium of outcomes
and outputs categories and accompanying corporate indicators. Additional indicators can be
developed by country offices and can be used upon vetting from relevant technical units at Head
Quarters. The CRF is to be implemented in accordance with its Business Rules, the first part of which
directly relate to the development of CSP logframes.
CRF Business Rules
Logframe design and indicator selection
(i) The design of a CSP logframe should be based on the strategic outcome identified in the narrative
section of the project document.
(ii) Each free-text CSP strategic outcome in the narrative section of the CSP must be aligned to only
one strategic outcome category of the CRF.
(iii) Each free-text CSP output in the narrative section of the CSP can be align to one or more output
category of the CRF.
(iv) Each free-text CSP activity in the narrative section of the CSP must be aligned to only one activity
category of the CRF.
(v) Some strategic outcomes under Strategic Results 1, 3 and 4 might be pursued using a nutrition
sensitive approach. This should be flagged in the logframe and relevant nutrition sensitive indicators
should be selected. Strategic Result 2 and corresponding outcome categories address nutrition-specific
programmes.
(vi) For each strategic outcome category selected in the logframe, all relevant outcome indicators must
be selected from the CRF.
(vii) For each output category selected in the logframe, all relevant output indicators must be selected
from the CRF.
1For example, technical partners in nutrition, who will be implementing some of the nutrition interventions, need to be consulted where appropriate,
unless the targets selected are based on universally acceptable thresholds like the SPHERE standards.
Page 6 of 26
WFP/M&E SOPs
(viii) The WFP country offices retain the flexibility to complement the CRF with country-specific output
and outcome indicators as required; country-specific indicators will be added to COMET upon
clearance.
(ix) Cross-cutting results and indicators are mandatory for Strategic Goal 1 as applicable and must be
included in all CSP logframes.
(x) In line with international norms, the monitoring of SDG indicators and of selected national,
subnational and thematic indicators are the responsibility of national authorities with the assistance
of international organizations.
(xi) For every Strategic Outcome, assumptions must be formulated. These should reflect the risks likely
to influence the achievement of intended results and, if relevant, the mitigation actions that country
offices adopt to offset these risks. Assumptions can be internal or external to the programme and
should be stated in positive language.
The Country Office Tool for Managing (programme operations) Effectively (COMET) is used to draft
and validate CSP logframes. In COMET, location and/or beneficiary groups are essential fields for
planning and recording achievement values for outcome and output indicators.2 For instance, the
strategic outcome category “Maintained/enhanced individual and household access to adequate
food” could be planned and reported separately for refugees, IDPs and host populations. It is
important therefore to consider the beneficiary groups and locations when preparing the logframe
to be able to enter it in COMET. Regional bureaux and Headquarters staff will view, comment on
and validate the CSP logframes before they are submitted to the Project Review Committee (PRC)
for review and approval.
Responsibilities
- Draft CSP monitoring narrative: M&E staff at country-office level
- Draft CSP logframe: M&E staff in consultation with programme staff and activity managers
- Validate logframe: M&E staff
- Approve logframe: Head of programme/Deputy Country Director
- Enter logframe in COMET: M&E staff/COMET users
Timing
Logframes should be completed and submitted for approval at the same time as the CSP narrative
and other related documents ahead of the PRC meeting.
Further guidance
- Corporate Results Framework (CRF)
- CRF Business Rules
- COMET Design Module Manual
- Downloadable format of Logframe template
- Monitoring Job Aids: How to build your logframe?
2The COMET glossary provides a definition of “location” and “beneficiary/location groups” at:
http://docustore.wfp.org/stellent/groups/public/documents/manual_guide_proced/wfp257930.pdf
Page 7 of 26
WFP/M&E SOPs
Step 2: Prepare CSP Monitoring, Review and Evaluation (MRE) Plan and draft M&E Budget
SOP 3: A Monitoring, Review and Evaluation Plan is developed.
SOP 4: A draft M&E budget is prepared reflecting costs of the MRE Plan.
NB: The draft M&E budget will be finalized at a later stage, once detailed monitoring implementation plans
are prepared.
Rationale
To ensure that indicators are systematically monitored and reported on, and that reviews and
evaluations are planned and budgeted for, a Monitoring, Review and Evaluation (MRE) Plan is
required. The MRE Plan captures all indicators included in the CSP logframe and any additional
outcome, output and process indicators the country office may have selected to meet information
needs beyond what is articulated in the approved logframe. These might include indicators
responding to specific donor requirements or others that country office managers or external
stakeholders deem important. In addition to outcome and output indicators, monitoring of process
indicators and cross-cutting indicators (for gender, protection and accountability to affected
populations, and environment) should also be included.
The MRE Plan describes how each indicator will be monitored, and is presented as a matrix, as
shown in Table 1 below. It also includes basic information on reviews and evaluation plans. The
MRE Plan can be adapted to suit varying country needs and context. The process for preparing the
MRE Plan should be participatory, and the final plan endorsed by stakeholders (WFP staff at field
offices and country office, government and NGO staff). Where partners are responsible for
monitoring, this should be articulated in respective field-level agreements, with tools (e.g.
questionnaires) attached as annexes. In determining the periodicity of monitoring each indicator,
the MRE Plan should adhere to the CRF business rules highlighted under SOP 1.
The methodology to calculate outcome and output indicators is explained in the CRF Indicator
Compendium, where you can find the indicator definition, data sources, guidance on how to set
baselines and targets, indicator calculation and interpretation, and examples. The methodology to
calculate these indicators has been developed by the relevant technical units at Headquarters.
1 2 3 4 5 6 7 8
Indicator Name Data Source Data collection Baseline Frequency of Responsible for Reports When/ how/
method establishme follow-up data collection
who
nt
Strategic Goal 1 Support countries to achieve zero hunger
Strategic Objective 1 End hunger by protecting access to food
Strategic Result 1 Everyone has access to food
Vulnerable households and communities at risk of food insecurity in the Far North, North, Adamaoua and Eastern
Strategic Outcome 1
Regions have safe year-round access to adequate food
Outcome Category 1.1 Maintained/enhanced individual and household access to adequate food
Bimonthly(PDM)
Outcome indicator 1.1.1 PDM M&E/ Field
Quarterly (mVAM, SPR, PDM Management, SitReps,
Food Consumption Score, FSA(EFSA, CFSVA April Monitoring/Progr
Households FSNMS) report/ Food Executive briefs,
disaggregated by sex of mVAM, 2017 amme
Annually (EFSA, Security report programme, COMET/ACR
household head FSNMS,CFSVA) Assistant/VAM
CFSVA)
Page 8 of 26
WFP/M&E SOPs
Women, men, girls and boys, as well as communities at risk, including refugees and IDPs in crisis affected areas, receive
Output 1.1
assistance to meet their basic food and nutrition requirements
Output Category A.1 Unconditional resources transferred
A.1 Number of women, men, COMET/Management
M&E/ Field
boys and girls receiving Monthly meeting/ Mid-year
Distribution Monitoring/Progr Monthly M&E
food/cash-based cooperating NA Monthly performance review/
registers amme report
transfers/commodity partner reports ACR/SitReps/ Executive
Assistant/CP
vouchers Briefs
To help ensure that adequate resources are allocated for monitoring activities, the MRE Plan needs
to be costed. The structure of an M&E budget should be based on: (a) the organizational structure
of the country office (e.g. area/sub/field offices structures); (b) whether M&E costs are borne at
central level or decentralized; (c) whether there is a dedicated M&E team or if it is integrated within
other functions; (d) whether outsourcing is involved; (e) considerations as regards monitoring
processes, periodicity and indicator clustering (e.g. Food consumption score, Coping strategy index
and Food expenditure share are to be collected through one household survey); (f) requirements to
participate in the UN Development Assistance Framework (UNDAF) and other joint M&E processes
(e.g. membership of task forces and contribution to UNDAF reviews). After consideration of these
variables, a budget should be drafted covering the following eight key cost categories: (1) staff
salary cost;3 (2) staff danger/hazard pay and hardship allowances; (3) staff overtime; (4) staff IT
per capita cost; (5) international consultant MSLS costs; (6) contracted services; (7) travel cost; (8)
TC/IT equipment; (9) vehicle leasing and running cost and (10) vehicle acquisition cost.
Responsibility
- Draft MRE Plan: M&E staff in consultation with activity managers and other programme staff
- Validate MRE Plan: M&E staff
- Approve MRE Plan: Head of programme/ Deputy Country Director
- Draft M&E Budget: M&E staff in consultation with finance/programme staff
Timing
If monitoring activities and related costs are to be taken into consideration when preparing CSP
budgets, it is important that the MRE Plan is drafted when the CSP is being prepared. Based on
comments received on the CSP document and logframe during the PRC , the final MRE Plan should
be completed within a month after the approval of the CSP, then shared with the respective regional
bureau.
3Should include staff costs for full time M&E staff and proportionate costs for those dedicating part of their time to monitoring.
Page 9 of 26
WFP/M&E SOPs
Further guidance
- CRF Indicator Compendium
- Downloadable format of MRE Plan matrix template
- Guidance on Country Portfolio Budget
4 Post Distribution Monitoring-PDM, Distribution Monitoring-DM, Beneficiary Contact Monitoring-BCM, Food Security Outcome Monitoring -FSOM, etc.
5 Data collected by Government or other organizations for other purposes e.g. school enrolment data, demographic and health survey data, nutrition
survey data from NGOs, etc.
6 For outcome indicators, sampling needs to be representative; for process monitoring, typically not.
Page 10 of 26
WFP/M&E SOPs
be finalized with more specific budget lines, under appropriate cost categories. Finance staff should
be consulted during this stage to ensure that cost items are placed in the correct budget categories.
The final budget should also be informed by analysis of M&E capacities WFP and its partners, in
order to build in adequate resources for capacity development, as well as to identification of
resources when relying on external capacity.
Timing
Development of the methods and required tools should begin as soon as the MRE Plan is complete
and should be finalized before CSP activities begin. Once the tools are finalized and tested, the
Analysis Plan should be prepared. If data is to be stored in a database,7 this must be created prior
to start of data collection. If electronic devices are to be used, these must be procured, programmed,
tested and piloted prior to the data collection exercise.
Responsibilities
- Prepare implementation plan: M&E staff, in consultation with programme staff
- Design monitoring tools: M&E staff in consultation with technical staff
- Prepare Analysis Plan: M&E staff in consultation with technical staff
- Finalize M&E budget: M&E staff, in consultation with finance/programme staff
- Approve M&E budget: Head of programme/Deputy Country Director
- Develop/adopt databases: M&E officer with guidance from IT staff8
- Prepare list of activity sites: Area/sub/field office focal points
Further guidance
- CRF Indicator Compendium
- Data Collection Toolkit for Monitoring
- Guidance on Sampling for household data collection
- Guidance on Remote technology for WFP programme monitoring
- Information Technology Directive (ODI2012/001)
7Most offices use Microsoft Excel sheets, Microsoft Access databases or SQL databases.
8In line with appropriate IT polices and guidelines.
Page 11 of 26
WFP/M&E SOPs
output data, this stage involves partner staff filling in and submitting reporting formats to WFP as
per agreements. Once gathered, baseline values are entered or updated in COMET at this same stage.
Timing
Baseline values for outcome indicators must be established during the CSP design phase or within
three months of the commencement of activities. Thereafter, output, outcome and process
monitoring should adhere to the Minimum Monitoring Requirements as described in M&E
guidance. The interval for monitoring outcome indicators should be short enough to allow
information to be available at critical decision points, and long enough to allow outcomes to be
realized. To allow for comparison of trends in comparable circumstances, it is important that follow-
up monitoring be carried out around the same period/season that the baseline values were
gathered.9
Responsibilities
- Prepare data collection kits: M&E staff/focal points
- Prepare monthly monitoring plan: Area/sub/field office M&E focal points
- Approve monthly monitoring plan: Head of area/sub/field office
- Train data collection staff: Technical staff from M&E, Nutrition, VAM
- Coordinate data collection: M&E staff/focal person
- Collect data: WFP field monitors/enumerators/partner staff
- Receive secondary data: M&E staff/focal points or technical staff
- Update monitoring coverage: M&E focal points at area/sub/field office
- Enter baseline values in COMET: COMET users
Further guidance
- CRF Indicator Compendium
- COMET Design Module Manual
Rationale
If it is to be used for decision-making purposes, the completeness and accuracy of data is essential.
Where paper questionnaires are used, quantitative data needs to be entered in a database.
Similarly, if electronic devices are used, data has to be uploaded. This requires following up with
data collectors to ensure submission and entry of all expected data sets as well as with partners
for output data from regular reporting as per agreements. Data compilation involves collection of
all data sets if entry is done separately in delinked databases as well as consolidation of any
qualitative data for narrative purposes. Data validation/cleaning involves checking completeness,
quality and comparability of data to ensure it is adequate for analysis. In cases where data
9E.g if household baseline interviews for Food Consumption Score were carried out during the lean season, follow up monitoring should be carried out
the same season to ensure comparability with the baseline.
Page 12 of 26
WFP/M&E SOPs
collection is outsourced, the contractual document should clarify the exact format in which the
data will be received, whether any preliminary analysis will be required and in what format the
preliminary products will be presented.
For outcome and output levels, data should be processed and validated before final values are
entered into COMET. Where representative sampling is used, validation includes comparison of
datasets with the expected number of cases based on the sampling, and determining whether the
dataset is adequate for analysis. For output data, validation and reconciliation must be done with
relevant documents and other data sources. The team responsible for analysis and preparation of
information products should receive the data sets for analysis as per the implementation plan (if
it is not the same team that is compiling/cleaning data). Data required for corporate reporting –
Standard Project Reports (SPRs) and Annual Country Reports (ACRs) should be reconciled for
consistency on a quarterly basis.
Timing
Data should ideally be received and entered in relevant databases/systems within fourteen days
after collection. Data sets from outsourced processes must be received within the timelines stated
in relevant contractual documents.
Responsibilities
- Enter data in databases: Field Monitors/enumerators/contracted entry staff
- Supervise data entry: M&E staff/field focal persons10/contracted entity
- Compile & validate data: M&E staff/focal persons
- Enter data values into COMET: COMET users
- Receive validated data sets: Analysis staff (M&E staff/VAM/Nutrition, etc.)
Further guidance
- Guidance on Sampling for household data collection
- COMET Design Module Manual
Rationale
Data sets gathered under SOP 8 include both qualitative information and quantitative data. The
former often informs immediate actions that cannot wait until analysis of quantitative data is
finalized. Once quantitative data is compiled and validated, it has to be analysed to discern trends,
achievements and challenges guided by the Analysis Plan. Such analysis should include
consideration of the reasons for under-achievement of indicators and actions required to improve
performance. Similarly, success factors contributing to good performance should be identified for
dissemination to other staff. A key output of this analysis is the development of information
products, which might include, for example, monitoring reports, public information materials or
input for donor briefs. These information products should be circulated internally for decision-
making and learning purposes and externally for information sharing and accountability.
Timing
Data should normally be analyzed and draft information products prepared and circulated for
comments within three weeks after receiving the datasets. Timelines should be synchronized with
scheduled meetings where findings are to be presented.11
For purposes of corporate reporting, December reports should be ready by early January for
analysis during the annual performance review and for the preparation of ACRs.
Information products communicating monitoring findings (e.g. executive summaries for donor
briefs, technical reports, or Action Sheets for programme managers) should normally be finalized
within seven days after receiving comments. They should then be circulated as per the Monitoring
Implementation Plan.
Responsibilities
- Submit datasets for analysis: M&E staff/focal person/contracted entity
- Review Analysis Plan: M&E staff/focal person
- Analyse data per Analysis Plan: Technical staff from M&E, Nutrition, VAM, etc.
- Enter data values in COMET: COMET users
- Prepare information products: Technical staff from M&E, Nutrition, VAM
- Review and comment on products: Programme staff
- Finalize information products: M&E staff, technical staff from Nutrition, VAM, etc.
Further guidance
- COMET Design Module Manual
Step 7: Make use of monitoring findings: Take action and document lessons
SOP 12: An escalation system is established to ensure required actions are taken at the
appropriate place in a timely manner.
SOP 13: Regular meetings are conducted between M&E, programme and technical staff to
review monitoring findings and implement actions.
Rationale
The main purposes of monitoring are to inform decision-making, spur actions to improve
implementation and inform the (re-)design of CSPs. It is therefore important that procedures be put
in place to ensure that monitoring findings are utilized and actions are taken based on these
findings. This requires establishment of an escalation system so that those corrective actions that
can be taken immediately in the field are acted upon without delay, and those that need to be
escalated to sub-office or country-office level are done so in an efficient manner. Second, regular
meetings between M&E, programme and other technical staff are required. Such meetings should
certainly take place in the context of the Annual Performance Plan (APP) mid- and end-year
performance reviews, and identified actions would subsequently be reflected in partnership
11Timing must be guided by who and when the products will be used, e.g. if a report will be presented during a senior management meeting or a donor
briefing, timing should be dictated by the dates of these meetings.
Page 14 of 26
WFP/M&E SOPs
agreements and activities. Action Sheets or Unit Plans of the APP – stating agreed actions, who is
responsible for each action and indicating timelines – should be prepared and discussed during
these meetings. A tracking system is also required to ensure that agreed actions are implemented.
This could again be done using the APP Unit Plans, as this would link actions to individual staff and
to an eventual mid and end year review of programme and process results for the office. In addition
to its role in informing programmatic decision making, monitoring is also an integral part of the
overall performance management process of the office, reflected in the annual performance
planning and reviews, which allows offices to have a clear line of sight among results and an overall
view of programme and process results. The Annual Country Reports should be subsequently
derived from an analysis of CSP performance review at end-year, reporting primarily on indicators
deriving from the CRF following an analysis of the above process.
Timing
The escalation system defining which actions can be addressed at the various organizational levels
should be established at the time of CSP preparation. Meetings to review monitoring findings with
programme staff (at both sub-office and country-office levels) and other concerned units are
required at regular intervals, and whenever monitoring data and analysis is being made available.
Action Sheets or Unit Plans of the APP template should be updated and circulated prior to the next
scheduled meeting and following each round of monitoring.
Responsibilities
- Take immediate actions: Monitoring staff/Head of field offices/Partner staff
- Prepare APP Unit Plan or Action Sheets: M&E staff/focal persons
- Convene review meetings: Head of field offices /Head of programme
- Ensure actions are taken: Head of field office/Programme and activity managers/Head of
programme
- Update APP Unit Plan or Action Sheets: M&E staff/focal persons
Further guidance
- APP template (available soon).
12Centralized evaluations are commissioned by OEV which adhere to principles of independence and impartiality and follow the WFP Evaluation Quality
Assurance standards (EQAS) and the United Nations Evaluation Group (UNEG) Standards. Decentralized evaluations are commissioned by RBx or COs
and should follow the same principles and standards. Reviews are commissioned by RBx and COs and do not necessarily follow the same principles and
standards.
Page 15 of 26
WFP/M&E SOPs
portfolio evaluation at the end of the CSP implementation are corporate requirements. Evaluations
and reviews rely on effective monitoring systems that provide credible baseline data and
output/outcome monitoring data. To achieve this, the Implementation Plan prepared under SOP 5
should indicate consolidated data required for CSP review or evaluation purposes. Doing this
ensures that monitoring processes will provide data sets on specific outcomes, in formats that are
useful for CSP level reviews and evaluations, i.e. they will allow for trend analysis. An important
evaluative product is a management response matrix, which summarizes evaluation
recommendations and actions that will be taken to implement them. This is critical to close the
learning loop and to make use of evaluation findings to adjust programmes and monitoring
processes and to (re)-design CSPs and prepare logframes.
Timing
Evaluations should be timed in such a way that recommendations inform decision-making, such as
for preparation of CSP documents. Datasets required for evaluations should be provided during the
inception period.13 Evaluation and Reviews’ recommendations should be reviewed as soon as the
reports are disseminated and elements of the monitoring processes that need adjusting are
identified.
Responsibilities
- Consolidate monitoring data for evaluative purposes: M&E staff/Focal point
- Oversee the conduct of centralized evaluations: Office of Evaluation (OEV)
- Oversee the conduct of decentralized evaluations: Head of programme
- Oversee the conduct of reviews: Head of programme
- Ensure use is made of evaluation recommendations: Head of programme
- Update implementation status of recommendations: M&E staff/focal persons
Further guidance
- Guidance on Reviews
- Decentralized Evaluation Quality Assurance System (Office of Evaluation guidance materials)
- WFPgo topic page of Evaluation in the New WFPgo
13Normally several weeks before the evaluation team visits the CO.
Page 16 of 26
3. Annexes
Annex 1: Sample Analysis Plan
Food security and outcome monitoring (FSOM) - Analysis plan based on monitoring tool
# Theme14 Objective of analyzing these indicators Indicators
1 1.1: Number of Beneficiary households monitored (as a % of plan)
Summarize the monitoring coverage and compare
1. Coverage
2 with planned coverage by livelihood zones by 1.2: Number of non-Beneficiary households monitored (as a % of plan)
3 beneficiary/ non-beneficiary and by 1.3: Number of HHs monitored per Livelihood zone (as a % of total HHs)
4 intervention/programme 1.4: Number of HHs monitored per intervention type/programme
5 2.1: Average household size for beneficiary HHs
2. Demographics
6 Summarize key demographics of both beneficiaries 2.2: Average household size for non-beneficiary HHs
7 and none-beneficiaries to better explain the profile 2.3: % of female headed beneficiary HHs
8 of the households, that can be used to understand 2.4: % of female headed non-beneficiary HHs
9 the food security and outcomes in different areas 2.5: Average age (in years) of the head of Beneficiary households
10 2.6: Average age (in years) of the head of non-Beneficiary households
11 3.1: % of Household members reported to be registered (i.e. counted to receive support)
3. Targeting and
registration
12 To assess completeness of beneficiary registration 3.2: % of Beneficiary households receiving other food support
(as reported by HHs in the absence of ration cards)
13 3.3: % of Non-beneficiary households receiving other food support
and whether households are receiving other
14 support from other partners - cash or food 3.4: % of Beneficiary Households receiving other cash support
15 3.5: % of Non-beneficiary households receiving other cash support
16 4.1: % of Beneficiary Households experiencing shortage of food/money during the recall period
4. Coping strategies
17 4.2: % of non-Beneficiary Households experiencing shortage of food/money during the recall period
Assess whether HHs experienced food shortages
18 4.3: Mean coping strategy score (CSS) for beneficiary households
during the period, and how they coped with that
19 4.4: Mean coping strategy score (CSS) for non-beneficiary households
shortage
20 4.5: % of beneficiary households applying coping strategy 1, 2, 3,4 and 5
21 4.4: % of non-beneficiary households applying coping strategy 1, 2, 3,4 and 6
22 5.1 Average number of meals per day for children 6months-under 5syears; children between 5 to 18 years, and adults in Beneficiary households
5. Frequency
of Meals
23 This variable could help with recognizing 5.2 Average number of meals per day for children 6months-under 5years; children between 5 to 18 years, and adults in NON-Beneficiary households
24 households with extremely poor food consumption. 5.3 % of beneficiary households where children below 5 years, children 5 to 18 years and ADULTS, consume 1 meal, 2, 3 and 4 or more meals
25 5.4 % of non-beneficiary households where children below 5 years, children 5 to 18 years and adults, consume 1 meal, 2 meals, 3 meals and 4 or more
26 6.1: Average food consumption score for Beneficiary households
6. Food consumption score
Page 17 of 26
Annex 2: Sample Action Sheet/Tracking system
# Monitoring findings Recommendation/Action Action Responsible By when Current Update on actions
required Level person/Unit action should Status taken
be taken
1 The monitoring report shows that in Provide additional tarpaulins to Field WFP head of Before next Done All FDPs now have
50% of sites monitored there was high the sites, from WFP or CP level field office distribution additional materials
level of food spillage leading to losses. warehouse at the to prevent spillage
2 Outcome monitoring report shows that Since there is GFD in that region, Country Programme Next On-going Guidelines on
MAM recovery rate in the North was every household that has office manager and beneficiary identifying
very low, coupled with low food malnourished women/children Partner NGO registration households with
consumptions score, due to sharing of should be included in GFD Project cycle for GFD women or children
supplementary food rations with the registers and receive a coordinator on SMP during GFD
rest of the household members. household ration targeting have been
sent to the field
3 Outcome monitoring which includes Review CSP design to provide Country Head of Next Budget Pending
market prices and household unconditional cash transfers office programme revision of the
expenditure shows that the markets in during the lean season instead of ongoing CSP
the Eastern regions are well integrated, GFD. and re-design
prices are stable, and most food of successor
commodities are available programme
Page 18 of 26
Annex 3: SOP Checklist
This Checklist is intended to help assess the readiness of a country office to take up the SOPs or, in country offices where they are already in place, to assess
how well they are being applied. The Checklist can also be used by regional bureaux to identify capacity/technical support needs at country level. Compliance
with a particular SOP can be graded as High, Medium, Low or Very low by adding up the scores for each question. Compliance with a SOP should be rated HIGH
if all questions are scored 3 and should be graded VERY LOW if all questions are scored zero. A zero rating would imply that the procedure is non-existent.
The maximum score is 129 which would imply that all procedures in place, with documentary evidence.
CHECK LIST FOR ASSESSING IMPLEMENTATION OF THE M&E SOPS AT COUNTRY LEVEL
1.1 Does every CSP document have a logframe that is well aligned to the CRF? And using the correct template?
1.3 Does the CSP have a performance monitoring narrative explaining how monitoring will be carried out?
1.4 Does the performance monitoring section clearly outline the process monitoring plan as part of DM and PDM?
1.5 Does the performance monitoring narrative clearly outline how baseline data is/will be collected?
1.6 Are the risks and assumptions logical and do they correspond to each results hierarchy properly?
3.2 Does the MRE include ALL CSP Logframe indicators (outcomes, outputs, cross-cutting)?
3.3 Does the plan have baseline values or footnote indicating when they will be established?
3.4 For each indicator, is the data source/process indicated in the plan?
3.5 In the plan, it is clear who will collect data for each indicator?
3.6 Is the frequency for data collection stated for each indicator?
3.7 Does the plan indicate the report in which data for each indicator will be reported?
3.8 Does the plan indicate when and through which channels the reports will be disseminated?
4 SOP 4: A draft M&E budget is prepared reflecting costs of the MRE Plan
Page 19 of 26
WFP/M&E SOPs
CHECK LIST FOR ASSESSING IMPLEMENTATION OF THE M&E SOPS AT COUNTRY LEVEL
4.2 Does the draft budget include all required staff and activity costs?
5 SOP 5: A Monitoring toolkit is developed covering the processes listed in the MRE Plan
5.2 Does it explain the methodology (sampling, sample size) for each indicator?
5.3 Does it include a calendar showing when each monitoring process will be carried out?
5.5 Does the M&E toolkit include monitoring tools for each process, with corresponding guidelines?
6.1 Does the M&E toolkit clarify the type of database system/tools that will be used for storage/analysis?
7.1 Is there a list of sites to be monitored set out within an activity plan?
8 SOP 8: Primary data is collected and secondary data synthesised as per the Monitoring Plan
8.2 Is monitoring (including secondary data) being carried out as per the periodicity in the implementation plan?
9.1 Are there baseline values in COMET for all indicators included in logframes?
10.1 Is all data that is collected during monitoring entered into a database?
10.2 Is there a process of validating the data against expected data as per sampling?
11 SOP 11: Data is analysed as per the Analysis Plan and information products prepared.
11.2 Is the preparation of information products within the timelines stated in the implementation plan?
11.3 Is there a process in place for drafts of information products to be reviewed by relevant staff before completion?
11.4 Are final information products circulated to the audience as per plans
Page 20 of 26
WFP/M&E SOPs
CHECK LIST FOR ASSESSING IMPLEMENTATION OF THE M&E SOPS AT COUNTRY LEVEL
12 SOP 12: An escalation system is established to ensure required actions are taken at the appropriate place in a timely manner
12.1 Are there Action Sheets or Unit Plans that summarise monitoring recommendations (see format in Annex 2)
12.2 Is there a tracking tool to compare planned versus actual data, including monitoring days, field visits, data sets?
12.3 Are the Action Sheets or Unit Plans up to date, with latest status for each recommendation action?
13 SOP 13: Regular meetings are conducted between M&E, programme and technical staff to review monitoring findings and implement actions
13.1 Are there scheduled meetings during which monitoring findings are discussed and actions agreed on?
14 SOP 14: Monitoring data is consolidated for evaluation and review purposes and made available to evaluators
14.1 Is past monitoring data available in a format that facilitates trend analysis?
14.2 Was monitoring data made available for all indicators during evaluations/ reviews (where applicable)?
14.3 Is there a process in place to track implementation of evaluation and/or review recommendations?
15 SOP 15: Findings of evaluations and reviews are used to inform the design and re-design CSPs and monitoring processes.
15.1 Have Budget Revisions or CSP re-designed after conduct of an evaluation taken account of recommendations?
Total Score
Medium 71 to 99
Low 41 to 70
Very Low 0 to 40
Page 21 of 26
Annex 4: Generic Template to Apply the SOPs at Country Level
Steps/SOPs Related Activities Responsibility Timeline Remarks
Step 1: Prepare or revise performance monitoring narratives and logical frameworks
Draft monitoring narrative
SOP 1: CSP logframes and monitoring narratives are prepared Draft logframe
Validate logframe
SOP 2: Completed logframes are entered in COMET Approve logframe
Enter logframe in COMET
Step 2: Prepare Monitoring, Review and Evaluation (MRE) Plan and draft M&E Budget
SOP 3: A MRE Plan is developed Draft MRE Plan
Validate MRE Plan
SOP 4: A draft M&E budget is prepared reflecting costs of the Approve MRE Plan
MRE Plan Draft M&E Budget
Step 3: Develop a monitoring toolkit (methodologies, tools, analysis plan, implementation plan)
SOP 5: A Monitoring toolkit is developed covering the processes Prepare implementation plan
listed in the MRE Plan Design monitoring tools
Prepare Analysis Plan
SOP 6: Databases to store and process monitoring data are Finalise M&E budget
developed Approve M&E budget
Develop/adopt databases
SOP 7: A list of sites to be monitored is prepared within a Prepare list of activity sites
monthly activity plan Update M&E system with activity site lists
Step 4: Collect primary data and collate secondary data
Prepare data collection kits
Prepare monthly monitoring plan
Approve monthly monitoring plan
SOP 8: Primary data is collected, and secondary data
Train data collection staff
synthesized as per the MRE Plan
Coordinate data collection
Collect data
SOP 9: Baselines are entered or updated in COMET
Receive secondary data
Update monitoring coverage
Enter baseline values in COMET
Step 5: Capture, compile and validate data
Enter data in databases
Supervise data entry
SOP 10: Monitoring data is entered in databases, compiled and
Compile & validate data
validated
Enter data values in COMET
Receive validated data set
Step 6: Analyse data and prepare information products
Submit datasets for analysis
SOP 11: Data is analysed as per the Analysis Plan and Review Analysis Plan
information products prepared Analyse data per Analysis Plan
Enter data values in COMET
Page 22 of 26
WFP/M&E SOPs
Prepare information products
Review and comment on products
Finalise information products
Step 7: Make use of monitoring findings: Take action and document lessons
SOP 12: An escalation system is established to ensure required Take immediate actions
actions are taken at the appropriate place in a timely manner Convene review meetings
SOP 13: Regular meetings are conducted between M&E, Prepare APP Unit Plan or Action Sheets
programme and technical staff to review monitoring findings Ensure actions are taken
and implement actions Update APP Unit Plan or Action Sheets
Step 8: Conduct evaluations or reviews
Consolidate monitoring data for evaluative purposes
SOP 14: Monitoring data is consolidated for evaluation and
Oversee the conduct of centralised evaluations
review purposes and made available to evaluators
Oversee the conduct of decentralised evaluations
Oversee the conduct of reviews
SOP 15 Findings of evaluations and reviews are used to inform
Ensure use is made of evaluation recommendations
the design and re-design CSPs and monitoring processes
Update implementation status of recommendations
Page 23 of 26
SOP acronyms
DM Distribution monitoring
SO Sub-office
Page 25 of 26
WFP/M&E SOPs For more
information
contact:
World Food
Programme,
Performance
Management
and Monitoring
Division,
Monitoring
Branch (RMPM)
Rome, Italy
Version 1:
October 2017
(Supersedes June 2014 version)
Page 26 of 26