VFM Systems Matrix Prefin

You might also like

Download as xls, pdf, or txt
Download as xls, pdf, or txt
You are on page 1of 28

VFM SYSTEM MATRIX

This tool is part of the VFM Guide created by People in Need. The VFM Systems Matrix can be used at the proposal stage a
items to be considered during identification and planning or during project implementation as a selfassessment tool or a t
by a project evaluator. It requires the review of the project documents, internal procedures as well as discussions with pro
support staff. The Matrix can be used jointly with the step-by-step guide so that a holistic VFM analysis can be conducted.

This qualitative tool presents project management teams and evaluators a method for assessnig the Value for Money usin
categories: Economy, Efficiency, Effectiveness and Equity. The analysis of each „E“ is divided into sub-categories of the diff
a project:
− Identification and Planning
− Implementation and Monitoring
− Evaluation and Learning
Each sub-category includes several standards to be assessed. The standards and their description should be adjusted to fit
organisation's systems.

A score is given to each standard and calculated for each of the 4E‘s:
1 = No / Not implemented
2 = Partly / Partly implemented
3 = Yes / Fully implemented
n/a = not aplicable (or delete irrelevant standard completely)
Allocating the score should always be explained in the Rating justification column.

A narrative summary of the findings can be prepared for each „E“ (see bottom of each „E“ worksheet).
Assessment identification

Implementing organisation:
Project title:
Country:
Donor:

Assessment done by:

Staff interviewed during assessment:

Date of assessment:
Stage of project:
No of VFM assessment in the project:
Assessment identification

Name

Name

project planning (proposal) / implementation stage / end of project


ation

Position

Position

end of project
ECONOMY:
Are we or our agents buying inputs of the appropriate quality at the right price? (Inputs are thing

Assessed Standard

IDENTIFICATION AND PLANNING

Resource allocation is linked to previous performance data in a


Budget planning
similar context.

The costs of data collection/analysis/validation are appropriate (and


M&E in proportion) to the intervention and the ultimate benefits of the
data.

IMPLEMENTATION AND MONITORING

Programme cost-drivers (as identified using the Economy Step-by-


Step Guide) are assessed regularly (at least annually) and any
Budget monitoring
changes or significant difference from benchmarks/budget
explained

The project team together with senior management review budget


Budget monitoring
planning and spending on at least a monthly basis.

Salary grid and per diem rates are reviewed annually against other
Staff international and national organisations to ensure it is both
competitative and does not distort the sector

Procurement Procurement guidelines for selecting supliers are followed

Regular reviews of market to have accurate knowledge of supply


Procurement
prices are conducted

Procurement Soft intelligence on suppliers performance and reputation is kept.

Inventory System and staff are in place to manage inventory

Transport Mechanisms are in place to monitor and control fuel expenditure


Flight tickets Tickets are bought through a travel agent that can provide lower
prices, all tickets are economy class.

Where possible, office is shared with other projects to reduce


Office rent
overall overhead costs.

Overall summary: Economy


t the right price? (Inputs are things such as staff, consultants, raw materials and capital that are used to p

Description of standard

M&E costs usually between 1-3% of the overal project budget, but
project specifics need to be considered (donor rules/allowance,
innovative component in project requiring higher costs, specific
M&E methods to be used, etc. )

vehicle logbooks that record distances driven and fuel used


(including receipts) are checked on a monthly basis against
expected fuel usage of the vehicle.
1 = offices can be shared but are not
2 = offices are shared in some implementation areas and not in
others, even if it would be possible
3 = offices are shared OR it was considered but not found possible.
ultants, raw materials and capital that are used to produce outputs)

Sources for rating Rating

Project budget justification


Review of spending on similar costs in similar context 1

Comparison of resources dedicated to M&E activities to demand of


2
project for data to be collected and analysed

Discussion with logistics/procurement/programme staff; check


3
when the latest review of market prices was conducted

Monthly budget spending plan documents.

Discussion with HR Manager or other responsible staff.

Randomly select a procurement and check documentation against


procurement guidelines

Discussion with logistics/procurement staff; check when the latest


review of market prices was conducted

logbook of previous experience, blacklist; check when the latest


update of logbook/blacklist was conducted

Discussion with logistics/procurement staff; review of relevant job


description

Discussion with logistics/procurement staff; review of existing


mechanisms
Financial records of flight tickets bought

Visit to project office/discussion with Project team

2.00
ts)

Rating justification
EFFICIENCY:
How well do we or our partners convert inputs into outputs? (We or our partners exercise strong

Assessed Standard

IDENTIFICATION AND PLANNING

Is the ratio of programmatic/direct to organisational/indirect cost at


1
least 60/40 with 7% for HQ costs?

Have different alternatives for delivering the project and respective


2
benefits and costs been considered?

IMPLEMENTATION AND MONITORING

3 Do we have a meaningful M&E Plan/Results Framework?

4 Do we have clear and realistic milestones and targets, building on a


baseline?

5 Are we systematically monitoring quality of implemented activities?

Are we systematically monitoring progress against set outputs and


6
targets?

Are the monitoring data collected according to the M&E


7
plan/Results Framework?

8 Are we monitoring output cost ratios?

9 Are we analysing costs of results (outputs)?

Where results and costs vary from expectations are we able to


10
explain why?

EVALUATION AND LEARNING


Are we systematically capturing and reflecting the learning from
11 monitoring in implementation?

Overall summary: Efficiency


utputs? (We or our partners exercise strong control over the quality and quantity of outputs)

Description

In the beginning of intervention, including SMART project


indicators, strategic/global indicators and efficiency indicators,
providing clear description of indicators, clear link of data to
indicators, and appropriate data collection methods used on
appropriate sample.

Following quality standards for delivering activities in a given sector

Results comparison against baseline, milestones

Data collection comparison against M&E Plan/Results Framework


(timeliness and regularity, using planned data collection methods
and following the sampling strategy)

Looking at budget spending in connection to project products,


outputs etc.

Efficiency indicators - e.g. cost per unit result.


ver the quality and quantity of outputs)

Sources for rating Rating

Project budget; review of actual costs 1

Project proposal; context assessment; discussion with the


2
PM/HoP/CD

M&E Plan/Result Framework


Data collection / Research methods and instruments
Sampling strategy

Logical Framework, M&E Plan/Results Framework, Indicator


Tracking Table

Quality standard checklists, activity monitoring sheets, observation


protocols, etc.

Monitoring reports
Indicator Tracking Table
Project team interviews

M&E Plan/Results Framework,


developed monitoring tools,
Indicator Tracking Table
Project staff interviews

financial system records review, project manager & finance


meetings, quarterly reports

M&E Plan/Results Framework, Indicator tracking Table, Rationale


for any changes or significant difference from benchmark/budget

Interviews with Project Manager


Monitoring data, reports and recommendations; Implementation
plans, project reports
Rating justification
EFFECTIVENESS:
How well are the outputs from an intervention achieving (or likely to achieve) the desired outcome? (Note that in contrast to outputs, we or our partners do not exercise direct control over outcomes)

Assessed Standard Description Sources for rating Rating Rating justification

IDENTIFICATION AND PLANNING


Do we have an understanding of what would happen without the
1 initial assessment, discussion with Project Manager 1
intervention?

Does our intervention fit in with our (or our partners') comparative Track record annex to proposal, reference to PIN tool kits or PIN
2 experience with the approaches in the proposed project 2
advantage / areas of expertize? Quality Standards

It is clear who will benefit from the intervention and what value do beneficiary selection criteria, needs assessment, stakeholder
3 3
they put on results? analysis

4 Do we have a clear theory of change based on evidence? Theory of Change, best practice, learning and evidence documents

Problem/objective analysis, Logical Framework, Project proposal,


5 Do we have clear and realistic objectives?
Project workplan

considering and description of factors and approaches that create


Have we considered how this intervention is going to be sustainable
6 sustainability, e.g. behavioural change approach, feasibility study, Project proposal, Exist / Phasing-out strategy
in long term?
strategy for integration with local systems

contribution to a gap in sector knowledge, a mechanism described


7 Are we generating important learning through this intervention? for how the learning will be shared M&E Plan/Results Framework, evaluation, research documents

8 Are we managing risks in this intervention? existance of risk analysis and risk management/mitigation strategy Project proposal

IMPLEMENTATION AND MONITORING


Indicator tracking Table,
Outcome level changes measured and recorded,
9 Is there evidence that we are achieving what we set out to do? Beneficiary feedback
Comparison against baseline, milestones
Data management system review

Programme staff demonstrate knowledge of where data and


10 Are the data collected and analysis accessible to non-M&E staff? analysis can be accessed. Discussion with non-M&E staff (project staff)

Evaluation reports;
Follow up on evaluations / monitoring findings template;
11 Are appropriate amendments to objectives made to take account of If applicable, objectives and/or targets are modified based on Theory of Change / Logframe
changing circumstances or findings? changing context and findings.
Communication with donor;
Project reports

EVALUATION AND LEARNING

Is there evidence that our theory of change worked in practice and Final evaluation report;
12
we reached our target objective indicators? Theory of change

Evaluation report;
Evaluation fact sheet/presentation;
Are there clearly described lessons and learning available from Discussion with project staff (what they know about learning from
13
implementation? implementation/evaluation)

Are lessons and learning from intervention being (planned to be) M&E Plan, Advocacy / communication plan, implementation
14 disseminated? results data made public in a form appropriate to the audience workplan, information dissemination materials, website
Is there a realistic exit strategy to ensure that the impact from the
15 Exit strategy
programme is long term and sustainable?

Overall summary: Effectiveness


EQUITY
Refers to whether the project is addressing social or economic disparity.

Assessed Standard

IDENTIFICATION AND PLANNING

Accountability: Were all relevant stakeholders (the most marginalised groups,


1 target groups, partners) involved in identifying core problems and the project
outcome to address it?

Accountability: Were the perspectives of all relevant stakeholders considered in


2
project design?

3 Did the project consider the needs of people living with disabilities?

IMPLEMENTATION AND MONITORING

Accountability: Is there a functional Complaint/Feedback Response Mechanism


4
(CRM)?

Accountability: Is there a system of participatory monitoring/community


5 outreach/accountability events in which the project target group can provide
feedback and make dicisions so that implementation is adapted accordingly?

Do non-project beneficiaries understand and agree with why others where


6
selected?

7 Is data disagregated according to sex, disability and other social differences?

EVALUATION AND LEARNING

Did the project contribute to increased equity in the communities (e.g. through
8
pro-poor targetting, needs of people with disabilities or else)?

9 Was the project gender-sensitive?

Overall summary: Equity


c disparity.

Description Sources for rating

participatory planning and decision-making needs assessment, stakeholder analysis

needs assessment, stakeholder analysis, project


participatory planning and decision-making
proposal

Assessment, project proposal, evaluation.

project beneficiaries and target groups, the wider


community and other stakeholders can give feedback CRM, discussions with senior management,
and raise complaints confidentially; discussion with CRM focal person, review of the
the complaints are responded to in an appropriate CRM mechanism
manner;

Discussion project staff, M&E staff and with


target groups.

Discussion with non-target groups

Data collection methodologies and tools,


databases

Analysis of data disaggregated by wealth, project


evaluation.

Gender Marker
Rating Rating justification

You might also like