Results Based Management: Theory and Application

“Results Based
Management: Theory
and Application”
JPO Training
Programme
October-November 2011
Marielza Oliveira
Operations Support Group/Executive Office
Key Message 1
We get what we focus on: to get highlevel results (national impact), this is
what we must focus on.
Key Message 2
RBM is about having a clear articulation of
what we want to change and how to make
it happen.
The results we seek - DEFINITIONS
Impact:
Change in the
lives of people
Outcome:
Institutional &
Behavioural Change
56-58
Actual or intended changes in human development
Reduced
infant
andwell-being.
maternal mortality
as
measured by
people’s
by 2013
Actual or intended changes in development
conditions that an intervention(s) seeks to support.
Improved provision of public sanitary
The contribution of several partners is usually
services
to rural
communities by 2013
required
to achieve
an outcome.
Atlas: award
Outputs:
Operational Change capacity development
Atlas: project ID
Tangible
product
or service
of Agency
an intervention
National
Public
Works
has the
that is directly attributable
the initiative.and skills
management
systems, to
equipment,
Outputs relate to the completion (rather than the
to provide sanitation services to rural
conduct) of activities and are the type of results
communities
over which managers
have most influence.
The results we seek – Sanitation example
Impact Level - signs that people’s lives have improved:
1. # of cases of water borne diseases;
2. under 5 infant mortality rate in x region.
Outcome Level - signs of change in institutional capacity/provision of services
1.
2.
3.
4.
5.
% of public satisfied with delivery of services
Amount of resources allocated to solid waste management
% of public solid waste collected by private sector contractors
# of new households being served
% on time pick up of solid waste
Output level – what needs to be produced for outcomes to be achieved?
New policy drafted to facilitate private sector participation in delivery of public
sanitation services; public education campaign implemented in rural communities;
vehicles and other critical equipment in place and training provided to staff.
efficiency
Inputs
Performance
indicators
Outputs
Outcomes
effectiveness
Impacts
R
E
S
U
L
T
S
Implementation
Planning
Activities
INFLUENCE
THE RESULTS CHAIN
From Inputs to Impacts: Theory of Change
• Certain resources/inputs are needed to operate your
program.
• IF you can gain access to them, THEN you can use them
to accomplish your planned activities.
• IF you accomplish your planned activities, THEN you will
deliver the amount of product or service that you
intended.
• IF you produce these outputs, THEN certain changes in
systems, behaviors, etc, take place.
• IF these benefits to participants are achieved, THEN
your stakeholders benefit from changes in their life
conditions.
Results at country level
Natl. Dev.
CPD
Country Programme
Document
Project document
Annual Work Plan (AWP)
Public Works Agency
has the systems,
equipment, and
skills to provide
sanitation services
to rural
communities.
CPAP
UNDAF
UN Development
Assistance
Framework
Country Programme
Action Plan
Improved access of rural
population to basic health
and sanitation services by
2013.
Plans
Improved
infant and
maternal
mortality.
Writing good Outcomes
Outcomes are actual or intended changes in development
conditions that interventions are seeking to support.
Some tips:
1. Avoid action verbs – ”Strengthening”, “enhancing”, etc
2. Avoid intentions – “To assist the government…”,
3. Use completed verbs: “…reduced”, “improved”, “have greater
access to”, etc
4. Must signal that something has changed for someone
5. The something which has changed must be important to the
country/region/community, not just UNDP.
6. Avoid UN speak: gender mainstreamed
56-58
Outcome examples
1. Legal and regulatory framework reformed to provide
people with better access to information and
communication technologies.
2. The poor in x region have better access to capital and
other financial services.
3. Reduction in the level of domestic violence against
women by 2016
4. Increased regional and sub-regional trade
5. Higher and more sustainable employment and income
for urban slum dwellers.
Typical pitfalls
• Wordy (..and no change language)
To promote equitable economic development and democratic
governance in accordance with international norms by
strengthening national capacities at all levels and empowering
citizens and increasing their participation in decision-making
processes
• Too ambitious
Strengthened rule of law, equal access to justice and the
promotion of rights
• Containing multiple results
The state improves its delivery of services and its protection
of rights—with the involvement of civil society and in
compliance with its international commitments
• Beyond comprehension
The Poverty/environment nexus is enhanced.
Typical pitfalls
• Wishy-washy (ie. Support provided to improve..)
Support to institutional capacity building for improved
governance
• So general, they could mean anything
To promote sustainable development and increase capacity at
municipal level
• Overlapping with National goals/ MDGs (impacts)
Substantially reduce the level of poverty and income inequality
in accordance with the MDGs and PRSP
• Mixing means and ends
Strengthen the protection of natural resources through the
creation of an enabling environment that promotes sound
resources management
Writing good Outputs
Tangible products and services or improvement
in skills and abilities. Relate to the completion
(rather than the conduct) of activities
Some tips:
1. Must be clear what is being delivered
2. Must be achievable within the project period
3. Managers have a high degree of control
 If the result is mostly beyond the control of the
programme or project, it cannot be an output
• Failure to deliver the outputs usually means failure of the
project
58-59
Output examples
a. National electoral body has adequate personnel,
equipment and skills to administer free and fair national
and local level elections by 2012.
b. Study of environment-poverty linkages completed
c. Police forces and judiciary trained in understanding
gender violence
d. National, participatory forum convened to discuss draft
national anti-poverty strategy
e. National human development report produced
f. Revised electoral dispute resolution mechanism
established
g. Business processes reengineered
h. Compliance mechanisms established
Exercise: Review the UNDAF/CPD
Select 2 CPD outcomes.
Review the quality of the outcomes
Is it clear where the country should be at the end of the cycle?
Are CPD outcomes/resources the same as in UNDAF?
Is it clear who is supposed to benefit, and how?
How can they be improved?
Review the related outputs (use ERBM platform)
Are they worded as products, services or improvement in abilities?
Are they the product of completed activities?
Are they the right things to deliver, to make the desired change happen?
(Are they the right “set”? Are they under our control?)
How can they be improved?
INDICATORS
Are we are making progress?
• Performance measurement is essential:
– During:
• To improve the programme
– After:
• To assess whether the programme is
successful in achieving results
• To learn
• To help communicate with stakeholders
• Reliable signals that inform about real changes
Measuring performance
“Promise
Performance:
Actual delivery
to deliver”
“Readings” on the same
indicator at different
points in time
Baseline
Before
Target
Measurement
at the end of
each period
Expected after
Real after
Good indicators
Technical aspects
• Valid – measures what it is
intended to measure
• Reliable – consistent and
comparable over time
• Sensitive – able to detect
extent and direction of
change during the cycle
Practical aspects
• Specific – Identifies the nature
of the expected change, the
target groups, the target area,
etc
• Measurable – reliable and clear
measure of results
• Attainable – realistic given
resources likely to be available
• Relevant – helps us to correct
course during, and to learn after
• Time bound – data collection is
feasible and not overly
burdensome within timeframe
Performance measurement
Using indicators requires a systematic approach,
including a clear definition of how measurements will
be done:
•
Information required:
What is the nature of the information required for each indicator?
•
Sources of information:
From where can the information be obtained?
•
Methods of collection:
How will the data be collected?
•
Frequency and responsibilities:
Align data collection to reporting cycles
Who is accountable for measuring and analysing?
Performance measurement matrix
Result
Indicator
Data
Description
Data
sources
Collection
methods
Frequency
Responsibility
Quantitative indicators
Measures of quantity including statistical statements
Number: # people below extreme poverty line
Percentage: % government budget allocated to social
protection
Ratio: # doctors for 1,000 people
Terminology needs to be clear to ensure validity
and reliability of what is being counted
Qualitative indicators
Judgments and perceptions derived mostly from
objective analysis e.g.
– extent of commitment of Government to respect human
rights
– quality of sanitation services received by rural dwellers
– level of client satisfaction with public services
Terminology needs to be clear to ensure validity
and reliability of what is being measured.
Proxy indicators
• Not a direct measure of the stated result, but an indirect
reflection of the situation
• Used when more direct measures are not available due
to lack of information or complexity of situation
• Based on a sound assumption about behaviour of certain
phenomena relating to the stated result….
…. and therefore are context specific
• Can be quantitative or qualitative
– Stock market index as a gauge of the economy
– Electricity consumption as a view of income
– Number of women in management positions as an
assessment of social justice
OUTCOME INDICATOR
Focus on what is critical to see happen:
– “Number of audit
recommendations implemented”
Versus
– “Number of critical audit recommendations implemented”
or
– “# of persons provided with information on HIV/AIDS…”
Versus
- “# of persons who say they have changed behaviour based
on information received on HIV/AIDS”
62
EXERCISE: INDICATOR CONSTRUCTION
Using the same 2 CPD outcomes as before
Review the quality of the outcome indicators
Are they good signals of the change we are trying to achieve?
Do they focus on what is critical? Are they disaggregated?
Are they SMART? It is possible to collect data during the cycle?
Are the baselines and targets using the same unit of measurement?
Has monitoring taken place using them?
How can they be improved?
Review the related output indicators
Do they indicate completion?
Do they focus on what is critical?
Are they clear and measurable?
How can they be improved?
KEY MESSAGE 3
You must structure your
organization so that people
work effectively together to
achieve the results that
matter.
76
Results Chain through the life of the program
• Purpose of implementing performance
information system: to improve management
practices
• Reasons for implementing MfDR “managing for development results”:
– enhancing institutional learning
– strengthening staff accountability
– improving decision-making
Results Chain through the life of the program
CLARIFY
PROGRAMME
THEORY
•
•
•
•
IMPLEMENT AND
MONITOR YOUR
PROGRAMME
Describe problem to • Describe impact you
solve
expect to have in the
Identify desired
community, in 10 years’
results (long- and
time
short-term)
• Identify desired results
List strategies to seek • Identify outputs you
results
expect to
Examine risks and
produce/deliver to
assumptions
achieve results
• Describe process for
checking if outputs are
being delivered and
progress is being made
DETERMINE AND
COMMUNICATE
RESULTS
• Identify key audience
for each component of
the programme
• Identify the questions
they may have about
your programme
• Describe the
information you need to
collect and analyse to
indicate programme
achievements and
lessons
• Structure the
information to the
audience’s needs
Low
Monitoring quality
High
Challenge: To KNOW that results are being achieved
or not
Monitoring infor will
show important
elements: the big picture,
challenges/issues,
opportunities.
Vicious circle: low
performance; and
do not know
what’s going on
Opinion-based decisions
Virtuous circle: Achieving
results transparently and
knowing what’s going on: Big
picture and small picture
Evidence based, quality
decisions
Small picture good, but M&E
information does not bring
out big picture results. Do not
really know what’s going on.
Unable see the overall change.
Low
High
Progress toward results
EVALUATION FINDINGS
“The programme addresses clear and critical national
priorities. However, the programme has spread too thin
into many activities within 11 outcomes that have not been
adequately specified nor do they present indicators for
measuring progress. The thematic areas are neither linked
nor do they reinforce each other. Furthermore, the thrust is
on outputs oriented to the short-run or middle-run results
without a long run view. With this approach, the Agency
neglected the opportunity to assume a position to engage
in advocacy and policy dialogue on critical issues for the
transformation of society and the process of
democratization in xxx country.”
Communicating results
- Basic Principles 1. Accuracy and verifiability
2. Coherence and clarity
3. Relevance and value
4. Remembering the audience
2
ROAR 10-Point Checklist
1. Does the report capture the main development changes in the country?
2. Does the report highlight UNDP’s main contributions to these changes?
3. Does the report have a strong focus on high level change rather than
outputs?
4. Could the report be read and understood by someone from outside the
country?
5. Could the report be read and understood by someone from outside UNDP?
(excluding the management section)
6. Does the report present a compelling picture of UNDP’s development work?
7. Does the report include any significant advocacy, communications or
partnership building work conducted successfully during the year?
8. Does the report read as a single coherent effort rather than a series of
separate pieces of reports stitched together?
9. Can my office verify the statements made in the report?
10. Has the report been discussed at different points in its formulation
(particularly start and end)?
Some examples of pitfalls (1)
Focusing on programming processes instead of the progress of the
programme.
EXAMPLE: “A number of quality annual work
plans based on broad
stakeholder consultation and buy–in were finalised, approved (through
LPAC meetings) and signed in 2008. This followed an extensive consultation for
the development of the new programming system based on the United Nations
Development Framework (UNDAF), Country Programme Document (CPD) and
the Country Programme Action Plan (CPAP). More important is the fact the
process of developing the AWPs was led and owned by the national institutions.
And over 20 of UNDP's implementing partners
were micro-
assessed in line with the harmonised cash transfers methodology (HACT).”
Some examples of pitfalls (2)
Listing outputs and activities, often cut and pasted from elsewhere.
EXAMPLE: “2008 was the first year of the CPAP 2008 - 2012 implementation.
Activities were therefore implemented and results achieved in each
of the key areas below:
1.democratic governance and human rights
a. draft of the law on prevention and fight against corruption and related
practices technically validated
b. draft of the presidential decree reorganizing the National Anti corruption
Commission technically validated
c. advocacy in favour of the implementation of Election [X]
d. study of the electoral code
e. teachers manual of Human Rights elaborated
f. national plan of action for human rights promotion elaborated
g. general population sensitized on human rights issues
h.40 focal points trained on Human Rights Based Approach”
Some examples of pitfalls (3)
Focusing on inputs and delivery:
• EXAMPLE: “Programme and financial targets have all been
met, although several large initiatives have been unduly delayed due to
the financial crisis, government budgetary reductions and MDGF delays.
Overall this concerns some US$8,5M of projects which should have
started in 2008. In spite of this all delivery and efficiency
indicators have improved compared to 2007, with other
programmes picking up the slack. Financial sustainability of the CO has
also improved and gov-t co-financing has surpassed US$4M with an
additional US$11M in parallel financing attracted.”
Some examples of pitfalls (4)
Focusing on low level results:
EXAMPLE: “60 rural women in select pilot areas have enhanced their
opportunities in income generation through business development training,
networking and distribution of grants in that way testing approaches and
creating enabling environment for the introduction of micro-financing
opportunities. The capacity needs assessment and follow up strategy has been
developed to continue promotion of self-employment and business opportunities
for rural women.”
Exercise: Report
Using one of the 2 CPD outcomes used before…
Reporting/Communicating on results (using ERBM platform,
UNDP Annual Report of the Administrator, and website)
Have the relevant outputs been monitored?
Do we know how close we are to achieving outcome?
What have we reorted to HQ on these outcomes?
What has UNDP reported to the EB?
What, if anything, have we communicated to stakeholders?
Communicating for Results Toolkit
http://comtoolkit.undp.org/
Intranet Home  Toolkits  Communications Toolkit
3rd Header on the right hand side
External Access Available to interested partners
Communicating for Results Toolkit
Each of the Five Sections: Core Concepts – Tools – Best Practices
You Tube
UNDP FlipCam Video Project
UNDP Success Stories Website
We have the tool
the challenge now is to populate it
UNDG
Strategic Communication Resources
SUMMARY
1. Understand your office’s strengths, weaknesses
and comparative advantages.
2. Prioritise. Look for opportunities to make impact.
3. You get what you focus on, so focus on what you
really want
4. Be clear on what you want to achieve, and know
how to measure whether you are achieving it
5. Think “big picture”, “strategy”. “results” – not just
activities, outputs, outcomes.
6. Communicate results in simple, meaningful ways.
Plenary: Back to the Office
What can I do differently when I go back?
In Planning?
In Monitoring?
In Reporting/Communicating on results?