Proposal and implementation plan for a Government

Proposal and implementation plan for a
Government-wide
Monitoring and Evaluation System
SEPTEMBER 2005
Prepared by the Governance and Administration
Chief Directorate
Presidency
Fifth draft for comment
Proposal and implementation plan for a
Government-wide
Monitoring and Evaluation System
SEPTEMBER 2005
FOREWORD
A proposal and Implem entation Plan for Governm ent-Wide Monitoring and
Evaluation: A publication for programme managers is the firs t comprehens ive
initiative monitoring and evaluation guideline after Cabinet of the Republic of
South Africa approved a process to plan a monitoring and evaluation (M&E)
s ys tem for us e across government in 2004
The purpose of this publi cation is to contribute to improved governance and
enhance the effectiveness of public sector organis ations and ins titutions in
South Africa. It was written to support the improvement of the collection and
collation, analys is and diss emination and the application of information on the
progress and im pact of programm es in order to ensure trans parency and
accountability, prom ote service delivery im provem ent and com pliance with
s tatutory and other requirements and a learning culture in the public s ector
Specifically, the propos al aims to: focus on the essential elem ents of results oriented monitoring and evaluation that res pond to the requirem ents of
government’s programmes , policies and projects for decision-m aking,
accountability and learnin g; s trengthen the role of the monitoring function
within the three spheres of government; presenting a m ore integrated
approach to monitoring and evaluation functions in governm ent; introducing
s implified, s treamlined and harmonized procedures in line with governments’
a results -oriented fram ework for monitoring and evaluation com bined with
practical guidance for the developm ent of selected ins truments ; guidance on
the assessm ent of res ults within a clearly defined fram ework or context of
government programm e of action and its priorities ; greater attention to
m onitoring than in the pas t to stress that both m onitoring and evaluation are
important management functions aim ed at ensuring the quality of
interventions and supporting decis ion-m aking, accountability, learning and
capacity development;
June 2005
2
Fifth draft for comment
This publication is therefore, intended for various levels of m anagem ent within
government, Senior Managem ent, technical, programm e and project
m anagers . It is divided into two parts , part one presenting the conceptual and
operational framework for m onitoring and evaluation and implem entation
plan; part two offers indicators for monitoring and evaluating programm es and
projects at a macro level.
The process of developing this proposal & implem entation plan as well as
developm ent indicators , coordinated by the Presidency, has been a
collaborative effort by m any people and we would like to acknowledge their
contributions .
Cons tructive ins ights from colleagues in Policy Coordination and Advis ory
Services (PCAS) were es pecially helpful throughout the process . Our thanks
als o go to the Office of DG- DPSA; M&E TASK Team ; GCIS for rigorous
editorial and publication designs as well as Stats SA for refining and
reviewing the indicators in detail and providing useful comm ents and
s ugges tions .
I trus t that this publication will be a useful tool to thos e of you respons ible for
the m anagem ent of monitoring actions and the effective conduct of evaluation
exercis es
Rev F Chikane
Director-General
The Presidency
June 2005
3
Fifth draft for comment
Table of Contents
1. Introduction to the Government Wide Monitoring and
Evaluation system
•
•
•
•
Definition of monitoring and evaluation
Background and rationale
Lessons from international experi ence
The quality of existing government monitoring and
evaluation systems
• Other national information system s
2. System users and their needs
•
•
•
•
Meeting the needs of diverse users
The Presidency and Premiers’ Offices
Centre of Government
Decision makers in public service organizations and local
authorities
• Oversight bodies
• The Public
3. System aims, objectives and intended results
•
•
•
•
System aims and objectives
Results to be achi eved
Programme logic and underlying assumptions
Indicators of system success
4. Performance indicators and sources of information
•
•
•
•
•
•
•
June 2005
Overview of the system’s approach to indicators
A National Indicator Initiative
Departmental indicator s
Transversal indicators
Government Programme of Action
Links to other sour ces
4
Fifth draft for comment
5. System reports and their utilization
• Issues to be addressed in system reports
• Composite indicator s: a Governm ent performance
dashboard
• Qualitative and impact studies
• Responses to information provided by the system
6. Roles and responsibilities:
•
•
•
•
System management and maintenance
Institutional responsibilities
Transversal responsibilities
Capacity building
Appendix: A
Implementation Strategy and Plan
• Strategic approach
• Projects required
• Implementation schedule
Appendix: B
• Transversal systems
Proposal and implementation plan for a
Government-wide
Monitoring and Evaluation System
April 2005
June 2005
5
Fifth draft for comment
1. Introduction to the Government Wide Monitoring and Evaluation
system
•
•
•
•
Definition of monitoring and evaluation
Background and rationale
Lessons from international experi ence
The quality of existing government monitoring and evaluation
systems
• Other national information system s
1.1 Definition of
monitoring and
evaluation
Monitoring is a continuing managerial func tion that aims to prov ide
managers, decision makers and main stak eholders with regular feedback
and early indications of progress or lack thereof in the achiev ement of
intended results and the attainment of goals and objectiv es.
Monitoring inv olv es reporting on actual performance agains t what was
planned or ex pected acc ordin g to pre-determined standards. Monitoring
generally involv es collecting and analy sing data on implementation
processes, strategies and results, and rec ommending correctiv e measures.
Ev aluation is a time-bound ex ercise th at sy stematically and objec tiv ely
assesses the relev ance, performanc e, challenges and succ esses of
programmes and projec ts. Ev aluation can also address outcomes or other
development iss ues. Ev aluati on usually seeks to answer specific questions
to guide decision-makers or programme managers and should adv ise
whether underly ing theories and assumptions were valid, what worked, what
did not and why . Evaluation commonly aims to determine relev ance,
efficiency , effec tiv eness, impact and sustainability .
1.2 Background
and rationale
This doc ument propos es the dev elopment of a Gov ernment-wide monitoring
and ev aluation sy stem (GWM&ES) that will deliv er useful information and
analy sis and improve monitoring and ev aluatio n (M&E) practices and
capacity , contributing to better public management in South Afric a.
Effectiv e management is a continuous cy cle that starts with planning and
leads into implementation and ev aluation (the PIE model). Implementation
plans should be monitored and ev aluated, producing knowledge and insights
that are fed back into planning process es.
Public management in South Afric a has improv ed signific antly since
June 2005
6
Fifth draft for comment
democratis ation. Each public serv ice ins titution has been made res pons ible
for th eir own accountability and effectiv e management, leading to
fundamental improvements in gov ernance. The Public Finance Management
Act and the implementation of the Medium Term Strategic and Ex penditure
Frameworks hav e made it necess ary to define and align ac tivities and
spending around clearly defined objec tiv es. Thes e reforms hav e led to major
improv ements in planning and implementation, and enc ouraged a focus on
service delivery quali ty and impac t.
With dec entr alisation of accountability , line managers hav e become more
responsible for non-core functions, suc h as human resource dev elopment
and equity .
The key strategic challenge is to increas e public serv ice effectiv eness, so
that gov ernment ac hiev es its desired outcomes and strategic objec tiv es. This
makes monitoring and ev aluation critic ally important.
This proposal describes how a gov ernment wide sy stem should operate and
what it s hould produce. The proposal s pecifically addresses:
• Who the users of the sy stem should be and which of their needs it
should meet
• The sy stem’s aims, objectiv es and intended results
• Sources of information and procedures for data collection
• How system reports should be pres ented and us ed
• Roles and responsibilities of v arious stakeholders and
• How the sy stem should be implemented and rolled out.
It is likely that as implementation of the sy stem progresses and users ’ needs
become clearer, the sy stem will come to look somewhat different to that
described in this document. This is desirable and intended.
1.3 Lessons from
international
experience
As part of this project a rapid rev iew of international ex periences was
undertak en. The review looked at a range of international ex perienc es from
which it emerged that the dev elopment of a GWM&E sy stem is an ambitious
task best tackled incrementally ov er a sev eral y ears.
The clearest lessons in this regard can be found in the cas e of the United
States, which pass ed the Gov ernment Performance and Results Act (GPRA)
in 1993. GPRA addresses a broad range of conc erns about gov ernment
accountability and performanc e. Its goals were to focus on the ac tual results
of gov ernment ac tiv ity and serv ices, support congressional oversight and
decision-making, and improve the managerial and inte rnal workings of
June 2005
7
Fifth draft for comment
agencies within the federal gov ernment.
Whil e GPRA followed on the heels of a number of efforts throughout the past
fifty y ears to improv e the workings of the federal gov ernment, it is unique in
its requirement that agency results be integrated into the budgetary decisionmaking process. GPRA can also be distinguished from prior reform attempts
because it is taking place in a climate of increased political emphasis on
downsizing and reinventing federal gov ernment, dev olution of federal
activities to states, and the priv atisation of many federal gov ernment
activities. Finally , rather than other reforms that were primarily Ex ecutiv e
Branch initiativ es, GPRA is statutory ; its performance meas urement
requirements are law.
All agencies of the federal gov ernment, defined as cabinet departments and
other c oncerns of the gov ernment, including independent agencies and
government corporations, are bound by GPRA with certain limited
ex clusions. Although passed in 1993, actual GPRA requirements began in
1997, and the firs t full cy cle was in March 2000. GPRA requires agencies to
prepare three key documents: strategic plans, performanc e plans and
perfo rmance reports.
Other lessons from international practic e include the need to adopt a realis tic
and prac tic al approac h. Australia, for ex ample, has experienced difficulties in
its efforts to implement a “Whole of Gov ernment” sy stem and has made
limited progress in implementing a system intended to support the
development of “joined up gov ernment” operating as a seamless network.
Research also points to the need to build a gov ernment wide culture of
monitoring and ev aluation and on strengthenin g the centre of gov ernment. A
radic al public serv ice reformer, New Zealand has more recently
acknowledged that it ov erstated decentralizati on and created a short-term
outlo ok that ov eremphasised efficiency and underv alued longer-term
outcomes. It has subsequently started processes to increase the capacity of
central gov ernment to play an ov ersight role.
The international review shows v ery clearly that the concept of monitoring
and ev aluation is widely used globally and its importance and v alue
increasingly accepted. Three factors affec ting M&E can be identified:
• Gov ernment must be seen to take the initiativ e by creating appropriate
policies and showing a willingness and c apacity to control and guid e
implementation
• Infrastruc ture and financial and human capacities mus t be av ailable and
be deploy ed as required.
• Public inv olv ement improv es the quality and impact of M&E and makes
findings more widely accepted and us eful.
June 2005
8
Fifth draft for comment
1.4 Exi sting
monitoring and
evaluation
systems
In order to ensure that this proposal to ok account of current prac tices and
capacities, a rapid rev iew of ex isting gov ernment M&E sy stems was
undertak en 1. The rev iew involv ed circulating a ques tionnaire to all national
departments, Premiers Offices and prov incial treasuries, 20 of whom
responded, prov iding us eful insight into the current quality and ex tent of M&E
practices.
The rev iew findings indicate v ery clearly that M&E systems are generally
v ery underdev eloped and inadequate, although the basic building blocks are
usually present as a result of gov ernment’s strategic planning and budgeting
sy stems. The results also showed that monitoring and ev aluati on is widely
seen as an im portant area of management that is generally acknowledged as
strategic ally important and useful. There is a widespread preparedness to
improv e and enhance sy stems and practic es, essential for long-term capacity
and c apability dev elopment. This willingness to improv e is a major advantage
that must be effec tively used.
Since mos t gov ernment departments hav e not progress ed v ery far in th e
development of their M&E systems, the GWM&E sy stem enjoys the
advantage of “latecoming”, learning from others’ ex perienc es and apply ing
international bes t prac tices. It can also be developed without hav ing to
accommodate or cater ex tensiv ely for other ex isting sy stems and processes.
Howev er, the GWM&E system needs to be clos ely struc tured around the
ex isting planning framework and should be clearly integrated with it and
complement it wherev er possible.
Ev en though not alway s centrally located or ideally configured, mos t
departments hav e some lev el of monitoring and ev aluation capacity in place.
This means that once the Gov ernment wide sy stem articulates its reporting
requirements to departments they will hav e human and other res ourc es
av ailable, even if it will tak e ti me to get institutional and operati onal
arrangements func tioning optimally .
Whil e information tec hnology sy stems for M&E are often not ins talled or are
not entirely satis fac to ry , they are in many instances in the process of being
developed or improved. There is thus an exciting window of opportunity to
contribute to thes e sy stem dev elopment processes by defining v ery clearly
what func tionality is required by the gov ernment wide sy stem.
Whil e M&E strategies are generally poorly stated, this is partly a
consequence of a historic al lack of guidance on the issue. Clearly defined
terms and standards must be an integral part of the sy stem so that
departments are able to assess their own M&E products and outputs and
1 A detailed
June 2005
and comprehensive audit of M&E sys tems is being undertaken by the Public Service Commission.
9
Fifth draft for comment
make improv ements as nec essary . Public serv ice organis ations are now well
placed to mak e us e of practic al guidelines and other forms of support to
enhanc e and improve their M&E strategies and to ensure that they meet the
required standards and ac hiev e the intended res ults.
In summary, the government wide system needs to be:
• Prescriptiv e and clear about what information should be s ubmitted to it
by departments and other public entities but
• Accommodating and flex ible with regard to how information s hould be
collec ted and who should be responsible for doing so.
A capacity building and best prac tice promotion component to the system is
required s o that those public serv ice organis ations that find it diffic ult to meet
the prescribed standards are supported and assis ted to do s o. This could
include a forum or forums to promote M&E practic es and methodologies.
It may also be a useful s trategy to prov ide some kind of assessment and
accreditation serv ice so that it is clear when th e nec essary standards are met
and whether improv ements are required.
Ov erall, it is important that the gov ernment wide sy stem mak es its purpos e
absolutely clear to all participants and stakeholders. “Improving service by
learning” needs to be the ov erarc hing theme of the sy stem and the
underly ing proc esses for its implementation.
1.5 Other national
info. systems
Statistics South Africa is respons ible for the implementation of a National
Statistical Sy stem (NSS) that will serve as the conceptual and technical spine
for assessing gov ernment’s progress.
The NSS will ensure that South Africa can report on major policy objectiv es
such as the Millennium Dev elopment Goals and will include the forging of
agreements with government entities around the provision of accurate and
reliable statistic al information. It will also include the adoption of s tatis tical
standards and definitions. The sy stem is still in its preliminary phases and w ill
be im plemented ov er a three-y ear period.
2. System users and their needs
•
•
•
•
•
•
June 2005
Meeting the needs of diverse users
The Presidency and Premiers’ Offices
Centre of Government
Decision makers in public organizations and local authorities
Oversight bodies
The Public
10
Fifth draft for comment
2.1 Meeting the
needs of diverse
users
The sy stem will hav e a number of users, each with their own specific needs.
The v arious users and their needs are suggested below.
2.2 The
Presidency and
Premier’s Offices
As the principals of national and provincial departments, the Pres idency and
Premiers ’ offic es need reliable and accurate informatio n on institutional
progress and performance to guide them in dev eloping strategies and
programmes as well as in the allocation of resources, and to prompt
interv entions as required.
Sy stematic consultations will be undertak en with these users in order to
ensure that their needs are properly understood and met by the sy stem.
The GWM&E system should provide accurate and reliabl e information
that allows these users to assess the impact achieved by departments
and organisations and to encourage and promote policy revisions
where necessary.
2.3 Centre of
Government
The centre of gov ernment includes the Presidency , National Treasury and
the Departments of Public Serv ice and Administration (DPSA), Prov incial and
Local Gov ernment (DPLG) and the Public Serv ice Commission. Thes e
bodies hav e an essential role to play in ens uring that human, financial and
other resources are well us ed to ac hiev e greates t impac t.
These departments need easy and ready access to non-financial progress
reports as well as qualitativ e and quantitativ e information on the financial and
non-financial performanc e of ev ery institution falling within the scope of their
mandate.
Each central department has a particular area of concern and needs
information suitable for their own particular type of analy sis to be accessible
to them.
The GWM&E system needs to provide National Treasury with data that
allows it to assess that value for money is being provided and DPSA
with the data it needs to assess whether human resources are being
well used, managed and developed. It also needs to provide DPLG with
information that allows it to asses how well provinces and local
authorities are performing in meeting their mandates. The Presidency
needs to be provided with information on the performance of agencies
in implementing the Programme of Action and the impact of long-term
efforts to improve economic performance and alleviate poverty.
2.4 Deci sion
June 2005
Decision mak ers in all gov ernment agencies, departments and loc al
11
Fifth draft for comment
makers in public
organizations and
local authorities
authorities need access to regular and reliable information that contributes to
their own management process es by rev ealing which of their prac tices and
strategies work well and which need to be changed or improv ed.
On an ongoing basis, ev ery public body needs to assess and report on their
own progress in terms of the performance indic ators they have defined for
each of their programmes.
This management information should be detailed and accurate and should
be generated by regular, integrated management process es rather than
through s eparate proc edures.
Each institution als o needs to periodic ally assess the ex tent to which th ey are
achiev ing their s trategic objectiv es and to ev aluate the impac t they are
having in their own special areas of res ponsibility .
The GWM&E system needs to present regularly updated information on
progress in implementing programmes (in terms of inputs, outputs and
outcomes) and periodic information on impact and results. It should
also provide useful guidelines on information management.
2.5 Oversight
bodies
Our Constitution has created a number of ov ersight bodies, eac h with their
own specific areas of concern. These are:
• The Public Protector (responsible for inv estigating improper conduc t in
public adminis tration)
• The Human Rights Commission (responsible for promoting human
rights)
• The Commission for the Promotion and Protec tion of the Rights of
Cultural, Religious and Linguis tic Communities (responsible for
defending the rights of partic ular groups)
• The Commission for Gender Equality (respons ible for promoting gender
equality)
• The Auditor General (responsible for auditing and reporting on the
accounts of public bodies)
• The Elec toral Commission (responsible for free and fair elections ) and
• The Public Service Commission (responsible for monitoring and
ev aluatin g the public serv ice and promoting a high standard of
professional ethics).
Each of these bodies needs access to information that allows them to rev iew
government compliance with v arious legislative and other requirements and
to ev aluate performance in terms of their own partic ula r areas of concern.
The GWM&E system can assist these ov ersight bodies with supportive
information on gov ernanc e and adminis tration matters for ev ery public
June 2005
12
Fifth draft for comment
organis ation.
2.6 The Public
Good gov ernance requires that the public be encouraged to participate in
governance and policy -making proc esses.
This can take a wide range of different forms, including commenting on policy
propos als, participati ng in improv ement initiativ es and prov iding ass essments
through public opinion surv eys.
In order to allow such partic ipation, the GWM&E system needs to provide
access to clear, accurate and well-pres ented updat es on progress in
government programmes and their impact as well as indicating where more
detailed information can be acc essed.
June 2005
13
Fifth draft for comment
3. System aims, objectives and intended results
•
•
•
•
3.1 System aims
and objectives
System aims and objectives
Results to be achi eved
Programme logic and underlying assumptions
Indicators of system success
The aim of the Gov ernment Wide Monitoring and Ev aluation system is to
contribute to improv ed gov ernanc e and enhance the effectiv eness of public
sector organisations and institutions.
The sy stem objec tiv es are the collection and collation, analy sis and
dissemination and the applic ation of information on the progress and impac t of
programmes and initiativ es in order to:
3.2 Results to be
achieved
•
Ensure transparency and accountability ,
•
Promote serv ice deliv ery improv ement
•
Ensure compliance with statutory and other requirements and
•
Promote the emergenc e of a learning c ulture in the public sec tor
The sy stem will achiev e the following results:
RESULT ONE: Accurate and reliable information on progress in the
implementation of government and other public sec tor programmes is
collected and updated on an ongoing basis
RESULT TWO: Information on the outcomes and impac t achiev ed by
gov ernment and other public bodies is periodically collec ted and presented
RESULT THREE: The quality of monitoring and ev aluation prac tices in
gov ernment and public bodies is contin uously improv ed.
3.3 Programme
logic and
underlying
assumptions
June 2005
The sy stem is bas ed on the argument that by promoting c ertain practices and
by collecting and providing information to sy stem us ers, certain positiv e
consequenc es will result.
14
Fifth draft for comment
This intended sequenc e of events is called a logic model and can be depic ted
as follows:
STANDARD
SETTING AND
CAPACITY
BUILDING
PHASE:
M&E practic es (norms and
standards) are presc ribed and
capacity to comply is buil t.
INFORMATION
COLLECTION
PHASE:
Information on implementation
proc esses (outputs) and impact
(outcomes) is gathered and
reported upon
REPORTING
PHASE:
Compliance to regulatory
frameworks is measured
Learning by doing leads to
leads to bes t practice
promotion and collaborativ e
problem solv ing
FOLLOW UP
PHASE:
Interv entions are designed
and implemented
Ev idence bas ed decision
making supports policy
adjustments
RESULTS
ACHIEVED:
Trans parency and
accountability is improved
Serv ice deliv ery is improv ed
OBJECTIVES
ATTAINED:
Improv ed governance
Enhanc ed public service
effec tiveness
3.5 Indicators of
system success
Performance ar ea
M&E practic es (norms and standards)
are prescribed and adhered to.
June 2005
Performance indicator
•
Comprehens iv eness and rigour /
quality of standards and their
dissemination
• Ex tent of compliance to national
M&E requirements by government
entities
15
Fifth draft for comment
Information on implementation
proc esses (outputs) and impact
(outcomes) is gathered and reported
upon
Compliance to regulatory frameworks
is measured
June 2005
•
Frequency and quality of reports
produc ed by gov ernment entities
and transversal systems
•
Number and quality of complianc e
reports / Proportion of government
on which compliance reporting is
completed / Implementation of
recommendations from
compliance reports
Learning by doing leads to leads to
bes t practic e promotion and
collaborativ e problem solv ing
•
Number of prac tice improvements
resulting from learning from the
system.
Interventions are designed and
implemented
•
Number and quality of support
interv entions and their res ults
Ev idence bas ed decisio n making
supports policy adjus tments
•
Number of policy rev isions
resulting from sy stem reports
Transparency and accountability is
improv ed
Serv ice deliv ery is improv ed
Improv ed governance
Enhanc ed public serv ice effectiv eness
•
Result and Objectiv e lev el
Indicators to be dev eloped through
the National Indic ator Initiativ e
16
Fifth draft for comment
4. Performance indicators and sources of information
•
•
•
•
•
4.1 Overview
Overview of the system’s approach to indicators
A National Indicator Initiative
Departmental indicator s
Transversal indicators
Government Programme of Action
The GWM&E system will be a sec ondary sy stem that does not undertake
primary researc h or data collec tion itself.
Rather, it will draw in information from a range of other sources and pres ent
this in user fr iendly and acc essible formats tailored to the needs of different
users.
The sy stem will make us e of a web-based inte rnet portal onto which
institutions will be expected to pos t their information and reports. This
information will be cons olidated and formatted acc ording to the v arious user
specifications.
4.2 A National
The Presidency and Statis tics South Africa are finalising a compendium of
Indicator Initiative natio nal indicators as part of the M&E sy stem. The generic indicators (see
Anenx ure I) hav e been identi fied, and further work on dis aggregation is
continuing.
A forum of departmental monitoring and ev aluation officers will be
establis hed to facilitate debate and to promote a culture of measurement in
government.
This Forum will coordinate the dev elopment and adoption of standardis ed
programme lev el indic ators based on s trategic plans and supporting
programmes.
4.3 Depar tmental
information
June 2005
Each departm ent or public sec tor organisation will be required to provide the
GWM&E system with the following information by posting it on the GWM&E
sy stem:
• Clear str ategic plans broken down into programmes, each with input,
proc ess, output and outcome indicators with quarterly targets. (These
are ess entially already in plac e as they are required by the Medium Term
Ex penditure Framework and will just have to be quality assured and
improv ed in some ins tances)
• Quarterly reports on the ac hiev ements of their targets, stated in terms of
17
Fifth draft for comment
the performanc e indicators included in their s trategic plans.
• Bi-month ly collation of information on the progress made in relation to
the gov ernment’s programme of action and reporting to Cabinet through
the cluster sy stem.
• Reports on impac t studies for each of their programmes undertaken at
leas t every fiv e years, in line with the MTSF.
4.4 Transversal
information
The following transversal sy stems will be implemented and their findings and
recommendations pos ted on the GWM&E system:
• Value for money will be assess ed by a system managed by National
Treas ury
• Human Resource utilisati on will be assessed by DPSA
• An early warning sy stem will also be managed by DPSA drawing on data
from Persal and Vulindlela
• Public administration will be assess ed by the Public Serv ice Commission
• Constitutional rights will be assessed by the Department of J ustic e
• Serv ice deliv ery quality will be assess ed by DPLG’s sy stem for
monitorin g the performance of prov inces and local gov ernments.
4.5 Information
on the
Government
Programme of
Action
The ex isting web-based sy stem for prov iding informati on on progress in
implementing Gov ernment’s programme of action will form part of the
GWM&E system.
4.6 Links to other
sources
The sy stem will provide web links to other appropriate sources of information,
including priv ate and civ il society sources.
4.7 Verifying
information
The Presidency together with the other coordinating departments will verify
information prov ided by gov ernment agencies to the GWM&E sy stem.
Ex isting arrangements for submitting and processing information will be
retained.
The Auditor General may be required to participate in the v erification of
information prov ided by agencies. The precise role of the Auditor General’s
Office in this regard will be clarified when Cabinet approv es proposed
amendments to the Auditor General’s mandate.
June 2005
18
Fifth draft for comment
5. System reports and their utilization
•
•
•
•
5.1 Issues to be
addressed in
system reports
Issues to be addressed in system reports
Composite indicator s: A Government performance dashboard
Qualitative and impact studies
Responses to information provided by the system
The sy stem, located in The Presidency , will generate reports that address
ov erall gov ernment performanc e, institutional performance and progress
in implementing programmes. It will also rec eiv e impact ev aluation reports
and make these av ailable to sy stem users.
Further details on eac h of these c ategories are prov ided below:
Type of
information:
Information on
overall
government
performance
Information on
individual
institutional
performance
June 2005
Prov ided by:
Contents:
Various gov ernment and
non-government sources
primarily Stats SA
Progress in terms of
Key National
Indicators based on
(developmental
indicators for South
Africa 2005-2014)
Assigned lead
departments
Performance in
implementation of
current gov ernment
programme of action
Auditor General
Quality of accounts
PSC
Quality of public
administration
DPSA
Human resource
utilisation
DPSA and National
Treasury
Financial and Human
Resource utilisation
Department of Justice
Compliance with
constitutional rights
19
Fifth draft for comment
Information on
progress in
implementing
programmes
National Treasury
Value for money
assessments
Departments and
organisations concerned
Progress to plans
measured by MTEF
and non-financial
programme
performance
indicators
Information on
impact
Departments and
organisations concerned.
Findings from
periodic impact
ev aluations
5.2 The Key
Government
Indicator
dashboard
As part of Phas e Two of the sy stem’s implementation, a dashboard-s ty le
presentation of key data will be dev eloped acc ording to the needs of eac h
user group.
5.3 Qualitative
and impact
studies
Monitoring reports will need to be supplemented by periodic ev aluations that
assess the im pac t of gov ernment programmes and propos e changes to
policy and implementation str ategies.
The dashboard will collate information from the v arious sy stems prov iding
information to the GWM&E sy stem and present it in formats that are useful to
each specific user.
These ev aluations will need to be specific ally tailored to the needs of the
programmes being ev aluated, but will need to meet certain minimum
standards.
These s tandards would include issues suc h as the frequency with which
ev aluations should be undertak en, who should be involv ed and what kinds of
research, consultation and public participation should be undertak en.
Setting these minimum ev aluation standards will be addressed through the
M&E Norms and Standards Projec t to be undertaken as part of the
implementation of the GWM&E sy stem.
June 2005
20
Fifth draft for comment
5.4 Responses to
system reports
When submitting their quarterly monitoring reports, each gov ernment agency
will be requir ed to commit thems elv es to certain actions in respons e to the
information they are prov iding. Responsibility for rev iewing whether these
commitments are k ept still needs to be allocated. This will be handled at both
the Presidency through the Policy Unit and the cluster sy stem.
Ideally , agencies’ respons es should be linked to their internal Knowledge
Management Strategies through which learning is institutionalis ed and good
practices are promoted. Few gov ernment agencies hav e such s trategies in
place and will need to be encouraged to dev elop them. This will also be
addressed in the proposed Norms and Standards Project.
The Department of Public Serv ice and Adminis tration has developed a
Framework for Gov ernment Interv entions. This Framework prov ides a
process to be followed and defines responsibilities for implementing
interv entions to address ins titutional problems. The framework will be
implemented when sy stem reports indicate that departments require
assistanc e in addressing problems.
Besides triggering implementation of the Framework fo r Gov ernment
Interv entions, the system will prov ide its users with data for use in ev idenc ebased dec ision-makin g in their own areas of res ponsibility . This will lead to
improv ed decision-making, better long-term us e of resourc es and an
increased focus on ins titutions requiring attention and support.
June 2005
21
Fifth draft for comment
6. Roles and responsibilities:
•
•
•
•
6.1 System
management and
maintenance
System management and maintenance
Institutional responsibilities
Transversal responsibilities
Capacity building
The sy stem will be managed and maintained by the Policy Coordination and
Advisory Serv ice in the Presidency .
This will entail:
1. Maintaining regular communic atio n with all affected s takeholders and
ensuring that each of them is fully aware of what is required of them.
2. Prov iding the information technology infras truc ture for the submission of
information and the creation of system reports,
3. Rev iewing and assessing the frequency and quali ty of information
prov ided by each gov ernment agency and by the transv ersal sy stems
4. Alerting the relevant authorities (including politic al principals ) when
sy stem information indicates that there are problems or matters requiring
attention, for ex ample by triggering implementation of the Framework for
Gov ernment Interv entions and
5. Dev eloping and improv ing the system over time.
6.2 Depar tmental
systems
Each departm ent or public sec tor body will hav e to ensure that they are able
to prov ide the nec essary information as required and should determine
procedures and processes appropriate to their own operations in order to be
able to do so.
Guidelines fo r reporti ng will be dev eloped and disseminated through the
M&E Norms and Standards Projec t mentioned abov e.
6.3 Transversal
systems
June 2005
DPSA is to dev elop a sy stem for assessing human resource and skills
utilisation. DPLG will implement a system for ass essing serv ice deliv ery . The
Department of justice will implement a sy stem for assessing the protection of
cons titutional rights. The Public Service Commission will continue to
22
Fifth draft for comment
implement their sy stem for assessing adherence to public adminis tratio n
princ iples. Each of these sy stems will need to hav e early warning
mechanisms.
6.4 Capacity
building
June 2005
SAMDI will design and implement a strategy for building the capacity of all
government agencies to undertak e monitoring and evaluation.
23
Fifth draft for comment
Appendix A
Implementation Strategy and Plan for GWM&ES (2005-2007)
•
•
•
•
1. Strategic
approach
Strategic approach
Projects required
Implementation schedule
Financial considerations
The GWM&ES is essentially a com pos ite s ys tem that requires
a num ber of s upporting or contributory s ys tems to be in place
and fully operational before the overarching s ys tem can
function.
These contributory s ys tems include the following:
1.1 A National Indicator Initiative coordinated by the Policy Unit
in the Pres idency as part of the National Statis tical System
will be im plemented with Statis tics SA.
1.2 A national compendium of indicators has been developed;
building on the Ten Year Review indicators developed by
the Presidency, and will be further improved by the
proposed M&E forum.
2. All public s ervice entities need to undertake their own
credible M&E processes that m eet clearly defined s tandards
and deliver inform ation on their progress and performance.
Statis tics SA and Presidency will provide the base
document for s tandards and guidelines .
3. Trans vers al sys tems including s ystems for ass ess ing
hum an resource and s kills utilisation (DPSA), value for
m oney (Treasury), service delivery s tandards at provincial
and local levels (DPLG), protection of cons titutional rights
(Jus tice) and adherence to public adm inis tration principles
(PSC). (Others may need to be added.)
These underlying systems will take time to design and
implement fully (up to two years until 2007) and it is thus
proposed that a phased approach be adopted.
Three phases are propos ed is proposed, although each phas e
does not need to be complete before the next can s tart,
s ignificant progress will have to have been made. The
s ugges ted phases are as follows :
June 2005
24
Fifth draft for comment
• Phase One will involve the creation and im provem ent of the
trans vers al, departmental and s tatis tical s ys tems m entioned
above and the im plem entation of efforts to im prove M&E
capacity, by June 2006.
• Phase Two will involve detailed cons ultations with users to
ens ure their needs are properly unders tood, the des ign of
report formats and the developm ent of an information
technology architecture and platform that will receive data
and form at it into reports that meet us ers needs, by
December 2006
• Phase Three will involve tes ting and piloting the s ys tem ,
evaluating and adjusting it and then rolling it out to the res t
of the public service, local authorities and s tate owned
enterprises by July 2007.
2. Projects
approach
The following projects will have to be undertaken to deliver the
required res ults for each phase:
Phase One: Setting the basis for implementing the
reporting practices
1.1 The development of clear M&E requirements and s tandards
to be met by all governm ent ins titutions , linked to the
requirem ents of the National Indicator Initiative. Acti vities
will involve drafting an initial propos al, hos ting a
cons ultative conference, amending the propos al and
releasing it as a Regulation under the Public Service Act in
terms of which all government bodies wil l have to comply
with the s tandardis ed requirem ents . There will be a parallel
process , along with or immediately after 1.1 above, to
develop capacity in The Pres idency as well as Premiers ’
and Mayors ’ Offices to assis t in implem enting the res t of the
“Projects ” and to carry out their central M&E respons ibilities
June 2005
25
Fifth draft for comment
1.2 The development of the trans versal s ys tems listed in Phase
one above. Acti vities will include working clos ely with the
responsible departments and s upporting them in their
s ys tem developm ent process es . Undertake capacity
building and provide accreditation. Activities will include
drawing on the PSC’s ass essm ent project to determ ine
what each departm ent needs to do to bring their M&E
s ys tems up to s tandard and ass isting them to do so.
1.3 Once the necess ary remedial s teps have been taken a
s econd assessm ent will have to be done in order to accredit
them as m eeting the required s tandards.
For Phase Two: Reporting formats and users needs
identification
2.1 The development of reporting formats that meet us ers
needs. Activities will include initial cons ultations and the
form ulation of draft and final report formats .
2.2 The development of an IT architecture that provides a
platform to receive and collate data into the necess ary
reporting formats . Alongs ide the normal activities required
for IT de velopm ents, us ers and their IT functions will have
to be cons ulted to ens ure overall interoperability and
integration. The current work on the Executive Information
Managem ent Sys tem EIMS and POA will be integrated and
form the basis for using IT architecture in this propos al.
June 2005
26
Fifth draft for comment
For Phase Three: Use of Information management system
as a tool, building on the current POA and EIMS
3.1 Initial pilot project with a select few departments
in order to tes t the s ys tem ’s functionality. Part of
this project will be developing a detailed activities
plan that will s erve as a tem plate for the later rollout.
3.2 An independent evaluation of the s ystem will be
commiss ioned and adjus tm ents m ade to the
s ystem according to the evaluators findings and
recomm endations . Key acti vities will be
developing a clear terms of reference, selecting a
reputable s ervice provider and providing
whatever assis tance is required.
3.3 The gradual extension of the sys tem to include all
government bodies , s tarting with national
departm ents , then provincial departments , local
authorities and all SOEs . Activities for this project
will be determ ined in project 3.1.
June 2005
27
Fifth draft for comment
3.
Im plementation
schedule
Project
M ilestones
Com pletion date
1.1 Reporting
norms and
standards
Complete initial draft
Hold conference and
release final draft
Issue regulations
Conclude agreements w ith
departments
Collect baseline date on
developed indicators
Test systems
Amend and finalise
systems
Roll out government w ide
Complete PSC project
Define capacity building
interventions needed
Complete capacity building
processes
Re-assess and accredit
departments
September 2005
December 2005
1.2 Integrating
and im proving
Transversal
systems
1.3 Capacity
building and
accreditation
June 2005
April 2006
December 2005
January 2006
February 2006
June 2006
July 2007
December 2005
January 2006
June 2006
December 2006
2.1 Reporting
form ats
Draft initial formats
Consult and amend
Finalise and disseminate
formats
January 2006
April 2006
July 2006
2.2 IT
architecture
building on
and enhancing
EIMS and POA
platforms
Define user specifications
Develop and test an initial
proposal
Finalise architecture and
platform
Implement
January 2006
March 2006
June 2006
July 2006
3.1 Pilot
Select pilot departments
Project for
Implement all systems
im plementation Collate data and prepare
reports
Review results and make
recommendations
Gradual extension to all
3.2 Roll out to
bodies of government
the rest of
government
July 2006
August 2006
September 2006
3.3 Evaluation
appraisal
November 2007
Commission research,
Receive and consider
findings
Implement
recommendations
December 2006
From April 2007
onw ards
28
Fifth draft for comment
Appendix: B
• Transversal systems
System
Focus
Data sources
Value for
money
Results achieved
assessed against
resources used.
Existing financial systems and
performance information as
envisaged by the
performance information
working committee.
National Treasury
Human
Resource
utilisation
Effectiveness of
human resource
utilisation
T o be determined as part of
DPSA processe s to improve
HR practices
DPSA
Early warning
system
Identifying where
interventions are
required as early
as possible
Data from Departmental
reports, Persal, Vulindlela and
other sources
The Presidency, National
Treas ury and DPSA
Public
administration
performance
Compliance to
Constitutional
principles
Mix of existing information
required by various
frameworks and primary
research
PSC
Delivery of
Constitutional
rights
Compliance to
Constitutional
rights
Department of Justice M&E
working committee to
determine
Department of Justice
Service
deliv ery
quality
Ten key services
assessed through
longitudinal
studies
Original research at sentinel
sites
Not yet determined
June 2005
Responsibility
29