Learn about how the South African government uses monitoring and evaluation to assess its performance.
Dr Ian Goldman, from the Department of Performance Monitoring and Evaluation: The Presidency, speaks at the Tshikululu Social Investments Serious Social Investing 2013 workshop.
From policy to impact - Serious Social Investing 2013
1. The Presidency
Department of Performance Monitoring and Evaluation
Why is M&E important?
What is government doing?
Presentation to Tshikululu CSI Conference
13 March 2013
Dr Ian Goldman
Head of Evaluation and Research
2. Summary
Approach – M&E as a system – not an ad-hoc
donor approach
Why is M&E important?
Evidence for policy- and decision-making
Helping to get a results culture – transforming the
public service
The problem
How is government approaching it?
Challenges
The Presidency: Department of Performance Monitoring and Evaluation 2
3. What Is Evidence-Based Policy?
• Helping policy makers to make better decisions and
achieve better outcomes
• Providing better services (public and private)
By using:
• Existing evidence more effectively
• New research/evaluation to fill the gaps in the
evidence base
And:
• Integrating sound evidence with decision makers ’
knowledge, skills, experience, expertise and
judgement
Source: Oxford Evidentia
4. Source: Oxford Evidentia
How is the policy Logic Model What is already
to work?
known about the
Social Ethics
Public problem/policy?
Consultation Theories
Ethical Research
What are the ethical of Synthesis
Evidence Harness
implications of Change
the policy? Existing
Economic and Evidence
What is the nature,
Cost-Benefit/ Econometric Evidence
Effectiveness/ for size and dynamics
Evidence Descriptive
Utility Analysis
Policy of the problem?
and
What are the costs Implementati Experientia Statistics
and benefits of the on Evidence of l Evidence
Surveys
policy Evidence/ Proven Qualitative
] Effectiveness Research
Case Studies
Interviews
Focus Groups What has been
Experimental and
Ethnography shown to work
How do we make Quasi-
Operations elsewhere?
Research The policy work? Experimental Evidence
The Presidency: Department of Performance Monitoring and Evaluation
5. Views of senior managers
Scientific and objective, enabling reliable predication based on
facts that speak for themselves, collected by objective and
independent specialists, derived through replicable methods
and constituting objectively verifiable proof; or
Probabilistic, emergent and contested, an iterative search for
explanations and understanding of how to achieve politically
derived values in which the choice of facts and sources is
influenced by existing ideas, ideology, mind-set, values and
interests and subject to specific and changing contextual
factors.
A third group straddled these views, indicating that the choice
should be dictated by the type of policy to be developed and
the type of research methodology appropriate to that type of
policy decision.
The Presidency: Department of Performance Monitoring and Evaluation 5
6. Use of M&E as change strategy - WPTPS
A mission statement for service delivery, together with service guarantees;
The services to be provided, to which groups, and at which service charges; in
line with RDP priorities, the principle of affordability, and the principle of
redirecting resources to areas and groups previously under-resourced;
Service standards, defined outputs and targets, and performance indicators,
benchmarked against comparable international standards;
Monitoring and evaluation mechanisms and structures, designed to measure
progress and introduce corrective action, where appropriate;
Plans for staffing, human resource development and organisational capacity
building, tailored to service delivery needs;
The redirection of human and other resources from administrative tasks to
service provision, particularly for disadvantaged groups and areas;
Financial plans that link budgets directly to service needs and personnel plans;
Potential partnerships with the private sector, NGOs and community
organisations to provide more effective forms of service delivery;
The Presidency: Department of Performance Monitoring and Evaluation 8
7. Measuring results
Having a clear direction
Having targets (you know what you want to achieve)
(Having a theory of change - logical link between what
you do and what you achieve)
Linking resources to plans, monitoring progress against
plans
Challenge of target approach – does a good headmaster
do what she does because of targets?
Be careful of taking private sector models too far.
The Presidency: Department of Performance Monitoring and Evaluation 9
8. So
How can we strengthen and formalise the use of
evidence
How can we formalise the need for effective
theories of change
Through strengthening planning
Through the evaluation process
How can we use M&E as part of an organisational
change strategy
The Presidency: Department of Performance Monitoring and Evaluation 10
9. But we have a problem…
The Presidency: Department of Performance Monitoring and Evaluation 11
10. 1.3 Performance Area: Monitoring and Evaluation
1.3.1 Indicator name: Use of monitoring and evaluation outputs
Indicator definition : Extent to which the department uses monitoring and evaluation information .
Secondary Data: AGSA findings on pre determined objectives – Reported information not reliable.
Question: Which set of statements best reflects the department’s use of M&E outputs?
Statement Evidence Performance
level
Department does not have an M&E Policy/Framework or does Not required Level 1
not have capacity to generate information .
Monitoring reports are available but are not used regularly by Quarterly monitoring Level 2
top management and programme managers to track prog ress reports
and inform improvement . Minutes of top
management meetings or
programme meetings to
assess use of reports
Monitoring reports are regularly used by top management Quarterly monitoring Level 3
and programme managers to track progress and inform reports
improvement. Minutes of top
management meetin gs or
programme meetings to
assess use of reports
All above in L evel 3 plus: All above in Level 3 plus: Level 4
Evaluation Reports
Evaluations of major programmes are conducted periodically Changes to programmes
and the results are used to inform changes to programme and plans
plans, business processes, APP and strategic plan.
12
11. Score in M&E
(based on self-assessments by 103
national and provincial departments)
13
12. Problem
Evidence and analysis not used sufficiently in decision-making,
planning, or budgeting ,particularly of programmes
44% of national and provincial departments not regularly using
monitoring reports to improve performance
Monitoring undertaken as compliance, not as part of culture of
continuous improvement
Evaluation applied sporadically and not informing planning, policy-
making and budgeting sufficiently - missing the opportunity to improve
Government’s effectiveness, efficiency, impact and sustainability.
Parliament relatively weak compared to executive, so oversight limited
(on this trip to US and Canada to improve understanding of committee
that oversees DPME)
The Presidency: Department of Performance Monitoring and Evaluation 14
14. Roles and Responsibilities for Planning and M&E in SA
National Treasury Presidency
Auditor General
• Regulate departmental 5 • National Planning
• Independent monitoring year and annual plans and Commission (NPC):
of compliance reporting o Produce long-term
• Auditing of performance • Receive quarterly plan (20 years)
information performance information • Department of
• Reporting to Parliament • Expenditure reviews Performance Monitoring
and Evaluation (DPME)
Cooperative Governance Dept
Public Service Commission o Produce
(DCOG)
government-wide
• Independent monitoring and • Regulate local government
M&E frameworks
evaluation of public service planning
o Facilitate production
• Focus on adherence to public • Monitor performance of
of whole of
service principles in local government
government 5 year
Constitution • Intervention powers over
plans for priorities
• Reporting to Parliament local government
o Monitor and
Public Service Dept (DPSA) evaluate plans for
Constitutional power
priorities as well as
• Monitor national and performance of
Legal power
provincial public service individual
• Regulate service delivery departments and
Executive power
improvement municipalities
The Presidency: Department of Performance Monitoring and Evaluation 16
15. Focus of DPME to date
• Plans for the 12 priority outcomes (delivery agreements)
M&E of • Monitoring (ie tracking) progress against the plans
national
priorities • Evaluating to see how to improve programmes, policies,
plans (2012-13 8 evaluations, then 15, then 20)
Management • Assessing quality of management practices in individual
performance departments (MPAT) at national/state level
M&E • Moderated self assessment and continuous improvement
M&E of front- • Monitoring of experience of citizens when obtaining
line service services (joint with states)
delivery • Presidential Hotline – analysing responses and follow-up
• M&E platforms across gov – nationally, provincially
Government- • Data quality issues
Wide M&E • Structures of M&E units/Capacity development
System • Emerging focus on (implementation) programmes
• National Evaluation System (initially NEP-focused)
The Presidency: Department of Performance Monitoring and Evaluation 17
16. Why evaluate?
Improving policy or programme performance (evaluation for
continuous improvement):
this aims to provide feedback to programme managers.
Evaluation for improving accountability:
where is public spending going? Is this spending making a difference?
Improving decision-making:
Should the intervention be continued? Should how it is implemented be
changed? Should increased budget be allocated?
Evaluation for generating knowledge (for learning):
increasing knowledge about what works and what does not with regards to
a public policy, programme, function or organization.
The Presidency: Department of Performance Monitoring and Evaluation 18
17. Different types of evaluations related to
questions around the outcome model
Impact evaluation
Has the intervention had
impact at outcome and
Economic Evaluation impact level, and why
What are the
cost-benefits?
Implementation
evaluation
- what is
happening and
why
DESIGN
Diagnostic
what is the underlying situation Design evaluation
and root causes of the Does the theory of
problem change seem strong?
The Presidency: Department of Performance Monitoring and Evaluation 19
18. Following-up of the evaluations
Evaluation report
1page policy summary, 3p exec summary, 25p report
Management response
Each department responds formally, and also put on website
Improvement plan
Developed with the departments involved after report
approved
Monitored
Communication
Development of customised communication materials for
different audiences
Evaluation report, management response and improvement
plan put on dept and DPME website
The Presidency: Department of Performance Monitoring and Evaluation 20
19. Challenges emerging
Overall the system is working but some challenges are emerging. These include:
Poor communication channels from some DGs and programme managers
often not aware of the possibility
Some senior managers wary and don’t see it as an opportunity to improve
their performance. Not getting right people to briefing sessions so senior
managers don’t understand the system and haven’t bought in
Making sure the evaluations proposed are the strategic ones
Sometimes departments not budgeting for evaluations and expecting DPME
to provide all the money
Departments not planning ahead – very important for impact evaluations in
particular where need to plan 3+ years ahead
Some avoidance strategies happening – eg parallel evaluations, not providing
information to evaluators.
The Presidency: Department of Performance Monitoring and Evaluation 21
20. So we are developing a corpus of evaluations
7 underway
16 being scoped
93 from 2006 that will go on website in May
15 for 2014/15……..
We are on the journey
The Presidency: Department of Performance Monitoring and Evaluation 25
22. 8 evaluations in National Evaluation Plan
2012-13 (1)
1. Impact Evaluation of the National School Nutrition Programme (NSNP).
(DBE)
2. Impact Evaluation of Grade R. (DBE)
3. Implementation Evaluation of the Integrated Nutrition Programme. (Health)
4. Implementation Evaluation of the Land Reform Recapitalisation and
Development Programme. (Department of Rural Development and Land
Reform)
5. Implementation Evaluation of the Comprehensive Rural Development
Programme. (Department of Rural Development and Land Reform)
6. Implementation/design evaluation of the Business Process Services
Incentives Scheme. (Department of Trade and Industry)
7. Implementation Evaluation of the Integrated Residential Development
Programme (IRDP).. (Department of Human Settlements)
8. Implementation Evaluation of the Urban Settlements Development Grant
(USDG). (Department of Human Settlements)
The Presidency: Department of Performance Monitoring and Evaluation 27
23. Evaluations recommended for 2013/14
1. Evaluation of Export Marketing Investment Assistance incentive
programme (DTI).
2. Evaluation of Support Programme for Industrial Innovation (DTI).
3. Impact evaluation of Technology and Human Resources for Industry
programme (DTI).
4. Evaluation of Military Veterans Economic Empowerment Programme
(Military Veterans).
5. Impact evaluation on Tax Compliance Cost of Small Businesses (SARS).
6. Impact evaluation of the Comprehensive Agriculture Support Programme
(DAFF).
7. Evaluation of the Socio-Economic Impact of Restitution programme
(DRDLR).
8. Evaluation of the Quality of the Senior Certificate (DBE).
The Presidency: Department of Performance Monitoring and Evaluation 28
24. 2013/14 continued
9. Setting the Baseline for Impact Evaluation of the Informal Settlements
targeted for upgrading (DHS).
10. Evaluating interventions by the Department of Human Settlements to
facilitate access to the city (DHS).
11. Provision of state subsidised housing and asset poverty for households
and local municipalities (DHS).
12. Impact evaluation of the Community Works Programme. (DCOG).
13. Evaluation of the National Advanced Manufacturing Technology Strategy
(DST).
14. Impact Evaluation of the Outcomes Approach (DPME).
15. Impact/implementation evaluation of national coordination structures
including the cluster system (Presidency).
The Presidency: Department of Performance Monitoring and Evaluation 29
There are a range of departments and institutions responsible for planning and M&E in SA. The Constitution mandates the Auditor General and the Public Service Commission to carry out independent monitoring of certain aspects of government and report on this to Parliament.Three national departments have strong legal powers to regulate certain types of planning and M&E. Since 2009, the presidency has also taken on certain planning and M&E roles, but to date we have relied on positional power rather than legal powers to effect these. These complex and fragmented institutional responsibilities are the result of incremental changes over time, arising from successive public sector reform initiatives since 1994. This process has resulted in a number of gaps and overlaps in planning and M&E, differences in approaches to planning and some conflicting instructions to departments. Departments have to run parallel systems to service the various reporting requirements, resulting in a reporting over-burden, reporting fatigue, and a tendency to focus on compliance with reporting requirements rather than use of information by management. The reasons why this situation has not yet been addressed include a lack of coordination between the various centre of government departments, a lack of clarity regarding the role of the Presidency in addressing such issues, and general change-fatigue and wariness about further change when we are still consolidating post-1994 changes.We are currently considering introducing legislation to address some of these gaps and overlaps. We would also aim to institutionalise the role of the Presidency with such legislation, to reduce the risk of relying on a single strong political champion for M&E, who may not survive political changes.
To date my Department has been focused on three levels of performance monitoring and evaluation. At the highest level, our focus since 2010 has been on facilitating the development of plans for priorities such as basic education, health, reducing crime and creating employment. The main aims of this initiative have been to increase the strategic focus of government, to introduce results-based planning on a sectoral basis, to increase coordination between departments and across spheres of government, and to use M&E of progress against the plans to foster a culture of evidence-based continuous improvement. The 10 and 15 year reviews of the post-apartheid government’s performance which were carried out by the Presidency came to the conclusion that a key challenge in SA is implementation of policies, which in turn is related to management weaknesses. At the next level we have therefore introduced a management performance assessment mechanism, informed by the Canadian Management Assessment Framework. The methodology is based on self assessment against standards in key management areas, coupled with verification against secondary data. It is being implemented in partnership with the provinces and the results of the assessments will be presented to Cabinet and the Provincial Executive Councils on an ongoing basis, together with monitoring reports on the implementation of improvement plans.A key political imperative in South Africa is to improve the quality of services provided directly to citizens. We are therefore also carrying out M&E at the level of frontline service delivery by visiting service delivery sites together with the Offices of the Premier in the provinces.The focus is on whether service delivery standards are in place and are being adhered to. The results and monitoring reports on the implementation of improvement plans are similarly presented to Cabinet and the Provincial Executive Councils. We have also introduced a Presidential Hotline through which the public can lodge service delivery complaints. In future we would like to introduce more citizen-based monitoring, in partnership with civil society organisations.
Note Minister clear he wants all 4
If presenting the following slides on each type of evaluation, then can be brief here. Mention evaluation synthesis too.