Presentation by Howard White, Chief Executive Officer, Campbell Collaboration, at the event on Governing better through evidence-informed policy making, 26-27 June 2017. For further information please see http://www.oecd.org/gov/evidence-informed-policy-making.htm
Howard White - Evidence informed policy making - 26 June 2017
1. Howard White
CEO, Campbell Collaboration
International cooperation on evidence-informed
for better policy and practice
@campbellreviews @HowardNWhite #Evidence4gov
4. Wave one: New public management and
the results agenda
• Origins US, UK, Australia and New Zealand
• Adopted Clinton and Blair governments
• Shifted focus from monitoring inputs (how much
money we spend)…
• to outcomes (families lifted out of poverty, women
empowered, children protected from abuse etc.)
5. UK Modernizing Government
(aka ‘the results agenda’)
The Government wants to ensure the
effectiveness of the services the public
receive. That is what makes a difference
to the quality of people’s lives. The way
to do this is through Public Service
Agreements (PSAs).
UK Cabinet Office
6. DFID PSA Performance Targets, 1999-2002
Source: Public Services for the Future: Modernisation, Reform, Accountability Comprehensive
Spending Review: Public Service Agreements 1999-2002 Cm 4181
7. All this may sound rather familiar...
Government
Results and
Performance
Act, 1993
8. Application by USAID
• USAID: six strategic development goals
• E.g. “broad-based economic growth and
agricultural development encouraged”
• For each goal defined outcome indicators at both
country and global levels
• E.g. “average annual growth rates in real per capita
income above 1 per cent”
9. • FY 2000 performance report states that “nearly 70
per cent of USAID-assisted countries were growing
at positive rates in the second half of the 1990s,
compared with 45 per cent in the early part of the
decade”
But: ‘one cannot
reasonably
attribute overall
country progress
to USAID
programs’
GAO: ‘so broad and
progress affected by many
factors other than USAID
programmes, [that] the
indicators cannot
realistically serve as
measures of the agency’s
specific efforts’
10. And so…
USAID abandoned the use of strategic
indicators as performance measures
(retaining them as ‘Development
Performance Benchmarks’)
This does not mean should NOT do
monitoring… but know what it can and
cannot do
11. Monitoring Factual data of what happened. Especially useful at lower reaches of causal
chain
12. So how do we measure what difference a
programme makes, i.e. impact?
By using rigorous impact evaluations
with a valid comparison group to control
for selection bias, preferably a
randomized controlled trial
13. So how are we to measure impact?
Wave Two: The
Randomization
Revolution
14. Number of social work RCTs published by year
Source: Calculated from Bruce Thyer ‘A Bibliography of Randomized Controlled Experiments in Social Work (1949–2013): Solvitur Ambulando’ Research
on Social Work Practice 2015, Vol. 25(7) 753-793
19. Evidence-driven project cycle
Consult
evidence
base to
inform
design
Formative
testing in
local context
Pilot
programme:
Efficacy
studies
Go to scale
with
promising
components:
effectiveness
studies
Keep testing as
roll out to new
populations /
contexts / design
features
Synthesize
evidence
across all
studies
20. • Systematic reviews: systematic summaries
of available high-quality evidence.
• Non-systematic reviews (or low quality
systematic reviews) have a greater risk of
bias
• Cochrane: founded 1993
• Campbell: founded 2000
21. Different models around the world
Nordic model UK model US model Latin American
model
Govt. funded research
centres e.g. SFI, SBU
and FHI
What Works Centres 1. WWCHs
2. Moneyball for
Government
Central government
evaluation agencies
Government funded Mixed funding (e.g. Big
Lottery)
1. Some govt. funding
2. Foundation funding
Government funded
Systematic reviews,
some adherence to
Cochrane and Campbell
standards
Variety of evidence
synthesis
Often single study based
(note conflict of interest)
Oversee M&E framework
for govt funded
programmes
Integrated into decision
making (demand driven)
Each WWC has to find
its ‘pathway to policy
influence’
1. Portals
2. Advocacy model
Promotes rigorous
evidence and evidence-
based decisions
22. But these are all separate
initiatives, all producing separate
reviews of the same global
evidence… health is something of
an exception with some
coordination through Cochrane
Library
23. Institutionalisation of the use of
evidence: health
The World Health Organization (WHO) follows a guideline
development process, described in detail in the WHO
Handbook for Guideline Development (2nd edition),
overseen by the Guidelines Review Committee (GRC)
established by the Director-General in 2007. The WHO
Guidelines Review Committee ensures that WHO
guidelines are of a high methodological quality, developed
using a transparent and explicit process, and are
informed by high quality systematic reviews of
the evidence using state-of–the art systematic search
strategies, synthesis, quality assessments and methods.
24. UK Health: NIHR-NICE
National Institutes Health Research (NIHR):
• Provides infrastructure support to 21 Cochrane Groups
• NIHR Cochrane Programme Grant Scheme funds reviews of
relevance to NHS
• NIHR Cochrane Incentive Awards to accelerate reviews
National Institute for Clinical Excellence (NICE), Use
systematic reviews for:
• Guideline production
• Eligibility for NHS resources
25. But we do not seem the same
coordination other sectors
E.g. multiple agencies commissioning the same
review on child welfare issues
new EC Humanitarian Assistance Knowledge
Centre vs Evidence Aid
27. The current uncoordinated model
Synthesis
of global
evidence
Question setting
Knowledge brokering
Use of evidence in
policy and practice
Opportunities for coordination
Evidence standards
(ESI and GRADE)
Sharing findings
Demand coordination:
• Sharing workplans
• Coordinating workplans
• Common workplan
28. A coordinated approach
Synthesis
of global
evidence
Joint question setting
Knowledge brokering
Use of evidence in
policy and practice
A single set of
findings
Local knowledge
brokering for local
policy and practice
stakeholders
Campbell is
supporting this
approach with
pooled fund for
child-related
reviews: but few
takers
29. Why is coordination not happening
• Timelines
• National mandate
• Control / trust
ALL of these reasons are short-sighted, resulting in a
missed opportunity to truly build a global evidence
based in social policy
We need to coordinate evidence demand to build a
single global repository in the Campbell Library