SlideShare ist ein Scribd-Unternehmen logo
1 von 17
ESRC Evaluation Strategy
Mr Luke Moody, Deputy Head of Evaluation, Corporate Strategy and
Analysis
Content
▶ ESRC Evaluation Committee
▶ ESRC Evaluation Strategy
▶ Project Evaluation
▶ Large Investment and Policy Evaluation
▶ International Benchmarking
▶ Impact Evaluation
ESRC Evaluation Committee
▶ Oversees evaluation of quality and impact of all
  ESRC investments
▶ Responsible for advising Council on the
  successful achievement of its corporate strategy
▶ Operates independently of the Policy
  Committees, reports directly to Council
Evaluation Strategy – cuts across all
Committee and Network remits
▶ Impact through World Class Research: Evaluations of the academic quality
  and impact of ESRC research and UK Social Sciences more generally – to
  advise Council on the impact and quality of its research portfolio, and to
  advise the Research Committee on further investments
▶ Impact through Skilled People: Evaluations of ESRC funding schemes and
  training investments – to advise Council and the Training and Skills
  Committee on the impact and further development of research training
  initiatives
▶ Impact through Infrastructure: Evaluations of ESRC funding schemes and
  resource investments – to advise Council and the Methods and
  Infrastructure Committee on the impact and further development of research
  methods and infrastructure initiatives
▶ Impact through International Leadership: Evaluation of internationally
  focused ESRC funding schemes and investments – to advise Council and
  the Policy Committees on the impact and further development of initiatives
  that promote international collaboration
▶ Impact through Partnerships: Evaluations of the impact of ESRC
  partnerships and investments on policy and practice – to advise Council and
  the Policy Committees on the practical impact of its partnership building, and
  on ways to increase this contribution
Why evaluate?
▶ The main purposes of evaluation are to:
▶ provide an assessment of accountability, i.e. whether
  public funds were spent as agreed
▶ assess whether a project has been conducted
  effectively, whether it has met its objectives and to
  make an assessment of the quality and impact of the
  research
▶ provide award holders with some feedback about the
  management, quality and rigour of the research, and
  to provide comments on uses or potential uses of
  the research
▶ Learn lessons to inform ongoing and future activities
Project Evaluation
▶ All ESRC grant holders must provide:

▶ End of Award Report (3 months after grant end).
▶ Impact Report (12 months after grant end).

▶ Subject to external peer review and grade.
▶ Cumulative results reported annually to Council.
Investment and Policy Evaluation

▶ Annual programme of large investment, scheme
  and policy evaluations.
▶ Consultation with Policy Committees.
▶ Undertaken by external evaluators.
▶ Reports presented to Council and relevant
  Committees, alongside synthesis of key findings
  and recommendations
▶ Mechanism in development for tracking
  recommendations
International Benchmarking
▶ Assess comparative performance of UK Social
  Science disciplines.
▶ In partnership with Learned Societies.
▶ Review series aims to:
▶ - highlight the standing and contribution of UK
   disciplines
▶ - identify ways of enhancing performance and
  capacity
▶ - promote future research agendas
Impact Evaluation - Aims
▶ To identify and analyse evidence of research
  impact on policy and practice.
▶ To understand how impact is generated, and
  help the ESRC to improve its performance in this
  area.
▶ To develop impact evaluation methods.
Impact Evaluation – Critical Issues
▶ What are the problems?
  – Timing – how long to wait?
  – Attribution – what role has research played in change?
  – Additionality – what is ESRC’s contribution?
▶ Different types of Impact:
  – Instrumental
  – Conceptual
  – Capacity Building
▶ Looking for demonstrable impact on policy and
  practice
Impact Evaluation – Case Studies
▶ Experimental Policy and Practice Case Studies –
  2005 onwards
▶ Aims:
  – to trial methods - ‘what works’
  – produce evidence – ‘demonstrating impact’
  – improve understanding - ‘how does impact occur’
▶ Testing:
  – wide range of methods
  – wide range of contractors
▶ Key Requirements:
  – conceptual framework
  – understanding the ‘how’ as well as the ‘what’
Drivers of Impact
▶ Established relationships and networks with user
  communities
▶ Involving users at all stages of the research
▶ Well-planned user-engagement and knowledge
  exchange strategies
▶ Portfolios of research activity that build reputations
  with research users
▶ Good infrastructure and management support
▶ Where appropriate, the involvement of
  intermediaries and knowledge brokers as translators,
  amplifiers, network providers
Economic Impact - Pilot
▶ Pilot Study: Some success in valuing outputs,
  but quantifying the wider impact of research was
  more challenging
  – Two main barriers:
   ▪ Disentangling multiple contributors to policy development
   ▪ General lack of evidence on economic impact of government policies
  – Access to robust evaluation data on the impact of
    individual policies is needed in order to quantify
    ESRC’s contribution to those policies
Economic Impact – Tracking Back

▶ Tracking Back Studies – working backwards
  from policy or practice initiatives that have been
  subject to evaluation
▶ Assessing ESRC research contributions to the
  initiative
▶ Estimating economic value of ESRC’s impact,
  using national evaluation data as a benchmark
▶ EMA and Pathways to Work
Developing Impact Evaluation

▶ Conceptual Impact
  – Child Poverty Policy
▶ Impact of Infrastructure
  – Economic impact infrastructure investment: the
    Economic and Social Data Service (ESDS)
▶ Impact of Skilled People
  – People-flow impacts in the Welsh Government
Further Information
▶ ‘Taking Stock’ and ‘Branching Out’:
  http://www.esrc.ac.uk/impacts-and-
  findings/impact-assessment/developing-impact-
  evaluation.aspx
▶ Policy and Practice Case Studies:
  http://www.esrc.ac.uk/impacts-and-
  findings/impact-assessment/policy-practice-
  impacts.aspx
▶ Economic Impact Evaluation:
  http://www.esrc.ac.uk/impacts-and-
ESRC’s Contacts
▶ Speaker:
  – Mr Luke Moody, Deputy Head of Evaluation, Corporate
    Strategy and Analysis
        luke.moody@esrc.ac.uk


▶ ESRC website
  – www.esrc.ac.uk

Weitere ähnliche Inhalte

Was ist angesagt?

The Country- led system to monitor and evaluate: Case of Mali
The Country- led system to monitor and evaluate: Case of MaliThe Country- led system to monitor and evaluate: Case of Mali
The Country- led system to monitor and evaluate: Case of Mali
Sharkhuu Munkhbat
 

Was ist angesagt? (20)

Research and policy processes
Research and policy processesResearch and policy processes
Research and policy processes
 
The Country- led system to monitor and evaluate: Case of Mali
The Country- led system to monitor and evaluate: Case of MaliThe Country- led system to monitor and evaluate: Case of Mali
The Country- led system to monitor and evaluate: Case of Mali
 
Illustration of a proposed ReSAKSS-Asia website tool
Illustration of a proposed ReSAKSS-Asia website toolIllustration of a proposed ReSAKSS-Asia website tool
Illustration of a proposed ReSAKSS-Asia website tool
 
Molly Irwin - Evidence informed policy making - 27 June 2017
Molly Irwin - Evidence informed policy making - 27 June 2017Molly Irwin - Evidence informed policy making - 27 June 2017
Molly Irwin - Evidence informed policy making - 27 June 2017
 
FoME Symposium 2015 | Workshop 8: Current Evaluation Practices and Perspectiv...
FoME Symposium 2015 | Workshop 8: Current Evaluation Practices and Perspectiv...FoME Symposium 2015 | Workshop 8: Current Evaluation Practices and Perspectiv...
FoME Symposium 2015 | Workshop 8: Current Evaluation Practices and Perspectiv...
 
Monitoring and Evaluation
Monitoring and EvaluationMonitoring and Evaluation
Monitoring and Evaluation
 
Presentation by M. Sinclair (English), Regional Conference for Supreme Audit ...
Presentation by M. Sinclair (English), Regional Conference for Supreme Audit ...Presentation by M. Sinclair (English), Regional Conference for Supreme Audit ...
Presentation by M. Sinclair (English), Regional Conference for Supreme Audit ...
 
ODI: Methods for M&E of Policy Advocacy
ODI: Methods for M&E of Policy AdvocacyODI: Methods for M&E of Policy Advocacy
ODI: Methods for M&E of Policy Advocacy
 
Organizational Capacity Assessments for Policy, Advocacy, Financing, and Gove...
Organizational Capacity Assessments for Policy, Advocacy, Financing, and Gove...Organizational Capacity Assessments for Policy, Advocacy, Financing, and Gove...
Organizational Capacity Assessments for Policy, Advocacy, Financing, and Gove...
 
RAPID perspective on Research into Use
RAPID perspective on Research into UseRAPID perspective on Research into Use
RAPID perspective on Research into Use
 
Dr Sumi David, Strategy and Development Manager for Research Impact and Secto...
Dr Sumi David, Strategy and Development Manager for Research Impact and Secto...Dr Sumi David, Strategy and Development Manager for Research Impact and Secto...
Dr Sumi David, Strategy and Development Manager for Research Impact and Secto...
 
A Cup of Tea with Yaso Kunaratnam
A Cup of Tea with Yaso KunaratnamA Cup of Tea with Yaso Kunaratnam
A Cup of Tea with Yaso Kunaratnam
 
Howard White - Evidence informed policy making - 26 June 2017
Howard White - Evidence informed policy making - 26 June 2017Howard White - Evidence informed policy making - 26 June 2017
Howard White - Evidence informed policy making - 26 June 2017
 
Capacity Development For Monitoring And Evaluation
Capacity Development For Monitoring And EvaluationCapacity Development For Monitoring And Evaluation
Capacity Development For Monitoring And Evaluation
 
Rotary Foundation Cadre Training: Monitoring and Evaluation
Rotary Foundation Cadre Training: Monitoring and EvaluationRotary Foundation Cadre Training: Monitoring and Evaluation
Rotary Foundation Cadre Training: Monitoring and Evaluation
 
Using SURVEYBE to improve the collection of panel data experience from EDI
Using SURVEYBE to improve the collection of panel data experience from EDIUsing SURVEYBE to improve the collection of panel data experience from EDI
Using SURVEYBE to improve the collection of panel data experience from EDI
 
Project Monitoring and Evaluation
Project Monitoring and EvaluationProject Monitoring and Evaluation
Project Monitoring and Evaluation
 
The role of Monitoring and Evaluation in Improving Public Policies – Challeng...
The role of Monitoring and Evaluation in Improving Public Policies – Challeng...The role of Monitoring and Evaluation in Improving Public Policies – Challeng...
The role of Monitoring and Evaluation in Improving Public Policies – Challeng...
 
Module 6 Implementation: project management and monitoring
Module 6 Implementation: project management and monitoringModule 6 Implementation: project management and monitoring
Module 6 Implementation: project management and monitoring
 
Ref 2014 for aje
Ref 2014   for ajeRef 2014   for aje
Ref 2014 for aje
 

Ähnlich wie ESRC Evaluation strategy

Gilpin policy training presentation2 final
Gilpin policy training presentation2 finalGilpin policy training presentation2 final
Gilpin policy training presentation2 final
GilpinESD
 
Updated Israel Laizer CV
Updated Israel Laizer CVUpdated Israel Laizer CV
Updated Israel Laizer CV
Israel Laizer
 

Ähnlich wie ESRC Evaluation strategy (20)

Evaluating the performance of OECD Committees -- Kevin Williams, OECD Secreta...
Evaluating the performance of OECD Committees -- Kevin Williams, OECD Secreta...Evaluating the performance of OECD Committees -- Kevin Williams, OECD Secreta...
Evaluating the performance of OECD Committees -- Kevin Williams, OECD Secreta...
 
Monotoring and evaluation principles and theories
Monotoring and evaluation  principles and theoriesMonotoring and evaluation  principles and theories
Monotoring and evaluation principles and theories
 
Krutikova plenary ag slides ifs slides_final
Krutikova plenary ag slides ifs slides_finalKrutikova plenary ag slides ifs slides_final
Krutikova plenary ag slides ifs slides_final
 
From theory based policy evaluation to smart policy design: lessons learned f...
From theory based policy evaluation to smart policy design: lessons learned f...From theory based policy evaluation to smart policy design: lessons learned f...
From theory based policy evaluation to smart policy design: lessons learned f...
 
Evaluation Capacity Development
Evaluation Capacity DevelopmentEvaluation Capacity Development
Evaluation Capacity Development
 
SOEDS, 11th April 2022 How to Evaluate CSR Projects and Programmes.pptx
SOEDS, 11th April 2022 How to Evaluate CSR Projects and Programmes.pptxSOEDS, 11th April 2022 How to Evaluate CSR Projects and Programmes.pptx
SOEDS, 11th April 2022 How to Evaluate CSR Projects and Programmes.pptx
 
Participatory Monitoring and Evaluation
Participatory Monitoring and EvaluationParticipatory Monitoring and Evaluation
Participatory Monitoring and Evaluation
 
JISC BCE: Evaluation Phase 1
JISC BCE: Evaluation Phase 1JISC BCE: Evaluation Phase 1
JISC BCE: Evaluation Phase 1
 
Perspectives on Research Funding: the why, what and how of commissioning exce...
Perspectives on Research Funding: the why, what and how of commissioning exce...Perspectives on Research Funding: the why, what and how of commissioning exce...
Perspectives on Research Funding: the why, what and how of commissioning exce...
 
USP Sport Matters - Workshop (Sherry)
USP Sport Matters - Workshop (Sherry)USP Sport Matters - Workshop (Sherry)
USP Sport Matters - Workshop (Sherry)
 
Monitoring and Evaluation of welfare projects
 Monitoring and Evaluation of welfare projects Monitoring and Evaluation of welfare projects
Monitoring and Evaluation of welfare projects
 
The new Evaluation Methodology
The new Evaluation MethodologyThe new Evaluation Methodology
The new Evaluation Methodology
 
Introduction to project evaluations for SLOGA / Trialog
Introduction to project evaluations for SLOGA / TrialogIntroduction to project evaluations for SLOGA / Trialog
Introduction to project evaluations for SLOGA / Trialog
 
M&E.ppt
M&E.pptM&E.ppt
M&E.ppt
 
Project monitoring and evaluation by Samuel Obino Mokaya
Project monitoring and evaluation by Samuel Obino MokayaProject monitoring and evaluation by Samuel Obino Mokaya
Project monitoring and evaluation by Samuel Obino Mokaya
 
Uk co rr_sep2015final
Uk co rr_sep2015finalUk co rr_sep2015final
Uk co rr_sep2015final
 
Gilpin policy training presentation2 final
Gilpin policy training presentation2 finalGilpin policy training presentation2 final
Gilpin policy training presentation2 final
 
Evaluation research-resty-samosa
Evaluation research-resty-samosaEvaluation research-resty-samosa
Evaluation research-resty-samosa
 
Evidence On Trial: weighing the value of evidence in academic enquiry, policy...
Evidence On Trial: weighing the value of evidence in academic enquiry, policy...Evidence On Trial: weighing the value of evidence in academic enquiry, policy...
Evidence On Trial: weighing the value of evidence in academic enquiry, policy...
 
Updated Israel Laizer CV
Updated Israel Laizer CVUpdated Israel Laizer CV
Updated Israel Laizer CV
 

ESRC Evaluation strategy

  • 1. ESRC Evaluation Strategy Mr Luke Moody, Deputy Head of Evaluation, Corporate Strategy and Analysis
  • 2. Content ▶ ESRC Evaluation Committee ▶ ESRC Evaluation Strategy ▶ Project Evaluation ▶ Large Investment and Policy Evaluation ▶ International Benchmarking ▶ Impact Evaluation
  • 3. ESRC Evaluation Committee ▶ Oversees evaluation of quality and impact of all ESRC investments ▶ Responsible for advising Council on the successful achievement of its corporate strategy ▶ Operates independently of the Policy Committees, reports directly to Council
  • 4. Evaluation Strategy – cuts across all Committee and Network remits ▶ Impact through World Class Research: Evaluations of the academic quality and impact of ESRC research and UK Social Sciences more generally – to advise Council on the impact and quality of its research portfolio, and to advise the Research Committee on further investments ▶ Impact through Skilled People: Evaluations of ESRC funding schemes and training investments – to advise Council and the Training and Skills Committee on the impact and further development of research training initiatives ▶ Impact through Infrastructure: Evaluations of ESRC funding schemes and resource investments – to advise Council and the Methods and Infrastructure Committee on the impact and further development of research methods and infrastructure initiatives ▶ Impact through International Leadership: Evaluation of internationally focused ESRC funding schemes and investments – to advise Council and the Policy Committees on the impact and further development of initiatives that promote international collaboration ▶ Impact through Partnerships: Evaluations of the impact of ESRC partnerships and investments on policy and practice – to advise Council and the Policy Committees on the practical impact of its partnership building, and on ways to increase this contribution
  • 5. Why evaluate? ▶ The main purposes of evaluation are to: ▶ provide an assessment of accountability, i.e. whether public funds were spent as agreed ▶ assess whether a project has been conducted effectively, whether it has met its objectives and to make an assessment of the quality and impact of the research ▶ provide award holders with some feedback about the management, quality and rigour of the research, and to provide comments on uses or potential uses of the research ▶ Learn lessons to inform ongoing and future activities
  • 6. Project Evaluation ▶ All ESRC grant holders must provide: ▶ End of Award Report (3 months after grant end). ▶ Impact Report (12 months after grant end). ▶ Subject to external peer review and grade. ▶ Cumulative results reported annually to Council.
  • 7. Investment and Policy Evaluation ▶ Annual programme of large investment, scheme and policy evaluations. ▶ Consultation with Policy Committees. ▶ Undertaken by external evaluators. ▶ Reports presented to Council and relevant Committees, alongside synthesis of key findings and recommendations ▶ Mechanism in development for tracking recommendations
  • 8. International Benchmarking ▶ Assess comparative performance of UK Social Science disciplines. ▶ In partnership with Learned Societies. ▶ Review series aims to: ▶ - highlight the standing and contribution of UK disciplines ▶ - identify ways of enhancing performance and capacity ▶ - promote future research agendas
  • 9. Impact Evaluation - Aims ▶ To identify and analyse evidence of research impact on policy and practice. ▶ To understand how impact is generated, and help the ESRC to improve its performance in this area. ▶ To develop impact evaluation methods.
  • 10. Impact Evaluation – Critical Issues ▶ What are the problems? – Timing – how long to wait? – Attribution – what role has research played in change? – Additionality – what is ESRC’s contribution? ▶ Different types of Impact: – Instrumental – Conceptual – Capacity Building ▶ Looking for demonstrable impact on policy and practice
  • 11. Impact Evaluation – Case Studies ▶ Experimental Policy and Practice Case Studies – 2005 onwards ▶ Aims: – to trial methods - ‘what works’ – produce evidence – ‘demonstrating impact’ – improve understanding - ‘how does impact occur’ ▶ Testing: – wide range of methods – wide range of contractors ▶ Key Requirements: – conceptual framework – understanding the ‘how’ as well as the ‘what’
  • 12. Drivers of Impact ▶ Established relationships and networks with user communities ▶ Involving users at all stages of the research ▶ Well-planned user-engagement and knowledge exchange strategies ▶ Portfolios of research activity that build reputations with research users ▶ Good infrastructure and management support ▶ Where appropriate, the involvement of intermediaries and knowledge brokers as translators, amplifiers, network providers
  • 13. Economic Impact - Pilot ▶ Pilot Study: Some success in valuing outputs, but quantifying the wider impact of research was more challenging – Two main barriers: ▪ Disentangling multiple contributors to policy development ▪ General lack of evidence on economic impact of government policies – Access to robust evaluation data on the impact of individual policies is needed in order to quantify ESRC’s contribution to those policies
  • 14. Economic Impact – Tracking Back ▶ Tracking Back Studies – working backwards from policy or practice initiatives that have been subject to evaluation ▶ Assessing ESRC research contributions to the initiative ▶ Estimating economic value of ESRC’s impact, using national evaluation data as a benchmark ▶ EMA and Pathways to Work
  • 15. Developing Impact Evaluation ▶ Conceptual Impact – Child Poverty Policy ▶ Impact of Infrastructure – Economic impact infrastructure investment: the Economic and Social Data Service (ESDS) ▶ Impact of Skilled People – People-flow impacts in the Welsh Government
  • 16. Further Information ▶ ‘Taking Stock’ and ‘Branching Out’: http://www.esrc.ac.uk/impacts-and- findings/impact-assessment/developing-impact- evaluation.aspx ▶ Policy and Practice Case Studies: http://www.esrc.ac.uk/impacts-and- findings/impact-assessment/policy-practice- impacts.aspx ▶ Economic Impact Evaluation: http://www.esrc.ac.uk/impacts-and-
  • 17. ESRC’s Contacts ▶ Speaker: – Mr Luke Moody, Deputy Head of Evaluation, Corporate Strategy and Analysis luke.moody@esrc.ac.uk ▶ ESRC website – www.esrc.ac.uk

Hinweis der Redaktion

  1. The Evaluation Committee is chaired by a Council member. Other members are appointed by Council for two years and are drawn from a range of academic and non-academic backgrounds. Prof Ann Buchanan, University of Oxford (Chair) Prof Paul Anand, Economics, Open University Prof Tara Fenwick, Stirling Institute of Education, University of Stirling Prof Brian Francis, Mathematics and Statistics, Lancaster University Steven Marwick, Evaluation Support Scotland Jeremy Mayhew, Public User Member Prof Paul Milbourne, Cardiff University Dr Paul Nightingale, University of Sussex Jeremy Peat, BBC National Trustee for Scotland Prof Ken Starkey, University of Nottingham Prof Sandra Walklate, University of Liverpool Prof Paul Whiteley, University of Essex Penny Young, National Centre for Social Research Vicki Crossley, ESRC (Secretary)
  2. Five distinct evaluation areas that cover the full remit of ESRC activity
  3. Six reviews have been undertaken now: Social Anthropology, Politics and International Studies, Economics, Sociology, Psychology and Human Geography. The Human Geography review is due to be published early in 2013. A forthcoming review of AIM will look more broadly at the Management and Business Studies discipline.
  4. These outcomes have been fed back into the management of large ESRC research investments.
  5. Pilot study in 2008; used evidence from two Research Centres: Centre for Economic Performance (CEP) and Centre on Skills, Knowledge and Organisational Performance (SKOPE).
  6. Focusing on conceptual impacts- how social science changes ideas and the general debate, thinking and culture of a specific issue. A further development was focused on assessing the economic impact of infrastructure resources (ESDS). This focused on the value and use of the datasets held by what is now called the UK Data Service.Looking how social science contributes to the training of skilled individuals, assessing the impact that they make on society. Currently working with the Government Economic Service and Government Social Research. Government Statistical Service assessment will be published service.