SlideShare ist ein Scribd-Unternehmen logo
1 von 16
Rethinking the Relationship between
Evaluation and Performance
Measurement/Monitoring – and RBM
Robert Lahey
Presentation to the Canadian Evaluation Society
Annual Conference
Toronto: June 10, 2013
RELahey@rogers.com - CES Conference 2013 2
Talking Points
Complementarity? – the theory vs the practice
Some observations – Canada; International
experience
Some considerations for RBM
RELahey@rogers.com - CES Conference 2013 3
Two Tools to Measure ‘Performance’
E – Evaluation (Evaluators)
M – Performance Measurement/Monitoring
(Program Managers)
Continuum for measuring ‘performance’
(results chain)
RELahey@rogers.com - CES Conference 2013 4
The Theory
 M supports E
 E supports M
 Various notions of ‘complementarity’
* Informational * Sequential
* Organizational * Methodical
* Hierarchical
 Reference: New Directions for Evaluation, No. 137,
Spring 2013
RELahey@rogers.com - CES Conference 2013 5
The Practice
 Complementarity? Taken advantage of?
 Can and do organizations (and governments) use the
M&E information in a coherent system?
 Observations: from Canada; Internationally
 Experience to date? - good, bad & ugly
RELahey@rogers.com - CES Conference 2013 6
The Good
 E supporting M – derivation of performance
frameworks, relevant indicators
 Moving the focus up from activities to include
‘results’
 A more systematic, structured & results-
oriented approach to understanding program,
theory & articulating expected results
 ‘Methodical complementarity’
RELahey@rogers.com - CES Conference 2013 7
The Bad
 M not supporting E to the level expected (by
central authorities & senior officials)
 To a large extent ‘results’ still not being
measured by M – for a variety of reasons:
* lack of data to populate indicators
* methodological issues re measuring outcomes
* Managers not equipped to carry out M (resource,
skill & time constraints)
RELahey@rogers.com - CES Conference 2013 8
The Ugly
 Cases where E being ignored as an important
tool to measure & understand performance
 Unrealistic expectations re the ability of M to
deliver cost-effective approach to measuring
outcomes
 Dumbing down of performance reporting
* Observations vs understanding
Some Conclusions
 Some level of complementarity
(opportunities)
 But, limits to this – much relates to practical
implementation issues
 Extent that M can support E is probably
overstated
 Importance of informing/educating senior
officials – in terms meaningful to them
RELahey@rogers.com - CES Conference 2013 9
RELahey@rogers.com - CES Conference 2013 10
Some Considerations for the Governance
Model that M&E Supports
 Both M and E - key tools to generate
performance information to support RBM
 ‘Results’ information - various uses & users:
* Learning/Knowledge * Internal Needs
* External Needs
* Accountability * Internal Needs
* External Needs
RELahey@rogers.com - CES Conference 2013 11
Potential Uses/Users for M&E Information
M E
Learning – Internal Use Learning – Internal Use
Learning – External Use Learning – External Use
Accountability – Internal Accountability – Internal
Accountability - External Accountability - External
The Practice – M, E and RBM
 Is there a coordination of M and E to support
RBM?
 Some Differences:
* Different players in their production
* Different timelines
* (Potentially) serving different purposes
* Operational disconnect between the two?
RELahey@rogers.com - CES Conference 2013 12
RELahey@rogers.com - CES Conference 2013 13
Focus of M and E – largely on
‘Accountability’ for External Audiences
M E
Learning – Internal Use Learning – Internal Use
Learning – External Use Learning – External Use
Accountability – Internal Accountability – Internal
Accountability - External Accountability - External
RELahey@rogers.com - CES Conference 2013 14
Rethinking the Relationship between M, E
and RBM – Measurement Considerations
 How should E support M? M support E?
 Appropriate role for Evaluators? Program
Managers?
 Is something missing within organizations to
deliver on the measurement needs of RBM?
 Are organizations/governments willing to
resource to the level needed?
 Move from silos to ‘knowledge strategy’
RELahey@rogers.com - CES Conference 2013 15
Rethinking the Relationship between M, E
and RBM – Governance Model
 What should be the appropriate balance for
both M and E re:
* Uses: a focus on ‘accountability’ vs ‘knowledge’?
* Users: Internal vs External?
 More clarity likely needed around ‘uses’ within
organizations
 Capacity building of ‘users’
RELahey@rogers.com - CES Conference 2013 16
Contact Coordinates
Robert Lahey
REL Solutions Inc.
Ottawa, Canada
Tel.: (613) 728-4272
E-mail: RELahey@rogers.com

Weitere ähnliche Inhalte

Ähnlich wie CES 2013 conference - Rethinking the Relationship between Monitoring and Evaluation

Project Organizational Responsibility Model - ORM
Project Organizational Responsibility Model -  ORMProject Organizational Responsibility Model -  ORM
Project Organizational Responsibility Model - ORMGuttenberg Ferreira Passos
 
USER GUIDE M&E 2014 LENNY HIDAYAT
USER GUIDE M&E 2014 LENNY HIDAYATUSER GUIDE M&E 2014 LENNY HIDAYAT
USER GUIDE M&E 2014 LENNY HIDAYATLenny Hidayat
 
Data Center Transformation Program Planning and Design
Data Center Transformation Program Planning and DesignData Center Transformation Program Planning and Design
Data Center Transformation Program Planning and DesignJoseph Schwartz
 
Introduction to SCORE - strategy-assessment beyond SWOT
Introduction to SCORE - strategy-assessment beyond SWOTIntroduction to SCORE - strategy-assessment beyond SWOT
Introduction to SCORE - strategy-assessment beyond SWOTTetradian Consulting
 
What Makes a Good Performance Management Plan? A new tool for managers
 What Makes a Good Performance Management Plan? A new tool for managers What Makes a Good Performance Management Plan? A new tool for managers
What Makes a Good Performance Management Plan? A new tool for managersMEASURE Evaluation
 
2014 survey-monitoring-evaluation-v4
2014 survey-monitoring-evaluation-v42014 survey-monitoring-evaluation-v4
2014 survey-monitoring-evaluation-v4Meegan Scott
 
Setting Up A Project Management Office
Setting Up A Project Management OfficeSetting Up A Project Management Office
Setting Up A Project Management OfficeKit Rowley
 
Results-based-Management.pdf
Results-based-Management.pdfResults-based-Management.pdf
Results-based-Management.pdfpophius
 
Running head MULTIPLE- STAKEHOLDER PROCESS To Shelf part.docx
Running head MULTIPLE- STAKEHOLDER PROCESS  To Shelf part.docxRunning head MULTIPLE- STAKEHOLDER PROCESS  To Shelf part.docx
Running head MULTIPLE- STAKEHOLDER PROCESS To Shelf part.docxcharisellington63520
 
Performance Measurement for Local Governments
Performance Measurement for Local GovernmentsPerformance Measurement for Local Governments
Performance Measurement for Local GovernmentsRavikant Joshi
 
Strenghs and weaknesses of CAC 40 materiality assessments
Strenghs and weaknesses of CAC 40 materiality assessmentsStrenghs and weaknesses of CAC 40 materiality assessments
Strenghs and weaknesses of CAC 40 materiality assessmentsMarionMartorell
 
MEAL ETH11171-draft.pptx
MEAL ETH11171-draft.pptxMEAL ETH11171-draft.pptx
MEAL ETH11171-draft.pptxAbraham Lebeza
 

Ähnlich wie CES 2013 conference - Rethinking the Relationship between Monitoring and Evaluation (20)

Project Organizational Responsibility Model - ORM
Project Organizational Responsibility Model -  ORMProject Organizational Responsibility Model -  ORM
Project Organizational Responsibility Model - ORM
 
USER GUIDE M&E 2014 LENNY HIDAYAT
USER GUIDE M&E 2014 LENNY HIDAYATUSER GUIDE M&E 2014 LENNY HIDAYAT
USER GUIDE M&E 2014 LENNY HIDAYAT
 
Ranking portfolio initiatives, Bernard Marshall, june 2012
Ranking portfolio initiatives, Bernard Marshall, june 2012Ranking portfolio initiatives, Bernard Marshall, june 2012
Ranking portfolio initiatives, Bernard Marshall, june 2012
 
Data Center Transformation Program Planning and Design
Data Center Transformation Program Planning and DesignData Center Transformation Program Planning and Design
Data Center Transformation Program Planning and Design
 
Introduction to SCORE - strategy-assessment beyond SWOT
Introduction to SCORE - strategy-assessment beyond SWOTIntroduction to SCORE - strategy-assessment beyond SWOT
Introduction to SCORE - strategy-assessment beyond SWOT
 
What Makes a Good Performance Management Plan? A new tool for managers
 What Makes a Good Performance Management Plan? A new tool for managers What Makes a Good Performance Management Plan? A new tool for managers
What Makes a Good Performance Management Plan? A new tool for managers
 
2014 survey-monitoring-evaluation-v4
2014 survey-monitoring-evaluation-v42014 survey-monitoring-evaluation-v4
2014 survey-monitoring-evaluation-v4
 
Setting Up A Project Management Office
Setting Up A Project Management OfficeSetting Up A Project Management Office
Setting Up A Project Management Office
 
Results-based-Management.pdf
Results-based-Management.pdfResults-based-Management.pdf
Results-based-Management.pdf
 
Portfolio management knowledge development
Portfolio management knowledge developmentPortfolio management knowledge development
Portfolio management knowledge development
 
ROI-Institute-Brochure1
ROI-Institute-Brochure1ROI-Institute-Brochure1
ROI-Institute-Brochure1
 
ISPMS Background, Purpose and Approach
ISPMS Background, Purpose and ApproachISPMS Background, Purpose and Approach
ISPMS Background, Purpose and Approach
 
Running head MULTIPLE- STAKEHOLDER PROCESS To Shelf part.docx
Running head MULTIPLE- STAKEHOLDER PROCESS  To Shelf part.docxRunning head MULTIPLE- STAKEHOLDER PROCESS  To Shelf part.docx
Running head MULTIPLE- STAKEHOLDER PROCESS To Shelf part.docx
 
Fair data economy score intro
Fair data economy score introFair data economy score intro
Fair data economy score intro
 
Performance Measurement for Local Governments
Performance Measurement for Local GovernmentsPerformance Measurement for Local Governments
Performance Measurement for Local Governments
 
KM
KMKM
KM
 
Strenghs and weaknesses of CAC 40 materiality assessments
Strenghs and weaknesses of CAC 40 materiality assessmentsStrenghs and weaknesses of CAC 40 materiality assessments
Strenghs and weaknesses of CAC 40 materiality assessments
 
Determining impact and return of csi updated (2)
Determining impact and return of csi updated (2)Determining impact and return of csi updated (2)
Determining impact and return of csi updated (2)
 
MEAL ETH11171-draft.pptx
MEAL ETH11171-draft.pptxMEAL ETH11171-draft.pptx
MEAL ETH11171-draft.pptx
 
BI assessment template jr
BI assessment template jrBI assessment template jr
BI assessment template jr
 

Mehr von CesToronto

Why aren't Evaluators using Digital Media Analytics?
Why aren't Evaluators using Digital Media Analytics?Why aren't Evaluators using Digital Media Analytics?
Why aren't Evaluators using Digital Media Analytics?CesToronto
 
Evaluation pal program monitoring and evaluation technology
Evaluation pal program monitoring and evaluation technologyEvaluation pal program monitoring and evaluation technology
Evaluation pal program monitoring and evaluation technologyCesToronto
 
Developing Performance Measures Through a Consultative Process
Developing Performance Measures Through a Consultative ProcessDeveloping Performance Measures Through a Consultative Process
Developing Performance Measures Through a Consultative ProcessCesToronto
 
The international Partnership Initiative to promote Evaluation Capacity Devel...
The international Partnership Initiative to promote Evaluation Capacity Devel...The international Partnership Initiative to promote Evaluation Capacity Devel...
The international Partnership Initiative to promote Evaluation Capacity Devel...CesToronto
 
Ces 2013 the role of technology and social media - raising the grade
Ces 2013   the role of technology and social media - raising the gradeCes 2013   the role of technology and social media - raising the grade
Ces 2013 the role of technology and social media - raising the gradeCesToronto
 
Evaluating Libraries' Business Services
Evaluating Libraries' Business Services Evaluating Libraries' Business Services
Evaluating Libraries' Business Services CesToronto
 
Demonstrating Research Impact: Measuring Return on Investment with an Impact ...
Demonstrating Research Impact: Measuring Return on Investment with an Impact ...Demonstrating Research Impact: Measuring Return on Investment with an Impact ...
Demonstrating Research Impact: Measuring Return on Investment with an Impact ...CesToronto
 
Improving evaluations and utilization with statistical edge nested data desi...
Improving evaluations and utilization with statistical edge  nested data desi...Improving evaluations and utilization with statistical edge  nested data desi...
Improving evaluations and utilization with statistical edge nested data desi...CesToronto
 
Evaluating through a Community of Practice: How 8 Evaluators are Improving Pr...
Evaluating through a Community of Practice: How 8 Evaluators are Improving Pr...Evaluating through a Community of Practice: How 8 Evaluators are Improving Pr...
Evaluating through a Community of Practice: How 8 Evaluators are Improving Pr...CesToronto
 
Ces 2013 negotiating evaluation design
Ces 2013   negotiating evaluation designCes 2013   negotiating evaluation design
Ces 2013 negotiating evaluation designCesToronto
 
Ces conference presentation 2013 final_june 2013
Ces conference presentation 2013 final_june 2013Ces conference presentation 2013 final_june 2013
Ces conference presentation 2013 final_june 2013CesToronto
 
Monitoring and Evaluation of International Development Assistance to the Priv...
Monitoring and Evaluation of International Development Assistance to the Priv...Monitoring and Evaluation of International Development Assistance to the Priv...
Monitoring and Evaluation of International Development Assistance to the Priv...CesToronto
 
Living the Values of Engagement and the Strategic View with Program Evaluatio...
Living the Values of Engagement and the Strategic View with Program Evaluatio...Living the Values of Engagement and the Strategic View with Program Evaluatio...
Living the Values of Engagement and the Strategic View with Program Evaluatio...CesToronto
 
Avoiding Chartjunk
Avoiding Chartjunk Avoiding Chartjunk
Avoiding Chartjunk CesToronto
 
Evaluation for Development: matching evaluation to the right user, the right...
Evaluation for Development:  matching evaluation to the right user, the right...Evaluation for Development:  matching evaluation to the right user, the right...
Evaluation for Development: matching evaluation to the right user, the right...CesToronto
 
Indicators workshop ces 2013
Indicators workshop ces 2013Indicators workshop ces 2013
Indicators workshop ces 2013CesToronto
 
CES Toronto 2013 Engaging Practitioners in Evaluation using the Risk Based Co...
CES Toronto 2013 Engaging Practitioners in Evaluation using the Risk Based Co...CES Toronto 2013 Engaging Practitioners in Evaluation using the Risk Based Co...
CES Toronto 2013 Engaging Practitioners in Evaluation using the Risk Based Co...CesToronto
 
Closing the Knowledge Gap Between Evaluators and Stakeholders
Closing the Knowledge Gap Between Evaluators and StakeholdersClosing the Knowledge Gap Between Evaluators and Stakeholders
Closing the Knowledge Gap Between Evaluators and StakeholdersCesToronto
 
Ces 2013 - Doing Developmental Evaluation at the System Level
Ces 2013 - Doing Developmental Evaluation at the System LevelCes 2013 - Doing Developmental Evaluation at the System Level
Ces 2013 - Doing Developmental Evaluation at the System LevelCesToronto
 
Ces 2013 towards a cdn definition of evaluation
Ces 2013   towards a cdn definition of evaluationCes 2013   towards a cdn definition of evaluation
Ces 2013 towards a cdn definition of evaluationCesToronto
 

Mehr von CesToronto (20)

Why aren't Evaluators using Digital Media Analytics?
Why aren't Evaluators using Digital Media Analytics?Why aren't Evaluators using Digital Media Analytics?
Why aren't Evaluators using Digital Media Analytics?
 
Evaluation pal program monitoring and evaluation technology
Evaluation pal program monitoring and evaluation technologyEvaluation pal program monitoring and evaluation technology
Evaluation pal program monitoring and evaluation technology
 
Developing Performance Measures Through a Consultative Process
Developing Performance Measures Through a Consultative ProcessDeveloping Performance Measures Through a Consultative Process
Developing Performance Measures Through a Consultative Process
 
The international Partnership Initiative to promote Evaluation Capacity Devel...
The international Partnership Initiative to promote Evaluation Capacity Devel...The international Partnership Initiative to promote Evaluation Capacity Devel...
The international Partnership Initiative to promote Evaluation Capacity Devel...
 
Ces 2013 the role of technology and social media - raising the grade
Ces 2013   the role of technology and social media - raising the gradeCes 2013   the role of technology and social media - raising the grade
Ces 2013 the role of technology and social media - raising the grade
 
Evaluating Libraries' Business Services
Evaluating Libraries' Business Services Evaluating Libraries' Business Services
Evaluating Libraries' Business Services
 
Demonstrating Research Impact: Measuring Return on Investment with an Impact ...
Demonstrating Research Impact: Measuring Return on Investment with an Impact ...Demonstrating Research Impact: Measuring Return on Investment with an Impact ...
Demonstrating Research Impact: Measuring Return on Investment with an Impact ...
 
Improving evaluations and utilization with statistical edge nested data desi...
Improving evaluations and utilization with statistical edge  nested data desi...Improving evaluations and utilization with statistical edge  nested data desi...
Improving evaluations and utilization with statistical edge nested data desi...
 
Evaluating through a Community of Practice: How 8 Evaluators are Improving Pr...
Evaluating through a Community of Practice: How 8 Evaluators are Improving Pr...Evaluating through a Community of Practice: How 8 Evaluators are Improving Pr...
Evaluating through a Community of Practice: How 8 Evaluators are Improving Pr...
 
Ces 2013 negotiating evaluation design
Ces 2013   negotiating evaluation designCes 2013   negotiating evaluation design
Ces 2013 negotiating evaluation design
 
Ces conference presentation 2013 final_june 2013
Ces conference presentation 2013 final_june 2013Ces conference presentation 2013 final_june 2013
Ces conference presentation 2013 final_june 2013
 
Monitoring and Evaluation of International Development Assistance to the Priv...
Monitoring and Evaluation of International Development Assistance to the Priv...Monitoring and Evaluation of International Development Assistance to the Priv...
Monitoring and Evaluation of International Development Assistance to the Priv...
 
Living the Values of Engagement and the Strategic View with Program Evaluatio...
Living the Values of Engagement and the Strategic View with Program Evaluatio...Living the Values of Engagement and the Strategic View with Program Evaluatio...
Living the Values of Engagement and the Strategic View with Program Evaluatio...
 
Avoiding Chartjunk
Avoiding Chartjunk Avoiding Chartjunk
Avoiding Chartjunk
 
Evaluation for Development: matching evaluation to the right user, the right...
Evaluation for Development:  matching evaluation to the right user, the right...Evaluation for Development:  matching evaluation to the right user, the right...
Evaluation for Development: matching evaluation to the right user, the right...
 
Indicators workshop ces 2013
Indicators workshop ces 2013Indicators workshop ces 2013
Indicators workshop ces 2013
 
CES Toronto 2013 Engaging Practitioners in Evaluation using the Risk Based Co...
CES Toronto 2013 Engaging Practitioners in Evaluation using the Risk Based Co...CES Toronto 2013 Engaging Practitioners in Evaluation using the Risk Based Co...
CES Toronto 2013 Engaging Practitioners in Evaluation using the Risk Based Co...
 
Closing the Knowledge Gap Between Evaluators and Stakeholders
Closing the Knowledge Gap Between Evaluators and StakeholdersClosing the Knowledge Gap Between Evaluators and Stakeholders
Closing the Knowledge Gap Between Evaluators and Stakeholders
 
Ces 2013 - Doing Developmental Evaluation at the System Level
Ces 2013 - Doing Developmental Evaluation at the System LevelCes 2013 - Doing Developmental Evaluation at the System Level
Ces 2013 - Doing Developmental Evaluation at the System Level
 
Ces 2013 towards a cdn definition of evaluation
Ces 2013   towards a cdn definition of evaluationCes 2013   towards a cdn definition of evaluation
Ces 2013 towards a cdn definition of evaluation
 

CES 2013 conference - Rethinking the Relationship between Monitoring and Evaluation

  • 1. Rethinking the Relationship between Evaluation and Performance Measurement/Monitoring – and RBM Robert Lahey Presentation to the Canadian Evaluation Society Annual Conference Toronto: June 10, 2013
  • 2. RELahey@rogers.com - CES Conference 2013 2 Talking Points Complementarity? – the theory vs the practice Some observations – Canada; International experience Some considerations for RBM
  • 3. RELahey@rogers.com - CES Conference 2013 3 Two Tools to Measure ‘Performance’ E – Evaluation (Evaluators) M – Performance Measurement/Monitoring (Program Managers) Continuum for measuring ‘performance’ (results chain)
  • 4. RELahey@rogers.com - CES Conference 2013 4 The Theory  M supports E  E supports M  Various notions of ‘complementarity’ * Informational * Sequential * Organizational * Methodical * Hierarchical  Reference: New Directions for Evaluation, No. 137, Spring 2013
  • 5. RELahey@rogers.com - CES Conference 2013 5 The Practice  Complementarity? Taken advantage of?  Can and do organizations (and governments) use the M&E information in a coherent system?  Observations: from Canada; Internationally  Experience to date? - good, bad & ugly
  • 6. RELahey@rogers.com - CES Conference 2013 6 The Good  E supporting M – derivation of performance frameworks, relevant indicators  Moving the focus up from activities to include ‘results’  A more systematic, structured & results- oriented approach to understanding program, theory & articulating expected results  ‘Methodical complementarity’
  • 7. RELahey@rogers.com - CES Conference 2013 7 The Bad  M not supporting E to the level expected (by central authorities & senior officials)  To a large extent ‘results’ still not being measured by M – for a variety of reasons: * lack of data to populate indicators * methodological issues re measuring outcomes * Managers not equipped to carry out M (resource, skill & time constraints)
  • 8. RELahey@rogers.com - CES Conference 2013 8 The Ugly  Cases where E being ignored as an important tool to measure & understand performance  Unrealistic expectations re the ability of M to deliver cost-effective approach to measuring outcomes  Dumbing down of performance reporting * Observations vs understanding
  • 9. Some Conclusions  Some level of complementarity (opportunities)  But, limits to this – much relates to practical implementation issues  Extent that M can support E is probably overstated  Importance of informing/educating senior officials – in terms meaningful to them RELahey@rogers.com - CES Conference 2013 9
  • 10. RELahey@rogers.com - CES Conference 2013 10 Some Considerations for the Governance Model that M&E Supports  Both M and E - key tools to generate performance information to support RBM  ‘Results’ information - various uses & users: * Learning/Knowledge * Internal Needs * External Needs * Accountability * Internal Needs * External Needs
  • 11. RELahey@rogers.com - CES Conference 2013 11 Potential Uses/Users for M&E Information M E Learning – Internal Use Learning – Internal Use Learning – External Use Learning – External Use Accountability – Internal Accountability – Internal Accountability - External Accountability - External
  • 12. The Practice – M, E and RBM  Is there a coordination of M and E to support RBM?  Some Differences: * Different players in their production * Different timelines * (Potentially) serving different purposes * Operational disconnect between the two? RELahey@rogers.com - CES Conference 2013 12
  • 13. RELahey@rogers.com - CES Conference 2013 13 Focus of M and E – largely on ‘Accountability’ for External Audiences M E Learning – Internal Use Learning – Internal Use Learning – External Use Learning – External Use Accountability – Internal Accountability – Internal Accountability - External Accountability - External
  • 14. RELahey@rogers.com - CES Conference 2013 14 Rethinking the Relationship between M, E and RBM – Measurement Considerations  How should E support M? M support E?  Appropriate role for Evaluators? Program Managers?  Is something missing within organizations to deliver on the measurement needs of RBM?  Are organizations/governments willing to resource to the level needed?  Move from silos to ‘knowledge strategy’
  • 15. RELahey@rogers.com - CES Conference 2013 15 Rethinking the Relationship between M, E and RBM – Governance Model  What should be the appropriate balance for both M and E re: * Uses: a focus on ‘accountability’ vs ‘knowledge’? * Users: Internal vs External?  More clarity likely needed around ‘uses’ within organizations  Capacity building of ‘users’
  • 16. RELahey@rogers.com - CES Conference 2013 16 Contact Coordinates Robert Lahey REL Solutions Inc. Ottawa, Canada Tel.: (613) 728-4272 E-mail: RELahey@rogers.com