SlideShare ist ein Scribd-Unternehmen logo
1 von 30
Importance of Monitoring and 
Evaluation
Lecture Overview 
 Monitoring and Evaluation 
 How to Build an M&E System
What is M, what is E, why and how to monitor 
MONITORING & EVALUATION
What is Monitoring 
 Ongoing process that generates information to inform decision 
about the program while it is being implemented. 
 Routine collection and analysis of information to track progress 
against set plans and check compliance to established standards 
 Helps identify trends & patterns, adapts strategies, and inform 
decisions 
 Key words: 
• Continuous – ongoing, frequent in nature 
• Collecting and analyzing information – to measure progress 
towards goals 
• Comparing results – assessing the performance of a 
program/project
Why is Monitoring Important? 
 Evidence of how much has been or has NOT been achieved 
• Quantitative: numbers, percentage 
• Qualitative: narrative or observation 
 Examination of trends 
 Highlight problems 
 Early warning signs 
 Corrective actions 
 Evaluate effectiveness of management action 
 Determine achievement of results
What is Evaluation 
 Evaluation is an assessment of an intervention to determine its 
relevance, efficiency, effectiveness, impact, and sustainability. The 
intention is to provide information that is credible and useful, 
enabling incorporation of lessons learned into decision making 
processes. 
 Key Words: 
• Assessment – of the value of an event of action 
• Relevance 
• Efficiency 
• Effectiveness 
• Impact 
• Sustainability 
• Lessons learned
What is Evaluation? 
Evaluation 
Program 
Evaluation 
Impact 
Evaluation
What is M and what is E? 
Monitoring 
Measures progress towards 
goals, but doesn’t tell us the 
extent to which results 
achieved or the impact 
Continuous, frequent 
Has to take place during the 
intervention 
Evaluation 
Measures whether progress 
towards goal is caused by 
the intervention - causality 
Infrequent, time bound 
Can evaluate ongoing or 
completed intervention
Monitoring and Evaluation 
Evaluation 
Program 
Evaluation 
Impact 
Evaluation 
Monitoring
Components of Program Evaluation 
 What are the characteristics of the target population? 
What are the risks and opportunities? What 
programs are most suitable? 
 What is the logical chain connecting our program to 
the desired results? 
 Is the program being rolled out as planned? Is their 
high uptake among clients? What do they think of it? 
 What was the impact and the magnitude of the 
program? 
 Given the magnitude of impact and cost, how 
efficient is the program? 
Needs assessment 
Program theory 
assessment 
Monitoring and 
process evaluation 
Impact evaluation 
Cost effectiveness 
Are your questions connected to decision-making?
Evaluation 
Programme 
Evaluation 
Impact 
Evaluation 
Program Evaluation
Who is this Evaluation For? 
 Academics 
 Donors 
• Their Constituents 
 Politicians / policymakers 
 Technocrats 
 Implementers 
 Proponents, Skeptics 
 Beneficiaries
How can Impact Evaluation Help Us? 
 Answers the following questions 
• What works best, why and when? 
• How can we scale up what works? 
 Surprisingly little hard evidence on what works 
 Can do more with given budget with better evidence 
 If people knew money was going to programmes that worked, could 
help increase pot for anti-poverty programmes
Programs and their Evaluations: Where do we Start? 
Intervention 
 Start with a problem 
 Verify that the problem actually 
exists 
 Generate a theory of why the 
problem exists 
 Design the program 
 Think about whether the 
solution is cost effective 
Program Evaluation 
 Start with a question 
 Verify the question hasn’t been 
answered 
 State a hypothesis 
 Design the evaluation 
 Determine whether the value 
of the answer is worth the cost 
of the evaluation
Endline 
Evaluation 
Life Cycle of a Program 
Baseline 
Evaluation 
Change or 
improvement 
Distributing 
reading 
materials and 
training 
volunteers 
• Reading 
materials delivered 
• Volunteers 
trained 
• Target children 
are reached 
• Classes are run, 
volunteers show up 
• Attendance in 
classes 
• Entire district is 
covered 
• Refresher training of 
teachers 
• Tracking the target 
children, convincing 
parents to send their child 
• Incentives to volunteer to 
run classes daily and 
efficiently (motivation) 
• Efforts made for children 
to attend regularly 
• Improve coverage 
Theory of 
Change/ Needs 
Assessment 
Designing 
the program 
to implement 
Background 
preparation, 
logistics, roll 
out of 
program 
Monitoring 
implementati 
on 
• Process 
evaluation 
• Progress towards 
target 
Planning for 
continuous 
improvement 
Reporting 
findings - 
impact, 
process 
evaluation 
findings 
Using the 
findings to 
improve 
program 
model and 
delivery
Program Theory – a Snap Shot 
Impacts 
Outcomes 
Outputs 
Activities 
Inputs 
Results 
Implementation
With a Focus on measuring both implementation and results? 
HOW TO BUILD AN M&E SYSTEM
Methods of Monitoring 
 First hand information 
 Citizens reporting 
 Surveys 
 Formal reports by project/programme staff 
• Project status report 
• Project schedule chart 
• Project financial status report 
• Informal Reports 
• Graphic presentations
Monitoring: Questions 
 Is the intervention implemented as designed? Does the program 
perform? 
Inputs Outputs Outcomes 
Implementation 
Plans and 
targets 
 Is intervention money, staff and other inputs available and put to 
use as planned? Are inputs used effectively? 
 Are the services being delivered as planned? 
 Is the intervention reaching the right population and target 
numbers? 
 Is the target population satisfied with services? Are they utilizing the 
services? 
 What is the intensity of the treatment?
Implementing Monitoring 
 Develop a monitoring plan 
• How should implementation be carried out? What is going to be 
changed? 
• Are the staff’s incentives aligned with project? Can they be 
incentivized to follow the implementation protocol? 
• How will you train staff? How will they interact with beneficiaries 
or other stakeholders? 
• What supplies or tools can you give your staff to make following 
the implementation design easier? 
• What can you do to monitor? (Field visits, tracking forms, 
administrative data, etc.) 
• Intensity of monitoring (frequency, resources required,…)?
Ten Steps to a Results-based Monitoring and Evaluation 
System 
Conducting a 
readiness 
and needs 
assessment 
Selecting key 
indicators to 
monitor 
outcomes 
Planning for 
improvement 
selecting 
realistic targets 
Using 
evaluation 
information 
Using findings 
1 2 3 4 5 6 7 8 9 10 
Agreeing on 
outcomes to 
monitor and 
evaluate 
Gathering 
baseline data 
on indicators 
Monitoring for 
results 
Reporting 
findings 
Sustaining the 
M&E system 
within the 
organization
Conducting a needs and 
readiness assessment 
1 2 3 4 5 6 7 8 9 10 
 What are the current systems that exist? 
 What is the need for the monitoring and evaluation? 
 Who will benefit from this system? 
 At what levels will the data be used? 
 Do we have organization willingness and capacity to establish the M&E 
system? 
 Who has the skills to design and build the M&E system? Who will 
manage? 
 What are the barriers to implementing M&E system on the ground 
(resource-crunch)? 
 How will you fight these barriers? 
 Will there be pilot programs that can be evaluated within the M&E 
system? 
- DO WE GO AHEAD?
Agreeing on outcomes 
(to monitor and evaluate) 
1 2 3 4 5 6 7 8 9 10 
 What are we trying to achieve? What is the vision that our M&E system 
will help us achieve? 
 Are there national or sectoral goals (commitment to achieving the 
MDGs)? 
 Political/donor driven interest in goals? 
 In other words, what are our Outcomes: Improving coverage, learning 
outcomes… broader than focusing on merely inputs and activities
Selecting key indicators 
to monitor outcomes 
1 2 3 4 5 6 7 8 9 10 
 Identify WHAT needs to get measured so that we know we have 
achieved our results? 
 Avoid broad based results, but assess based on feasibility, time, 
cost, relevance 
 Indicator development is a core activity in building an M&E system 
and drives all subsequent data collection, analysis, and reporting 
 Arriving at indicators will take come time 
 Identify plans for data collection, analysis, reporting 
PILOT! PILOT! PILOT!
Gathering baseline data 
on indicators 
1 2 3 4 5 6 7 8 9 10 
 Where are we today? 
 What is the performance of indicators today? 
 Sources of baseline information: Primary or Secondary data 
 Date types: Qualitative or Quantitative 
 Data collection instruments
Planning for 
improvement selecting 
realistic targets 
1 2 3 4 5 6 7 8 9 10 
 Targets – quantifiable levels of the indicators 
 Sequential, feasible and measurable targets 
 If we reach our sequential set of targets, then we will reach our 
outcomes! 
 Time bound – Universal enrolment by 2015 (outcome – better economic 
opportunities), Every child immunized by 2013 (outcome - reduction in 
infant mortality) etc. 
 Funding and resources available to be taken into account 
Target 
1 
Target 
2 
Target 
3 
Outcomes
Monitoring for 
implementation and 
results 
1 2 3 4 5 6 7 8 9 10 
Impacts 
Outcomes 
Outputs 
Activities 
Inputs 
Results 
Implementation 
Results 
monitoring 
Implementation 
monitoring 
Change in percentage children 
who cannot read; Change in 
teacher attendance 
Provision of materials; training 
of volunteers; usage of 
material; number of volunteers 
teaching
Evaluation(?), Using 
Evaluation Information 
1 2 3 4 5 6 7 8 9 10 
Monitoring does not information on attribution and causality. Information 
through Evaluation can be useful to 
 Helps determine are the right things being done 
 Helps select competing strategies by comparing results – are there 
better ways of doing things? 
 Helps build consensus on scale-up 
 Investigate why something did not work – scope for in-depth 
analysis 
 Evaluate the costs relative to benefits and help allocate limited 
resources
Reporting findings 
Sustaining M&E System 
1 2 3 4 5 6 7 8 9 10 
Using Results 
 Reporting Findings: What findings are reported to whom, in what format, 
and at what intervals. A good M&E system should provide an early warning 
system to detect problems or inconsistencies, as well as being a vehicle for 
demonstrating the value of an intervention – so do not hide poor results. 
 Using Results: recognize both internal and external uses of your results 
 Sustaining the M&E System: Some ways of doing this are generating 
demand, assigning responsibilities, increasing capacity, gather ing 
trustworthy data.
THANK YOU

Weitere ähnliche Inhalte

Was ist angesagt?

Introduction to monitoring and evaluation
Introduction to monitoring and evaluationIntroduction to monitoring and evaluation
Introduction to monitoring and evaluationMeshack Lomoywara
 
Difference between monitoring and evaluation
Difference between monitoring and evaluationDifference between monitoring and evaluation
Difference between monitoring and evaluationDoreen Ty
 
Monotoring and evaluation principles and theories
Monotoring and evaluation  principles and theoriesMonotoring and evaluation  principles and theories
Monotoring and evaluation principles and theoriescommochally
 
Monitoring and evaluation
Monitoring and evaluationMonitoring and evaluation
Monitoring and evaluationMd Rifat Anam
 
Monitoring And Evaluation
Monitoring And EvaluationMonitoring And Evaluation
Monitoring And EvaluationNick Nyamapfeni
 
Monitoring and Evaluation for Project management.
Monitoring and Evaluation for Project management.Monitoring and Evaluation for Project management.
Monitoring and Evaluation for Project management.Muthuraj K
 
Monitoring & Evaluation
Monitoring & EvaluationMonitoring & Evaluation
Monitoring & EvaluationDevegowda S R
 
Monitoring and Evaluation Framework
Monitoring and Evaluation FrameworkMonitoring and Evaluation Framework
Monitoring and Evaluation FrameworkMichelle Joja
 
Data management tools for monitoring, evaluation and learning—Options and ideas
Data management tools for monitoring, evaluation and learning—Options and ideasData management tools for monitoring, evaluation and learning—Options and ideas
Data management tools for monitoring, evaluation and learning—Options and ideasILRI
 
Monitoring evaluation
Monitoring evaluationMonitoring evaluation
Monitoring evaluationCarlo Magno
 
Monitoring and Evaluation: Lesson 2
Monitoring and Evaluation: Lesson 2Monitoring and Evaluation: Lesson 2
Monitoring and Evaluation: Lesson 2Meshack Lomoywara
 
Monitoring and evaluation Learning and Development
Monitoring and evaluation Learning and DevelopmentMonitoring and evaluation Learning and Development
Monitoring and evaluation Learning and DevelopmentSESH SUKHDEO
 

Was ist angesagt? (20)

Introduction to monitoring and evaluation
Introduction to monitoring and evaluationIntroduction to monitoring and evaluation
Introduction to monitoring and evaluation
 
Difference between monitoring and evaluation
Difference between monitoring and evaluationDifference between monitoring and evaluation
Difference between monitoring and evaluation
 
Monitoring and Evaluation
Monitoring and Evaluation Monitoring and Evaluation
Monitoring and Evaluation
 
Monotoring and evaluation principles and theories
Monotoring and evaluation  principles and theoriesMonotoring and evaluation  principles and theories
Monotoring and evaluation principles and theories
 
Monitoring and evaluation
Monitoring and evaluationMonitoring and evaluation
Monitoring and evaluation
 
Monitoring And Evaluation
Monitoring And EvaluationMonitoring And Evaluation
Monitoring And Evaluation
 
Monitoring and Evaluation for Project management.
Monitoring and Evaluation for Project management.Monitoring and Evaluation for Project management.
Monitoring and Evaluation for Project management.
 
Monitoring & Evaluation
Monitoring & EvaluationMonitoring & Evaluation
Monitoring & Evaluation
 
M&e system
M&e systemM&e system
M&e system
 
Monitoring and Evaluation Framework
Monitoring and Evaluation FrameworkMonitoring and Evaluation Framework
Monitoring and Evaluation Framework
 
Data management tools for monitoring, evaluation and learning—Options and ideas
Data management tools for monitoring, evaluation and learning—Options and ideasData management tools for monitoring, evaluation and learning—Options and ideas
Data management tools for monitoring, evaluation and learning—Options and ideas
 
Result based management
Result based management Result based management
Result based management
 
Monitoring evaluation
Monitoring evaluationMonitoring evaluation
Monitoring evaluation
 
Monitoring and Evaluation: Lesson 2
Monitoring and Evaluation: Lesson 2Monitoring and Evaluation: Lesson 2
Monitoring and Evaluation: Lesson 2
 
Logical framework
Logical frameworkLogical framework
Logical framework
 
Monitoring and Evaluation Framework
Monitoring and Evaluation FrameworkMonitoring and Evaluation Framework
Monitoring and Evaluation Framework
 
Logical framework
Logical  frameworkLogical  framework
Logical framework
 
Monitoring and evaluation Learning and Development
Monitoring and evaluation Learning and DevelopmentMonitoring and evaluation Learning and Development
Monitoring and evaluation Learning and Development
 
M&E Plan
M&E PlanM&E Plan
M&E Plan
 
Components of a monitoring and evaluation system
Components of a monitoring and evaluation system  Components of a monitoring and evaluation system
Components of a monitoring and evaluation system
 

Andere mochten auch

What is Evaluation
What is EvaluationWhat is Evaluation
What is Evaluationclearsateam
 
Evaluation Methods
Evaluation MethodsEvaluation Methods
Evaluation Methodsclearsateam
 
Threats and Analysis
Threats and AnalysisThreats and Analysis
Threats and Analysisclearsateam
 
Project from Start to Finish
Project from Start to FinishProject from Start to Finish
Project from Start to Finishclearsateam
 
Designing Indicators
Designing IndicatorsDesigning Indicators
Designing Indicatorsclearsateam
 
Theory of Change
Theory of ChangeTheory of Change
Theory of Changeclearsateam
 
Experimental Evaluation Methods
Experimental Evaluation MethodsExperimental Evaluation Methods
Experimental Evaluation Methodsclearsateam
 
Sampling, Statistics and Sample Size
Sampling, Statistics and Sample SizeSampling, Statistics and Sample Size
Sampling, Statistics and Sample Sizeclearsateam
 
Cost Effectiveness Analysis
Cost Effectiveness AnalysisCost Effectiveness Analysis
Cost Effectiveness Analysisclearsateam
 
Developing State Monitoring Systems
Developing State Monitoring SystemsDeveloping State Monitoring Systems
Developing State Monitoring Systemsclearsateam
 
Measuring Impact
Measuring ImpactMeasuring Impact
Measuring Impactclearsateam
 
Digital Data Collection
Digital Data CollectionDigital Data Collection
Digital Data Collectionclearsateam
 
02 management vs leadership
02 management vs leadership02 management vs leadership
02 management vs leadershipNguyễn Thanh
 
DIRECTING - A Function of Management
DIRECTING - A Function of ManagementDIRECTING - A Function of Management
DIRECTING - A Function of ManagementSugandha Vidge
 
Results Based Monitoring and Evaluation
Results Based Monitoring and EvaluationResults Based Monitoring and Evaluation
Results Based Monitoring and EvaluationMadhawa Waidyaratna
 
Od interventions
Od interventionsOd interventions
Od interventionsgaurav jain
 
Chapter 04 Directing function of management
Chapter 04 Directing function of managementChapter 04 Directing function of management
Chapter 04 Directing function of managementPatel Jay
 
Directing function in management
Directing  function in managementDirecting  function in management
Directing function in managementDiviya Arun
 

Andere mochten auch (20)

What is Evaluation
What is EvaluationWhat is Evaluation
What is Evaluation
 
Evaluation Methods
Evaluation MethodsEvaluation Methods
Evaluation Methods
 
Threats and Analysis
Threats and AnalysisThreats and Analysis
Threats and Analysis
 
Project from Start to Finish
Project from Start to FinishProject from Start to Finish
Project from Start to Finish
 
Designing Indicators
Designing IndicatorsDesigning Indicators
Designing Indicators
 
Theory of Change
Theory of ChangeTheory of Change
Theory of Change
 
Experimental Evaluation Methods
Experimental Evaluation MethodsExperimental Evaluation Methods
Experimental Evaluation Methods
 
Sampling, Statistics and Sample Size
Sampling, Statistics and Sample SizeSampling, Statistics and Sample Size
Sampling, Statistics and Sample Size
 
Cost Effectiveness Analysis
Cost Effectiveness AnalysisCost Effectiveness Analysis
Cost Effectiveness Analysis
 
Developing State Monitoring Systems
Developing State Monitoring SystemsDeveloping State Monitoring Systems
Developing State Monitoring Systems
 
Measuring Impact
Measuring ImpactMeasuring Impact
Measuring Impact
 
Digital Data Collection
Digital Data CollectionDigital Data Collection
Digital Data Collection
 
02 management vs leadership
02 management vs leadership02 management vs leadership
02 management vs leadership
 
DIRECTING - A Function of Management
DIRECTING - A Function of ManagementDIRECTING - A Function of Management
DIRECTING - A Function of Management
 
Results Based Monitoring and Evaluation
Results Based Monitoring and EvaluationResults Based Monitoring and Evaluation
Results Based Monitoring and Evaluation
 
Project Monitoring & Evaluation
Project Monitoring & EvaluationProject Monitoring & Evaluation
Project Monitoring & Evaluation
 
Monitoring indicators
Monitoring indicatorsMonitoring indicators
Monitoring indicators
 
Od interventions
Od interventionsOd interventions
Od interventions
 
Chapter 04 Directing function of management
Chapter 04 Directing function of managementChapter 04 Directing function of management
Chapter 04 Directing function of management
 
Directing function in management
Directing  function in managementDirecting  function in management
Directing function in management
 

Ähnlich wie Importance of M&E

Assessment MEAL Frameworks in scientific field.ppt
Assessment MEAL Frameworks in scientific field.pptAssessment MEAL Frameworks in scientific field.ppt
Assessment MEAL Frameworks in scientific field.pptShahidMahmood503398
 
Monitoring and evaluation presentation equi gov
Monitoring and evaluation presentation equi govMonitoring and evaluation presentation equi gov
Monitoring and evaluation presentation equi govEquiGov Institute
 
The Basics of Monitoring, Evaluation and Supervision of Health Services in Nepal
The Basics of Monitoring, Evaluation and Supervision of Health Services in NepalThe Basics of Monitoring, Evaluation and Supervision of Health Services in Nepal
The Basics of Monitoring, Evaluation and Supervision of Health Services in NepalDeepak Karki
 
Best Practices in Nonprofit Impact Measurement , CNM
Best Practices in Nonprofit Impact Measurement , CNMBest Practices in Nonprofit Impact Measurement , CNM
Best Practices in Nonprofit Impact Measurement , CNMGreenlights
 
Monitoring And Evaluation Presentation
Monitoring And Evaluation PresentationMonitoring And Evaluation Presentation
Monitoring And Evaluation PresentationEquiGov Institute
 
Ten_Steps_Results_Based_MESystem.ppt
Ten_Steps_Results_Based_MESystem.pptTen_Steps_Results_Based_MESystem.ppt
Ten_Steps_Results_Based_MESystem.pptNegussie5
 
COMMUNITY EVALUATION 2023.pptx
COMMUNITY  EVALUATION 2023.pptxCOMMUNITY  EVALUATION 2023.pptx
COMMUNITY EVALUATION 2023.pptxgggadiel
 
Community engagement - what constitutes success
Community engagement - what constitutes successCommunity engagement - what constitutes success
Community engagement - what constitutes successcontentli
 
Monitoring and impact assessment tools
Monitoring and impact assessment toolsMonitoring and impact assessment tools
Monitoring and impact assessment toolsBrajendra Singh Meena
 
Unit IV_Monitoring_and_Evaluation.pptx
Unit IV_Monitoring_and_Evaluation.pptxUnit IV_Monitoring_and_Evaluation.pptx
Unit IV_Monitoring_and_Evaluation.pptxMusondaMofu2
 
first-batch-me-training.pptx
first-batch-me-training.pptxfirst-batch-me-training.pptx
first-batch-me-training.pptxMaiwandHoshmand1
 
Performance Measurement for Local Governments
Performance Measurement for Local GovernmentsPerformance Measurement for Local Governments
Performance Measurement for Local GovernmentsRavikant Joshi
 

Ähnlich wie Importance of M&E (20)

M&E Concepts.pptx
M&E Concepts.pptxM&E Concepts.pptx
M&E Concepts.pptx
 
ME_Katende (2).ppt
ME_Katende (2).pptME_Katende (2).ppt
ME_Katende (2).ppt
 
Assessment MEAL Frameworks in scientific field.ppt
Assessment MEAL Frameworks in scientific field.pptAssessment MEAL Frameworks in scientific field.ppt
Assessment MEAL Frameworks in scientific field.ppt
 
Monitoring and evaluation presentation equi gov
Monitoring and evaluation presentation equi govMonitoring and evaluation presentation equi gov
Monitoring and evaluation presentation equi gov
 
M&E CLW 26Nov2015, MMM
M&E CLW 26Nov2015, MMMM&E CLW 26Nov2015, MMM
M&E CLW 26Nov2015, MMM
 
The Basics of Monitoring, Evaluation and Supervision of Health Services in Nepal
The Basics of Monitoring, Evaluation and Supervision of Health Services in NepalThe Basics of Monitoring, Evaluation and Supervision of Health Services in Nepal
The Basics of Monitoring, Evaluation and Supervision of Health Services in Nepal
 
Best Practices in Nonprofit Impact Measurement , CNM
Best Practices in Nonprofit Impact Measurement , CNMBest Practices in Nonprofit Impact Measurement , CNM
Best Practices in Nonprofit Impact Measurement , CNM
 
Monitoring And Evaluation Presentation
Monitoring And Evaluation PresentationMonitoring And Evaluation Presentation
Monitoring And Evaluation Presentation
 
Ten_Steps_Results_Based_MESystem.ppt
Ten_Steps_Results_Based_MESystem.pptTen_Steps_Results_Based_MESystem.ppt
Ten_Steps_Results_Based_MESystem.ppt
 
COMMUNITY EVALUATION 2023.pptx
COMMUNITY  EVALUATION 2023.pptxCOMMUNITY  EVALUATION 2023.pptx
COMMUNITY EVALUATION 2023.pptx
 
Basics of Extension Evaluation (Foundations Course 2019)
Basics of Extension Evaluation (Foundations Course 2019)Basics of Extension Evaluation (Foundations Course 2019)
Basics of Extension Evaluation (Foundations Course 2019)
 
EMIS
EMIS EMIS
EMIS
 
Community engagement - what constitutes success
Community engagement - what constitutes successCommunity engagement - what constitutes success
Community engagement - what constitutes success
 
Monitoring and impact assessment tools
Monitoring and impact assessment toolsMonitoring and impact assessment tools
Monitoring and impact assessment tools
 
Practical Evaluation Workshop
Practical Evaluation WorkshopPractical Evaluation Workshop
Practical Evaluation Workshop
 
Unit IV_Monitoring_and_Evaluation.pptx
Unit IV_Monitoring_and_Evaluation.pptxUnit IV_Monitoring_and_Evaluation.pptx
Unit IV_Monitoring_and_Evaluation.pptx
 
first-batch-me-training.pptx
first-batch-me-training.pptxfirst-batch-me-training.pptx
first-batch-me-training.pptx
 
Curriculum monitoring
Curriculum monitoringCurriculum monitoring
Curriculum monitoring
 
Labor Markets Core Course 2013: Monitoring and evaluation
Labor Markets Core Course 2013: Monitoring and evaluation Labor Markets Core Course 2013: Monitoring and evaluation
Labor Markets Core Course 2013: Monitoring and evaluation
 
Performance Measurement for Local Governments
Performance Measurement for Local GovernmentsPerformance Measurement for Local Governments
Performance Measurement for Local Governments
 

Kürzlich hochgeladen

Keynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designKeynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designMIPLM
 
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfVirtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfErwinPantujan2
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️9953056974 Low Rate Call Girls In Saket, Delhi NCR
 
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdfLike-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdfMr Bounab Samir
 
Judging the Relevance and worth of ideas part 2.pptx
Judging the Relevance  and worth of ideas part 2.pptxJudging the Relevance  and worth of ideas part 2.pptx
Judging the Relevance and worth of ideas part 2.pptxSherlyMaeNeri
 
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATIONTHEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATIONHumphrey A Beña
 
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17Celine George
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)lakshayb543
 
Culture Uniformity or Diversity IN SOCIOLOGY.pptx
Culture Uniformity or Diversity IN SOCIOLOGY.pptxCulture Uniformity or Diversity IN SOCIOLOGY.pptx
Culture Uniformity or Diversity IN SOCIOLOGY.pptxPoojaSen20
 
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfGrade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfJemuel Francisco
 
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptxAUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptxiammrhaywood
 
Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Jisc
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Celine George
 
4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptx4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptxmary850239
 
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSGRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSJoshuaGantuangco2
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxthorishapillay1
 
Earth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatEarth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatYousafMalik24
 

Kürzlich hochgeladen (20)

Keynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designKeynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-design
 
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfVirtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
 
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdfLike-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
 
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptxYOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
 
Judging the Relevance and worth of ideas part 2.pptx
Judging the Relevance  and worth of ideas part 2.pptxJudging the Relevance  and worth of ideas part 2.pptx
Judging the Relevance and worth of ideas part 2.pptx
 
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATIONTHEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
 
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptxFINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
 
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
 
YOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptx
YOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptxYOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptx
YOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptx
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
 
Culture Uniformity or Diversity IN SOCIOLOGY.pptx
Culture Uniformity or Diversity IN SOCIOLOGY.pptxCulture Uniformity or Diversity IN SOCIOLOGY.pptx
Culture Uniformity or Diversity IN SOCIOLOGY.pptx
 
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfGrade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
 
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptxAUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
 
Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17
 
4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptx4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptx
 
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSGRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptx
 
Earth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatEarth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice great
 

Importance of M&E

  • 1. Importance of Monitoring and Evaluation
  • 2. Lecture Overview  Monitoring and Evaluation  How to Build an M&E System
  • 3. What is M, what is E, why and how to monitor MONITORING & EVALUATION
  • 4. What is Monitoring  Ongoing process that generates information to inform decision about the program while it is being implemented.  Routine collection and analysis of information to track progress against set plans and check compliance to established standards  Helps identify trends & patterns, adapts strategies, and inform decisions  Key words: • Continuous – ongoing, frequent in nature • Collecting and analyzing information – to measure progress towards goals • Comparing results – assessing the performance of a program/project
  • 5. Why is Monitoring Important?  Evidence of how much has been or has NOT been achieved • Quantitative: numbers, percentage • Qualitative: narrative or observation  Examination of trends  Highlight problems  Early warning signs  Corrective actions  Evaluate effectiveness of management action  Determine achievement of results
  • 6. What is Evaluation  Evaluation is an assessment of an intervention to determine its relevance, efficiency, effectiveness, impact, and sustainability. The intention is to provide information that is credible and useful, enabling incorporation of lessons learned into decision making processes.  Key Words: • Assessment – of the value of an event of action • Relevance • Efficiency • Effectiveness • Impact • Sustainability • Lessons learned
  • 7. What is Evaluation? Evaluation Program Evaluation Impact Evaluation
  • 8. What is M and what is E? Monitoring Measures progress towards goals, but doesn’t tell us the extent to which results achieved or the impact Continuous, frequent Has to take place during the intervention Evaluation Measures whether progress towards goal is caused by the intervention - causality Infrequent, time bound Can evaluate ongoing or completed intervention
  • 9. Monitoring and Evaluation Evaluation Program Evaluation Impact Evaluation Monitoring
  • 10. Components of Program Evaluation  What are the characteristics of the target population? What are the risks and opportunities? What programs are most suitable?  What is the logical chain connecting our program to the desired results?  Is the program being rolled out as planned? Is their high uptake among clients? What do they think of it?  What was the impact and the magnitude of the program?  Given the magnitude of impact and cost, how efficient is the program? Needs assessment Program theory assessment Monitoring and process evaluation Impact evaluation Cost effectiveness Are your questions connected to decision-making?
  • 11. Evaluation Programme Evaluation Impact Evaluation Program Evaluation
  • 12. Who is this Evaluation For?  Academics  Donors • Their Constituents  Politicians / policymakers  Technocrats  Implementers  Proponents, Skeptics  Beneficiaries
  • 13. How can Impact Evaluation Help Us?  Answers the following questions • What works best, why and when? • How can we scale up what works?  Surprisingly little hard evidence on what works  Can do more with given budget with better evidence  If people knew money was going to programmes that worked, could help increase pot for anti-poverty programmes
  • 14. Programs and their Evaluations: Where do we Start? Intervention  Start with a problem  Verify that the problem actually exists  Generate a theory of why the problem exists  Design the program  Think about whether the solution is cost effective Program Evaluation  Start with a question  Verify the question hasn’t been answered  State a hypothesis  Design the evaluation  Determine whether the value of the answer is worth the cost of the evaluation
  • 15. Endline Evaluation Life Cycle of a Program Baseline Evaluation Change or improvement Distributing reading materials and training volunteers • Reading materials delivered • Volunteers trained • Target children are reached • Classes are run, volunteers show up • Attendance in classes • Entire district is covered • Refresher training of teachers • Tracking the target children, convincing parents to send their child • Incentives to volunteer to run classes daily and efficiently (motivation) • Efforts made for children to attend regularly • Improve coverage Theory of Change/ Needs Assessment Designing the program to implement Background preparation, logistics, roll out of program Monitoring implementati on • Process evaluation • Progress towards target Planning for continuous improvement Reporting findings - impact, process evaluation findings Using the findings to improve program model and delivery
  • 16. Program Theory – a Snap Shot Impacts Outcomes Outputs Activities Inputs Results Implementation
  • 17. With a Focus on measuring both implementation and results? HOW TO BUILD AN M&E SYSTEM
  • 18. Methods of Monitoring  First hand information  Citizens reporting  Surveys  Formal reports by project/programme staff • Project status report • Project schedule chart • Project financial status report • Informal Reports • Graphic presentations
  • 19. Monitoring: Questions  Is the intervention implemented as designed? Does the program perform? Inputs Outputs Outcomes Implementation Plans and targets  Is intervention money, staff and other inputs available and put to use as planned? Are inputs used effectively?  Are the services being delivered as planned?  Is the intervention reaching the right population and target numbers?  Is the target population satisfied with services? Are they utilizing the services?  What is the intensity of the treatment?
  • 20. Implementing Monitoring  Develop a monitoring plan • How should implementation be carried out? What is going to be changed? • Are the staff’s incentives aligned with project? Can they be incentivized to follow the implementation protocol? • How will you train staff? How will they interact with beneficiaries or other stakeholders? • What supplies or tools can you give your staff to make following the implementation design easier? • What can you do to monitor? (Field visits, tracking forms, administrative data, etc.) • Intensity of monitoring (frequency, resources required,…)?
  • 21. Ten Steps to a Results-based Monitoring and Evaluation System Conducting a readiness and needs assessment Selecting key indicators to monitor outcomes Planning for improvement selecting realistic targets Using evaluation information Using findings 1 2 3 4 5 6 7 8 9 10 Agreeing on outcomes to monitor and evaluate Gathering baseline data on indicators Monitoring for results Reporting findings Sustaining the M&E system within the organization
  • 22. Conducting a needs and readiness assessment 1 2 3 4 5 6 7 8 9 10  What are the current systems that exist?  What is the need for the monitoring and evaluation?  Who will benefit from this system?  At what levels will the data be used?  Do we have organization willingness and capacity to establish the M&E system?  Who has the skills to design and build the M&E system? Who will manage?  What are the barriers to implementing M&E system on the ground (resource-crunch)?  How will you fight these barriers?  Will there be pilot programs that can be evaluated within the M&E system? - DO WE GO AHEAD?
  • 23. Agreeing on outcomes (to monitor and evaluate) 1 2 3 4 5 6 7 8 9 10  What are we trying to achieve? What is the vision that our M&E system will help us achieve?  Are there national or sectoral goals (commitment to achieving the MDGs)?  Political/donor driven interest in goals?  In other words, what are our Outcomes: Improving coverage, learning outcomes… broader than focusing on merely inputs and activities
  • 24. Selecting key indicators to monitor outcomes 1 2 3 4 5 6 7 8 9 10  Identify WHAT needs to get measured so that we know we have achieved our results?  Avoid broad based results, but assess based on feasibility, time, cost, relevance  Indicator development is a core activity in building an M&E system and drives all subsequent data collection, analysis, and reporting  Arriving at indicators will take come time  Identify plans for data collection, analysis, reporting PILOT! PILOT! PILOT!
  • 25. Gathering baseline data on indicators 1 2 3 4 5 6 7 8 9 10  Where are we today?  What is the performance of indicators today?  Sources of baseline information: Primary or Secondary data  Date types: Qualitative or Quantitative  Data collection instruments
  • 26. Planning for improvement selecting realistic targets 1 2 3 4 5 6 7 8 9 10  Targets – quantifiable levels of the indicators  Sequential, feasible and measurable targets  If we reach our sequential set of targets, then we will reach our outcomes!  Time bound – Universal enrolment by 2015 (outcome – better economic opportunities), Every child immunized by 2013 (outcome - reduction in infant mortality) etc.  Funding and resources available to be taken into account Target 1 Target 2 Target 3 Outcomes
  • 27. Monitoring for implementation and results 1 2 3 4 5 6 7 8 9 10 Impacts Outcomes Outputs Activities Inputs Results Implementation Results monitoring Implementation monitoring Change in percentage children who cannot read; Change in teacher attendance Provision of materials; training of volunteers; usage of material; number of volunteers teaching
  • 28. Evaluation(?), Using Evaluation Information 1 2 3 4 5 6 7 8 9 10 Monitoring does not information on attribution and causality. Information through Evaluation can be useful to  Helps determine are the right things being done  Helps select competing strategies by comparing results – are there better ways of doing things?  Helps build consensus on scale-up  Investigate why something did not work – scope for in-depth analysis  Evaluate the costs relative to benefits and help allocate limited resources
  • 29. Reporting findings Sustaining M&E System 1 2 3 4 5 6 7 8 9 10 Using Results  Reporting Findings: What findings are reported to whom, in what format, and at what intervals. A good M&E system should provide an early warning system to detect problems or inconsistencies, as well as being a vehicle for demonstrating the value of an intervention – so do not hide poor results.  Using Results: recognize both internal and external uses of your results  Sustaining the M&E System: Some ways of doing this are generating demand, assigning responsibilities, increasing capacity, gather ing trustworthy data.

Hinweis der Redaktion

  1. Evaluation ek assessment hai program ka jo uski RELEVANCE: upyukta – jo program hai vo country context or logo ke liya kaam aayega? (water is pure, but I give chlorine tablets) EFFICIENCY: saksham - How economically are resources being used? (cost is more than benefit) EFFECTIVENESS: kitna prabhav tha – objectives achieved or not? IMPACT: Long term effects, intended or not intended SUSTAINABILTY: Nirantarta – program ka benefit continue karega?
  2. First let’s narrow down our definition of Evaluation Evaluation is a very big term and could mean many things… In general, we’ll be talking about program evaluation So that means, not the type of evaluation that’s more administrative in nature… Performance evaluations, audits, etc… Unless those are part of a new policy or program that we wish to evaluate… Programs are still a general term Could include Policies, or more generally, “interventions” What distinguishes impact evaluation? What makes “randomized evaluation” distinct? Where does monitoring fit in? The quiz you took at the beginning… that’s part of an evaluation. You’ll take one at the end as well. And you’ll also give us some course feedback something we’ll look at after this whole course is done and use it to make design and implementation changes to the course But it’s not something we consider part of the course design itself. It’s not really meant to serve as a pedagogical device. The clickers. That’s more part of monitoring. It’s specifically designed as part of the pedagogy. It gives us instant feedback based on which we make mid-course adjustments, corrections, etc. Part of the pedagogy is that we have a specific decision tree that the implementers (in this case, lecturers) use based on the results of the survey.
  3. Progress: Vikaas Extent: had Building an evaluation system allows for: • a more in-depth study of results-based outcomes and impacts • bringing in other data sources than just extant indicators • addressing factors that are too difficult or expensive to continuously monitor • tackling the issue of why and how the trends being tracked with monitoring data are moving in the directions they are (perhaps most important).
  4. In general, we’ll be talking about program evaluation So that means, not the type of evaluation that’s more administrative in nature… Performance evaluations, audits, etc… Unless those are part of a new policy or program that we wish to evaluate… Programs are still a general term Could include Policies, or more generally, “interventions” What distinguishes impact evaluation? What makes “randomized evaluation” distinct? Where does monitoring fit in? The quiz you took at the beginning… that’s part of an evaluation. You’ll take one at the end as well. And you’ll also give us some course feedback something we’ll look at after this whole course is done and use it to make design and implementation changes to the course But it’s not something we consider part of the course design itself. It’s not really meant to serve as a pedagogical device. The clickers. That’s more part of monitoring. It’s specifically designed as part of the pedagogy. It gives us instant feedback based on which we make mid-course adjustments, corrections, etc. Part of the pedagogy is that we have a specific decision tree that the implementers (in this case, lecturers) use based on the results of the survey.
  5. In general, we’ll be talking about programme evaluation So that means, not the type of evaluation that’s more administrative in nature… Performance evaluations, audits, etc… Unless those are part of a new policy or programme that we wish to evaluate… Programmes are still a general term Could include Policies, or more generally, “interventions” What distinguishes impact evaluation? What makes “randomized evaluation” distinct? Where does monitoring fit in? The quiz you took at the beginning… that’s part of an evaluation. You’ll take one at the end as well. And you’ll also give us some course feedback something we’ll look at after this whole course is done and use it to make design and implementation changes to the course But it’s not something we consider part of the course design itself. It’s not really meant to serve as a pedagogical device. The clickers. That’s more part of monitoring. It’s specifically designed as part of the pedagogy. It gives us instant feedback based on which we make mid-course adjustments, corrections, etc. Part of the pedagogy is that we have a specific decision tree that the implementers (in this case, lecturers) use based on the results of the survey.
  6. Who is your audience? And what questions are they asking? Academics: we have quite a few academics in the audience. Beneficiaries: This may be slightly different from “who are your stakeholders”? This effects the type of evaluation you do, but also the questions you seek to answer.
  7. This question is larger than a question of aid Aid accounts for less than 10% of development spending. Governments have their own budgets, their own programmes.
  8. Before thinking about evaluations, we should think about what it is that we’re evaluating… Here I’ll generically call it, an “intervention” You’d be surprised how many policies are implemented that address non-existent problems. One of our evaluations in India is of a policy called continuous and comprehensive evaluation… Now let’s ask a very specific question…
  9. Ignore blue boxes
  10. If this our program theory – anumaan, parikalpna – then it has two parts. The first is where the program is being implemented and the second is where the results have started to show.
  11. Hum ek M&E system kaise banaye jo program amal karne ko aur uske parinaam dono ko naapta hai
  12. How the world bank says we can do it – handbook for development practitioners
  13. Give examples
  14. Outcome – parinaam MDG example – eight international development goals that all UN member states have to achieve by the year 2015 Eradicate extreme hunger and poverty - universal primary education
  15. Indicator: suchak, nideshak Relevant: uchit
  16. Instruments: jariya
  17. Measure: naap Feasible: sambhav Another example – I want higher literacy levels 1) 50% children in school; 2) 100 % children in school; 3) teacher attendance increases; 4) children scores improve
  18. Implementation monitoring – data collected on inputs, activities and immediate outputs; information on administrative, implementation, management issues Results monitoring – sets indicators for outcomes and collects data on outcomes; systematic reporting on progress towards outcomes
  19. Evaluation – karanta sthapit karna, could be expensive but very useful in certain cases