This document provides an agenda and overview for a workshop on evaluation. The workshop covers introductions, the current context around social impact evaluation, why evaluation is important for accountability, learning and capacity building. It discusses evaluation concepts and frameworks, the evolution of evaluation thinking to focus on learning and improvement. The workshop teaches how to develop an evaluation plan, and provides an activity for participants to apply the concepts. It aims to help participants understand how to design and implement effective evaluations.
1. ALLIES LEARNING EXCHANGE
LEARNING FROM WHAT WORKS
THE ART AND SCIENCE OF EVALUATION
MAY 7, 2010 HALIFAX
2. AGENDA
⢠Introductions
⢠Current context
⢠Why evaluate?
⢠Evaluation 101
⢠Evolution in evaluation thinking
⢠What works
⢠Developing an Evaluation Plan
⢠Evaluation in action â participant activity
3. INTRODUCTIONS
⢠Participant introductions
â Name, affiliation
â What you hope to get out of this
session
4. CURRENT CONTEXT
âThere are a number of interesting social value calculations in use
throughout the sector. We also learned about their limitations in
terms of data quality and comparability. We now appreciate that
while each method generates actionable information for its own
users, no one approach has yet emerged as the single best method.
In fact, to some extent, it was the discipline and rigor of application
that is the most important common ingredient among the methods.
Each of the practitioners acknowledged the importance of their
calculation model forcing them to make their assumptions explicit
and transparent. It is only once the assumptions are laid bare, that
a true debate about the merits of a program, strategy or grant
relative to costs can fully be vetted and debated, even if not fully
known with precision. â
Gates Foundation Cover Letter on the Report on Measuring and/or
estimating Social Value Creation, December 2008
6. WHY EVALUATE AT ALL?
⢠Accountability
⢠Assessing impact - what difference did you
make?
⢠Learning: what works, what doesnât
⢠Building capacity
⢠âThe difference between what we do and what we are capable of doing would suffice to solve most
of the world's problems.â Mahatma Gandhi
⢠Sharing/transferring knowledge
7. EVALUATION 101
Distinction:
1. Monitoring = Accountability
⢠Were funds used as agreed?
⢠Did the grantee do what they said they would?
⢠Mid-grant adjustments
2. Evaluation = Impact
⢠What changed as a result of the funding?
⢠What was learned about the issue/intervention?
⢠So what?
⢠Transferring/disseminating knowledge
8. EVALUATION 101
â˘Inter-related levels in evaluation:
⢠The organization: vision, mission,
mandate, capacity
⢠The program: impact of funding
â˘The issue: whatâs different, needs to
change
9. EVOLUTIONARY THINKING IN EVALUATION
Assumptions Reality
⢠Purpose is to prove ⢠Focus is to improve
⢠Itâs about the grantee ⢠Involves many stakeholders
⢠Happens at the end ⢠Starts as soon as program is conceived
⢠Measures everything ⢠Select indicators to help critical decisions
⢠Looking for attribution ⢠Satisfied with contribution and learning
⢠Done by experts, pre-determined ⢠Participatory, evolving process
assumptions ⢠Rigorous thinking
⢠Rigorous methodology ⢠Specific to organization/program age/stage
⢠One size fits all ⢠Also looks for impact and learning
⢠Focus is accountability, measurement ⢠Use learning and knowledge transfer to
⢠Internally focused influence and inform decision-making, policy
10. EVALUATION AS A LEARNING TOOL
⢠Starts with organizational strategy, mission, goals
⢠Integrated into operational and planning cycle
⢠Anchor and amplify it in existing activities
⢠Allocate time and resources
⢠Encourage âevaluative thinkingâ
11. EVALUATION AS A LEARNING TOOL
⢠Mutual accountability
⢠Creates space for conversation
⢠Increases transparency, trust
⢠Meets broader public agenda by sharing what
we learn
⢠Great social impact: what works/does not
work
⢠Increases efficiency and effectiveness
12. DEVELOPING AN EVALUATION PLAN
⢠Develop Logic Model/Theory of Change
⢠Evaluation work plan
⢠Identify information required: quantitative output
⢠Qualitative impact
⢠Increase awareness/knowledge
⢠Change attitudes
⢠Change behaviours
⢠Increase skill levels
⢠Improve individual status
⢠Improve community status
⢠How/by whom will these be measured?
⢠Approach â participatory, developmental, formative, summative?
⢠Resources required (human, financial, technical)
13. DEVELOPING AN EVALUATION PLAN
⢠Limit your evaluation plan to the actual population/community served
and scope of activity
⢠Avoid things outside your control (systemic barriers, regulations, for
example) unless you intend to address these as part of the program
and be accountable for changing them
⢠Under-promise and over-deliver
⢠Complex grants need external evaluation help. Cost can range from
5%-15% of total project costs
⢠Provide staff/board the results and learn about what worked and could
be improved in future programming.
⢠Use the results to report to your funders
14. ALLIESÂ GENERIC COMMUNITY LOGIC MODEL
Key components                   Major Activities           Evidence process          Contributing factors     Project Outcomes UltimateÂ
Impact
is working                      to success Â
Increase in numbersÂ
Assess and adapt Adaptation of All sectors are of skilledÂ
core model to model to meet represented immigrantsÂ
reflect local local conditions and engaged in employed in theirÂ
Research and baseline Improveâ
conditions. is accepted by the process. field.
data about local ment inÂ
immigration and stakeholders. local hiringÂ
employment. andÂ
Increased number systemsÂ
New ways of of employers related toÂ
doing things committed to immigrantÂ
Adapt or develop emerge through improving hiring employmen
There isÂ
guides and other reflection and practices. t
significant partiâ
resources for use learning.
cipation ofÂ
by localÂ
employers.
employers.
Understanding of Evidence of Increased   capacityÂ
community strengths, financial of employers toÂ
assets and processes support for learn, plan andÂ
related to immigrant longer term implementÂ
employment (current Establish action sustainability. strategies.
programs, resources, Programs areÂ
group comprisedÂ
leaders, etc.). developed andÂ
of all relevantÂ
implemented.
sectors  to Guides andÂ
develop and Increased IncreasedÂ
resources areÂ
implement community  economicÂ
well used byÂ
solutions. capacity to learn, and socialÂ
employers andÂ
plan and implement capital inÂ
others.
Opportunity for strategies. localÂ
engagement of all local community.
sectors to address There are Local projectÂ
systems and barriers current and has âbrandâÂ
CreateÂ
potential recognition. Increased publicÂ
related to improving opportunities toÂ
leaders in the Media awareness andÂ
appropriate increase localÂ
community who coverage/ support forÂ
employment for skilled awareness andÂ
are available public events appropriate hiringÂ
immigrants. support for theÂ
and interested reinforce of skilledÂ
issue.
in this issue. message.  immigrants.
16. WHAT WORKS
⢠Clarity in purpose and audiences
⢠Clarity on theory of change, assumptions
⢠Supporting evaluation framework
⢠Selecting a few indicators that help assess progress in each
area
⢠Focus on contribution, rather than attribution
⢠Balance quantitative and qualitative
⢠No stories without numbers and no numbers without stories
⢠Share learning, celebrate successes
⢠Tell the story as it unfolds, periodically tie themes together
17. RESOURCES
⢠W.K.Kellogg Foundation Logic Model Worksheet
www.wkkf.org/knowledge-center/resources/2004/05/Logic-Model-
Worksheet.aspx
⢠Developmental evaluation: The J.W.McConnell Family Foundation
www.mcconnellfoundation.ca/en/resources
⢠Community Builderâs Approach to Theory of Change
http://tamarackcommunity.ca/g3s4_7.html
⢠Measuring and/or estimating social value creation
www.gatesfoundation.org/learning/Pages/december-2008-measuring-
estimating-social-value-creation-report-summary.aspx