SlideShare ist ein Scribd-Unternehmen logo
1 von 28
Program Evaluation: 
Using evaluation data to set direction, 
expand impact, and maintain 
accountability 
October 21, 2014 
Presented by: 
Isaac D. Castillo 
Director of Data and Evaluation 
DC Promise Neighborhood Initiative 
Twitter: @isaac_outcomes
Today’s Agenda 
Why should you care about program 
evaluation? 
What is program evaluation? 
How can your organization successfully 
conduct program evaluation work? 
Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
Why Should You Care About 
Program Evaluation? 
LAYC domestic violence story 
LeapOfReason.org 
First Do No Harm…Then Do More Good 
New domestic violence program 
component designed to teach three 
things: 
o Partner violence is not an OK expression of love 
o Partner violence is not OK in Latino culture 
o There are safe ways to get out of violent 
relationships 
Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
Why Should You Care About 
Program Evaluation? 
No human being is perfect. 
Staff will make mistakes 
Organizations will make mistakes 
Services will be delivered poorly 
Despite the best of intentions, some people 
will be harmed. 
How do you know you are not harming 
people with your services? 
Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
What is Program Evaluation? 
Process to determine if your program / 
intervention / approach is effective. 
Need to define what ‘success’ is for your 
program. 
Program evaluation does NOT need to be 
done by specialists or outsiders – but those 
people do add credibility and rigor (in most 
cases) 
Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
The Basics of Program 
Evaluation – 
An Example 
The concept of dieting – if you understand 
dieting, you understand the basics of program 
evaluation. 
What is the goal of dieting (how do you 
define dieting ‘success’)? 
How do you know if your diet ‘works’? 
Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
Data and Dieting 
Person weighs 200 
Person weighs 200 
Pounds 
(90 Kilograms) 
(90 Kilograms) 
• Does that data point alone tell us anything? 
Pounds 
• Context Matters – what if person is 4 feet tall and 10 years old? 
• Timing Matters – is this at beginning, end, or middle of diet? 
Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
Could Be About More Than 
Weight 
• Other things that could be measured: 
• Body Mass Index (BMI) 
• Physical fitness 
• Blood measures (cholesterol levels) 
• Own perceptions of health / feeling 
• Appearance / muscle tone 
Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
Five Evaluation Concepts to 
be Covered 
Types and timing of evaluation 
Who or what will you evaluate, and how will 
they be selected? 
Quantitative, Qualitative, and Mixed Methods 
approaches 
How detailed or rigorous does it need to be? 
Who does the work – internal or external? 
Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
Types of Evaluation 
What do I mean by ‘type’? 
Really it is about timing - When do you 
collect data? 
What will you compare your data to? 
Much of this discussion relies on: 
o Costs 
o Availability of potential comparison data 
o What you are trying to learn 
Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
Traditional (Time Series) 
Most common type of program evaluation. 
Looking to see if things have changed over time. 
What was situation before program, then what was 
situation after program. 
BBeefoforree P Prrooggrraamm PPrrooggrraamm D Deelilviveerreedd AAftfeterr P Prrooggrraamm 
Must measure same things, in same ways, at both 
points in time. 
Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
Comparison Group 
A time series study that compares to another group 
(that does not receive programming). 
PPrrooggrraamm D Deelilviveerreedd 
BBeefoforree P Prrooggrraamm 
No (or minimal) 
programming AAftfteerr P Prrooggrraamm 
No (or minimal) 
programming 
More rigorous, but more challenging. 
Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
Who / What Will You 
Evaluate? 
Need to define the population that will be 
evaluated. 
Need to define ‘success measures’ 
(outcomes) – what are you trying to achieve? 
Once these questions are answered, then 
need to consider which participants will be 
part of the evaluation (and maybe who gets 
programming). 
Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
In Time Series, This is Simple 
Usually just serve and evaluate those that 
enroll in the program: 
BBeefoforree P Prrooggrraamm PPrrooggrraamm D Deelilviveerreedd AAftfteerr P Prrooggrraamm 
SSeelfl-fs-seeleleccttioionn 
First come, first served is what is frequently 
used if there are too many potential 
participants. 
Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
Comparison Groups Are More 
Complicated 
• Can select by randomizing participants into 
groups: 
PPrrooggrraamm D Deelilviveerreedd 
BBeefoforree P Prrooggrraamm 
No (or minimal) 
programming AAftfteerr P Prrooggrraamm 
No (or minimal) 
programming 
Random 
Selection 
Random 
Selection 
Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
Compare across high/low 
dosage 
• Can use self-selection: 
PPrrooggrraamm D Deelilviveerreedd 
BBeefoforree P Prrooggrraamm 
No (or minimal) 
programming AAftfteerr P Prrooggrraamm 
No (or minimal) 
programming 
High 
High 
Attendance 
Attendance 
Low 
Attendance 
Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014 
Self-selection 
Self-selection 
Low 
Attendance
Quantitative, Qualitative, and 
Mixed Methods Evaluation 
Quantitative = more numerical information. 
Qualitative = less numerical information. 
One not better than other, just different types 
of information. Both can be high quality – 
both can be poor. 
Most modern program evaluation is Mixed 
Methods – both quantitative and qualitative to 
varying degrees. 
Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
What does Mixed Methods 
Evaluation Look Like? 
Quantitative 
•Survey responses (that can be 
quantified) 
•Numerical data (test scores, 
report cards, medical data, 
etc.) 
•External data (collected by 
others) 
Qualitative 
•Interviews with participants 
•Focus groups with 
participants 
•Interviews with staff 
•Interviews or focus groups 
with key stakeholders 
•Process / fidelity study 
•Open ended survey responses 
Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
Where Does This Get Tricky? 
Need to focus on outcomes – changes in 
knowledge, attitudes, behavior or conditions. 
Satisfaction surveys do not equal evaluation. 
Just because someone liked the program it 
does not mean the program led to successful 
outcomes. 
Some things can be either qualitative or 
quantitative depending on who you ask. 
Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
How Detailed or Rigorous Does 
the Evaluation Need to Be? 
What do you want to do with the results? 
Prove to yourself the program works? 
Use the results to market/fundraise? 
Publish the results through your own materials? 
Publish the results in peer-reviewed journals? 
How ‘certain’ do you want to be about the results? 
Are you fine with some doubt? 
Will you be comfortable answering concerns and criticisms? 
Are you willing to live with negative results? 
Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
Using Evaluation Data to 
Inform Change 
Changes in program type / delivery 
(this parenting program isn’t working, time for a different program) 
Changes in dosage 
(classes are offered once a month, increase classes to once a week) 
Changes in measurement tools or approaches 
(this survey question is flawed, let’s find a better one) 
Changes in staff training 
(staff do not seem to know how to deliver this program – time for 
training) 
Changes in organizational culture 
(no one is taking this approach seriously – time for larger 
conversation) 
Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
Communicating Evaluation 
Results 
Be honest about your data and the limitations of your 
method. 
What you omit is as telling as what you communicate. 
When communicating negative / ‘bad’ results, follow 
this formula: 
Finding / Result + Theory + New Solution 
Use different formats to communicate the information. 
Celebrate the successes, and identify areas for 
improvement. 
No need to share everything in detail – but have it ready 
if someone requests it. 
Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
Balancing Cost and Rigor 
Program evaluation does have a resource 
cost – but so does everything else. 
Simple internal evaluation / performance 
management can be done at low cost. 
However, larger picture requires more 
rigorous (and more expensive evaluation) 
Start small and focus on 2-3 outcomes – then 
expand over time. 
Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
Internal vs. External 
Evaluation 
Who does design of evaluation? 
Who does selection / creation of data 
collection tools? 
Who does actual data collection? 
Who does the analysis of data? 
Who creates the reports / charts / 
publications? 
Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
Resources - Books 
The Nonprofit Outcomes Toolbox 
By Robert M. Penna 
Handbook of Practical Program Evaluation 
By Joseph S. Wholey, Harry P. Hatry, and Kathryn E. 
Newcomer - editors 
Performance Measurement: Getting Results 
By Harry P. Hatry 
Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
Resources - Articles 
First Do No Harm…Then Do More Good 
By Isaac Castillo 
http://tinyurl.com/isaacLOR 
Good Stories Aren’t Enough 
By Martha A. Miles 
http://tinyurl.com/milesgoodstories 
Yes We Can! Performance Management in 
Nonprofit Human Services 
By David E.K. Hunter 
http://tinyurl.com/hunteryeswecan 
Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
Resources - Links 
PerformWell 
www.performwell.org 
Performance Management and Evaluation: 
Two Sides of the Same Coin 
By Isaac Castillo and Ann Emery 
https://www.youtube.com/watch?v=nC7AG8XxrI4 
Leap of Reason 
http://www.leapofreason.org 
Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
Isaac’s Contact Information 
Isaac D. Castillo 
Director of Data and Evaluation 
DC Promise Neighborhood Initiative 
On Twitter: @Isaac_outcomes 
Email: Isaac.Castillo@dcpni.org 
October 21, 2014 
Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014

Weitere ähnliche Inhalte

Was ist angesagt?

Evergreen and Growth: Sustainable Content Strategy for Social Media Managers
Evergreen and Growth: Sustainable Content Strategy for Social Media ManagersEvergreen and Growth: Sustainable Content Strategy for Social Media Managers
Evergreen and Growth: Sustainable Content Strategy for Social Media ManagersMa'ayan Plaut
 
Number Stories: Win Friends and Influence HiPPOs with an Effective Measuremen...
Number Stories: Win Friends and Influence HiPPOs with an Effective Measuremen...Number Stories: Win Friends and Influence HiPPOs with an Effective Measuremen...
Number Stories: Win Friends and Influence HiPPOs with an Effective Measuremen...Michael Powers
 
Vjusticeforever social plan
Vjusticeforever social planVjusticeforever social plan
Vjusticeforever social planKieran Bailey
 
Trends on Pinterest
Trends on PinterestTrends on Pinterest
Trends on PinterestJune Andrews
 
Growth, Engagement & Search Metrics: Snake Oil or North Stars
Growth, Engagement & Search Metrics: Snake Oil or North StarsGrowth, Engagement & Search Metrics: Snake Oil or North Stars
Growth, Engagement & Search Metrics: Snake Oil or North StarsJune Andrews
 
Predictive Analytics & Business Insights
Predictive Analytics & Business InsightsPredictive Analytics & Business Insights
Predictive Analytics & Business InsightsJune Andrews
 
VMCS14 REanalyze: What is your EVP Data Saying?
VMCS14 REanalyze: What is your EVP Data Saying?VMCS14 REanalyze: What is your EVP Data Saying?
VMCS14 REanalyze: What is your EVP Data Saying?VolunteerMatch
 

Was ist angesagt? (9)

Evergreen and Growth: Sustainable Content Strategy for Social Media Managers
Evergreen and Growth: Sustainable Content Strategy for Social Media ManagersEvergreen and Growth: Sustainable Content Strategy for Social Media Managers
Evergreen and Growth: Sustainable Content Strategy for Social Media Managers
 
Number Stories: Win Friends and Influence HiPPOs with an Effective Measuremen...
Number Stories: Win Friends and Influence HiPPOs with an Effective Measuremen...Number Stories: Win Friends and Influence HiPPOs with an Effective Measuremen...
Number Stories: Win Friends and Influence HiPPOs with an Effective Measuremen...
 
Vjusticeforever social plan
Vjusticeforever social planVjusticeforever social plan
Vjusticeforever social plan
 
Trends on Pinterest
Trends on PinterestTrends on Pinterest
Trends on Pinterest
 
Math in data
Math in dataMath in data
Math in data
 
Growth, Engagement & Search Metrics: Snake Oil or North Stars
Growth, Engagement & Search Metrics: Snake Oil or North StarsGrowth, Engagement & Search Metrics: Snake Oil or North Stars
Growth, Engagement & Search Metrics: Snake Oil or North Stars
 
Predictive Analytics & Business Insights
Predictive Analytics & Business InsightsPredictive Analytics & Business Insights
Predictive Analytics & Business Insights
 
VMCS14 REanalyze: What is your EVP Data Saying?
VMCS14 REanalyze: What is your EVP Data Saying?VMCS14 REanalyze: What is your EVP Data Saying?
VMCS14 REanalyze: What is your EVP Data Saying?
 
Team icon
Team iconTeam icon
Team icon
 

Ähnlich wie Program Evaluation Basics - Center for Nonprofit Success slides

Week 7: Missions and Measures
Week 7: Missions and MeasuresWeek 7: Missions and Measures
Week 7: Missions and MeasuresGabrielle Lyon
 
Communicating the ROI of UX from The Enterprise to The Streets (JD Buckley at...
Communicating the ROI of UX from The Enterprise to The Streets (JD Buckley at...Communicating the ROI of UX from The Enterprise to The Streets (JD Buckley at...
Communicating the ROI of UX from The Enterprise to The Streets (JD Buckley at...Rosenfeld Media
 
Evaluation and Assessment for Busy Professionals
Evaluation and Assessment for Busy ProfessionalsEvaluation and Assessment for Busy Professionals
Evaluation and Assessment for Busy ProfessionalsSara Rothschild
 
Everything You Need to Know About Strategy Deployment (Lean Methods)
Everything You Need to Know About Strategy Deployment (Lean Methods)Everything You Need to Know About Strategy Deployment (Lean Methods)
Everything You Need to Know About Strategy Deployment (Lean Methods)KaiNexus
 
How to Build a Culture of Analytics
How to Build a Culture of AnalyticsHow to Build a Culture of Analytics
How to Build a Culture of AnalyticsBadgeville, Inc.
 
Turning Data into Infographics: An Interactive Workshop for Problem Solvers
Turning Data into Infographics: An Interactive Workshop for Problem SolversTurning Data into Infographics: An Interactive Workshop for Problem Solvers
Turning Data into Infographics: An Interactive Workshop for Problem SolversUNCResearchHub
 
Storytelling with Data (Global Engagement Summit at Northwestern University 2...
Storytelling with Data (Global Engagement Summit at Northwestern University 2...Storytelling with Data (Global Engagement Summit at Northwestern University 2...
Storytelling with Data (Global Engagement Summit at Northwestern University 2...Sara Hooker
 
Marketing Matters: A Realistic Approach
Marketing Matters: A Realistic ApproachMarketing Matters: A Realistic Approach
Marketing Matters: A Realistic ApproachKristy Black
 
#15NTC NTEN Strategy, IT\Mission Alignment and Outcomes (SIMO) Presentation (...
#15NTC NTEN Strategy, IT\Mission Alignment and Outcomes (SIMO) Presentation (...#15NTC NTEN Strategy, IT\Mission Alignment and Outcomes (SIMO) Presentation (...
#15NTC NTEN Strategy, IT\Mission Alignment and Outcomes (SIMO) Presentation (...Steve Heye
 
What's Next: The Value of Data
What's Next: The Value of DataWhat's Next: The Value of Data
What's Next: The Value of DataOgilvy Consulting
 
Research and Community Building with a Roadmap
Research and Community Building with a RoadmapResearch and Community Building with a Roadmap
Research and Community Building with a RoadmapQuestionPro
 
Capturing and communicating impact
Capturing and communicating impactCapturing and communicating impact
Capturing and communicating impactEva Witesman
 
Determining & Demonstrating Value with the Logic Model
Determining & Demonstrating Value with the Logic ModelDetermining & Demonstrating Value with the Logic Model
Determining & Demonstrating Value with the Logic ModelPaul Burry
 
SMPS Alaska Chapter Presentation - 8-27-13
SMPS Alaska Chapter Presentation - 8-27-13SMPS Alaska Chapter Presentation - 8-27-13
SMPS Alaska Chapter Presentation - 8-27-13Kathy Day
 
Pallid Sturgeon Research Project
Pallid Sturgeon Research ProjectPallid Sturgeon Research Project
Pallid Sturgeon Research ProjectBrenda Zerr
 
Plan and (HYPOTHETICALLY) evaluate a public health intervention ut.docx
Plan and (HYPOTHETICALLY) evaluate a public health intervention ut.docxPlan and (HYPOTHETICALLY) evaluate a public health intervention ut.docx
Plan and (HYPOTHETICALLY) evaluate a public health intervention ut.docxajoy21
 
Quality Assurance_Final
Quality Assurance_FinalQuality Assurance_Final
Quality Assurance_Finalkristin kipp
 
Data Analytics: Better Decision, Better Business
Data Analytics: Better Decision, Better BusinessData Analytics: Better Decision, Better Business
Data Analytics: Better Decision, Better BusinessMcKonly & Asbury, LLP
 

Ähnlich wie Program Evaluation Basics - Center for Nonprofit Success slides (20)

Week 7: Missions and Measures
Week 7: Missions and MeasuresWeek 7: Missions and Measures
Week 7: Missions and Measures
 
Communicating the ROI of UX from The Enterprise to The Streets (JD Buckley at...
Communicating the ROI of UX from The Enterprise to The Streets (JD Buckley at...Communicating the ROI of UX from The Enterprise to The Streets (JD Buckley at...
Communicating the ROI of UX from The Enterprise to The Streets (JD Buckley at...
 
Evaluation and Assessment for Busy Professionals
Evaluation and Assessment for Busy ProfessionalsEvaluation and Assessment for Busy Professionals
Evaluation and Assessment for Busy Professionals
 
Everything You Need to Know About Strategy Deployment (Lean Methods)
Everything You Need to Know About Strategy Deployment (Lean Methods)Everything You Need to Know About Strategy Deployment (Lean Methods)
Everything You Need to Know About Strategy Deployment (Lean Methods)
 
How to Build a Culture of Analytics
How to Build a Culture of AnalyticsHow to Build a Culture of Analytics
How to Build a Culture of Analytics
 
Turning Data into Infographics: An Interactive Workshop for Problem Solvers
Turning Data into Infographics: An Interactive Workshop for Problem SolversTurning Data into Infographics: An Interactive Workshop for Problem Solvers
Turning Data into Infographics: An Interactive Workshop for Problem Solvers
 
1325 keynote singh
1325 keynote singh1325 keynote singh
1325 keynote singh
 
Storytelling with Data (Global Engagement Summit at Northwestern University 2...
Storytelling with Data (Global Engagement Summit at Northwestern University 2...Storytelling with Data (Global Engagement Summit at Northwestern University 2...
Storytelling with Data (Global Engagement Summit at Northwestern University 2...
 
Marketing Matters: A Realistic Approach
Marketing Matters: A Realistic ApproachMarketing Matters: A Realistic Approach
Marketing Matters: A Realistic Approach
 
#15NTC NTEN Strategy, IT\Mission Alignment and Outcomes (SIMO) Presentation (...
#15NTC NTEN Strategy, IT\Mission Alignment and Outcomes (SIMO) Presentation (...#15NTC NTEN Strategy, IT\Mission Alignment and Outcomes (SIMO) Presentation (...
#15NTC NTEN Strategy, IT\Mission Alignment and Outcomes (SIMO) Presentation (...
 
What's Next: The Value of Data
What's Next: The Value of DataWhat's Next: The Value of Data
What's Next: The Value of Data
 
Research and Community Building with a Roadmap
Research and Community Building with a RoadmapResearch and Community Building with a Roadmap
Research and Community Building with a Roadmap
 
How to Land the Grant
How to Land the GrantHow to Land the Grant
How to Land the Grant
 
Capturing and communicating impact
Capturing and communicating impactCapturing and communicating impact
Capturing and communicating impact
 
Determining & Demonstrating Value with the Logic Model
Determining & Demonstrating Value with the Logic ModelDetermining & Demonstrating Value with the Logic Model
Determining & Demonstrating Value with the Logic Model
 
SMPS Alaska Chapter Presentation - 8-27-13
SMPS Alaska Chapter Presentation - 8-27-13SMPS Alaska Chapter Presentation - 8-27-13
SMPS Alaska Chapter Presentation - 8-27-13
 
Pallid Sturgeon Research Project
Pallid Sturgeon Research ProjectPallid Sturgeon Research Project
Pallid Sturgeon Research Project
 
Plan and (HYPOTHETICALLY) evaluate a public health intervention ut.docx
Plan and (HYPOTHETICALLY) evaluate a public health intervention ut.docxPlan and (HYPOTHETICALLY) evaluate a public health intervention ut.docx
Plan and (HYPOTHETICALLY) evaluate a public health intervention ut.docx
 
Quality Assurance_Final
Quality Assurance_FinalQuality Assurance_Final
Quality Assurance_Final
 
Data Analytics: Better Decision, Better Business
Data Analytics: Better Decision, Better BusinessData Analytics: Better Decision, Better Business
Data Analytics: Better Decision, Better Business
 

Mehr von Isaac Castillo

Castillo collective impact convening rf w data aggregation
Castillo collective impact convening rf w data aggregationCastillo collective impact convening rf w data aggregation
Castillo collective impact convening rf w data aggregationIsaac Castillo
 
The Future of Logic Models: Logic Models in 3D
The Future of Logic Models: Logic Models in 3DThe Future of Logic Models: Logic Models in 3D
The Future of Logic Models: Logic Models in 3DIsaac Castillo
 
The Logic Model Repair Shop: An Introduction to 3D Logic Models.
The Logic Model Repair Shop: An Introduction to 3D Logic Models.The Logic Model Repair Shop: An Introduction to 3D Logic Models.
The Logic Model Repair Shop: An Introduction to 3D Logic Models.Isaac Castillo
 
Collective impact in 3D - Collective Impact Convening Short Talk
Collective impact in 3D - Collective Impact Convening Short TalkCollective impact in 3D - Collective Impact Convening Short Talk
Collective impact in 3D - Collective Impact Convening Short TalkIsaac Castillo
 
Evaluation Blooers and How to Make the Most of Your Mistakes
Evaluation Blooers and How to Make the Most of Your MistakesEvaluation Blooers and How to Make the Most of Your Mistakes
Evaluation Blooers and How to Make the Most of Your MistakesIsaac Castillo
 
Castillo EERS 2015 ignite presentation
Castillo EERS 2015 ignite presentationCastillo EERS 2015 ignite presentation
Castillo EERS 2015 ignite presentationIsaac Castillo
 
Castillo Measure4Change - Presenting Data to Community Residents
Castillo Measure4Change - Presenting Data to Community ResidentsCastillo Measure4Change - Presenting Data to Community Residents
Castillo Measure4Change - Presenting Data to Community ResidentsIsaac Castillo
 
Engaging Community Residents with Data
Engaging Community Residents with DataEngaging Community Residents with Data
Engaging Community Residents with DataIsaac Castillo
 
When technology hits the sidewalk empowering community residents through 21s...
When technology hits the sidewalk  empowering community residents through 21s...When technology hits the sidewalk  empowering community residents through 21s...
When technology hits the sidewalk empowering community residents through 21s...Isaac Castillo
 
Helping Families and Community Residents Use Data
Helping Families and Community Residents Use DataHelping Families and Community Residents Use Data
Helping Families and Community Residents Use DataIsaac Castillo
 
Engaging Families & Community Residents in Data-Driven Work
Engaging Families & Community Residents in Data-Driven WorkEngaging Families & Community Residents in Data-Driven Work
Engaging Families & Community Residents in Data-Driven WorkIsaac Castillo
 
A Neighborhood Survey in the Nation’s Capital: Balancing Rigor, Resources, a...
A Neighborhood Survey in the Nation’s Capital:  Balancing Rigor, Resources, a...A Neighborhood Survey in the Nation’s Capital:  Balancing Rigor, Resources, a...
A Neighborhood Survey in the Nation’s Capital: Balancing Rigor, Resources, a...Isaac Castillo
 

Mehr von Isaac Castillo (12)

Castillo collective impact convening rf w data aggregation
Castillo collective impact convening rf w data aggregationCastillo collective impact convening rf w data aggregation
Castillo collective impact convening rf w data aggregation
 
The Future of Logic Models: Logic Models in 3D
The Future of Logic Models: Logic Models in 3DThe Future of Logic Models: Logic Models in 3D
The Future of Logic Models: Logic Models in 3D
 
The Logic Model Repair Shop: An Introduction to 3D Logic Models.
The Logic Model Repair Shop: An Introduction to 3D Logic Models.The Logic Model Repair Shop: An Introduction to 3D Logic Models.
The Logic Model Repair Shop: An Introduction to 3D Logic Models.
 
Collective impact in 3D - Collective Impact Convening Short Talk
Collective impact in 3D - Collective Impact Convening Short TalkCollective impact in 3D - Collective Impact Convening Short Talk
Collective impact in 3D - Collective Impact Convening Short Talk
 
Evaluation Blooers and How to Make the Most of Your Mistakes
Evaluation Blooers and How to Make the Most of Your MistakesEvaluation Blooers and How to Make the Most of Your Mistakes
Evaluation Blooers and How to Make the Most of Your Mistakes
 
Castillo EERS 2015 ignite presentation
Castillo EERS 2015 ignite presentationCastillo EERS 2015 ignite presentation
Castillo EERS 2015 ignite presentation
 
Castillo Measure4Change - Presenting Data to Community Residents
Castillo Measure4Change - Presenting Data to Community ResidentsCastillo Measure4Change - Presenting Data to Community Residents
Castillo Measure4Change - Presenting Data to Community Residents
 
Engaging Community Residents with Data
Engaging Community Residents with DataEngaging Community Residents with Data
Engaging Community Residents with Data
 
When technology hits the sidewalk empowering community residents through 21s...
When technology hits the sidewalk  empowering community residents through 21s...When technology hits the sidewalk  empowering community residents through 21s...
When technology hits the sidewalk empowering community residents through 21s...
 
Helping Families and Community Residents Use Data
Helping Families and Community Residents Use DataHelping Families and Community Residents Use Data
Helping Families and Community Residents Use Data
 
Engaging Families & Community Residents in Data-Driven Work
Engaging Families & Community Residents in Data-Driven WorkEngaging Families & Community Residents in Data-Driven Work
Engaging Families & Community Residents in Data-Driven Work
 
A Neighborhood Survey in the Nation’s Capital: Balancing Rigor, Resources, a...
A Neighborhood Survey in the Nation’s Capital:  Balancing Rigor, Resources, a...A Neighborhood Survey in the Nation’s Capital:  Balancing Rigor, Resources, a...
A Neighborhood Survey in the Nation’s Capital: Balancing Rigor, Resources, a...
 

Program Evaluation Basics - Center for Nonprofit Success slides

  • 1. Program Evaluation: Using evaluation data to set direction, expand impact, and maintain accountability October 21, 2014 Presented by: Isaac D. Castillo Director of Data and Evaluation DC Promise Neighborhood Initiative Twitter: @isaac_outcomes
  • 2. Today’s Agenda Why should you care about program evaluation? What is program evaluation? How can your organization successfully conduct program evaluation work? Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
  • 3. Why Should You Care About Program Evaluation? LAYC domestic violence story LeapOfReason.org First Do No Harm…Then Do More Good New domestic violence program component designed to teach three things: o Partner violence is not an OK expression of love o Partner violence is not OK in Latino culture o There are safe ways to get out of violent relationships Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
  • 4. Why Should You Care About Program Evaluation? No human being is perfect. Staff will make mistakes Organizations will make mistakes Services will be delivered poorly Despite the best of intentions, some people will be harmed. How do you know you are not harming people with your services? Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
  • 5. What is Program Evaluation? Process to determine if your program / intervention / approach is effective. Need to define what ‘success’ is for your program. Program evaluation does NOT need to be done by specialists or outsiders – but those people do add credibility and rigor (in most cases) Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
  • 6. The Basics of Program Evaluation – An Example The concept of dieting – if you understand dieting, you understand the basics of program evaluation. What is the goal of dieting (how do you define dieting ‘success’)? How do you know if your diet ‘works’? Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
  • 7. Data and Dieting Person weighs 200 Person weighs 200 Pounds (90 Kilograms) (90 Kilograms) • Does that data point alone tell us anything? Pounds • Context Matters – what if person is 4 feet tall and 10 years old? • Timing Matters – is this at beginning, end, or middle of diet? Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
  • 8. Could Be About More Than Weight • Other things that could be measured: • Body Mass Index (BMI) • Physical fitness • Blood measures (cholesterol levels) • Own perceptions of health / feeling • Appearance / muscle tone Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
  • 9. Five Evaluation Concepts to be Covered Types and timing of evaluation Who or what will you evaluate, and how will they be selected? Quantitative, Qualitative, and Mixed Methods approaches How detailed or rigorous does it need to be? Who does the work – internal or external? Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
  • 10. Types of Evaluation What do I mean by ‘type’? Really it is about timing - When do you collect data? What will you compare your data to? Much of this discussion relies on: o Costs o Availability of potential comparison data o What you are trying to learn Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
  • 11. Traditional (Time Series) Most common type of program evaluation. Looking to see if things have changed over time. What was situation before program, then what was situation after program. BBeefoforree P Prrooggrraamm PPrrooggrraamm D Deelilviveerreedd AAftfeterr P Prrooggrraamm Must measure same things, in same ways, at both points in time. Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
  • 12. Comparison Group A time series study that compares to another group (that does not receive programming). PPrrooggrraamm D Deelilviveerreedd BBeefoforree P Prrooggrraamm No (or minimal) programming AAftfteerr P Prrooggrraamm No (or minimal) programming More rigorous, but more challenging. Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
  • 13. Who / What Will You Evaluate? Need to define the population that will be evaluated. Need to define ‘success measures’ (outcomes) – what are you trying to achieve? Once these questions are answered, then need to consider which participants will be part of the evaluation (and maybe who gets programming). Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
  • 14. In Time Series, This is Simple Usually just serve and evaluate those that enroll in the program: BBeefoforree P Prrooggrraamm PPrrooggrraamm D Deelilviveerreedd AAftfteerr P Prrooggrraamm SSeelfl-fs-seeleleccttioionn First come, first served is what is frequently used if there are too many potential participants. Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
  • 15. Comparison Groups Are More Complicated • Can select by randomizing participants into groups: PPrrooggrraamm D Deelilviveerreedd BBeefoforree P Prrooggrraamm No (or minimal) programming AAftfteerr P Prrooggrraamm No (or minimal) programming Random Selection Random Selection Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
  • 16. Compare across high/low dosage • Can use self-selection: PPrrooggrraamm D Deelilviveerreedd BBeefoforree P Prrooggrraamm No (or minimal) programming AAftfteerr P Prrooggrraamm No (or minimal) programming High High Attendance Attendance Low Attendance Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014 Self-selection Self-selection Low Attendance
  • 17. Quantitative, Qualitative, and Mixed Methods Evaluation Quantitative = more numerical information. Qualitative = less numerical information. One not better than other, just different types of information. Both can be high quality – both can be poor. Most modern program evaluation is Mixed Methods – both quantitative and qualitative to varying degrees. Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
  • 18. What does Mixed Methods Evaluation Look Like? Quantitative •Survey responses (that can be quantified) •Numerical data (test scores, report cards, medical data, etc.) •External data (collected by others) Qualitative •Interviews with participants •Focus groups with participants •Interviews with staff •Interviews or focus groups with key stakeholders •Process / fidelity study •Open ended survey responses Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
  • 19. Where Does This Get Tricky? Need to focus on outcomes – changes in knowledge, attitudes, behavior or conditions. Satisfaction surveys do not equal evaluation. Just because someone liked the program it does not mean the program led to successful outcomes. Some things can be either qualitative or quantitative depending on who you ask. Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
  • 20. How Detailed or Rigorous Does the Evaluation Need to Be? What do you want to do with the results? Prove to yourself the program works? Use the results to market/fundraise? Publish the results through your own materials? Publish the results in peer-reviewed journals? How ‘certain’ do you want to be about the results? Are you fine with some doubt? Will you be comfortable answering concerns and criticisms? Are you willing to live with negative results? Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
  • 21. Using Evaluation Data to Inform Change Changes in program type / delivery (this parenting program isn’t working, time for a different program) Changes in dosage (classes are offered once a month, increase classes to once a week) Changes in measurement tools or approaches (this survey question is flawed, let’s find a better one) Changes in staff training (staff do not seem to know how to deliver this program – time for training) Changes in organizational culture (no one is taking this approach seriously – time for larger conversation) Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
  • 22. Communicating Evaluation Results Be honest about your data and the limitations of your method. What you omit is as telling as what you communicate. When communicating negative / ‘bad’ results, follow this formula: Finding / Result + Theory + New Solution Use different formats to communicate the information. Celebrate the successes, and identify areas for improvement. No need to share everything in detail – but have it ready if someone requests it. Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
  • 23. Balancing Cost and Rigor Program evaluation does have a resource cost – but so does everything else. Simple internal evaluation / performance management can be done at low cost. However, larger picture requires more rigorous (and more expensive evaluation) Start small and focus on 2-3 outcomes – then expand over time. Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
  • 24. Internal vs. External Evaluation Who does design of evaluation? Who does selection / creation of data collection tools? Who does actual data collection? Who does the analysis of data? Who creates the reports / charts / publications? Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
  • 25. Resources - Books The Nonprofit Outcomes Toolbox By Robert M. Penna Handbook of Practical Program Evaluation By Joseph S. Wholey, Harry P. Hatry, and Kathryn E. Newcomer - editors Performance Measurement: Getting Results By Harry P. Hatry Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
  • 26. Resources - Articles First Do No Harm…Then Do More Good By Isaac Castillo http://tinyurl.com/isaacLOR Good Stories Aren’t Enough By Martha A. Miles http://tinyurl.com/milesgoodstories Yes We Can! Performance Management in Nonprofit Human Services By David E.K. Hunter http://tinyurl.com/hunteryeswecan Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
  • 27. Resources - Links PerformWell www.performwell.org Performance Management and Evaluation: Two Sides of the Same Coin By Isaac Castillo and Ann Emery https://www.youtube.com/watch?v=nC7AG8XxrI4 Leap of Reason http://www.leapofreason.org Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014
  • 28. Isaac’s Contact Information Isaac D. Castillo Director of Data and Evaluation DC Promise Neighborhood Initiative On Twitter: @Isaac_outcomes Email: Isaac.Castillo@dcpni.org October 21, 2014 Isaac Castillo - @isaac_outcomes – DC Promise Neighborhood Initiative – October 21, 2014