SlideShare ist ein Scribd-Unternehmen logo
1 von 16
If I Knew Then What I Know Now:  Building a  Successful  Evaluation Roblyn Brigham, Brigham Nahas Research Associates Andy Hoge, New Jersey SEEDS Janet Smith, The Steppingstone Foundation Danielle Stein Eisenberg, KIPP Foundation April 8, 2010
Overview and Introduction ,[object Object],[object Object],[object Object],[object Object]
Internal and External Evaluation ,[object Object],[object Object],[object Object]
Evaluation Planning:  Factors to Consider ,[object Object],[object Object],[object Object]
Organizational Characteristics  and Evaluation Design ,[object Object],[object Object],[object Object],[object Object]
Designing Evaluation:  Non-negotiables ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Data Collection:  Capacity and Commitment ,[object Object],[object Object],[object Object],[object Object]
Data Analysis:  Capacity and Action ,[object Object],[object Object],[object Object],[object Object],[object Object]
Presenting Results:  Know Your Audience ,[object Object],[object Object],[object Object],[object Object],[object Object]
Test Results Over Time
Test Results Over Time
 
Additional Slides ,[object Object],[object Object]
The KSS core team articulated specific goals, objectives, and metrics for the event (which mapped back to the overall vision).  Strand leads did the same. ,[object Object],Strand Goal Objective Metric Evaluation Tool Boards Board members should be inspired by KIPP's mission and energized to contribute as Board members Board members will feel inspired to continue their work with KIPP 95% of board members will indicate that they feel somewhat or very inspired to continue their work with KIPP Strand Survey Board members should feel part of a network-wide Board community, and national reform movement, rather than just a supporter of a local KIPP effort. Board members will feel like part of a national network 90% of board members will indicate that they feel somewhat or very connected to a national community Strand Survey Board members should learn practical skills and/or obtain tools that will enhance their Board's effectiveness Board members should leave KSS with at least one tool or practical skill they can immediately put to use Can name 1 tool or skill they used Strand Survey and Follow-Up Survey Board members should learn about KIPP initiatives that are meaningful to their Board service - e.g. KIPP share Board members will leave KSS knowing about national initiatives Can name 2 KIPP initiatives that are relevant to their region or school Strand Survey
Executive summary: KSS 2009 successfully delivered against our vision; per-participant costs were lowest level ever ,[object Object],[object Object],Collective Power; Intro/ Reconnection Network; Share, Reflect, and Learn Personal Learning Kick off school year with high energy; Renew collective  commitment Per-participant costs were lowest level ever Please see the appendix for the supporting data to the bullet points below. ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
The Lie Factor ( The Visual Display of Quantitative Information, 2 nd  Ed.  by E.R. Tufte, 2001) Los Angeles Times Aug. 5, 1979, p. 3

Weitere ähnliche Inhalte

Was ist angesagt?

Class session 3 program ideas and needs assessment
Class session 3   program ideas and needs assessmentClass session 3   program ideas and needs assessment
Class session 3 program ideas and needs assessment
tjcarter
 
New Frameworks for Measuring Capacity and Assessing Performance
New Frameworks for Measuring Capacity and Assessing PerformanceNew Frameworks for Measuring Capacity and Assessing Performance
New Frameworks for Measuring Capacity and Assessing Performance
TCC Group
 
Cnics mentoring program
Cnics mentoring programCnics mentoring program
Cnics mentoring program
James Kahn
 

Was ist angesagt? (20)

Funders eval meeting survey newsletter v3
Funders eval meeting survey newsletter v3Funders eval meeting survey newsletter v3
Funders eval meeting survey newsletter v3
 
Class session 3 program ideas and needs assessment
Class session 3   program ideas and needs assessmentClass session 3   program ideas and needs assessment
Class session 3 program ideas and needs assessment
 
Leading with Content: Using Content Strategy to Advance Business Goals
Leading with Content: Using Content Strategy to Advance Business GoalsLeading with Content: Using Content Strategy to Advance Business Goals
Leading with Content: Using Content Strategy to Advance Business Goals
 
New Frameworks for Measuring Capacity and Assessing Performance
New Frameworks for Measuring Capacity and Assessing PerformanceNew Frameworks for Measuring Capacity and Assessing Performance
New Frameworks for Measuring Capacity and Assessing Performance
 
Pietranton-Deal-Williams: ASHA's Implementation of the Balanced Scorecard
Pietranton-Deal-Williams: ASHA's Implementation of the Balanced ScorecardPietranton-Deal-Williams: ASHA's Implementation of the Balanced Scorecard
Pietranton-Deal-Williams: ASHA's Implementation of the Balanced Scorecard
 
The Planning Process
The Planning ProcessThe Planning Process
The Planning Process
 
Olc.evaluation.nada.djordjevich.final
Olc.evaluation.nada.djordjevich.finalOlc.evaluation.nada.djordjevich.final
Olc.evaluation.nada.djordjevich.final
 
Focus on Fit: A Cultural Blueprint for Successful Physician Recruitment
Focus on Fit: A Cultural Blueprint for Successful Physician RecruitmentFocus on Fit: A Cultural Blueprint for Successful Physician Recruitment
Focus on Fit: A Cultural Blueprint for Successful Physician Recruitment
 
Leading With Content: Using Content Strategy to Advance Business Goals
Leading With Content: Using Content Strategy to Advance Business GoalsLeading With Content: Using Content Strategy to Advance Business Goals
Leading With Content: Using Content Strategy to Advance Business Goals
 
The "Hub and Spoke" Evaluation 2013-17
The "Hub and Spoke" Evaluation 2013-17The "Hub and Spoke" Evaluation 2013-17
The "Hub and Spoke" Evaluation 2013-17
 
Defence presentation
Defence presentationDefence presentation
Defence presentation
 
Kenya – Capacity Assessment
Kenya – Capacity AssessmentKenya – Capacity Assessment
Kenya – Capacity Assessment
 
Resources for New Mentoring Programs & New Staff Members
Resources for New Mentoring Programs & New Staff MembersResources for New Mentoring Programs & New Staff Members
Resources for New Mentoring Programs & New Staff Members
 
Role of Evaluation in Decision Making
Role of Evaluation in Decision MakingRole of Evaluation in Decision Making
Role of Evaluation in Decision Making
 
Strategic Direction Setting For Libraries Nov. 2010
Strategic Direction Setting For Libraries Nov.  2010Strategic Direction Setting For Libraries Nov.  2010
Strategic Direction Setting For Libraries Nov. 2010
 
Kassie: TPEP presentation
Kassie: TPEP presentation Kassie: TPEP presentation
Kassie: TPEP presentation
 
Bonner Fall Directors 2016 - Signature Work
Bonner Fall Directors 2016 - Signature WorkBonner Fall Directors 2016 - Signature Work
Bonner Fall Directors 2016 - Signature Work
 
Building Partnerships in Schools
Building Partnerships in SchoolsBuilding Partnerships in Schools
Building Partnerships in Schools
 
Cnics mentoring program
Cnics mentoring programCnics mentoring program
Cnics mentoring program
 
2010 Minnesota Mentoring Conference - Keynote Presentation
2010 Minnesota Mentoring Conference - Keynote Presentation2010 Minnesota Mentoring Conference - Keynote Presentation
2010 Minnesota Mentoring Conference - Keynote Presentation
 

Andere mochten auch

Andere mochten auch (6)

Research, Policy & Evaluation: Complex Intersections: Navigating the Waters o...
Research, Policy & Evaluation: Complex Intersections: Navigating the Waters o...Research, Policy & Evaluation: Complex Intersections: Navigating the Waters o...
Research, Policy & Evaluation: Complex Intersections: Navigating the Waters o...
 
Showcase Session: College Access & Retention
Showcase Session: College Access & RetentionShowcase Session: College Access & Retention
Showcase Session: College Access & Retention
 
The Near Future of CSS
The Near Future of CSSThe Near Future of CSS
The Near Future of CSS
 
Classroom Management Tips for Kids and Adolescents
Classroom Management Tips for Kids and AdolescentsClassroom Management Tips for Kids and Adolescents
Classroom Management Tips for Kids and Adolescents
 
The Presentation Come-Back Kid
The Presentation Come-Back KidThe Presentation Come-Back Kid
The Presentation Come-Back Kid
 
The Buyer's Journey - by Chris Lema
The Buyer's Journey - by Chris LemaThe Buyer's Journey - by Chris Lema
The Buyer's Journey - by Chris Lema
 

Ähnlich wie Research, Policy & Evaluation: If I Knew Then What I Know Now: Building Successful Evaluation

Learning focused Evaluation
Learning focused EvaluationLearning focused Evaluation
Learning focused Evaluation
Michele Garvey
 
Dr. Lewis D. Ferebee's 100-day plan
Dr. Lewis D. Ferebee's 100-day planDr. Lewis D. Ferebee's 100-day plan
Dr. Lewis D. Ferebee's 100-day plan
Abdul-Hakim Shabazz
 
Leading The Learning Function
Leading The Learning FunctionLeading The Learning Function
Leading The Learning Function
Jason Scott
 
ABC Asia Pacific Presentation on Listening
ABC Asia Pacific Presentation on ListeningABC Asia Pacific Presentation on Listening
ABC Asia Pacific Presentation on Listening
petercardon
 
CEP • CEI 1BENCHMARKING FoundationEvaluationPra.docx
CEP • CEI     1BENCHMARKING FoundationEvaluationPra.docxCEP • CEI     1BENCHMARKING FoundationEvaluationPra.docx
CEP • CEI 1BENCHMARKING FoundationEvaluationPra.docx
sleeperharwell
 

Ähnlich wie Research, Policy & Evaluation: If I Knew Then What I Know Now: Building Successful Evaluation (20)

Learning focused Evaluation
Learning focused EvaluationLearning focused Evaluation
Learning focused Evaluation
 
Prt 595 week 2 lecture 2
Prt 595 week 2 lecture 2Prt 595 week 2 lecture 2
Prt 595 week 2 lecture 2
 
Appreciative Inquiry & Strategy
Appreciative Inquiry & StrategyAppreciative Inquiry & Strategy
Appreciative Inquiry & Strategy
 
Full Program Design
Full Program DesignFull Program Design
Full Program Design
 
Impact of Leadership in Action Course Kings/UCL April 2018
Impact of Leadership in Action Course Kings/UCL April 2018Impact of Leadership in Action Course Kings/UCL April 2018
Impact of Leadership in Action Course Kings/UCL April 2018
 
Dr. Lewis D. Ferebee's 100-day plan
Dr. Lewis D. Ferebee's 100-day planDr. Lewis D. Ferebee's 100-day plan
Dr. Lewis D. Ferebee's 100-day plan
 
LEAD & Learning
LEAD & LearningLEAD & Learning
LEAD & Learning
 
How Did WE Do? Evaluating the Student Experience
How Did WE Do? Evaluating the Student Experience How Did WE Do? Evaluating the Student Experience
How Did WE Do? Evaluating the Student Experience
 
Managing Change: Tools and Techniques
Managing Change: Tools and TechniquesManaging Change: Tools and Techniques
Managing Change: Tools and Techniques
 
Readiness to Train Assessment Tool™ - National Launch
Readiness to Train Assessment Tool™ - National LaunchReadiness to Train Assessment Tool™ - National Launch
Readiness to Train Assessment Tool™ - National Launch
 
Making Critical Thinking Real with Creative Coding, Digital Games & Movie Cre...
Making Critical Thinking Real with Creative Coding, Digital Games & Movie Cre...Making Critical Thinking Real with Creative Coding, Digital Games & Movie Cre...
Making Critical Thinking Real with Creative Coding, Digital Games & Movie Cre...
 
Leading The Learning Function
Leading The Learning FunctionLeading The Learning Function
Leading The Learning Function
 
coaching_session_csmh_2020_final.pptx
coaching_session_csmh_2020_final.pptxcoaching_session_csmh_2020_final.pptx
coaching_session_csmh_2020_final.pptx
 
Eval Framework For Presentation
Eval Framework For PresentationEval Framework For Presentation
Eval Framework For Presentation
 
Retaining DSPs - Presentation 4 - John Sauer MSW and M Ed Institute on Commun...
Retaining DSPs - Presentation 4 - John Sauer MSW and M Ed Institute on Commun...Retaining DSPs - Presentation 4 - John Sauer MSW and M Ed Institute on Commun...
Retaining DSPs - Presentation 4 - John Sauer MSW and M Ed Institute on Commun...
 
ABC Asia Pacific Presentation on Listening
ABC Asia Pacific Presentation on ListeningABC Asia Pacific Presentation on Listening
ABC Asia Pacific Presentation on Listening
 
CEP • CEI 1BENCHMARKING FoundationEvaluationPra.docx
CEP • CEI     1BENCHMARKING FoundationEvaluationPra.docxCEP • CEI     1BENCHMARKING FoundationEvaluationPra.docx
CEP • CEI 1BENCHMARKING FoundationEvaluationPra.docx
 
A Pulse of Predictive Analytics In Higher Education │ Civitas Learning
A Pulse of Predictive Analytics In Higher Education │ Civitas LearningA Pulse of Predictive Analytics In Higher Education │ Civitas Learning
A Pulse of Predictive Analytics In Higher Education │ Civitas Learning
 
Measuring Success in Patient Advocacy Initiatives
Measuring Success in Patient Advocacy InitiativesMeasuring Success in Patient Advocacy Initiatives
Measuring Success in Patient Advocacy Initiatives
 
FerrellWalker.pptx
FerrellWalker.pptxFerrellWalker.pptx
FerrellWalker.pptx
 

Kürzlich hochgeladen

Kürzlich hochgeladen (20)

ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
 
Micro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfMicro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdf
 
FSB Advising Checklist - Orientation 2024
FSB Advising Checklist - Orientation 2024FSB Advising Checklist - Orientation 2024
FSB Advising Checklist - Orientation 2024
 
Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...
Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...
Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...
 
How to Add New Custom Addons Path in Odoo 17
How to Add New Custom Addons Path in Odoo 17How to Add New Custom Addons Path in Odoo 17
How to Add New Custom Addons Path in Odoo 17
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
Towards a code of practice for AI in AT.pptx
Towards a code of practice for AI in AT.pptxTowards a code of practice for AI in AT.pptx
Towards a code of practice for AI in AT.pptx
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
 
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptxHMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
 
How to Create and Manage Wizard in Odoo 17
How to Create and Manage Wizard in Odoo 17How to Create and Manage Wizard in Odoo 17
How to Create and Manage Wizard in Odoo 17
 
Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024
 
How to setup Pycharm environment for Odoo 17.pptx
How to setup Pycharm environment for Odoo 17.pptxHow to setup Pycharm environment for Odoo 17.pptx
How to setup Pycharm environment for Odoo 17.pptx
 
Kodo Millet PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...
Kodo Millet  PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...Kodo Millet  PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...
Kodo Millet PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...
 
NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...
NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...
NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...
 
SOC 101 Demonstration of Learning Presentation
SOC 101 Demonstration of Learning PresentationSOC 101 Demonstration of Learning Presentation
SOC 101 Demonstration of Learning Presentation
 
Google Gemini An AI Revolution in Education.pptx
Google Gemini An AI Revolution in Education.pptxGoogle Gemini An AI Revolution in Education.pptx
Google Gemini An AI Revolution in Education.pptx
 
Wellbeing inclusion and digital dystopias.pptx
Wellbeing inclusion and digital dystopias.pptxWellbeing inclusion and digital dystopias.pptx
Wellbeing inclusion and digital dystopias.pptx
 
Jamworks pilot and AI at Jisc (20/03/2024)
Jamworks pilot and AI at Jisc (20/03/2024)Jamworks pilot and AI at Jisc (20/03/2024)
Jamworks pilot and AI at Jisc (20/03/2024)
 

Research, Policy & Evaluation: If I Knew Then What I Know Now: Building Successful Evaluation

  • 1. If I Knew Then What I Know Now: Building a Successful Evaluation Roblyn Brigham, Brigham Nahas Research Associates Andy Hoge, New Jersey SEEDS Janet Smith, The Steppingstone Foundation Danielle Stein Eisenberg, KIPP Foundation April 8, 2010
  • 2.
  • 3.
  • 4.
  • 5.
  • 6.
  • 7.
  • 8.
  • 9.
  • 12.  
  • 13.
  • 14.
  • 15.
  • 16. The Lie Factor ( The Visual Display of Quantitative Information, 2 nd Ed. by E.R. Tufte, 2001) Los Angeles Times Aug. 5, 1979, p. 3

Hinweis der Redaktion

  1. Evaluation helps organizations succeed in gaining funding, delivering services, and improving internal processes. Yet conducting rigorous evaluation is a challenge when resources are limited, or you are in charge of evaluation without having had evaluation training. In this workshop, four evaluators of college-access organizations with different levels of experience identify key realworld stumbling blocks such as: 1) confusing evaluation for external use with evaluation for internal use; 2) finding that too much data paralyzes organizational decisions; 3) prioritizing data collection over data analysis; 4) finding your audience’s eyes glazing over when you talk about evaluation; and 5) over or underestimating the resources you will need for your project. This session will focus on evaluation tips and tools, lessons learned, and most importantly, mistakes to avoid. It is designed for those charged with leading evaluation for their organizations (even if they have little evaluation experience); it is not for those completely new to evaluation. Introduction of Panelists: Roblyn Brigham, Brigham Nahas Research Associates Andy Hoge, New Jersey SEEDS Janet Smith, The Steppingstone Foundation Danielle Stein Eisenberg, KIPP Foundation
  2. Overview slide for Janet/Danielle’s section: Describe our orgs re: mission, size Describe our orgs briefly: Two things very important to how we do evaluation:   KIPP – national, multiple site, autonomous model (meaning the Foundation does not run or operate the KIPP schools, rather it provides support and training and economies of scale. Currently 82 schools in 20 states and DC, serving 21,000 students. We do internal and external evaluations – I’m going to focus on our internal program evaluation work today.
  3. Evaluation - not a one-size-fits-all approach • Size & Structure of Organization – Who on your staff does program evaluation? No one, everyone or one person? - Evaluator wears many hats?   • Culture of Org – Is evaluation part of your organization’s culture already or will this work be entirely new? What systems and processes need to be put in place to create a data-driven culture? What time and resources are dedicated to all phases?   Age of org affects what can/should be evaluated   Nature of the program offering(s) – direct service, info only, prepping now for future outcomes, multiple sites? Implementing a model vs. responsive to specific program context Through experience we’ve learned that the following matters greatly: -   • Size & Structure of Organization – Who on your staff does program evaluation? No one, everyone or one person? - Evaluator wears many hats?   • Culture of Org – Is evaluation part of your organization’s culture already or will this work be entirely new? What systems and processes need to be put in place to create a data-driven culture? What time and resources are dedicated to all phases? Age of org affects what can/should be evaluated   Nature of the program offering(s) – direct service, info only, prepping now for future outcomes, multiple sites? Implementing a model vs. responsive to specific program context Describe our orgs briefly: Two things very important to how we do evaluation: KIPP – national, multiple site, autonomous model (meaning the Foundation does not run or operate the KIPP schools, rather it provides support and training and economies of scale. Currently 82 schools in 20 states and DC, serving 21,000 students. We do internal and external evaluations – I’m going to focus on our internal program evaluation work today. Two important things about how my org has decided to do eval:   DSE: 1. Recently, we built a culture around making program evaluation a critical component to a program’s lifecycle; and engaged program managers in managing their own evaluations 2. Work hard to first define program goals, participant outcomes, and even process goals; and then connect them to the right evaluation tools and processes in order to ensure that the information is useful and actionable.l (show slide with tool to define eval goals, outcomes, tools)   JS 1. Learning org – make decisions based on research and data (but easy to leave evaluation thinking until the end) - moving beyond “satisfaction surveys” that ask “rate your level of satisfaction” 2. Interconnected Teams – shared database; own evals and linked evals; mixed methods; Showcase: why does this matter to you?
  4. Evaluable questions: Examples of using proxies, be realistic in making claims Lesson learned: If you are not going to act on evaluation findings, do not collect the data –yet (painful decision-making but necessary)
  5.   Data Collection activities: • Articulating “Evaluable” questions - What does Success Look Like? (Theory of change)   • (Deciding What Data to Collect) Data Collection – An iterative process – Identifying what data needs to be collected; Ways of collecting data; getting people on board; Identifying who’s involved in data collection • Commitment to Using Results – Being clear upfront how you will USE the results from data collection and analysis: process for making sure the data is utilized for decision-making and program improvement How do data link back to mission and theory of change? What can you take on now? Decide this as part of larger eval over time   • What to evaluate (what data to collect)? – Guidelines: Incremental steps based on age of org: New orgs – beginning eval: implementation & staff training After first year – focus on knowledge and behavior outcomes of those you serve After a few years – focus on measuring program impact (with external evaluator?) at theory-of-change level   • 1-2 things we’ve learned from experience: DES: 1. When we were a younger organization, no systems in place for doing real program evaluation – program managers were each responsible for doing it on their own, but with no support, no guidelines, and no expertise in this area. Lots of survey monkeying – we learned there needed to be some centralization of efforts to ensure quality evaluation was happening, that we were actually learning from our experiences, that we were retaining information during staff transitions, and that the data we collected was actually utilized. 2. Now that we’re a bit older and have some processes in place, the big question becomes “how much is too much” – survey/interview request fatigue – what data is really necessary – what needs to be collected every year and what doesn’t, etc. 3. Who’s involved is also important – important to have analyst involved in survey development – critical to knowing what data is actually going to be useable.   JS: 1.   2. Steppingstone: consider how to send “message in a bottle” - Survey Punch- card, team planning tools, comments in Excel
  6. • Data Analysis - Who’s involved in analyzing the data? Best is team, inclusion of non-eval folks included in later stages to help interpret - Ask yourself: what’s surprising and why? what’s worrisome and why? what’s missing?   Share Results of Analysis: org parts are linked via data-driven cycle, activity: each Team presents its data Prioritize action to be taken in response to analysis – formative? Summative?   • 1-2 things we’ve learned from experience: DES: 1. KIPP – Articulate the link to your vision/mission: Delivering against vision slide – demonstrate that we set vision first, used variety of data collection methods to track progress towards vision (or goals), and then presented information in ways that were appropriate for audience. 2. Example: Take action based on results; KSLP team utilizes nightly surveys to make immediate adjustments for courses the following day. Others have more subtle changes – but bottom line is – data should be used.   JS: 1. Steppingstone – Quarterly Showcase: Teams share their own data analysis, other teams point out how those data affect them, where data-sharing will be key (attention to cross-Team needs is key – JS role, establishing data calendar) 2. Example: Transition Study – exploratory research to include point of view of all broad range of stakeholders. Analyzed themes using x-team analysis group, categorized themes per Team, left each team to do final interpretion and take action: informed support services activities, informed socio-emo curriculum  
  7.   Knowing your audience: Something we will all respond to who wants headlines? who will want to test your interpretation of the data? analyst not always best person to assess what is best for audience   Examples of what we’ve learned through experience: JS 1. visuals count: double-check automated processes - e.g., Excel slide -------------- chart with wrong Y axis for percentile and correct Y axis -------------- slide ------------- USA today graphic or overly busy graphic or data heavy graphic --------   DSE 1. KIPP: Variation in slide decks on various benchmarks – differentiated according to audience 2. What I learned: started to write lots of “reports” or “white papers” at the beginning – truth is – power point often works best to make succinct, easy to digest points, and it’s easily shared.
  8. Ensure participants feel the collective power and impact of our broader Team and Family and the overall movement by providing a powerful introduction and reconnection to KIPP Provide an opportunity for big KIPPsters to network and make connections with others by bringing communities together to share, reflect, and learn Provide an opportunity for personal learning and growth Kick off the 2009-2010 school year with high energy and building momentum towards the belief of what is possible for our kids and renewing our collective commitment to realizing these possibilities in our KIPP communities across the country.
  9. % of doctors has decreased by 15% - two-dimensional data But size of doctor (change in area of image) has decreased by over 75% - Problem using area to represent two-dimensional data