SlideShare ist ein Scribd-Unternehmen logo
1 von 48
EVALUATING
STUDENT SUCCESS
INITIATIVES                                              LACCD
                                                         Student
                                                         Success
                                                         & 3CSN
                                                         Summit

Making Sure Things Work Before
We Scale Them Up
              Center for Applied Research at CPCC 2013
PURPOSE
We want to take some time to discuss
 common misconceptions and issues
 experienced by colleges around the subject of
 evaluation.
We want to understand the differences
 between evaluation and research.
We want to know how to develop and
 implement a good evaluation for an
 intervention or program.
                Center for Applied Research at CPCC 2013
PROGRAM EVALUATION


What is evaluation?
   Evaluation is a profession composed of persons with
    varying interests, potentially encompassing but not limited
    to the evaluation of programs, products, personnel, policy,
    performance, proposals, technology, research, theory and
    even of evaluation itself.
Go to: http://www.eval.org
At the bottom of the homepage there is a link to a
 free training package and facilitators guide for
 teaching the Guiding Principles for Evaluator
 Training
                    Center for Applied Research at CPCC 2013
MORE ON EVALUATION

As defined by the American Evaluation
 Association, evaluation involves assessing the
 strengths and weaknesses of programs, policies,
 personnel, products, and organizations to improve
 their effectiveness.
  Evaluation is the systematic collection and analysis of data
   needed to make decisions, a process in which most well-
   run programs engage from the outset. Here are just some
   of the evaluation activities that are already likely to be
   incorporated into many programs or that can be added
   easily:
     Pinpointing the services needed for example, finding out
      what knowledge, skills, attitudes, or behaviors a program
      should address
                     Center for Applied Research at CPCC 2013
CONTINUED

 Establishing program objectives and deciding the particular
  evidence (such as the specific knowledge, attitudes, or
  behavior) that will demonstrate that the objectives have been
  met. A key to successful evaluation is a set of clear,
  measurable, and realistic program objectives. If objectives are
  unrealistically optimistic or are not measurable, the program
  may not be able to demonstrate that it has been successful
  even if it has done a good job
 Developing or selecting from among alternative program
  approaches for example, trying different curricula or policies
  and determining which ones best achieve the goals



                     Center for Applied Research at CPCC 2013
CONTINUED

 Tracking program objectives for example, setting up
  a system that shows who gets services, how much
  service is delivered, how participants rate the
  services they receive, and which approaches are
  most readily adopted by staff
 Trying out and assessing new program designs
  determining the extent to which a particular
  approach is being implemented faithfully by school
  or agency person


                 Center for Applied Research at CPCC 2013
PROGRAM EVALUATION

Purpose
  To establish better products, personnel, programs,
   organizations, governments, consumers and the public
   interest; to contribute to informed decision making and more
   enlightened change; precipitating needed change;
   empowering all stakeholders by collecting data from them
   and engaging them in the evaluation process; and
   experiencing the excitement of new insights.
  Evaluators aspire to construct and provide the best possible
   information that might bear on the value of whatever is being
   evaluated.


                    Center for Applied Research at CPCC 2013
Definition of Evaluation
 Study designed and conducted to assist some
audience to assess an object’s merit and worth.
                      (Stufflebeam, 1999)



Identification of defensible criteria to determine an
evaluation object’s value (worth or merit), quality, utility,
effectiveness, or significance in relation to those
criteria. (Fitzpatrick, Sanders & Worthen, 2004)



                      Center for Applied Research at CPCC 2013
Definition of Evaluation
                       Goal 1
     Determine the merit or worth of an evaluand.
                        (Scriven 1991)




                        Goal 2
Provide answers to significant evaluative questions that
                      are posed

       It is a value judgment based on
               defensible criteria
                    Center for Applied Research at CPCC 2013
Evaluation Questions
 Provide the direction and foundation for the
 evaluation (without them the evaluation will
                 lack focus)
The evaluation’s focus will determine
         the questions asked.
   Need              Process                              Outcomes
Assessment          Evaluation                            Evaluation
Questions?          Questions?                            Questions?

               Center for Applied Research at CPCC 2013
TYPES OF EVALUATION

Process evaluation – determines if the
 processes are happening according to the plan
 The processes of a program are the “nitty-
   gritty” details or the “dosage” students,
   patients or clients receive – the activities
 It is the who is going to do what and when
 It answers the question “Is this program being
   delivered as it was intended.”


                Center for Applied Research at CPCC 2013
TYPES OF EVALUATION


Outcome evaluation (most critical piece for
 accreditation)
  determines how participants do on short-range,
   mid-range or long-range outcomes
  Usually involves setting program goals and
   outcome objectives
  Answers the question “is this program working”
   and/or “are participants accomplishing what we
   intended for them to accomplish”
                 Center for Applied Research at CPCC 2013
TYPES OF EVALUATION

Impact evaluation
 How did the results impact the student
  group, college, community, family
  (larger group over time)
 Answers the question “Is this program
  having the impact it was intended to
  have (so you must start with
  intentions)?”
              Center for Applied Research at CPCC 2013
TWO MAJOR TYPES OF EVALUATION




         Center for Applied Research at CPCC 2013
IR DEPARTMENTS

The Good News Is…..
 You are all data people
The Bad News Is….
 You are all data people
 Sometimes have difficulty realizing
  this is not research and demands
  more than data from your student
  system
              Center for Applied Research at CPCC 2013
Evaluation                                            Research
  Use         intended for use – use is the                      produces knowledge – lets
                        rationale                              the natural process determine
                                                                             use

Questions   the decision-maker, not evaluator, the researcher determines the
              comes up with the questions to             questions
                          study.

Judgment       compares what is with what                                studies what is
                should be – does it meet
                  established criteria

 Setting      action setting/priority is to the                 priority is to the research, not
               program, not the evaluation                          what is being studied

 Roles       friction among evaluator’s roles                    not the friction; research vs.
            and program giver’s roles because                         funder – no friction
                            of
            judgmental qualities of evaluator.
                              Center for Applied Research at CPCC 2013
In Community
ISSUES WITH EVALUATION                                    Colleges




          Center for Applied Research at CPCC 2013
INTERVENTIONS HAVE QUESTIONABLE
                SUCCESS
The evaluated don’t take into consideration all
 factors including methodology and quality of
 implementation
College needs to have a realistic/courageous
 conversation on standards of evidence, statistical
 significance and expectations
Spend most of the time planning the interventions,
 not on how to evaluate it
Never define what success should look like,
 reasonable target

                   Center for Applied Research at CPCC 2013
INTERVENTIONS ARE OFTEN TOO
           COMPLICATED
Multiple layers of independent variables
College lacks the staff, software or ability to carry it
 out.
Groups keep getting smaller and smaller (for sample
 or comparison groups).
Don’t really know what worked.
Expansion happens too quickly.



                     Center for Applied Research at CPCC 2013
INTERVENTIONS HAVE QUESTIONABLE
 ABILITY TO BE ADAPTED ON A LARGE
               SCALE
Not enough consideration of the costs of scaling
Don’t want to cancel plans involving un-scalable
 interventions (someone’s pet project)
Develop culture where it is ok to take risk and learn
 from mistakes




                   Center for Applied Research at CPCC 2013
THE COLLEGE SKEPTIC
The one who wants everything to be
 statistically significant
The faculty group who wants to talk about
 confidence intervals or power
Fear that things won’t work
“We tried that before”
They confuse evaluation with research.

                Center for Applied Research at CPCC 2013
LIMITED ABILITY TO EVALUATE.

Whole concept is new to many.
Funders forces us to begin the process.
May be no one at the institution to lead them
 through it (health faculty are the best place to
 start).
Don’t know what resources are out there?



                 Center for Applied Research at CPCC 2013
ANALYSIS PARALYSIS

Let’s splice and dice the data more and more
 and more.
Too much data to analyze
Don’t know what it tells them
How do we make a decision about priorities
 and strategies from 200 pages of data
 tables?

                Center for Applied Research at CPCC 2013
THE SUMMER HIATUS
Faculty leave in June and never give the initiative
 a thought until August 20th.
No interventions are in place when fall term
 begins
No evaluation tools are in place.
Baseline data cannot be collected.
August 20-31 they are mostly concerned with
 preparing for fall classes (as they should).


                  Center for Applied Research at CPCC 2013
NO WORKABLE EVALUATION TIMELINES

Creating a timeline.
Identifying all the detail.
Getting a team to actually follow it.
Who is responsible for each piece.
Where do completed surveys/assessments
 go – who scores them – who analyzes them –
 who makes decisions based on them?

               Center for Applied Research at CPCC 2013
What does a logic model look like?
   Graphic display of boxes and
        arrows; vertical or horizontal
          Relationships, linkages.
   Any shape possible
          Circular, dynamic,
          Cultural adaptations, storyboards.
   Level of detail
          Simple
          Complex
   Multiple models

Source / Adapted from UW-Extension:
http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.htmlResearch at CPCC 2013
                                            Center for Applied
Where are you going?
How will you get there?
What will tell you that
               you’ve arrived?

A logic model is your program
ROAD MAP

            Center for Applied Research at CPCC 2013
Example: Every day logic
model –
Family Vacation
       Family Members                  Drive to state park                Family members
                                                                          learn about each
           Budget                                                           other; family
                                           Set up camp
                                                                          bonds; family has
             Car                                                             a good time
                                         Cook, play, talk,
                                           laugh, hike
           Camping
          Equipment




Source: E Taylor-Powell, University of Wisconsin- Extension-Cooperative Extension
                               Center for Applied Research at CPCC 2013
Example: Financial
management program
Situation: Individuals with limited knowledge and skills in basic financial management are
         unable to meet their financial goals and manage money to meet their needs.


     INPUTS                             OUTPUTS                               OUTCOMES

   Extension                       We conduct a variety of               Participants gain
                                   educational activities                knowledge, change
   invests time and
                                   targeted to individuals               practices and have
   resources                       who participate                       improved financial well-
                                                                         being




WHAT WE INVEST                         WHAT WE DO                         WHAT RESULTS



 Source: E Taylor-Powell, University of Wisconsin- Extension-Cooperative Extension
                                Center for Applied Research at CPCC 2013
Example: One component of a
  comprehensive parent education and
                     support initiative
         During a county needs assessment, majority of parents reported that they were
Situation:   having difficulty parenting and felt stressed as a result
INPUTS                   OUTPUTS                                         OUTCOMES
                                                      Parents
                                                      increase                Parents
             Develop                                  knowledge of            identify
Staff        parent ed                                child dev               appropriate   Improve
                                                                              actions to    d child-
             curriculu
                                                                              take          parent
             m                 Targeted               Parents
                                                                                            relation
Mone                                                  better
             Deliver            parents               understandin
                                                                                            s
y
             series of          attend                g their own             Parents use
                                                      parenting               effective     Strong
Partner      interacti                                style                   parenting     families
                                                       Parents gain
s            vesessio                                  skills in              practices

Researc      ns
             Facilitat                                 effective
                                                       parenting
h            e                                         practices
             support
             groups
   Assumptions:                                              External factors:
                                   Center for Applied Research at CPCC 2013
Example: Smoke free
 worksites
Situation: Secondhand smoke is responsible for lung cancer, respiratory symptoms, cardiovascular disease, and
worsens asthma. Public policy change that creates smoke free environments is the best known way to reduce and
prevent smoking.

Inputs                             Outputs                                       Outcomes
                  Assess
                  worksite                                                         Demonstrations
                                                              Increased
                  tobacco                Worksite             awareness of         of public support
Coalition
                  policies and           owners,                                   for SF worksites
                                                              importance of
Time              practices              managers             SF worksites

Dollars           Develop                                                                              SF worksites
                  community                                   Increased             SF worksites
Partners                                 Unions
                  support for SF                              knowledge of SF       policies
Including         worksites                                   worksite              drafted
youth                                                         benefits &
                                                              options
                                         Workers;
                  Organize and
                                         union
                  implement                                                         SF worksite
                                         members               Increased
                  strategy for                                                      policies
                                                               commitment,
                  targeted                                                          passed
                                                               support and
                  worksites
                                         Public                demand for
                                                               SF worksites
                                                                                     Adherence to
                                                                                     smoke-free
                                                                                     policies
 Source: E Taylor-Powell, University of Wisconsin- Extension-
 Cooperative Extension                Center for Applied Research at CPCC 2013
Need AssessmentProcess EvaluationOutcomes Evaluation
  Questions?       Questions?        Questions?

              INPUT
              INPUT                   PROCESS
                                      PROCESS                       OUTCOMES
                                                                    OUTCOMES


        What                        Is the                          To what extent
        resources are               intervention                    are desired
        needed for                  strategy being                  changes
                                    implemented as                  occurring? For
        starting this                                               whom?
        intervention                intended?
        strategy?                                                   Is the
                                                                    intervention
        How many                     Are participants               strategy making
        staff members               being reached as                a difference?
        are needed?                 intended?                       What seems to
                                                                    work? Not work?


Source: R. Rincones-Gomez, 2009 for Applied Research at CPCC 2013
                             Center
CHAIN OF OUTCOMES
        SHORT                       MEDIUM                        LONG-TERM
         Seniors increase            Practice safe cooling of      Lowered incidence of food
         knowledge of food           food; food preparation        borne illness
         contamination risks         guidelines
         Participants increase       Establish financial goals,    Reduced debt and
         knowledge and skills in     use spending plan             increased savings
         financial management

         Community increases          Residents and employers      Child care needs are met
         understanding of            discuss options and
         childcare needs             implement a plan


          Empty inner city parking Youth and adults learn       Money saved, nutrition
         lot converted to          gardening skills, nutrition, improved, residents enjoy
         community garden          food preparation and mgt. greater sense of
                                                                community
Source: E Taylor-Powell, University of Center for Applied Research at CPCC 2013
                                       Wisconsin- Extension-Cooperative Extension
WHAT ARE THE SUMMATIVE AND FORMATIVE
         OUTCOME INDICATORS
Supplemental Instruction
Learning Communities
Required Orientation
Academic Success Course
Minority Male Mentoring
Developmental Math Redesign
Peer Tutoring
Accelerated English

                Center for Applied Research at CPCC 2013
AT YOUR TABLES ……….
Select an ATD student success
 initiative at your college that you plan to
 evaluate before you make the decision
 to scale it up. (if you can’t think of one
 use the online learning one in your
 handouts)

Use this program for each activity.
               Center for Applied Research at CPCC 2013
1. BRING TOGETHER THE PROGRAM
           DEVELOPERS
Ask them to answer these question:

1. Why did you develop this program with these
 program characteristics?
2. What do you think students (or participants) will
 get out of this program (what changes)?
3. How do you tie specific program content to
 specific expected changes or improvements in
 participants.

                   Center for Applied Research at CPCC 2013
2. ORIENT AN EVALUATION TEAM

Who should be on it?
What skills do you need at the table (what
 staff members have those?)
What should be their charge?




                Center for Applied Research at CPCC 2013
3. GATHER INFORMATION ON
       POTENTIAL OUTCOMES.
What are potential sources for
 outcomes?




              Center for Applied Research at CPCC 2013
4. WRITE OUTCOME STATEMENTS
Sometime these are already written (from
 grants)
Make them clear
Don’t draw a number out of a hat
Test it out
Create a logic model


                Center for Applied Research at CPCC 2013
5. CREATE OUTCOME INDICATORS
Outcome Indicator. - Usually referred to as a key
 performance indicator, this is the data, or set of statistics that
 best verifies the accomplishment of a specific outcome. An
 outcome indicator for college readiness might be an SAT score
 of 1100 or above. It is typically the accomplishment of a
 specific skill or assessment at a certain level that indicates an
 outcome is met.

What data can you access?
What assessments need to be selected?


                        Center for Applied Research at CPCC 2013
6. CREATE OUTCOME TARGETS
Outcome Target – the benchmark set as a
 performance indicator for a given outcome. An example
 would be that 80% of students would score a 75% or above
 on a reading assessment. The outcome target would be
 “80% of students.”

How would you create these targets or
 benchmark?
Do you need a comparison group?
What is an acceptable level of
 improvement or change?
                    Center for Applied Research at CPCC 2013
7. CREATE ALL TOOLS
You will probably need:
   Demographic sheets
   Attendance or participation log
   Formative evaluation tools
Will they be online or pencil/paper tools
 (benefits of each)
When do they need to be ready?
Who needs copies?
Create evaluation timeline.
                     Center for Applied Research at CPCC 2013
8. PILOT TEST THE PROCESS
Make sure it works
Give a small group of student or faculty/staff the
 assessments to make sure they are clear
Work out all the detail
   Who distributes it
   Who collects it
   Who scores it
   Who puts it in the spreadsheet
   Who keeps up with the post-test dates, etc.

                      Center for Applied Research at CPCC 2013
9. IMPLEMENT THE EVALUATION
Follow your plan




              Center for Applied Research at CPCC 2013
10. ANALYZE RESULTS
Sometimes just numbers and percents
Sometimes statistical tests are needed
If students don’t meet the summative
 evaluation benchmarks, analyze the
 formative evaluation



              Center for Applied Research at CPCC 2013
11. IMPROVE YOUR PROCESS AND
          PROGRAM
Takes several years to have good data.
Discuss how the evaluation can be
 improved
Discuss how the program can be
 improved



              Center for Applied Research at CPCC 2013
CLOSING
Establish your plan
Follow your plan
Assign responsibility for it
Expect big things
Use results to improve what you do (close the
 loop)


                Center for Applied Research at CPCC 2013
SUPPORT AND CONTACT
INFO:

    Terri Manning, Ed.D.
    terri.manning@cpcc.ed
     u
    (704) 330-6592

Weitere ähnliche Inhalte

Was ist angesagt?

Basic steps to program evaluation
Basic steps to program evaluationBasic steps to program evaluation
Basic steps to program evaluationLeith Mazzochi
 
Presentation (m & e)
Presentation (m & e)Presentation (m & e)
Presentation (m & e)shainaanwar
 
Division Monitoring and Evaluation ( M and E) Framework
Division Monitoring and Evaluation ( M and E) FrameworkDivision Monitoring and Evaluation ( M and E) Framework
Division Monitoring and Evaluation ( M and E) FrameworkDr. Joy Kenneth Sala Biasong
 
Monitoring and impact assessment tools
Monitoring and impact assessment toolsMonitoring and impact assessment tools
Monitoring and impact assessment toolsBrajendra Singh Meena
 
CIPP Evaluation Model
CIPP Evaluation ModelCIPP Evaluation Model
CIPP Evaluation ModelCt Hajar
 
Spring 2011 Signature Experience Projects Kick-Off
Spring 2011 Signature Experience Projects Kick-OffSpring 2011 Signature Experience Projects Kick-Off
Spring 2011 Signature Experience Projects Kick-OffTXWes_PMO
 
What is program evaluation lecture 100207 [compatibility mode]
What is program evaluation lecture   100207 [compatibility mode]What is program evaluation lecture   100207 [compatibility mode]
What is program evaluation lecture 100207 [compatibility mode]Jennifer Morrow
 
Program evaluation 20121016
Program evaluation 20121016Program evaluation 20121016
Program evaluation 20121016nida19
 
Program evaluation
Program evaluationProgram evaluation
Program evaluationspanishpvs
 
Program Evaluation: Forms and Approaches by Helen A. Casimiro
Program Evaluation: Forms and Approaches by Helen A. CasimiroProgram Evaluation: Forms and Approaches by Helen A. Casimiro
Program Evaluation: Forms and Approaches by Helen A. CasimiroHelen Casimiro
 
Gauging Effectiveness of Instructional Grants
Gauging Effectiveness of Instructional GrantsGauging Effectiveness of Instructional Grants
Gauging Effectiveness of Instructional GrantsLynda Milne
 
Language course evaluation
Language course evaluationLanguage course evaluation
Language course evaluationCarlos Mayora
 
Leveraging success factors to increase collaboration across college and campu...
Leveraging success factors to increase collaboration across college and campu...Leveraging success factors to increase collaboration across college and campu...
Leveraging success factors to increase collaboration across college and campu...David Stone
 

Was ist angesagt? (20)

Basic steps to program evaluation
Basic steps to program evaluationBasic steps to program evaluation
Basic steps to program evaluation
 
Evaluation seminar1
Evaluation seminar1Evaluation seminar1
Evaluation seminar1
 
Presentation (m & e)
Presentation (m & e)Presentation (m & e)
Presentation (m & e)
 
Division Monitoring and Evaluation ( M and E) Framework
Division Monitoring and Evaluation ( M and E) FrameworkDivision Monitoring and Evaluation ( M and E) Framework
Division Monitoring and Evaluation ( M and E) Framework
 
Monitoring and impact assessment tools
Monitoring and impact assessment toolsMonitoring and impact assessment tools
Monitoring and impact assessment tools
 
CIPP Evaluation Model
CIPP Evaluation ModelCIPP Evaluation Model
CIPP Evaluation Model
 
Spring 2011 Signature Experience Projects Kick-Off
Spring 2011 Signature Experience Projects Kick-OffSpring 2011 Signature Experience Projects Kick-Off
Spring 2011 Signature Experience Projects Kick-Off
 
Basics of Extension Evaluation (Foundations Course 2019)
Basics of Extension Evaluation (Foundations Course 2019)Basics of Extension Evaluation (Foundations Course 2019)
Basics of Extension Evaluation (Foundations Course 2019)
 
Campus Safety and Violence Prevention Task Force
Campus Safety and Violence Prevention Task ForceCampus Safety and Violence Prevention Task Force
Campus Safety and Violence Prevention Task Force
 
What is program evaluation lecture 100207 [compatibility mode]
What is program evaluation lecture   100207 [compatibility mode]What is program evaluation lecture   100207 [compatibility mode]
What is program evaluation lecture 100207 [compatibility mode]
 
Program evaluation
Program evaluationProgram evaluation
Program evaluation
 
Program evaluation 20121016
Program evaluation 20121016Program evaluation 20121016
Program evaluation 20121016
 
Program evaluation
Program evaluationProgram evaluation
Program evaluation
 
W 2 WASC 101
W 2 WASC 101W 2 WASC 101
W 2 WASC 101
 
Program Evaluation: Forms and Approaches by Helen A. Casimiro
Program Evaluation: Forms and Approaches by Helen A. CasimiroProgram Evaluation: Forms and Approaches by Helen A. Casimiro
Program Evaluation: Forms and Approaches by Helen A. Casimiro
 
Research panel
Research panelResearch panel
Research panel
 
Gauging Effectiveness of Instructional Grants
Gauging Effectiveness of Instructional GrantsGauging Effectiveness of Instructional Grants
Gauging Effectiveness of Instructional Grants
 
Performance Measurement: Ensuring Accuracy, Transparency and Relevance
Performance Measurement: Ensuring Accuracy, Transparency and RelevancePerformance Measurement: Ensuring Accuracy, Transparency and Relevance
Performance Measurement: Ensuring Accuracy, Transparency and Relevance
 
Language course evaluation
Language course evaluationLanguage course evaluation
Language course evaluation
 
Leveraging success factors to increase collaboration across college and campu...
Leveraging success factors to increase collaboration across college and campu...Leveraging success factors to increase collaboration across college and campu...
Leveraging success factors to increase collaboration across college and campu...
 

Ähnlich wie Evaluating Student Success Initiatives

Week 12_Designing Instructional Materials and Conducting Summative Evaluation...
Week 12_Designing Instructional Materials and Conducting Summative Evaluation...Week 12_Designing Instructional Materials and Conducting Summative Evaluation...
Week 12_Designing Instructional Materials and Conducting Summative Evaluation...RalphNavelino3
 
Designing and conducting summative evaluations
Designing and conducting summative evaluationsDesigning and conducting summative evaluations
Designing and conducting summative evaluationsKobieJones1
 
Public Consulting Group Evaluation White Paper
Public Consulting Group Evaluation White PaperPublic Consulting Group Evaluation White Paper
Public Consulting Group Evaluation White PaperPublic Consulting Group
 
Learning from Evaluation
Learning from EvaluationLearning from Evaluation
Learning from EvaluationOlivier Serrat
 
credit seminor 2 - Copy
credit seminor 2 - Copycredit seminor 2 - Copy
credit seminor 2 - Copygireesh s
 
Ot5101 005 week 5
Ot5101 005 week 5Ot5101 005 week 5
Ot5101 005 week 5stanbridge
 
Measuring Student Success: Tutoring and Learning Centers
Measuring Student Success: Tutoring and Learning CentersMeasuring Student Success: Tutoring and Learning Centers
Measuring Student Success: Tutoring and Learning CentersLisa D'Adamo-Weinstein
 
Apt chapter 12 summative evaluation
Apt chapter 12 summative evaluationApt chapter 12 summative evaluation
Apt chapter 12 summative evaluationReginald Smith
 
Curriculum Evaluation
Curriculum EvaluationCurriculum Evaluation
Curriculum Evaluationvalarpink
 
Data Driven Decision Making Presentation
Data Driven Decision Making PresentationData Driven Decision Making Presentation
Data Driven Decision Making PresentationRussell Kunz
 
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2Brent MacKinnon
 
Administration and Supervision in Evaluation
Administration and Supervision in EvaluationAdministration and Supervision in Evaluation
Administration and Supervision in EvaluationSharon Geroquia
 
Strategies in curriculum evaluation
Strategies in curriculum evaluation Strategies in curriculum evaluation
Strategies in curriculum evaluation Eko Priyanto
 
COMMUNITY EVALUATION 2023.pptx
COMMUNITY  EVALUATION 2023.pptxCOMMUNITY  EVALUATION 2023.pptx
COMMUNITY EVALUATION 2023.pptxgggadiel
 
Summative Evaluation PowerPoint
Summative Evaluation PowerPointSummative Evaluation PowerPoint
Summative Evaluation PowerPointJoidon Jennings
 

Ähnlich wie Evaluating Student Success Initiatives (20)

Week 12_Designing Instructional Materials and Conducting Summative Evaluation...
Week 12_Designing Instructional Materials and Conducting Summative Evaluation...Week 12_Designing Instructional Materials and Conducting Summative Evaluation...
Week 12_Designing Instructional Materials and Conducting Summative Evaluation...
 
Designing and conducting summative evaluations
Designing and conducting summative evaluationsDesigning and conducting summative evaluations
Designing and conducting summative evaluations
 
Public Consulting Group Evaluation White Paper
Public Consulting Group Evaluation White PaperPublic Consulting Group Evaluation White Paper
Public Consulting Group Evaluation White Paper
 
Learning from Evaluation
Learning from EvaluationLearning from Evaluation
Learning from Evaluation
 
credit seminor 2 - Copy
credit seminor 2 - Copycredit seminor 2 - Copy
credit seminor 2 - Copy
 
Program Evaluation 1
Program Evaluation 1Program Evaluation 1
Program Evaluation 1
 
Ot5101 005 week 5
Ot5101 005 week 5Ot5101 005 week 5
Ot5101 005 week 5
 
Measuring Student Success: Tutoring and Learning Centers
Measuring Student Success: Tutoring and Learning CentersMeasuring Student Success: Tutoring and Learning Centers
Measuring Student Success: Tutoring and Learning Centers
 
Apt chapter 12 summative evaluation
Apt chapter 12 summative evaluationApt chapter 12 summative evaluation
Apt chapter 12 summative evaluation
 
Curriculum Evaluation
Curriculum EvaluationCurriculum Evaluation
Curriculum Evaluation
 
Data Driven Decision Making Presentation
Data Driven Decision Making PresentationData Driven Decision Making Presentation
Data Driven Decision Making Presentation
 
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
 
Administration and Supervision in Evaluation
Administration and Supervision in EvaluationAdministration and Supervision in Evaluation
Administration and Supervision in Evaluation
 
Strategies in curriculum evaluation
Strategies in curriculum evaluation Strategies in curriculum evaluation
Strategies in curriculum evaluation
 
COMMUNITY EVALUATION 2023.pptx
COMMUNITY  EVALUATION 2023.pptxCOMMUNITY  EVALUATION 2023.pptx
COMMUNITY EVALUATION 2023.pptx
 
Curriculum Evaluation
Curriculum EvaluationCurriculum Evaluation
Curriculum Evaluation
 
Syeda barka
Syeda barkaSyeda barka
Syeda barka
 
Summative Evaluation PowerPoint
Summative Evaluation PowerPointSummative Evaluation PowerPoint
Summative Evaluation PowerPoint
 
Dr Wilson pre-conference plenary presentation
Dr Wilson pre-conference plenary presentationDr Wilson pre-conference plenary presentation
Dr Wilson pre-conference plenary presentation
 
Campare 3737
Campare 3737Campare 3737
Campare 3737
 

Mehr von Bradley Vaden

Lost in Transition: How Faculty From Across the Disciplines Can Learn to “Own...
Lost in Transition: How Faculty From Across the Disciplines Can Learn to “Own...Lost in Transition: How Faculty From Across the Disciplines Can Learn to “Own...
Lost in Transition: How Faculty From Across the Disciplines Can Learn to “Own...Bradley Vaden
 
Student Success and Transfer
Student Success and TransferStudent Success and Transfer
Student Success and TransferBradley Vaden
 
Strategic Plan Kick-Off
Strategic Plan Kick-OffStrategic Plan Kick-Off
Strategic Plan Kick-OffBradley Vaden
 
Planning Cycle and Use of Results
Planning Cycle and Use of ResultsPlanning Cycle and Use of Results
Planning Cycle and Use of ResultsBradley Vaden
 
Helping Students Become More Self-Regulated Learners
Helping Students Become More Self-Regulated LearnersHelping Students Become More Self-Regulated Learners
Helping Students Become More Self-Regulated LearnersBradley Vaden
 
Lattc Strategic Plan BOT Update Feb 25 09
Lattc Strategic Plan BOT Update Feb 25 09Lattc Strategic Plan BOT Update Feb 25 09
Lattc Strategic Plan BOT Update Feb 25 09Bradley Vaden
 
CCCBSI Steering Committee
CCCBSI Steering CommitteeCCCBSI Steering Committee
CCCBSI Steering CommitteeBradley Vaden
 
LATTC Strategic Plan Update 2/09
LATTC Strategic Plan Update 2/09LATTC Strategic Plan Update 2/09
LATTC Strategic Plan Update 2/09Bradley Vaden
 
High Cost Program Presentation
High Cost Program PresentationHigh Cost Program Presentation
High Cost Program PresentationBradley Vaden
 
LATTC Strategic Planning Process
LATTC Strategic Planning ProcessLATTC Strategic Planning Process
LATTC Strategic Planning ProcessBradley Vaden
 
LATTC Presentation on STP3 program at LAUSD LACCD Meeting
LATTC Presentation on STP3 program at LAUSD LACCD MeetingLATTC Presentation on STP3 program at LAUSD LACCD Meeting
LATTC Presentation on STP3 program at LAUSD LACCD MeetingBradley Vaden
 
Program Review ACCJC Presentation
Program Review ACCJC PresentationProgram Review ACCJC Presentation
Program Review ACCJC PresentationBradley Vaden
 

Mehr von Bradley Vaden (15)

Lost in Transition: How Faculty From Across the Disciplines Can Learn to “Own...
Lost in Transition: How Faculty From Across the Disciplines Can Learn to “Own...Lost in Transition: How Faculty From Across the Disciplines Can Learn to “Own...
Lost in Transition: How Faculty From Across the Disciplines Can Learn to “Own...
 
Student Success and Transfer
Student Success and TransferStudent Success and Transfer
Student Success and Transfer
 
Strategic Plan Kick-Off
Strategic Plan Kick-OffStrategic Plan Kick-Off
Strategic Plan Kick-Off
 
Planning Cycle and Use of Results
Planning Cycle and Use of ResultsPlanning Cycle and Use of Results
Planning Cycle and Use of Results
 
Helping Students Become More Self-Regulated Learners
Helping Students Become More Self-Regulated LearnersHelping Students Become More Self-Regulated Learners
Helping Students Become More Self-Regulated Learners
 
Lattc Strategic Plan BOT Update Feb 25 09
Lattc Strategic Plan BOT Update Feb 25 09Lattc Strategic Plan BOT Update Feb 25 09
Lattc Strategic Plan BOT Update Feb 25 09
 
Brazil
BrazilBrazil
Brazil
 
CCCBSI Steering Committee
CCCBSI Steering CommitteeCCCBSI Steering Committee
CCCBSI Steering Committee
 
LATTC Strategic Plan Update 2/09
LATTC Strategic Plan Update 2/09LATTC Strategic Plan Update 2/09
LATTC Strategic Plan Update 2/09
 
Lattc SWOT
Lattc SWOTLattc SWOT
Lattc SWOT
 
Motivation
MotivationMotivation
Motivation
 
High Cost Program Presentation
High Cost Program PresentationHigh Cost Program Presentation
High Cost Program Presentation
 
LATTC Strategic Planning Process
LATTC Strategic Planning ProcessLATTC Strategic Planning Process
LATTC Strategic Planning Process
 
LATTC Presentation on STP3 program at LAUSD LACCD Meeting
LATTC Presentation on STP3 program at LAUSD LACCD MeetingLATTC Presentation on STP3 program at LAUSD LACCD Meeting
LATTC Presentation on STP3 program at LAUSD LACCD Meeting
 
Program Review ACCJC Presentation
Program Review ACCJC PresentationProgram Review ACCJC Presentation
Program Review ACCJC Presentation
 

Evaluating Student Success Initiatives

  • 1. EVALUATING STUDENT SUCCESS INITIATIVES LACCD Student Success & 3CSN Summit Making Sure Things Work Before We Scale Them Up Center for Applied Research at CPCC 2013
  • 2. PURPOSE We want to take some time to discuss common misconceptions and issues experienced by colleges around the subject of evaluation. We want to understand the differences between evaluation and research. We want to know how to develop and implement a good evaluation for an intervention or program. Center for Applied Research at CPCC 2013
  • 3. PROGRAM EVALUATION What is evaluation?  Evaluation is a profession composed of persons with varying interests, potentially encompassing but not limited to the evaluation of programs, products, personnel, policy, performance, proposals, technology, research, theory and even of evaluation itself. Go to: http://www.eval.org At the bottom of the homepage there is a link to a free training package and facilitators guide for teaching the Guiding Principles for Evaluator Training Center for Applied Research at CPCC 2013
  • 4. MORE ON EVALUATION As defined by the American Evaluation Association, evaluation involves assessing the strengths and weaknesses of programs, policies, personnel, products, and organizations to improve their effectiveness.  Evaluation is the systematic collection and analysis of data needed to make decisions, a process in which most well- run programs engage from the outset. Here are just some of the evaluation activities that are already likely to be incorporated into many programs or that can be added easily:  Pinpointing the services needed for example, finding out what knowledge, skills, attitudes, or behaviors a program should address Center for Applied Research at CPCC 2013
  • 5. CONTINUED  Establishing program objectives and deciding the particular evidence (such as the specific knowledge, attitudes, or behavior) that will demonstrate that the objectives have been met. A key to successful evaluation is a set of clear, measurable, and realistic program objectives. If objectives are unrealistically optimistic or are not measurable, the program may not be able to demonstrate that it has been successful even if it has done a good job  Developing or selecting from among alternative program approaches for example, trying different curricula or policies and determining which ones best achieve the goals Center for Applied Research at CPCC 2013
  • 6. CONTINUED  Tracking program objectives for example, setting up a system that shows who gets services, how much service is delivered, how participants rate the services they receive, and which approaches are most readily adopted by staff  Trying out and assessing new program designs determining the extent to which a particular approach is being implemented faithfully by school or agency person Center for Applied Research at CPCC 2013
  • 7. PROGRAM EVALUATION Purpose  To establish better products, personnel, programs, organizations, governments, consumers and the public interest; to contribute to informed decision making and more enlightened change; precipitating needed change; empowering all stakeholders by collecting data from them and engaging them in the evaluation process; and experiencing the excitement of new insights.  Evaluators aspire to construct and provide the best possible information that might bear on the value of whatever is being evaluated. Center for Applied Research at CPCC 2013
  • 8. Definition of Evaluation Study designed and conducted to assist some audience to assess an object’s merit and worth. (Stufflebeam, 1999) Identification of defensible criteria to determine an evaluation object’s value (worth or merit), quality, utility, effectiveness, or significance in relation to those criteria. (Fitzpatrick, Sanders & Worthen, 2004) Center for Applied Research at CPCC 2013
  • 9. Definition of Evaluation Goal 1 Determine the merit or worth of an evaluand. (Scriven 1991) Goal 2 Provide answers to significant evaluative questions that are posed It is a value judgment based on defensible criteria Center for Applied Research at CPCC 2013
  • 10. Evaluation Questions Provide the direction and foundation for the evaluation (without them the evaluation will lack focus) The evaluation’s focus will determine the questions asked. Need Process Outcomes Assessment Evaluation Evaluation Questions? Questions? Questions? Center for Applied Research at CPCC 2013
  • 11. TYPES OF EVALUATION Process evaluation – determines if the processes are happening according to the plan The processes of a program are the “nitty- gritty” details or the “dosage” students, patients or clients receive – the activities It is the who is going to do what and when It answers the question “Is this program being delivered as it was intended.” Center for Applied Research at CPCC 2013
  • 12. TYPES OF EVALUATION Outcome evaluation (most critical piece for accreditation)  determines how participants do on short-range, mid-range or long-range outcomes  Usually involves setting program goals and outcome objectives  Answers the question “is this program working” and/or “are participants accomplishing what we intended for them to accomplish” Center for Applied Research at CPCC 2013
  • 13. TYPES OF EVALUATION Impact evaluation How did the results impact the student group, college, community, family (larger group over time) Answers the question “Is this program having the impact it was intended to have (so you must start with intentions)?” Center for Applied Research at CPCC 2013
  • 14. TWO MAJOR TYPES OF EVALUATION Center for Applied Research at CPCC 2013
  • 15. IR DEPARTMENTS The Good News Is….. You are all data people The Bad News Is…. You are all data people Sometimes have difficulty realizing this is not research and demands more than data from your student system Center for Applied Research at CPCC 2013
  • 16. Evaluation Research Use intended for use – use is the produces knowledge – lets rationale the natural process determine use Questions the decision-maker, not evaluator, the researcher determines the comes up with the questions to questions study. Judgment compares what is with what studies what is should be – does it meet established criteria Setting action setting/priority is to the priority is to the research, not program, not the evaluation what is being studied Roles friction among evaluator’s roles not the friction; research vs. and program giver’s roles because funder – no friction of judgmental qualities of evaluator. Center for Applied Research at CPCC 2013
  • 17. In Community ISSUES WITH EVALUATION Colleges Center for Applied Research at CPCC 2013
  • 18. INTERVENTIONS HAVE QUESTIONABLE SUCCESS The evaluated don’t take into consideration all factors including methodology and quality of implementation College needs to have a realistic/courageous conversation on standards of evidence, statistical significance and expectations Spend most of the time planning the interventions, not on how to evaluate it Never define what success should look like, reasonable target Center for Applied Research at CPCC 2013
  • 19. INTERVENTIONS ARE OFTEN TOO COMPLICATED Multiple layers of independent variables College lacks the staff, software or ability to carry it out. Groups keep getting smaller and smaller (for sample or comparison groups). Don’t really know what worked. Expansion happens too quickly. Center for Applied Research at CPCC 2013
  • 20. INTERVENTIONS HAVE QUESTIONABLE ABILITY TO BE ADAPTED ON A LARGE SCALE Not enough consideration of the costs of scaling Don’t want to cancel plans involving un-scalable interventions (someone’s pet project) Develop culture where it is ok to take risk and learn from mistakes Center for Applied Research at CPCC 2013
  • 21. THE COLLEGE SKEPTIC The one who wants everything to be statistically significant The faculty group who wants to talk about confidence intervals or power Fear that things won’t work “We tried that before” They confuse evaluation with research. Center for Applied Research at CPCC 2013
  • 22. LIMITED ABILITY TO EVALUATE. Whole concept is new to many. Funders forces us to begin the process. May be no one at the institution to lead them through it (health faculty are the best place to start). Don’t know what resources are out there? Center for Applied Research at CPCC 2013
  • 23. ANALYSIS PARALYSIS Let’s splice and dice the data more and more and more. Too much data to analyze Don’t know what it tells them How do we make a decision about priorities and strategies from 200 pages of data tables? Center for Applied Research at CPCC 2013
  • 24. THE SUMMER HIATUS Faculty leave in June and never give the initiative a thought until August 20th. No interventions are in place when fall term begins No evaluation tools are in place. Baseline data cannot be collected. August 20-31 they are mostly concerned with preparing for fall classes (as they should). Center for Applied Research at CPCC 2013
  • 25. NO WORKABLE EVALUATION TIMELINES Creating a timeline. Identifying all the detail. Getting a team to actually follow it. Who is responsible for each piece. Where do completed surveys/assessments go – who scores them – who analyzes them – who makes decisions based on them? Center for Applied Research at CPCC 2013
  • 26. What does a logic model look like? Graphic display of boxes and arrows; vertical or horizontal  Relationships, linkages. Any shape possible  Circular, dynamic,  Cultural adaptations, storyboards. Level of detail  Simple  Complex Multiple models Source / Adapted from UW-Extension: http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.htmlResearch at CPCC 2013 Center for Applied
  • 27. Where are you going? How will you get there? What will tell you that you’ve arrived? A logic model is your program ROAD MAP Center for Applied Research at CPCC 2013
  • 28. Example: Every day logic model – Family Vacation Family Members Drive to state park Family members learn about each Budget other; family Set up camp bonds; family has Car a good time Cook, play, talk, laugh, hike Camping Equipment Source: E Taylor-Powell, University of Wisconsin- Extension-Cooperative Extension Center for Applied Research at CPCC 2013
  • 29. Example: Financial management program Situation: Individuals with limited knowledge and skills in basic financial management are unable to meet their financial goals and manage money to meet their needs. INPUTS OUTPUTS OUTCOMES Extension We conduct a variety of Participants gain educational activities knowledge, change invests time and targeted to individuals practices and have resources who participate improved financial well- being WHAT WE INVEST WHAT WE DO WHAT RESULTS Source: E Taylor-Powell, University of Wisconsin- Extension-Cooperative Extension Center for Applied Research at CPCC 2013
  • 30. Example: One component of a comprehensive parent education and support initiative During a county needs assessment, majority of parents reported that they were Situation: having difficulty parenting and felt stressed as a result INPUTS OUTPUTS OUTCOMES Parents increase Parents Develop knowledge of identify Staff parent ed child dev appropriate Improve actions to d child- curriculu take parent m Targeted Parents relation Mone better Deliver parents understandin s y series of attend g their own Parents use parenting effective Strong Partner interacti style parenting families Parents gain s vesessio skills in practices Researc ns Facilitat effective parenting h e practices support groups Assumptions: External factors: Center for Applied Research at CPCC 2013
  • 31. Example: Smoke free worksites Situation: Secondhand smoke is responsible for lung cancer, respiratory symptoms, cardiovascular disease, and worsens asthma. Public policy change that creates smoke free environments is the best known way to reduce and prevent smoking. Inputs Outputs Outcomes Assess worksite Demonstrations Increased tobacco Worksite awareness of of public support Coalition policies and owners, for SF worksites importance of Time practices managers SF worksites Dollars Develop SF worksites community Increased SF worksites Partners Unions support for SF knowledge of SF policies Including worksites worksite drafted youth benefits & options Workers; Organize and union implement SF worksite members Increased strategy for policies commitment, targeted passed support and worksites Public demand for SF worksites Adherence to smoke-free policies Source: E Taylor-Powell, University of Wisconsin- Extension- Cooperative Extension Center for Applied Research at CPCC 2013
  • 32. Need AssessmentProcess EvaluationOutcomes Evaluation Questions? Questions? Questions? INPUT INPUT PROCESS PROCESS OUTCOMES OUTCOMES What Is the To what extent resources are intervention are desired needed for strategy being changes implemented as occurring? For starting this whom? intervention intended? strategy? Is the intervention How many Are participants strategy making staff members being reached as a difference? are needed? intended? What seems to work? Not work? Source: R. Rincones-Gomez, 2009 for Applied Research at CPCC 2013 Center
  • 33. CHAIN OF OUTCOMES SHORT MEDIUM LONG-TERM Seniors increase Practice safe cooling of Lowered incidence of food knowledge of food food; food preparation borne illness contamination risks guidelines Participants increase Establish financial goals, Reduced debt and knowledge and skills in use spending plan increased savings financial management Community increases Residents and employers Child care needs are met understanding of discuss options and childcare needs implement a plan Empty inner city parking Youth and adults learn Money saved, nutrition lot converted to gardening skills, nutrition, improved, residents enjoy community garden food preparation and mgt. greater sense of community Source: E Taylor-Powell, University of Center for Applied Research at CPCC 2013 Wisconsin- Extension-Cooperative Extension
  • 34. WHAT ARE THE SUMMATIVE AND FORMATIVE OUTCOME INDICATORS Supplemental Instruction Learning Communities Required Orientation Academic Success Course Minority Male Mentoring Developmental Math Redesign Peer Tutoring Accelerated English Center for Applied Research at CPCC 2013
  • 35. AT YOUR TABLES ………. Select an ATD student success initiative at your college that you plan to evaluate before you make the decision to scale it up. (if you can’t think of one use the online learning one in your handouts) Use this program for each activity. Center for Applied Research at CPCC 2013
  • 36. 1. BRING TOGETHER THE PROGRAM DEVELOPERS Ask them to answer these question: 1. Why did you develop this program with these program characteristics? 2. What do you think students (or participants) will get out of this program (what changes)? 3. How do you tie specific program content to specific expected changes or improvements in participants. Center for Applied Research at CPCC 2013
  • 37. 2. ORIENT AN EVALUATION TEAM Who should be on it? What skills do you need at the table (what staff members have those?) What should be their charge? Center for Applied Research at CPCC 2013
  • 38. 3. GATHER INFORMATION ON POTENTIAL OUTCOMES. What are potential sources for outcomes? Center for Applied Research at CPCC 2013
  • 39. 4. WRITE OUTCOME STATEMENTS Sometime these are already written (from grants) Make them clear Don’t draw a number out of a hat Test it out Create a logic model Center for Applied Research at CPCC 2013
  • 40. 5. CREATE OUTCOME INDICATORS Outcome Indicator. - Usually referred to as a key performance indicator, this is the data, or set of statistics that best verifies the accomplishment of a specific outcome. An outcome indicator for college readiness might be an SAT score of 1100 or above. It is typically the accomplishment of a specific skill or assessment at a certain level that indicates an outcome is met. What data can you access? What assessments need to be selected? Center for Applied Research at CPCC 2013
  • 41. 6. CREATE OUTCOME TARGETS Outcome Target – the benchmark set as a performance indicator for a given outcome. An example would be that 80% of students would score a 75% or above on a reading assessment. The outcome target would be “80% of students.” How would you create these targets or benchmark? Do you need a comparison group? What is an acceptable level of improvement or change? Center for Applied Research at CPCC 2013
  • 42. 7. CREATE ALL TOOLS You will probably need:  Demographic sheets  Attendance or participation log  Formative evaluation tools Will they be online or pencil/paper tools (benefits of each) When do they need to be ready? Who needs copies? Create evaluation timeline. Center for Applied Research at CPCC 2013
  • 43. 8. PILOT TEST THE PROCESS Make sure it works Give a small group of student or faculty/staff the assessments to make sure they are clear Work out all the detail  Who distributes it  Who collects it  Who scores it  Who puts it in the spreadsheet  Who keeps up with the post-test dates, etc. Center for Applied Research at CPCC 2013
  • 44. 9. IMPLEMENT THE EVALUATION Follow your plan Center for Applied Research at CPCC 2013
  • 45. 10. ANALYZE RESULTS Sometimes just numbers and percents Sometimes statistical tests are needed If students don’t meet the summative evaluation benchmarks, analyze the formative evaluation Center for Applied Research at CPCC 2013
  • 46. 11. IMPROVE YOUR PROCESS AND PROGRAM Takes several years to have good data. Discuss how the evaluation can be improved Discuss how the program can be improved Center for Applied Research at CPCC 2013
  • 47. CLOSING Establish your plan Follow your plan Assign responsibility for it Expect big things Use results to improve what you do (close the loop) Center for Applied Research at CPCC 2013
  • 48. SUPPORT AND CONTACT INFO: Terri Manning, Ed.D. terri.manning@cpcc.ed u (704) 330-6592

Hinweis der Redaktion

  1. _________________________________________________________ _________________________________________________________ _________________________________________________________ _________________________________________________________ _________________________________________________________ _________________________________________________________
  2. Actually, we use logic models everyday. Let’s look at this… We want to take a family vacation and what we really hope is that we’ll have a good time and enjoy being together. We have had experience and know (our own personal research tells us) that camping is something we all enjoy doing together. So, in order to take a camping trip, we need.. If this…, then that…. Logic models involve a mental process. A logic model shows the series of connections and logical linkages that is expected to result in achievement of our goal.
  3. Let’s apply this to a typical Extension example
  4. An opportunity for questions and discussions.