Presentation by Terri Manning, Associate Vice President for Institutional Research/Director of the Center for Applied Research, Central Piedmont Community College; LACCD AtD Liaison at the 2nd Annual LACCD AtD Retreat
1. EVALUATING
STUDENT SUCCESS
INITIATIVES LACCD
Student
Success
& 3CSN
Summit
Making Sure Things Work Before
We Scale Them Up
Center for Applied Research at CPCC 2013
2. PURPOSE
We want to take some time to discuss
common misconceptions and issues
experienced by colleges around the subject of
evaluation.
We want to understand the differences
between evaluation and research.
We want to know how to develop and
implement a good evaluation for an
intervention or program.
Center for Applied Research at CPCC 2013
3. PROGRAM EVALUATION
What is evaluation?
Evaluation is a profession composed of persons with
varying interests, potentially encompassing but not limited
to the evaluation of programs, products, personnel, policy,
performance, proposals, technology, research, theory and
even of evaluation itself.
Go to: http://www.eval.org
At the bottom of the homepage there is a link to a
free training package and facilitators guide for
teaching the Guiding Principles for Evaluator
Training
Center for Applied Research at CPCC 2013
4. MORE ON EVALUATION
As defined by the American Evaluation
Association, evaluation involves assessing the
strengths and weaknesses of programs, policies,
personnel, products, and organizations to improve
their effectiveness.
Evaluation is the systematic collection and analysis of data
needed to make decisions, a process in which most well-
run programs engage from the outset. Here are just some
of the evaluation activities that are already likely to be
incorporated into many programs or that can be added
easily:
Pinpointing the services needed for example, finding out
what knowledge, skills, attitudes, or behaviors a program
should address
Center for Applied Research at CPCC 2013
5. CONTINUED
Establishing program objectives and deciding the particular
evidence (such as the specific knowledge, attitudes, or
behavior) that will demonstrate that the objectives have been
met. A key to successful evaluation is a set of clear,
measurable, and realistic program objectives. If objectives are
unrealistically optimistic or are not measurable, the program
may not be able to demonstrate that it has been successful
even if it has done a good job
Developing or selecting from among alternative program
approaches for example, trying different curricula or policies
and determining which ones best achieve the goals
Center for Applied Research at CPCC 2013
6. CONTINUED
Tracking program objectives for example, setting up
a system that shows who gets services, how much
service is delivered, how participants rate the
services they receive, and which approaches are
most readily adopted by staff
Trying out and assessing new program designs
determining the extent to which a particular
approach is being implemented faithfully by school
or agency person
Center for Applied Research at CPCC 2013
7. PROGRAM EVALUATION
Purpose
To establish better products, personnel, programs,
organizations, governments, consumers and the public
interest; to contribute to informed decision making and more
enlightened change; precipitating needed change;
empowering all stakeholders by collecting data from them
and engaging them in the evaluation process; and
experiencing the excitement of new insights.
Evaluators aspire to construct and provide the best possible
information that might bear on the value of whatever is being
evaluated.
Center for Applied Research at CPCC 2013
8. Definition of Evaluation
Study designed and conducted to assist some
audience to assess an object’s merit and worth.
(Stufflebeam, 1999)
Identification of defensible criteria to determine an
evaluation object’s value (worth or merit), quality, utility,
effectiveness, or significance in relation to those
criteria. (Fitzpatrick, Sanders & Worthen, 2004)
Center for Applied Research at CPCC 2013
9. Definition of Evaluation
Goal 1
Determine the merit or worth of an evaluand.
(Scriven 1991)
Goal 2
Provide answers to significant evaluative questions that
are posed
It is a value judgment based on
defensible criteria
Center for Applied Research at CPCC 2013
10. Evaluation Questions
Provide the direction and foundation for the
evaluation (without them the evaluation will
lack focus)
The evaluation’s focus will determine
the questions asked.
Need Process Outcomes
Assessment Evaluation Evaluation
Questions? Questions? Questions?
Center for Applied Research at CPCC 2013
11. TYPES OF EVALUATION
Process evaluation – determines if the
processes are happening according to the plan
The processes of a program are the “nitty-
gritty” details or the “dosage” students,
patients or clients receive – the activities
It is the who is going to do what and when
It answers the question “Is this program being
delivered as it was intended.”
Center for Applied Research at CPCC 2013
12. TYPES OF EVALUATION
Outcome evaluation (most critical piece for
accreditation)
determines how participants do on short-range,
mid-range or long-range outcomes
Usually involves setting program goals and
outcome objectives
Answers the question “is this program working”
and/or “are participants accomplishing what we
intended for them to accomplish”
Center for Applied Research at CPCC 2013
13. TYPES OF EVALUATION
Impact evaluation
How did the results impact the student
group, college, community, family
(larger group over time)
Answers the question “Is this program
having the impact it was intended to
have (so you must start with
intentions)?”
Center for Applied Research at CPCC 2013
14. TWO MAJOR TYPES OF EVALUATION
Center for Applied Research at CPCC 2013
15. IR DEPARTMENTS
The Good News Is…..
You are all data people
The Bad News Is….
You are all data people
Sometimes have difficulty realizing
this is not research and demands
more than data from your student
system
Center for Applied Research at CPCC 2013
16. Evaluation Research
Use intended for use – use is the produces knowledge – lets
rationale the natural process determine
use
Questions the decision-maker, not evaluator, the researcher determines the
comes up with the questions to questions
study.
Judgment compares what is with what studies what is
should be – does it meet
established criteria
Setting action setting/priority is to the priority is to the research, not
program, not the evaluation what is being studied
Roles friction among evaluator’s roles not the friction; research vs.
and program giver’s roles because funder – no friction
of
judgmental qualities of evaluator.
Center for Applied Research at CPCC 2013
18. INTERVENTIONS HAVE QUESTIONABLE
SUCCESS
The evaluated don’t take into consideration all
factors including methodology and quality of
implementation
College needs to have a realistic/courageous
conversation on standards of evidence, statistical
significance and expectations
Spend most of the time planning the interventions,
not on how to evaluate it
Never define what success should look like,
reasonable target
Center for Applied Research at CPCC 2013
19. INTERVENTIONS ARE OFTEN TOO
COMPLICATED
Multiple layers of independent variables
College lacks the staff, software or ability to carry it
out.
Groups keep getting smaller and smaller (for sample
or comparison groups).
Don’t really know what worked.
Expansion happens too quickly.
Center for Applied Research at CPCC 2013
20. INTERVENTIONS HAVE QUESTIONABLE
ABILITY TO BE ADAPTED ON A LARGE
SCALE
Not enough consideration of the costs of scaling
Don’t want to cancel plans involving un-scalable
interventions (someone’s pet project)
Develop culture where it is ok to take risk and learn
from mistakes
Center for Applied Research at CPCC 2013
21. THE COLLEGE SKEPTIC
The one who wants everything to be
statistically significant
The faculty group who wants to talk about
confidence intervals or power
Fear that things won’t work
“We tried that before”
They confuse evaluation with research.
Center for Applied Research at CPCC 2013
22. LIMITED ABILITY TO EVALUATE.
Whole concept is new to many.
Funders forces us to begin the process.
May be no one at the institution to lead them
through it (health faculty are the best place to
start).
Don’t know what resources are out there?
Center for Applied Research at CPCC 2013
23. ANALYSIS PARALYSIS
Let’s splice and dice the data more and more
and more.
Too much data to analyze
Don’t know what it tells them
How do we make a decision about priorities
and strategies from 200 pages of data
tables?
Center for Applied Research at CPCC 2013
24. THE SUMMER HIATUS
Faculty leave in June and never give the initiative
a thought until August 20th.
No interventions are in place when fall term
begins
No evaluation tools are in place.
Baseline data cannot be collected.
August 20-31 they are mostly concerned with
preparing for fall classes (as they should).
Center for Applied Research at CPCC 2013
25. NO WORKABLE EVALUATION TIMELINES
Creating a timeline.
Identifying all the detail.
Getting a team to actually follow it.
Who is responsible for each piece.
Where do completed surveys/assessments
go – who scores them – who analyzes them –
who makes decisions based on them?
Center for Applied Research at CPCC 2013
26. What does a logic model look like?
Graphic display of boxes and
arrows; vertical or horizontal
Relationships, linkages.
Any shape possible
Circular, dynamic,
Cultural adaptations, storyboards.
Level of detail
Simple
Complex
Multiple models
Source / Adapted from UW-Extension:
http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.htmlResearch at CPCC 2013
Center for Applied
27. Where are you going?
How will you get there?
What will tell you that
you’ve arrived?
A logic model is your program
ROAD MAP
Center for Applied Research at CPCC 2013
28. Example: Every day logic
model –
Family Vacation
Family Members Drive to state park Family members
learn about each
Budget other; family
Set up camp
bonds; family has
Car a good time
Cook, play, talk,
laugh, hike
Camping
Equipment
Source: E Taylor-Powell, University of Wisconsin- Extension-Cooperative Extension
Center for Applied Research at CPCC 2013
29. Example: Financial
management program
Situation: Individuals with limited knowledge and skills in basic financial management are
unable to meet their financial goals and manage money to meet their needs.
INPUTS OUTPUTS OUTCOMES
Extension We conduct a variety of Participants gain
educational activities knowledge, change
invests time and
targeted to individuals practices and have
resources who participate improved financial well-
being
WHAT WE INVEST WHAT WE DO WHAT RESULTS
Source: E Taylor-Powell, University of Wisconsin- Extension-Cooperative Extension
Center for Applied Research at CPCC 2013
30. Example: One component of a
comprehensive parent education and
support initiative
During a county needs assessment, majority of parents reported that they were
Situation: having difficulty parenting and felt stressed as a result
INPUTS OUTPUTS OUTCOMES
Parents
increase Parents
Develop knowledge of identify
Staff parent ed child dev appropriate Improve
actions to d child-
curriculu
take parent
m Targeted Parents
relation
Mone better
Deliver parents understandin
s
y
series of attend g their own Parents use
parenting effective Strong
Partner interacti style parenting families
Parents gain
s vesessio skills in practices
Researc ns
Facilitat effective
parenting
h e practices
support
groups
Assumptions: External factors:
Center for Applied Research at CPCC 2013
31. Example: Smoke free
worksites
Situation: Secondhand smoke is responsible for lung cancer, respiratory symptoms, cardiovascular disease, and
worsens asthma. Public policy change that creates smoke free environments is the best known way to reduce and
prevent smoking.
Inputs Outputs Outcomes
Assess
worksite Demonstrations
Increased
tobacco Worksite awareness of of public support
Coalition
policies and owners, for SF worksites
importance of
Time practices managers SF worksites
Dollars Develop SF worksites
community Increased SF worksites
Partners Unions
support for SF knowledge of SF policies
Including worksites worksite drafted
youth benefits &
options
Workers;
Organize and
union
implement SF worksite
members Increased
strategy for policies
commitment,
targeted passed
support and
worksites
Public demand for
SF worksites
Adherence to
smoke-free
policies
Source: E Taylor-Powell, University of Wisconsin- Extension-
Cooperative Extension Center for Applied Research at CPCC 2013
32. Need AssessmentProcess EvaluationOutcomes Evaluation
Questions? Questions? Questions?
INPUT
INPUT PROCESS
PROCESS OUTCOMES
OUTCOMES
What Is the To what extent
resources are intervention are desired
needed for strategy being changes
implemented as occurring? For
starting this whom?
intervention intended?
strategy? Is the
intervention
How many Are participants strategy making
staff members being reached as a difference?
are needed? intended? What seems to
work? Not work?
Source: R. Rincones-Gomez, 2009 for Applied Research at CPCC 2013
Center
33. CHAIN OF OUTCOMES
SHORT MEDIUM LONG-TERM
Seniors increase Practice safe cooling of Lowered incidence of food
knowledge of food food; food preparation borne illness
contamination risks guidelines
Participants increase Establish financial goals, Reduced debt and
knowledge and skills in use spending plan increased savings
financial management
Community increases Residents and employers Child care needs are met
understanding of discuss options and
childcare needs implement a plan
Empty inner city parking Youth and adults learn Money saved, nutrition
lot converted to gardening skills, nutrition, improved, residents enjoy
community garden food preparation and mgt. greater sense of
community
Source: E Taylor-Powell, University of Center for Applied Research at CPCC 2013
Wisconsin- Extension-Cooperative Extension
34. WHAT ARE THE SUMMATIVE AND FORMATIVE
OUTCOME INDICATORS
Supplemental Instruction
Learning Communities
Required Orientation
Academic Success Course
Minority Male Mentoring
Developmental Math Redesign
Peer Tutoring
Accelerated English
Center for Applied Research at CPCC 2013
35. AT YOUR TABLES ……….
Select an ATD student success
initiative at your college that you plan to
evaluate before you make the decision
to scale it up. (if you can’t think of one
use the online learning one in your
handouts)
Use this program for each activity.
Center for Applied Research at CPCC 2013
36. 1. BRING TOGETHER THE PROGRAM
DEVELOPERS
Ask them to answer these question:
1. Why did you develop this program with these
program characteristics?
2. What do you think students (or participants) will
get out of this program (what changes)?
3. How do you tie specific program content to
specific expected changes or improvements in
participants.
Center for Applied Research at CPCC 2013
37. 2. ORIENT AN EVALUATION TEAM
Who should be on it?
What skills do you need at the table (what
staff members have those?)
What should be their charge?
Center for Applied Research at CPCC 2013
38. 3. GATHER INFORMATION ON
POTENTIAL OUTCOMES.
What are potential sources for
outcomes?
Center for Applied Research at CPCC 2013
39. 4. WRITE OUTCOME STATEMENTS
Sometime these are already written (from
grants)
Make them clear
Don’t draw a number out of a hat
Test it out
Create a logic model
Center for Applied Research at CPCC 2013
40. 5. CREATE OUTCOME INDICATORS
Outcome Indicator. - Usually referred to as a key
performance indicator, this is the data, or set of statistics that
best verifies the accomplishment of a specific outcome. An
outcome indicator for college readiness might be an SAT score
of 1100 or above. It is typically the accomplishment of a
specific skill or assessment at a certain level that indicates an
outcome is met.
What data can you access?
What assessments need to be selected?
Center for Applied Research at CPCC 2013
41. 6. CREATE OUTCOME TARGETS
Outcome Target – the benchmark set as a
performance indicator for a given outcome. An example
would be that 80% of students would score a 75% or above
on a reading assessment. The outcome target would be
“80% of students.”
How would you create these targets or
benchmark?
Do you need a comparison group?
What is an acceptable level of
improvement or change?
Center for Applied Research at CPCC 2013
42. 7. CREATE ALL TOOLS
You will probably need:
Demographic sheets
Attendance or participation log
Formative evaluation tools
Will they be online or pencil/paper tools
(benefits of each)
When do they need to be ready?
Who needs copies?
Create evaluation timeline.
Center for Applied Research at CPCC 2013
43. 8. PILOT TEST THE PROCESS
Make sure it works
Give a small group of student or faculty/staff the
assessments to make sure they are clear
Work out all the detail
Who distributes it
Who collects it
Who scores it
Who puts it in the spreadsheet
Who keeps up with the post-test dates, etc.
Center for Applied Research at CPCC 2013
44. 9. IMPLEMENT THE EVALUATION
Follow your plan
Center for Applied Research at CPCC 2013
45. 10. ANALYZE RESULTS
Sometimes just numbers and percents
Sometimes statistical tests are needed
If students don’t meet the summative
evaluation benchmarks, analyze the
formative evaluation
Center for Applied Research at CPCC 2013
46. 11. IMPROVE YOUR PROCESS AND
PROGRAM
Takes several years to have good data.
Discuss how the evaluation can be
improved
Discuss how the program can be
improved
Center for Applied Research at CPCC 2013
47. CLOSING
Establish your plan
Follow your plan
Assign responsibility for it
Expect big things
Use results to improve what you do (close the
loop)
Center for Applied Research at CPCC 2013
Actually, we use logic models everyday. Let’s look at this… We want to take a family vacation and what we really hope is that we’ll have a good time and enjoy being together. We have had experience and know (our own personal research tells us) that camping is something we all enjoy doing together. So, in order to take a camping trip, we need.. If this…, then that…. Logic models involve a mental process. A logic model shows the series of connections and logical linkages that is expected to result in achievement of our goal.