1. Leveraging Analytics
to Improve Student Success
Karen Vignare,
University Maryland University College
@kvignare
Ellen Wagner, PAR Framework
@edwsonoma
2. Session Description
• This session shows how analytics can be used to identify
opportunities for improving student success.
• By the end of the session, participants will make connections
between predictions about risk, and the interventions most
likely to work best under varying conditions and with different
populations.
4. “But education researchers have
always worked with data.”
• We do qualitative research with data
• We do quantitative research with data
• We do evaluations with data
• We develop surveys and instruments and experiments to
collect more data
• We pull data from LMSs, SISs, ERPs, CRMs …
• We write reports, summaries, make presentations, develop
articles and books and webcasts….
7. Analytics in Higher Education
Learning Analytics
Best way to teach and learn
Learner Analytics
Best way to support students
Organizational Analytics
Best ways to operate a college
Academic
Analytics
8. Create new insights and opportunities for
data in our practices
• Enrollment management
• Student services
• Program and learning experience design
• Content creation
• Retention, completion
• Gainful employment
• Institutional Culture
9. How Are We Doing So Far?
• Data is the number 1 challenge in the adoption and use of analytics.
Organizations continue to struggle with data accuracy, consistency,
access.
• The primary focus of analytics focuses on reducing costs, improving
the bottom line, managing risk.
• Intuition, based on experience, is still the driving factor in data-
driven decision-making. Analytics are used as a part of the process.
• Many organizations lack the proper analytical talent. Organizations
that struggle with making good use of analytics often don’t know
how to apply the results.
• Culture plays a critical role in the effective use of data analytics.
9
10. GROUP DISCUSSION
• Is your institution using (or planning to use) academic analytics
specifically to improve student success?
• What kinds of questions are you trying to answer?
• What kinds of data are you planning to use?
• What kinds of barriers are you encountering?
11. Getting to the right answer takes work
• Analysis and model building is an
iterative process
• Around 70-80% efforts are spent
on data exploration and
understanding.
SAS Analysis/Modeling Process
12. Link Predictions to Action
• Predictive analytics refer to a wide varieties of methodologies.
There is no single “best” way of doing predictive analytics.
You need to know what you are looking for.
• Simply knowing who is at risk is simply not enough. Predictions
have value when they are tied to what you can do about it.
• Linking behavioral predictions of risk with interventions at the
best points of fit offers a powerful strategy for increasing rates
of student retention, academic progress and completion.
14. What PAR does
PAR uses descriptive, inferential and predictive analyses to create
benchmarks, institutional predictive models and to inventory,
map and measure student success interventions that have direct
positive impact on behaviors correlated with success.
15. Linking Predictions to Action
• Identify obstacles and remove barriers from student success
pathways.
• Provide actionable information so students and advisors can
build informed opportunity pathways.
• Know where to invest in student success leveraging
collaborative insight that determine return on investment in
interventions and support.
16. Benchmarks & Insight Predictive Analytics Intervention Inventory and ROI
Tools
Diagnostics
PAR analytic toolset
17. Benchmarks & Insight Predictive Analytics Intervention Inventory and ROI
Tools
Web Tools
Student Success Matrix (SSMx)
18. PAR by the Numbers
• 2.2 million students and 24.5 million courses in the PAR data warehouse, in
a single federated data set, using common data definitions.
• 48 institutions, 351 unique campuses.
• 77 discrete variables are available for each student record in the data set.
Additional 2 dozen constructed variables used to explore specific
dimensions and promising patterns of risk and retention.
• 343 discrete interventions filtered on predictor behaviors, point in student
life cycle, student attributes, institutional priorities and ROI factors in the
growing SSMx dataset.
19. Structured, Readily Available Data
• Common data definitions
= reusable predictive
models and meaningful
comparisons.
• Openly published via a cc
license @
https://public.datacookb
ook.com/public/institutio
ns/par
21. PAR Puts it All Together
Determine students
probability of failure
(predictions)
Determine which
students respond to
interventions (uplift
modeling)
Determine which
interventions are most
effective (explanatory
modeling)
Allocate resources
accordingly (cost
benefit analysis)
22. Findings from aggregated dataset
Positive Predictors
High school GPA (when available)
Dual Enrollment – HS/College
Any prior credit
CC GPA
Credit Ratio
Successful Course Completion
Positive completion of DevEd
Courses
Negative Predictors
Withdrawals
Low # of credits attempted
Varies but can be significant
PELL Grant Recipient
Taken Dev Ed
Age
Fully online student
Race
27. Specific Examples of
Data Driven Improvements
• UMUC / U of Hawaii – replication of community college success
prediction studies
• U of Hawaii – “Obstacle courses”
• University of North Dakota – predictives tied to student
watchlist data
• Intervention measurement at Sinclair CC and Lone Star CC
• National online learning impact study on student retention (in
press, based on results from >500,000 students taking
onground, blended and online courses)
28. Intervention Measurement –
Student Success Courses Results
• 12 month credit ratio: Only 1 of the 8 Student Success
Courses analyzed showed a statistically significant positive
effect for students taking the course vs. those who did
not.
• Retention: 7 of the 8 courses showed a significantly
positive effect
• Retention higher by 14% to 4X
30. Public university offering online degree
programs to a diverse population of
working adults
Largest open access public online
university in U.S.
Premier provider of higher education to
U.S. military since 1949
Part of the University System of Maryland
About UMUC
31. 20th Century
Historical
Longitudinal
Warehouse
Siloed
External
Reporting
21st Century
Predictive
Real-Time
Dashboards
Integrated
Institutional Insights
Continuous
Improvements
Evolution of Data for Retention
32. Institutional Research
Institutional Effectiveness
Business Intelligence
Civitas Learning, Inc.
PAR Framework, Inc.
Retention Resources at UMUC
33. Pre-enrollment
Demographics
Enrollment
LMS Engagement
Student Performance
Transfer
Military
Factors Included in Predictive
Model for Retention at UMUC
34. Campus
Class Load
Military Status
Academic Performance
Payment Method
Key Factors for Retention at UMUC
35. One year retention (year over year measured
with a cohort)
Re-enrollment (term to term metric that
includes all students)
Successful course completion (percentage of
students receiving a successful grade)
Graduation (1,2,3,4,5, and 10 year rate tracks
the graduation status of the starting cohort over
time)
Metrics at UMUC
36. Curriculum Redesign (2010)
8-week Standard Sessions (2010)
Community College Transfer (2010)
Registration Policy (2013)
Onboarding (2014)
Just-in-Time Messages (2014)
Retention Initiatives
37. 71.2 72 71.6 73.2
60.5 59.5 61.5
66
0
10
20
30
40
50
60
70
80
Fall 2011 Fall 2012 Fall 2013 Fall 2014
Stateside
Overseas
Retention Rates and Headcounts
47,416 46,213 41,197 41,356
38. Operationalize
successful tests;
“Lessons
Learned” fed
back
to body of
knowledge
Student Retention Enterprise Framework
Diagnosed
from internal
data and
external
research
Root cause
analysis
performed
and search
of existing
body of
knowledge
solutions
Work within
Governance
Structure
Levers pulled
here;
Measure
success &
ROI;
Quarterly
Retention Root Cause Identification & Analysis
Retention
Opportunity
Problem
Analyzed
Hypothesis
Generated
Test &
Learn Cycle
Operationaliz
e or Re-
create
39. Discussion
How will you begin, or improve, your
analytics journey at YOUR institution?
40. Elements of a Data Model
Use modeling to
Test likely impact on retention when new
initiatives or planned interventions are
undertaken
Create models that build out retention
impact by segments, e.g., demographics,
academic programs, persistence, etc.
Analytics is not one size fits all
3 major areas of analytics in HE, according to Russ
Learning
The act & process of learning, Curricular, Best way to teach and learn
Learner
Demographics, Behaviors, Best way to support students (Individually is the goal)
Organizational
Capacity, budget, scheduling, Best way to operate a college
Place Holders for Demo Sections
Place Holders for Demo Sections
We can give examples from each category
These are commonly used to report retention (as opposed to measuring success)
PRESENTATION NOTES
This is the strategic framework that we need to implement
In “Operationalize or re-create”, we didn’t abandon anything, because it always rolls back into body of knowledge
Governance is part of this!