2. 2013-14 Task Group Goals
• Will be to..
– develop the professional learning opportunities
for SUNY faculty and staff
– “identify and share known best practices and
exemplary uses of Learning Analytics for
assessment, and early intervention strategies.”
3. Task Group Activities
Fall 2013
• Presented first year Task Group findings and best practices about
Learning Analytics at several SUNY conferences.
– “Using Big Data to Enhance the Student Experience” panel
presentation at “Building a Smarter University: Big Data, Innovation
and Ingenuity”, October 2013.
– “Enhancing Excellence in Assessment: Institutional Effectiveness and
Learning Analytics” presented at SUNY Council on Assessment (ScoA)
at The College at Brockport, Stony Brook University, and the
University at Albany.
Spring 2014
• Presented best practices about Learning Analytics at through SUNY
webinars.
– “Learning Analytics: Best Practices for Student Assessment”
– “Learning Analytics: Predicting Student Success in A Course”
4. Spring 2014 Pilot
• SUNY Oswego has piloted Starfish retention
system over one academic year
– Nearly 1,000 students targeted in programs with
known persistence issues (freshman and transfer
students)
– 750 courses, 360 instructors, 100 advisors involved
• Impact being assessed; compared to previous
performance of a similar cohort
– scope of impact being measured
(effect vs. effort to "scale up" and track against all students)
6. Learning Analytics - Working Definition
• Software that collects and analyzes multiple
data sets related to the process of learning to
PREDICT and IMPACT student success.
Learning Analytics Task Group of FACT2
Learning Analytics Webinar: Tools & Best Practices for Student Assessment, 4/14/2014
Online course
assignments
Social
media
activities
Student
data
Data potentially collected in…
Blended and online learning environments,
and other emergent resources connected to
the teaching and learning experience.
7. Predictive Analytics: Building Models
Placement for success and completion..
– which students should be steered toward which
courses? Which programs?
Can advising leverage student data?
– If so, what are the best predictors of
performance?
Collect
Data
Data
Analysis
Actionable
Results
8. Development of Large Scale
Approaches
Predictive Analytics Reporting (PAR) Framework
and “Data Cookbook”
9. Predictive Analytics: Building Models
Can we identify
characteristics of a
successful
outcome?
an
unsuccessful
outcome?
DATA SOURCES
Grade in course
Can it be predicted by other
data?
• Major
• High school GPA
• English placement exam
score
• Math placement exam
score
• HS Regent scores….
• SAT Verbal, SAT math
• SAT Writing“every student with a HS
average of 83 or less, did not
successfully complete the
course…”
10. Learning
Outcomes
Teaching &
Learning
activity
Teaching &
Learning
activity
Student
assignments
Teaching &
Learning
activity
Student
assignments
Grading
Evaluate
outcomes
met for
course?
Typically, student
assessment data is
collected a the end of a
course and data is used
to report on outcomes.
“Learning analytics is
not in itself the goal
but could provide a
basis for decision
making for effective
action.”
Learning Analytics Webinar: Tools & Best Practices for Student Assessment, 4/14/2014
11. Prior assessment techniques…
Focused on course outcomes,
but no real-time data for
interventions….
Learning Analytics Webinar: Tools & Best Practices for Student Assessment, 4/14/2014
12. What happened?
Why did it happen?
What will happen?
How can we improve learning?
Descriptive
Analytics
Diagnostics
Analytics
Predictive
Analytics
Prescriptive
Analytics
2014 State of
the Art
for
Learning
Analytics
Learning Analytics Webinar: Tools & Best Practices for Student Assessment, 4/14/2014
13. Learning
Analytic
Tools need to
• Align with learning
principles & pedagogy
• Robust data analysis
• Ethical considerations
• Institutional capacity
Is there a thoughtful educational plan
for interventions and student
feedback?
Learning Analytics Webinar: Tools & Best Practices for Student Assessment, 4/14/2014
14. RISKS & CONSIDERATIONS
Ethics of Data Collection.
Permissions?
What is the educational plan for interventions and student feedback?
Learning Analytics Webinar: Tools & Best Practices for Student Assessment, 4/14/2014
16. LATG Recommendations
• Develop a mechanism to encourage and support the
adoption of Learning Analytics
– For course placements, assessment, and degree
completion
– Through
• professional education programs
• enabling data access
• Identify large Learning Analytics systems
– for predictive analytics and intervention strategies
– Expand & continue Pilots in process for system adoption,
such as StarFish.
– Identify, pilot and adopt learning analytics assessment
tools for course use.
17. Recommendations:
Specifically:
• Establish an ongoing working group to develop
Learning Analytic practices, tools and support
services.
• Develop best practices for campuses interested in
adoption
– resource allocation, effort involved, faculty
development, organizational change management
• Develop educational programs to encourage
campus adoption
18. Recommendation: Data Practices
• Establish common data definitions (e.g.
definitions of "at risk" students)
– leverage national data definition standards, such as
the “Data Cookbook” developed by the Predictive
Analytics Reporting framework.
• Facilitate access to data and develop supporting
policies for data access and privacy - ethical
considerations.
• Develop common indicators and measurements
to assess impact across multiple campuses
Data potentially collected in…
Blended and online learning environments,
Online assignments,
Online portals,
Enrollment data,
And other emergent resources connected to the teaching and learning experience.
Identify the factors that directly and positively influence student success and momentum towards completing a high value postsecondary credential.2
The PAR Framework is predicated on identifying factors that affect success and loss among undergraduate students, with a focus on at-risk, first-time, new, and nontraditional students. While attention had initially been paid only to online students, the sample now includes records of students from on-the-ground, blended, and online programs. Viewing normalized data through a multi-institutional lens and using complete sets of undergraduate data based on a common set of measures with common data definitions provides insights that are not available when looking at records from a single institution.
PAR Framework participation depends on having partner institutions (figure 1) provide a full set of de-identified undergraduate student data that allows for comparative investigation of student success trends over time, at the individual student, course, and degree level. PAR members provide incremental data updates at the end of each term/course enrollment period to measure changes over time, evaluate the impact of student success interventions, and enable the PAR predictive models to be adjusted and tuned for current data.
Typically, student assessment data is collected a the end of a course and data is used to report on outcomes.
LA can now enable interventions and feedback at the time students need it.
To support personalizing the student experience; customize course materials.
develop learning analytics that are learning-centric
Can better support a cycle of predicting success, assessment and just in time feedback. (show diagram)
Ultimately, analysing data to improve learning requires tools that align with pedagogical principles alongside sound data analysis skills, ethical considerations and institutional capacities that can ensure value-driven and valuable interventions.
Data collection and the tools that measure it are not neutral and instead reflect what we
consider to be ‘good’ education, pedagogy, and assessment practices, which are all
fundamental issues at the heart of learning analytics development.