This document summarizes a talk about what we are learning from implementing learning analytics (LA) in higher education. It discusses the drivers for interest in LA, perspectives from industry and research, benchmarks of current LA adoption, and emerging models. While industry rhetoric portrays LA as providing easy answers, the reality is more complex. Most universities are still in early stages of basic reporting rather than advanced applications. For LA to meet its potential and have long term impact, a process-focused model is needed that builds organizational capacity, is adaptive, and takes a broad view of LA beyond just retention.
➥🔝 7737669865 🔝▻ mahisagar Call-girls in Women Seeking Men 🔝mahisagar🔝 Esc...
What are we learning from learning analytics: Rhetoric to reality escalate 2014
1. What are we learning from
learning analytics?
Shane Dawson
Shane.dawson@unisa.edu.au
Twitter: @shaned07
2. Introduction
• Student from Shanghai-based East China Normal
University
• "Last month, you spent less on meals. Are you in
financial difficulty? If so, please contact me via
phone, text message or e-mail.“
http://www.bjreview.com.cn/nation/txt/2014-06/23/content_625466.htm
3. Introduction
• Automatically track students' meal card spending.
• If spending falls under a threshold level, a
designated faculty member sends the student a
short message to check whether they are in
financial difficulty.
http://www.bjreview.com.cn/nation/txt/2014-06/23/content_625466.htm
4. • Highlights the rapidly growing list of applications of
student data
• Academic
• Social
• Pastoral
Introduction
5. Introduction
This talk:
• What are we learning from the implementation
of LA into HE?
• What are the conversations, expectations and
reactions to this nascent field?
• What are the emerging models for institutional
implementation?
8. Drivers
• 1926 - Pressey built an instructional machine to
provide multiple choice questions
• “…with the addition of a simple attachment the
apparatus will present the subject with a piece of
candy or other reward upon his making on any
given score for which the experimenter may have
set the device…”
Shute, V. J., & Psotka, J. (1994). Intelligent Tutoring Systems: Past, Present, and Future (No. AL/HR-TP-1994-
0005). ARMSTRONG LAB BROOKS AFB TX HUMAN RESOURCES DIRECTORATE.
9. • Scale, access and application
• Ease of access to learner data – LMS, SIS, mobile
• Growth in adoption of technical devices
• Huge investment in analytics – industry &
Government
Data
10. Learning Analytics
• Learning Analytics
• “game changer” for education
…is the collection, collation, analysis and reporting
of data about learners and their contexts, for the
purposes of understanding and optimizing
learning
12. Industry rhetoric
“Get answers to your most important questions like:
• How can I easily find students who are at-risk?
13. Industry rhetoric
“Get answers to your most important questions like:
• How can I easily find students who are at-risk?
• Yes possible – much research in this area
• However, ignores the complexity
• Context is critical
• Not all courses are alike – student diversity
and approach
Overstated
14. Industry rhetoric
“Get answers to your most important questions like:
• Who are the most innovative instructors?”
15. Industry rhetoric
“Get answers to your most important questions like:
• Who are the most innovative instructors?”
• How and why? What defines innovative in this
space given the myriad of tools and learning
approaches available
Why?
16. Industry rhetoric
“…In five years the classroom will learn you! And
personalize course work accordingly”
http://www.research.ibm.com/cognitive-computing/machine-learning-applications/decision-support-education.
shtml#fbid=MRUeQg4jzVG
17. Industry rhetoric
“…In five years the classroom will learn you! And
personalize course work accordingly”
• Currently available if:
• Cognitive tutor, Knewton, Knowillage
• Ryan Baker – on/off task behaviour; gaming and
choice of major
Plausible
18. Industry rhetoric
“Enhance student outcomes with the ability to monitor,
evaluate, and predict learner performance to drive
retention and improve outcomes.”
• Much work in this area to predict performance
however, intervention strategies less well
understood.
• Greater recognition SRL
Available but not utilised
http://www.brightspace.com/solutions/higher-education/advanced-analytics/
19. Industry rhetoric
“…predictive analytics capabilities help educators target
learning strategies and pre-emptively mentor at-risk
learners.”
http://www.brightspace.com/solutions/higher-education/advanced-analytics/
https://www.flickr.com/photos/tadeeej/3228729514/
20. Industry rhetoric
Do we need predictive analytics here?
https://www.flickr.com/photos/tadeeej/3228729514/
21. Industry rhetoric
• Unlikely – practice is difficult change. However first
step is to aid identification.
• Tannes et al (2011) - Course Signals feedback
• Instructors – feedback was motivational
• Student success related to instructional
feedback
Tannes, et al (2011) . Using Signals for appropriate feedback. Perceptions and practice. Computers and Education,
57, (4), 2414 - 2422
23. Research rhetoric
What is missing: a focus on learning process
• SRL proficiency (Gasevic; Winne)
• Discourse analysis and text mining (Rose)
• Learning design and Instructional conditions
(Lockyer; Gasevic)
• Learning dispositions (Deakin Crick, Buckingham
Shum)
• Literacies or fluencies (Siemens)
• Creativity (Pei Ling Tan)
24. Research rhetoric
Great research BUT:
• Tends to ignore the complexity of university wide
practice
• Predominantly, small scale and technology and
institutional specific
• Lacks guidance to aid further adoption
• Frequently requires high level skills and capacities
25. Hence:
• Very few university wide examples of LA adoption
• But obviously an area of increasing need and
importance
Research rhetoric
Leads to questions related to how to
implement, get started and what data?
26. Learning Analytics
National project to benchmark LA status, policy
and practices for Australian Universities
27. Benchmarking
Interviews with 39 Universities and 30 “experts”:
• Identification of current practice, methods and
approaches
• Identification of key drivers for institutions, stage
of development, process for implementation,
project leads
28. Benchmarking
Research perspective:
• Focus on understanding learning processes
• Broad range of data sets –larger size and range
of data (relational data)
• Limited interest in the scalability of findings
across institution (at least not a stated intention)
29. Benchmarking
Research perspective:
“My hope [for LA] is that we can develop a better
theory about how people learn and forge
recommendations that might nudge learners
toward more productive, more efficient, more
satisfying ways of learning”
30. Benchmarking
University leaders perspective:
• Primarily focused on retention
• “It’s [LA] a tool for improving retention”
• Limited mention of LA as a means to improve
learning
• Main driver is budget (cost savings)
• Perception that it is only related to – LMS and
SIS
• Limited number of data sets considered
31. Benchmarking
University leaders perspective:
• Success is seen as staff access to information
• Limited understanding of the application of
interventions that are data informed
• Data visualisations – dashboard development is
the endpoint and goal
• Few institutions with stated LA policy and strategy
32. Benchmarking
• Widening gap between University Admin and
researchers
• Admin – Industry very similar
33. Reality is sobering:
Reality
• Need to develop greater understanding of the role
of technology and role of data in an institution
• Access to data does not mean change in practice
• Interventions and early alerts must be constantly
evaluated, revised and contextualised
34. 2005 – Goldstein & Katz:
• Stage 1: Extraction and reporting of transaction-level
data
• Stage 2: Analysis and monitoring of operational
performance
• Stage 3: “What-if” decision support (such as
scenario building)
• Stage 4: Predictive modeling & simulation
• Stage 5: Automatic triggers and alerts
(interventions)
Reality
35. 2005 – Goldstein & Katz:
• Stage 1: Extraction and reporting of transaction-level
data
• Stage 2: Analysis and monitoring of operational
performance
• Stage 3: “What-if” decision support (such as
scenario building)
• Stage 4: Predictive modeling & simulation
• Stage 5: Automatic triggers and alerts
(interventions)
Reality
36. • Yanosky (2009) – 305 institutions, 58% at
stage 1, 20% at stage 2
• Bichsel (2012)
• Interest in analytics is high, but many
institutions had yet to make progress
beyond basic reporting.
37. Reality
2014 LA organisational adoption is low:
• Australia is predominantly at a stage of basic
reporting
• Very few institutions have an enterprise
approach
• While the research has well progressed -
implementation remains a challenge.
38. Reality
• Essentially, 2 models emerging
1. Solutions focused
• IT driven or
• L&T driven or
• Industry
2. Process focused
• Individual “faculty” or
• Networked and integrated
40. Reality
Adaptability of
system to
meet org
needs
High
Low High
Low
Ease of adoption
Solutions
focused
Process
focused
41. Reality
Adaptability of
system to
meet org
needs
High
Low High
Low
Ease of adoption
Solutions
focused
Process
focused
42. Reality
Adaptability of
system to
meet org
needs
High
Low High
Low
Long term impact
Solutions
focused
Process
focused
43. Reality
Solutions focused – Short term gains
Advantages Disadvantages
• Cost • Locked in
• Speed of delivery • Short time for
acceptance
• Ease of
dissemination
• Lacks capacity building
• Scalable, risk
mitigation
• Access to data is often
limited
44. Reality
Process focused – Longer term gains
Advantages Disadvantages
• Capacity building • Time required
• Adaptive to
changing reqs
• Sustained leadership
and principles of access
• Acceptance of
process
• Complexity
• Shared ownership • Raises org threat
• Evidenced based
45. Reality
Common model – Solutions focused:
• IT lead and implemented
• Closed system focused on scalability,
performance, and list of features
• Dashboards/ reports are important
• Dissemination and access gains
[Success is seen as staff access to information]
• Where is the why?
46. Conclusion
LA sophistication model
Siemens, G., Dawson, S., & Lynch, G. (2013). Improving
the Productivity of the Higher Education Sector: Policy
and Strategy for Systems-Level Deployment of Learning
Analytics. Society for Learning Analytics Research for the
Australian Government Office for Learning and Teaching.
49. Is there an alternative:
Reality
• What are the organisational needs and how to
gain both impact and adoption
• How do we merge both models to gain both
short and long term impact?
50. An alternative
Developing models:
• Cross organisation
• IT, L&T, Faculty, Research, Administrators
• Development of exemplars and research
informed.
• Process is future looking and agile
• Increased time required for acceptance and
discussion
• Problem focused – understand the problem
51. An alternative
Developing models:
• Building organisational capacity
• Time for organisational acceptance
• Identify sites of interest and growth
• Research ideas promoted and faculty invited
into new spaces
• Need to act on data and findings
52. Complex adaptive system:
• Education is complex
• Learning is complex
• Organisations are complex
• CAS are systems large numbers of agents that
interact and adapt or learn
• Non-linear and resilient
53. Complex Leadership Theory:
• CAS – requires new forms of leadership
(Complex leadership theory - Uhl-Bien et al)
• Interactive, engaged, multi-level and
contextual
• Takes advantage of the dynamic capabilities
the system
• Leadership vs leaders
Uhl-Bien, M., Marion, R. & McKelvey, B. (2007). Complexity Leadership Theory: Shifting leadership from the
industrial age to the knowledge era, The Leadership Quarterly, Volume 18(4),298-318
55. Complexity Leadership:
Administrative Leadership
Adaptive Leadership
Administrative stifles
adaptive. (Bureaucratic
and top down)
However – it is driven
and solution focused
56. Complexity Leadership:
Administrative Leadership
Adaptive Leadership
Adaptive (lack of
integration)
However capacity
building and
innovation focused
58. Enabling:
• Leadership- focused on process and enabling staff
• Developing awareness and building capacity
• Diverse teams represented
• IT/ L&T – systems
• Data analysts
• Data wranglers
• Teaching staff
• Researchers
E.g.
• Open UK
• University of Michigan
• University of Texas
60. Conclusion
• Change in education is complex and multi-faceted
• Requires new models for implementation and
leadership
• Enabling leadership
• models that are agile and research informed
• Requires an inter-disciplinary approach
• Embrace Friction - generates discussion and
innovation
61. Conclusion
For the reality of LA to meet the rhetoric (to reach
potential):
• LA is not a technology
• LA is not a dashboard
• LA is not one individual
• LA is team based
• LA is dynamic and requires longer term
investment and process