SIRIKT Keynote: Learning Analytics: The good, the bad, or perhaps ugly?
The presentation will be the introduction of learning analytics, setting it in the context of big data and the increasing role of technology in learning, emphasising the role of analytics for supporting learning. Some examples will be given, and the points will be highlighted where we have the best evidence for learning analytics being helpful. The presentation will end with some suggestions – some practical, some conceptual – for how researchers and practitioners could move forward.
Dr. Bart Carlo Rienties is Reader in Learning Analytics at the Institute of Educational Technology at the Open University UK. He is programme director Learning Analytics within IET and Chair of Student Experience Project Intervention and Evaluation group, which focusses on evidence-based research on intervention of 15 modules to enhance student experience. As educational psychologist, he conducts multi-disciplinary research on work-based and collaborative learning environments and focuses on the role of social interaction in learning, which is published in leading academic journals and books. His primary research interests are focussed on Learning Analytics, Computer-Supported Collaborative Learning, and the role of motivation in learning. Furthermore, Bart is interested in broader internationalisation aspects of higher education. He successfully led a range of institutional/national/European projects and received several awards for his educational innovation projects.
3. (Social) Learning Analytics
“LA is the measurement, collection, analysis and reporting of data about learners
and their contexts, for purposes of understanding and optimising learning and the
environments in which it occurs” (LAK 2011)
Social LA “focuses on how learners build knowledge together in their cultural
and social settings” (Ferguson & Buckingham Shum, 2012)
4.
5. How can we filter the “good”
from “bad”, or even ugly
analytics:
1. What evidence is there that analytics
actually helps learners to reach their
potential?
2. How does the Open University UK use
analytics to provide support for students
and teachers?
3. How can we make learning more
personalised, adaptive and meaningful,
and what are the implications for
Moodle?
7. 2) Linking learning design 150+ modules
with learning analytics
1) How does the OU use LA? OU Analyse
3) How do students choose
collaboration tools?
4) Learning analytics with
120+ variables
8. Q2 Learning Analytics at OU: OU
Analyse
• 15+ modules, 20K+ students
• 4 different analytics approaches
• Based upon Moodle/SAS data
warehouse
• Developed in house by Knowledge
Media Institute (Prof Zdrahal)
9.
10. Important VLE activities
XXX1: Forum (F), Subpage (S), Resource
(R), OU_content (O), No activity (N)
Possible activities each week are: F, FS, N,
O, OF, OFS, OR, ORF, ORFS, ORS, OS, R,
RF, RFS, RS, S
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
11. Start
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
Pass Fail No submit TMA-1time
VLE opens
Start
Activity space
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
12. FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
Start
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
Pass Fail No submit TMA-1time
VLE opens
Start
VLE trail: successful
student
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
13. FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
Start
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
Pass Fail No submit TMA-1time
VLE opens
Start
VLE trail: student who
did not submit
17. Four predictive models
1. Case-based reasoning (reasoning from
precedents, k-Nearest Neighbours)
A. Based on demographic data
B. Based on VLE activities
2. Classification and Regression Trees (CART)
3. Bayes networks (naïve and full)
4. Final verdict decided by voting
18. Try the demo version yourself!
URL: http://analyse.kmi.open.ac.uk
Select Dashboard in the horizontal bar on top of the screen.
Username: demo, Password: demo
This fully anonymised version does not use data of any existing OU
module. Consequently, the STUDENT’S ACTIVITY RECOMMENDER (see
the Student view) referring to the module material could not be
included.
22. Q2/Q3 Learning analytics on meso
• 157+ modules, 60K+ students
• Learning design linked to
a. Student experience
b. Learning behaviour
c. Learning performance
23.
24.
25.
26. Method – data sets
• Combination of two different data sets:
• learning design data (157 modules)
• student feedback data (51)
• VLE data (42 modules)
• Academic Performance (51)
• Data sets merged and cleaned
• 29537 students undertook these modules
27. Method – LD process
• Mapping of modules to create learning
design data by OU’s LD specialists
• Importance of consistency in mapping
process; validated in team and by Faculty
• Use of seven activity categories, derived
from five year study across eight HE
institutions
28.
29.
30.
31. Assimilative Finding and
handling
information
Communicati
on
Productive Experiential Interactive/
Adaptive
Assessment
Type of
activity
Attending to
information
Searching for
and
processing
information
Discussing
module related
content with at
least one other
person (student
or tutor)
Actively
constructing an
artefact
Applying
learning in a
real-world
setting
Applying
learning in a
simulated
setting
All forms of
assessment,
whether
continuous,
end of
module, or
formative
(assessment
for learning)
Examples of
activity
Read, Watch,
Listen, Think
about,
Access,
Observe,
Review, Study
List, Analyse,
Collate, Plot,
Find,
Discover,
Access, Use,
Gather, Order,
Classify,
Select,
Assess,
Manipulate
Communicate,
Debate,
Discuss, Argue,
Share, Report,
Collaborate,
Present,
Describe,
Question
Create, Build,
Make, Design,
Construct,
Contribute,
Complete,
Produce, Write,
Draw, Refine,
Compose,
Synthesise,
Remix
Practice,
Apply, Mimic,
Experience,
Explore,
Investigate,
Perform,
Engage
Explore,
Experiment,
Trial, Improve,
Model,
Simulate
Write,
Present,
Report,
Demonstrate,
Critique
42. M SD
1
Assimilative
2
Finding
info
3
Communication
4
Productive
5
Experiential
6
Interactive
7
Assessment total
9 Overall I am satisfied with the quality
of the course 81.29 14.51 .253 -.259 -.315* -.11 .018 .135 -.034 .002
10 Overall I am satisfied with my study
experience 80.52 13.20 .303* -.336* -.333* -.082 -.208 .137 .039 -.069
11 The module provided good value for
money 66.86 16.28 .312* -.345* -.420** -.163 -.035 .197 .025 -.05
12 I was satisfied with the support
provided by my tutor on this module 83.42 13.10 .230 -.231 -.263 -.049 -.051 .189 -.065 -.1
13 Overall I am satisfied with the
teaching materials on this module 78.52 15.51 .291* -.257 -.323* -.091 -.134 .16 -.021 -.063
14 Overall I was able to keep up with
the workload on this module 78.75 11.75 .182 -0.259 -.337* -.006 -.274 .012 .166 -.479**
15 The learning outcomes of this
module were clearly stated 89.09 7.01 .287* -.350* -.292* -.211 -.156 .206 .104 -.037
16 I would recommend this module to
other students 74.30 16.15 .204 -.285* -.310* -.086 -.065 .163 .052 -.036
17 The module met my expectations 74.26 14.44 .267 -.311* -.381** -.049 -.148 .152 .032 -.041
18 I enjoyed studying this module 75.40 15.49 .212 -.233 -.239 -.068 -.1 .207 -.017 .016
19 Average learning experience 77.53 13.34 .277* -.308* -.346* -.106 -.103 .177 .017 -.036
20 Average Support and workload 81.09 9.22 .277* -.327* -.399** -.038 -.211 .139 .061 -.377**
48. Dynamic interaction of sychronous and
asychronous learning
Giesbers, B., Rienties, B., Tempelaar, D.T., & Gijselaers, W. H. (2014). A dynamic analysis of the interplay between asynchronous and synchronous
communication in online learning: The impact of motivation. Journal of Computer Assisted Learning, 30(1), 30-50. Impact factor: 1.632.
49. Intrinsic Motivation ↑ initial asynchronous contributions
↑ in asynchronous and synchronous contributions
Giesbers, B., Rienties, B., Tempelaar, D.T., & Gijselaers, W. H. (2014). A dynamic analysis of the interplay between asynchronous and synchronous
communication in online learning: The impact of motivation. Journal of Computer Assisted Learning, 30(1), 30-50. Impact factor: 1.632.
50. Introduction math/stats
• Business
• 1st year students
• Blended
• 0-12 weeks after start studying
• Adaptive learning/Problem-Based
Learning
• N=990
51.
52. Diagnostic
EntryTests
Week 0 Week 1 Week 2 Week 3 Week 4 Week 6Week 5
Quiz 1 Quiz 2 Quiz 3
Final
Exam
• Math-
Exam
• Stats-
Exam
--------------------------------------------- BlackBoard LMS behaviour -----------------------------------------
Week 7
Mastery scores
MyMathlab
Mastery scores
Practice time #
Attempts
Practice time
# Attempts
Mastery scores
Practice time
# Attempts
Mastery scores
Practice time
# Attempts
Mastery scores
Practice time
# Attempts
Mastery scores
Practice time
# Attempts
Mastery scores
MyMathlab
Practice time #
Attempts
Mastery scores
MyStatlab
Mastery scores
Practice time #
Attempts
Practice time
# Attempts
Mastery scores
Practice time
# Attempts
Mastery scores
Practice time
# Attempts
Mastery scores
Practice time
# Attempts
Mastery scores
Practice time
# Attempts
Mastery scores
MyStatlab
Practice time
# Attempts
Demogra-
phic data
QMTotal
Week 8
Learning Styles,
Motivation,
Engagement
Learning
Emotions
-Learning dispositions ------------------ ------------------------------------------------------------------
Tempelaar, D., Rienties, B., Giesbers., B. (2015). In search for the most informative data for feedback generation: Learning Analytics in a data-rich context. Computers in
Human Behaviour. Impact factor: 2.067.
58. Using track data we can follow:
-who is struggling?
-where?
-when?
-why?
59.
60. Who is struggling in week 3?
What can be done about this?
• (Personalised) feedback
• (Personalised) examples
• Peer support
• Emotional/learning support
61. Is data from Virtual Learning Environment systems (e.g., Blackboard, Moodle)
useful for learning (analytics)? What else should we focus on to improve our
understandings of social interaction?
• “Raw” VLE data does not seem very
useful
• (entry)quizzes/formative learning
outcomes in combination with learning
dispositions provide good early-
warning systems
62. Implications for EURO CALL1. What evidence is there that analytics
actually helps learners to reach their
potential?
• http://evidence.laceproject.eu/
2. How does the Open University UK use
analytics to provide support for
students and teachers?
• OU Analyse
• Information Office Model
• Predictive Z-score
• Analytics4Action
63. Implications for EURO CALL3. How can we make learning more
personalised, adaptive and meaningful,
and what are the implications for Moodle?
• Need to incorporate learning design
• Individual differences? Learning
dispositions?
• Emotions?
• Ethics?
64. Learning Analytics: The good,
the bad, or perhaps ugly?
@DrBartRienties
Reader in Learning Analytics
Hinweis der Redaktion
5131 students responded – 28%, between 18-76%
Learning Design Team has mapped 100+ modules
For each module, the learning design team together with module chairs create activity charts of what kind of activities students are expected to do in a week.
For each module, detailed information is available about the design philosophy, support materials, etc.
Explain seven categories
This came as a surprise as LD is implemented as a unique, creative process.
Cluster analysis of 40 modules (>19k students) indicate that module teams design four different types of modules: constructivist, assessment driven, balanced, or socio-constructivist. The LAK paper by Rienties and colleagues indicates that VLE engagement is higher in modules with socio-constructivist or balanced variety learning designs, and lower for constructivist designs. In terms of learning outcomes, students rate constructivist modules higher, and socio-constructivist modules lower. However, in terms of student retention (% of students passed) constructivist modules have lower retention, while socio-constructivist have higher. Thus, learning design strongly influences behaviour, experience and performance. (and we believe we are the first to have mapped this with such a large cohort).
Cluster analysis of 40 modules (>19k students) indicate that module teams design four different types of modules: constructivist, assessment driven, balanced, or socio-constructivist. The LAK paper by Rienties and colleagues indicates that VLE engagement is higher in modules with socio-constructivist or balanced variety learning designs, and lower for constructivist designs. In terms of learning outcomes, students rate constructivist modules higher, and socio-constructivist modules lower. However, in terms of student retention (% of students passed) constructivist modules have lower retention, while socio-constructivist have higher. Thus, learning design strongly influences behaviour, experience and performance. (and we believe we are the first to have mapped this with such a large cohort).
We have been customising data for various audiences such as VCE. This has been a year of change in this area, but we are timetabling key events looking forward so that this is all becoming more routine...
We have been customising data for various audiences such as VCE. This has been a year of change in this area, but we are timetabling key events looking forward so that this is all becoming more routine...