Across the globe many institutions and organisations have high hopes that learning analytics can play a major role in helping their organisations remain fit-for-purpose, flexible, and innovative. A broad goal of learning analytics is to apply the outcomes of analysing data gathered by monitoring and measuring the learning process. Learning analytics applications in education are expected to provide institutions with opportunities to support learner progression, but more importantly provide personalised, rich learning on a large scale. Substantial progress in learning analytics research has been made in the last few years.
Researchers in learning analytics use a range of advanced computational techniques (e.g., Bayesian modelling, cluster analysis, natural language processing, machine learning) for predicting which learners are likely to fail or succeed, and how to provide appropriate support in a flexible and adaptive manner.
In this keynote, I will argue that unless educational researchers at EARLI embrace some of the key principles, methods, and approaches of learning analytics, educational researchers may be left behind. In particular, a main merit of learning analytics is linking large datasets of actual learning processes and outcomes with learning dispositions and learner characteristics. Using evidence-based approaches rapid insights and advancements are developed how learning designs and learning processes can be optimised to maximise the potential of each learner. For example, our recent research with 151 modules and 133K students at the Open University UK indicates that learning design has a strong impact on student behaviour, satisfaction, and performance. Learning analytics can also drive learning in more “traditional”, face-to-face contexts. For example, by measuring emotions, epistemological expressions, and cross-cultural dialogue, social interactions can be effectively supported by innovative dashboards and adaptive
approaches. I aim to unpack the advantages and limitations of learning analytics and how EARLI researchers can embrace such data-driven research approaches
More info at www.bartrienties.nl
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
The Power of Learning Analytics: Is There Still a Need for Educational Research?
1. The power of learning analytics: is there
still a need for educational research?
Tampere, Finland, 30 August 2017
EARLI 2017
@DrBartRienties
Professor of Learning Analytics
2. A special thanks to Avinash Boroowa, Shi-Min Chua, Simon Cross, Doug Clow, Chris Edwards, Rebecca Ferguson, Mark Gaved, Christothea Herodotou, Martin
Hlosta, Wayne Holmes, Garron Hillaire, Simon Knight, Nai Li, Vicky Marsh, Kevin Mayles, Jenna Mittelmeier, Vicky Murphy, Quan Nguygen, Tom Olney, Lynda
Prescott, John Richardson, Jekaterina Rogaten, Matt Schencks, Mike Sharples, Dirk Tempelaar, Belinda Tynan, Lisette Toetenel, Thomas Ullmann, Denise
Whitelock, Zdenek Zdrahal, and others…
4. Learning analytics included 142 times in EARLI Book of Abstracts
http://bcomposes.wordpress.com/
4548 students
3113 education
2751 teachers
2575 study
2294 research
1698 teacher
1522 Germany
1437 knowledge
1390 school
1263 teaching
1255 developm
1241 group
1239 Learning
1124 skills
1097 different
1088 results
1083 Education
1022 methods
1012 studies
998 analysis
5. (Social) Learning Analytics
“LA is the measurement, collection, analysis and reporting of data about learners
and their contexts, for purposes of understanding and optimising learning and the
environments in which it occurs” (LAK 2011)
Social LA “focuses on how learners build knowledge together in their cultural
and social settings” (Ferguson & Buckingham Shum, 2012)
6. Dyckhoff, A. L., Zielke, D., Bültmann, M., Chatti, M. A., & Schroeder, U. (2012). Design and Implementation of a Learning Analytics Toolkit for Teachers. Journal of Educational Technology & Society, 15(3), 58-76.
7. Prof Paul Kirschner (OU NL)
“Learning analytics: Utopia or dystopia”, LAK 2016 conference
9. 1. Increased availability of learning data
2. Increased availability of learner data
3. Increased ubiquitous presence of technology
4. Formal and informal learning increasingly blurred
5. Increased interest of non-educationalists to understand
learning (Educational Data Mining, 4profit companies)
6. Personalisation and flexibility as standard
10. The power of learning analytics: is there still a need
for educational research?
1. How can learning analytics empower
teachers?
2. How can learning analytics empower
students?
3. How to join us…
12. Learning Design is described as “a methodology for enabling
teachers/designers to make more informed decisions in how they go about
designing learning activities and interventions, which is pedagogically
informed and makes effective use of appropriate resources and
technologies” (Conole, 2012).
13. Assimilative Finding and
handling
information
Communication Productive Experiential Interactive/
Adaptive
Assessment
Type of activity Attending to
information
Searching for
and processing
information
Discussing
module related
content with at
least one other
person (student
or tutor)
Actively
constructing an
artefact
Applying
learning in a
real-world
setting
Applying
learning in a
simulated
setting
All forms of
assessment,
whether
continuous, end
of module, or
formative
(assessment for
learning)
Examples of
activity
Read, Watch,
Listen, Think
about, Access,
Observe,
Review, Study
List, Analyse,
Collate, Plot,
Find, Discover,
Access, Use,
Gather, Order,
Classify, Select,
Assess,
Manipulate
Communicate,
Debate, Discuss,
Argue, Share,
Report,
Collaborate,
Present,
Describe,
Question
Create, Build,
Make, Design,
Construct,
Contribute,
Complete,
Produce, Write,
Draw, Refine,
Compose,
Synthesise,
Remix
Practice, Apply,
Mimic,
Experience,
Explore,
Investigate,
Perform,
Engage
Explore,
Experiment,
Trial, Improve,
Model, Simulate
Write, Present,
Report,
Demonstrate,
Critique
Conole, G. (2012). Designing for Learning in an Open World. Dordrecht: Springer.
Rienties, B., Toetenel, L., (2016). The impact of learning design on student behaviour, satisfaction and performance: a cross-institutional comparison across 151
modules. Computers in Human Behavior, 60 (2016), 333-341
Open University Learning Design Initiative (OULDI)
14.
15. Merging big data sets
• learning design data (>300 modules mapped)
• student feedback data (>140)
• VLE data
• >140 modules aggregated individual data weekly
• >37 modules individual fine-grained data daily
• Academic Performance (>140)
• Predictive analytics data (>40)
• Data sets merged and cleaned
• 111,256 students undertook these modules
16. Toetenel, L., Rienties, B. (2016). Analysing 157 Learning Designs using Learning Analytic approaches as a means to evaluate the impact of pedagogical
decision-making. British Journal of Educational Technology, 47(5), 981–992.
17. Nguyen, Q., Rienties, B., & Toetenel, L. (2017). Unravelling the dynamics of instructional practice: a longitudinal study on learning design and VLE activities.
Paper presented at the Proceedings of the Seventh International Learning Analytics & Knowledge Conference, Vancouver, British Columbia, Canada, pp. 168-
20. Nguyen, Q., Rienties, B., Toetenel, L., Ferguson, R., Whitelock, D. (2017). Examining the designs of computer-based assessment and its impact on student
engagement, satisfaction, and pass rates. Computers in Human Behavior. DOI: 10.1016/j.chb.2017.03.028.
21. Nguyen, Q., Rienties, B., Toetenel, L., Ferguson, R., Whitelock, D. (2017). Examining the designs of computer-based assessment and its impact on student
engagement, satisfaction, and pass rates. Computers in Human Behavior. DOI: 10.1016/j.chb.2017.03.028.
24. Constructivist
Learning Design
Assessment
Learning Design
Productive
Learning Design
Socio-construct.
Learning Design
VLE Engagement
Student
Satisfaction
Student
retention
150+ modules
Week 1 Week 2 Week30
+
Rienties, B., Toetenel, L., (2016). The impact of learning design on student behaviour, satisfaction and performance: a cross-institutional comparison across 151
modules. Computers in Human Behavior, 60 (2016), 333-341
Nguyen, Q., Rienties, B., Toetenel, L., Ferguson, R., Whitelock, D. (2017). Examining the designs of computer-based assessment and its impact on student
engagement, satisfaction, and pass rates. Computers in Human Behavior. DOI: 10.1016/j.chb.2017.03.028.
Communication
25. So what happens when you give
learning design visualisations to
teachers?
Toetenel, L., Rienties, B. (2016) Learning Design – creative design to visualise learning activities. Open Learning: The Journal of Open and Distance Learning,
26. Toetenel, L., Rienties, B. (2016) Learning Design – creative design to visualise learning activities. Open Learning: The Journal of Open and Distance Learning,
29. Hlosta, M., Herrmannova, D., Zdrahal, Z., & Wolff, A. (2015). OU Analyse: analysing at-risk students at The Open University.
Learning Analytics Review, 1-16.
30. Hlosta, M., Herrmannova, D., Zdrahal, Z., & Wolff, A. (2015). OU Analyse: analysing at-risk students at The Open University.
Learning Analytics Review, 1-16.
31. Hlosta, M., Herrmannova, D., Zdrahal, Z., & Wolff, A. (2015). OU Analyse: analysing at-risk students at The Open University.
Learning Analytics Review, 1-16.
32. Hlosta, M., Herrmannova, D., Zdrahal, Z., & Wolff, A. (2015). OU Analyse: analysing at-risk students at The Open University.
Learning Analytics Review, 1-16.
33. So what happens when you give
learning analytics data about
students to teachers?
1. How did 240 teachers within the 10
modules made use of PLA data (OUA
predictions) and visualisations to help
students at risk?
2. To what extent was there a positive
impact on students' performance and
retention when using OUA
predictions?
3. Which factors explain teachers' uses
of OUA?
34. Usage of OUA dashboard by participating
teachers
3
Herodotou, C., Rienties, B., Boroowa, A., Zdrahal, Z., Hlosta, M., & Naydenova, G. (2017). Implementing predictive learning analytics on a large scale: the
teacher's perspective. Paper presented at the Proceedings of the Seventh International Learning Analytics & Knowledge Conference, Vancouver, British
35. 1. How can learning analytics empower teachers?
35
Learning analytics can enhance and facilitate
teaching practice, especially within distance
learning contexts
A variation in teachers’ degree and quality of
engagement with learning analytics/design.
Lack of consensus about intervention strategies
36. The power of learning analytics: is there still a need
for educational research?
1. How can learning analytics empower
teachers?
2. How can learning analytics empower
students?
3. How to join us…
37. Student Study 1 Testing theories…
1. How do students make sense of a complex problem by
searching the internet?
2. To what extent does Internet-Specific Epistemic
Questionnaire (ISEQ) relate to actual behaviour and
cognition?
• Justification factor (internet-based knowledge claims can be accepted
without critical evaluation)
• General internet epistemology factor (internet can give true, specific
facts, to a perspective that the internet is not a good source of true facts)
38. • Lab study whereby 269
students worked in dyads on
complex red yeast rice case
• We monitored which websites
they visited (and which they
did not)
• We analysed chat data and
final dyad answer to
government advice
Knight, S., Rienties, B., Littleton, K., Mitsui, M., Tempelaar, D. T., Shah, C. (2017). The relationship of (perceived) epistemic cognition to interaction with
resources on the internet. Computers in Human Behavior, 73, August 2017, 507–518
41. Student Study 2 How to provide actionable feedback?
Tempelaar, D. T., Rienties, B. Mittelmeier, J. Nguyen, Q. (2017) Student profiling in a dispositional learning analytics application using formative assessment.
Computers in Human Behavior.
42. Tempelaar, D. T., Rienties, B., & Giesbers, B. (2015). In search for the most informative data for feedback generation: Learning Analytics in a data-rich context.
Computers in Human Behavior, 47, 157-167
43. Tempelaar, D. T., Rienties, B., & Giesbers, B. (2015). In search for the most informative data for feedback generation: Learning Analytics in a data-rich context.
Computers in Human Behavior, 47, 157-167
44. Tempelaar, D. T., Rienties, B., Nguyen, Q. (2017). Towards actionable learning analytics using dispositions. IEEE Transactions on Learning Technologies, 1
(Jan-March 2017), 6-16.
45. Tempelaar, D. T., Rienties, B., Nguyen, Q. (2017). Towards actionable learning analytics using dispositions. IEEE Transactions on Learning Technologies, 1
(Jan-March 2017), 6-16.
47. 1. Increased availability of learning data
2. Increased availability of learner data
3. Increased ubiquitous presence of technology
4. Formal and informal learning increasingly blurred
5. Increased interest of non-educationalists to understand
learning (Educational Data Mining, 4profit companies)
6. Personalisation and flexibility as standard
7. We can properly test and improve Educational Theories
50. Conclusions and moving forwards
1. Learning design and teachers strongly
influences student engagement, satisfaction
and performance
2. Visualising learning design and learning
analytics to teachers lead to more
interactive/communicative designs and
improved student retention
51. Conclusions and moving forwards
1. Learning analytics approaches can
help EARLI researchers to test and
validate big and small theoretical
questions
2. Giving students access to learning
analytics data and insight next frontier
52. The power of learning analytics: is there
still a need for educational research?
Tampere, Finland, 30 August 2017
EARLI 2017
@DrBartRienties
Professor of Learning Analytics
Hinweis der Redaktion
Explain seven categories
For each module, the learning design team together with module chairs create activity charts of what kind of activities students are expected to do in a week.
5131 students responded – 28%, between 18-76%
Cluster analysis of 40 modules (>19k students) indicate that module teams design four different types of modules: constructivist, assessment driven, balanced, or socio-constructivist. The LAK paper by Rienties and colleagues indicates that VLE engagement is higher in modules with socio-constructivist or balanced variety learning designs, and lower for constructivist designs. In terms of learning outcomes, students rate constructivist modules higher, and socio-constructivist modules lower. However, in terms of student retention (% of students passed) constructivist modules have lower retention, while socio-constructivist have higher. Thus, learning design strongly influences behaviour, experience and performance. (and we believe we are the first to have mapped this with such a large cohort).
Cluster analysis of 40 modules (>19k students) indicate that module teams design four different types of modules: constructivist, assessment driven, balanced, or socio-constructivist. The LAK paper by Rienties and colleagues indicates that VLE engagement is higher in modules with socio-constructivist or balanced variety learning designs, and lower for constructivist designs. In terms of learning outcomes, students rate constructivist modules higher, and socio-constructivist modules lower. However, in terms of student retention (% of students passed) constructivist modules have lower retention, while socio-constructivist have higher. Thus, learning design strongly influences behaviour, experience and performance. (and we believe we are the first to have mapped this with such a large cohort).