This presentation was used Tinne De Laet, KU Leuven, for a keynote presentation during the event: http://www.educationandlearning.nl/agenda/2017-10-13-cel-innovation-room-10-learning-and-academic-analytics organised by Leiden University, Erasmus University Rotterdam, and Delft University of Technology.
The presentations presents the results of two case studies from the Erasmus+ project ABLE and STELA, and provides 9 recommendations regarding learning analytics.
4. Who am I? Why I am here?
woman
engineer
Head Tutorial Services
Engineering Science
KU Leuven
.
associate
professor
4
6 9
woman
5. STELA Project
• Successful Transition from secondary to higher Education using Learning Analytics
• project partners:
• main goal: enhance a successful transition from secondary to higher
education by means of learning analytics
• The STELA project…
involves designing and building student and staff-facing analytics dashboards,
aims to develop dashboards that go beyond identifying at-risk students,
allowing actionable feedback for all students on a large scale.
STELA Project: 562167-EPP-1-2015-1-BE-EPPKA3-PI-FORWARD
www.stela-project.eu
@STELA_project
5
6. ABLE Project
• Achieving Benefits from Learning Analytics
• project partners:
• main goal: research strategies and practices for using learning
analytics to support students during their first year at university
• The ABLE project…
involves developing the technological aspects of learning analytics,
focuses on how learning analytics can be used to support students.
ABLE Project: 2015-1-BE-EPPKA3-PI-FORWARD
www.ableproject.eu
@ABLE_project_eu
6
7. STELA ♥ ABLE
7
actionable feedback
student-centered
program level
inclusive
first-year experience
institution-wide
Learning Analytics
actual implementation
8. Problem statement
transition from secondary to higher education
challenging from academic and social perspective
students have to adapt study and learning strategies, but how?
social-comparison
theory
people evaluate abilities
through comparison with
others, when objective
measures are lacking
freshman students lack
comparative framework
self-efficacy
“expectation to be
successful for specific
task” ≈ situation-specific
self-confidence
academic self-confidence
influences student
performance
feedback
considered important for
improving student
achievement
actionable feedback!
8
10. Learning Analytics?
“Learning analytics is
about collecting
traces that learners
leave behind and
using those traces to
improve learning.”
- Erik Duval
Learning Analytics and Educational Data Mining, Erik Duval’s Weblog, 30 January 2012, https://erikduval.wordpress.com/2012/01/30/learning-analytics-and-educational-data-mining/
10
11. Learning Dashboards?
11
Dashboard Confusion, Stephen Few, Intelligent Enterprise, March 20, 2004
“A dashboard is a visual display
of the most important
information needed to achieve
one or more objectives;
consolidated and arranged on a
single screen so the information
can be monitored at a glance.”
- Stephen Few
12. Problem statement
What we asked students ….
★ confidence to being successful
in the first year
★ study-related behaviour that is
important to this end
Learning dashboards
with
actionable feedback
supporting
first-year
students’ success How confident are they?
What do they believe
to be important?
Which feedback would
they like to receive?
Confidence in and beliefs about first-year engineering student success, Tinne De Laet et al., Proceedings of the SEFI 2017 conference, Azores Islands Portugal
12
13. Problem statement
★ ★ ★
★ ★ ★
★ ★ ★
Does difference in confidence comply with actual first-year study success?
drop-out
60%
24%
42%
Are freshman confident in their study success?
Confidence in and beliefs about first-year engineering student success, Tinne De Laet et al., Proceedings of the SEFI 2017 conference, Azores Islands Portugal
Howtoread.
13
15. Problem statement
15
The transition from secondary to higher education is challenging.
Students want “actionable” feedback.
Learning Analytics?
Learning Dashboards?
17. Study advisor – student conversations
17
Should I consider
another program?
Can I still finish the
bachelor in 3 years?
How should I compose
my program for next
year?
What is the personal
situation?
How can I help?
What is the best
next step?
18. Iterative design process
18
visualization experts
practitioners / end-users
researchers LA
researchers first-year
study success
Charleer S., Vande Moere A., Klerkx J., Verbert K., De Laet T. (2017). Learning Analytics Dashboards to Support Adviser-Student Dialogue.
In IEEE Transactions on Learning Technology (http://ieeexplore.ieee.org/document/7959628/).
20. LISSA dashboard
20
10 STEM study programs
in 3 faculties @KU Leuven
three examination periods
observations, interviews with SA,
questionnaire with students
Charleer S., Vande Moere A., Klerkx J., Verbert K., De Laet T. (2017). Learning Analytics Dashboards to Support Adviser-Student Dialogue.
In IEEE Transactions on Learning Technology (http://ieeexplore.ieee.org/document/7959628/).
21. Evaluation – observations
21
Claes, S., & Moere, A. V. (2015, June). The role of tangible interaction in exploring information on public visualization displays. In Proceedings of the 4th International
Symposium on Pervasive Displays (pp. 201-207). ACM.
15 observations
insights
(-) factual
(+) interpretative
(!) reflective
22. Evaluation – interviews
“When students see the numbers, they are
surprised, but now they believe me.
Before, I used my gut feeling, now I feel
more certain of what I say as well”.
“It’s like a main thread
guiding the
conversation.”
“I can talk about what to do with the results,
instead of each time looking for the data and
puzzling it together.”
“Students don’t know where to look during the
conversation, and avoid eye contact.
The dashboard provides them a point of focus”.
“A student changed her
study method in June and
could now see it paid off.”
LISSA supports a personal dialogue.
the level of usage depends on the experience
and style of the study advisors
fact-based evidence at the side
narrative thread
key moments and student path help to
reconstruct personal track
“I can focus on the
student’s personal
path, rather than on
the facts.”
“Now, I can blame
the dashboard and
focus on
collaboratively looking
for the next step to
take.”
22
23. Evaluation – student questionnaires
23
101 students
third examination period
Millecamp M., Charleer S., Verbert K., De Laet T. (2017). A qualitative evaluation of a learning dashboard to support advisor-student dialogues.
Submitted for LAK 2018
strongly agreestrongly disagree
24. Future of LISSA dashboard
24
26 programs >4500 students
@KU Leuven
pilot @Leiden University
…
26. [1] Start with data that is
already available.
• Lots of data may eventually become
available in the future …
• Start with what is available today:
• every institution has registration and
performance data,
• many of today’s learning activities already
leave digital traces behind.
26
(*)
(*) Zarraonandia, T., Aedo, I., Díaz, P., & Montero, A. (2013). An augmented lecture feedback system to support learner and teacher communication.
British Journal of Educational Technology, 44(4), 616-628.
27. [1] Start with data that is
already available.
27
data already available?
administrative (examples)
student records course grades
systems (examples)
LMS access logs advisor meetings
surveys (examples)
quality insurance LASSI
28. [2] Think beyond the obvious.
28
• Look for data in unusual places.
• Many institutions are collecting survey
data for educational research.
• Consider new combinations of data.
example: physical attendance vs LMS activity
29. [3] Feedback must be actionable.
29
• many interesting correlations
gender, socio-economic status, …
• for LA, focus on actionable insights:
how can the data client act based on the data?
30. [3] Feedback must be actionable.
30
awareness
(self-)reflection
sensemaking
impact
data
questions
answers
behavior change
new meaning
Verbert K, Duval E, Klerkx J; Govaerts S, Santos JL (2013) Learning analytics dashboard applications. American Behavioural Scientist, 10 pages. Published online February 2013.
31. [3] Feedback must be actionable.
31
• learning Tracker @TU Delft
• embedded in MOOC or SPOC
• LA Process model applied:
My online usage
pattern is different
from the average
succesful student
in this course.
How can I
adapt my
behavior?
I don't think I need
to spend more time
on the platform,
but I do need to
focus more on the
quizzes.
I now learned
that quizzes are
a valuable
learning
instrument.
Davis, D., Chen, G., Jivet, I., Hauff, C. and Houben, G.J., 2016. Encouraging
Metacognition & Self-Regulation in MOOCs through Increased Learner
Feedback. In LAL@ LAK (pp. 17-22).
32. [4] Keep Learning Analytics in
mind when designing learning
activities.
32
Learning
Analytics
Learning Design
INFORM
ENABLE
• If LA indeed contributes to improved
learning design…
• … don’t make it an afterthought
33. [4] Keep Learning Analytics in mind
when designing learning activities.
33
example data from a traditional course with “VLE as a file system”
test scores
activity/week (#days)
weeks of the year
34. [4] Keep Learning Analytics in mind
when designing learning activities.
34
example data from a course with flipped classroom & blended learning
test scores
activity (# of modules used)
Not a single student
using less than 10
modules passed the
course.
Most of the successful
students used 15
modules or more.
35. [5] Include all available
expertise.
35
• Leverage in-house expertise across
domains.
• educational scientist and practitioners
• computer scientists and IT dept
• teachers and students
• Don’t impose, include.
• Start doing this at the beginning of
the project.
36. [5] Include all available expertise.
tutorial services
PRACTICE & POLICY
research group
THEORY
36
IT Dept. (ICTS)
study advice service
Other faculties
Other faculties
participating faculties
FUNDAMENTALS
educational research
educational sciences
Tinne Greet Carolien
Tom Martijn Francesco Sven
Katrien
vice rectors
37. [6] Create a checklist to evaluate
tools and resources.
37
• increasingly ‘hot’ domain
• growing number of commercial solutions
• difficult to evaluate without framework
• ethics, privacy, data?
• one-size-fits-all?
• different context, same solution?
38. [6] Create a checklist to evaluate
tools and resources.
38Greller, W. and Drachsler, H., 2012. Translating learning into numbers: A generic framework for learning analytics. Journal of Educational Technology & Society, 15(3), p.42.
39. [6] Create a checklist to evaluate
tools and resources.
39Scheffel, M., Drachsler, H., Toisoul, C., Ternier, S. and Specht, M., 2017, September. The proof of the pudding: examining validity and reliability of the
evaluation framework for learning analytics. In European Conference on Technology Enhanced Learning (pp. 194-208). Springer, Cham.
40. [7] Design for scalability.
40
• content scalability
• focus on program level, not single modules
• allow for adaptation
• process scalability
• design processes
• provide tools
• technical scalability
• OK to explore new approaches (CS)…
• … but also involve IT dept. and see what is
already there!
• Prefer, but don’t overfocus on open source.
41. [8] Before impact, acceptance is
required.
41
• Include stakeholders, early on.
• Demonstrate usefulness.
• Always manage ethics & privacy.
• good scenario:
students and practitioners as ambassadors
42. [8] Before impact, acceptance is
required.
42
dashboard for study adviser –
student interaction
43. [9] Collaborate and experiment to
convince management.
43
• European collaboration projects…
• … not always easy,
• … but strong catalyst.
• Foster scalability across institutions.
• Shared context facilitates
collaboration (e.g. GDPR).
45. ~ 30 LASSI questions
(shortened version)
“Learning Skills”
Example: When preparing for an
exam, I create questions that I
think might be included.
Example: I find it difficult to
maintain my concentration
while doing my coursework.
Example: I find it hard to stick
to a study schedule.
raw scores
(selected 5 out of 10)
CONCENTRATION
MOTIVATION
FAILURE ANXIETY
TEST STRATEGY
TIME MANAGEMENT
norm scores
(in Flemish HE context)
Example: STRONG
Example: AVERAGE
Example: LOW
Example: VERY STRONG
Example: VERY WEAK
Weinstein, C. E., Schulte, A. C., & Hoy, A. W. (1987). LASSI: Learning and study strategies inventory. H & H Publishing Company.
45
Meta cognitive abilities
46. Learning Skills Dashboard
46
1406 students in 12 STEM programs
in 4 faculties @KU Leuven
1137 (80%) used the dashboard
all actions (or lack thereof)
monitored
47. Feedback model
1. What is this about?
2. How you are doing?
3. How this relates to others.
4. Why this is relevant.
5. What you can do about it.
47
52. Students’ feedback?
Broos, T., Peeters, L., Verbert, K., Van Soom, C., Langie, G., & De Laet, T. (2017, July). Dashboard for Actionable Feedback on Learning Skills: Scalability and Usefulness.
In International Conference on Learning and Collaboration Technologies (pp. 229-241). Springer, Cham.
52
53. Higher learning skills scores?
Broos, T., Peeters, L., Verbert, K., Van Soom, C., Langie, G., & De Laet, T. (2017, July). Dashboard for Actionable Feedback on Learning Skills: Scalability and Usefulness.
In International Conference on Learning and Collaboration Technologies (pp. 229-241). Springer, Cham.
53
more likely to use the dashboard.
54. Lower learning skills scores?
Broos, T., Peeters, L., Verbert, K., Van Soom, C., Langie, G., & De Laet, T. (2017, July). Dashboard for Actionable Feedback on Learning Skills: Scalability and Usefulness.
In International Conference on Learning and Collaboration Technologies (pp. 229-241). Springer, Cham.
54
if using, more intensively
55. Future of LASSI dashboard
55
26 programs >4500 students
@KU Leuven
pilot @TU Delft
…
56. Summary
56
two case studies 9 recommendations
[1] Start with data that is already available.
[2] Think beyond the obvious.
[3] Feedback must be actionable.
[4] Keep Learning Analytics in mind when designing
learning activities.
[5] Include all available expertise.
[6] Create a checklist to evaluate tools and resources.
[7] Design for scalability.
[8] Before impact, acceptance is required.
[9] Collaborate and experiment to convince management.
humble approach
small data
involvement of stakeholders, especially practitioners
actionable feedback
scalability
traditional university settings
Is this learning analytics?
58. Project team @
58
Sven Charleer
AugmentHCI, Computer Science department
PhD researcher ABLE
Katrien Verbert
AugmentHCI, Computer Science department
Copromotor of STELA & ABLE
Carolien Van Soom
Leuven Engineering and Science Education Center
Head of Tutorial Services of Science
Copromotor of STELA & ABLE
Greet Langie
Leuven Engineering and Science Education Center
Vicedean (education) faculty of Engineering Technology
Copromotor of STELA & ABLE
Tinne De Laet
Leuven Engineering and Science Education Center
Head of Tutorial Services of Engineering Science
Coordinator of STELA
KU Leuven coordinator of ABLE
Francisco Gutiérrez
AugmentHCI, Computer Science department
PhD researcher ABLE
Tom Broos
Leuven Engineering and Science Education Center
AugmentHCI, Computer Science department
PhD researcher STELA
Martijn Millecamp
AugmentHCI, Computer Science department
PhD researcher ABLE
Special thanks to the involved stakeholders for their inspiration, collaboration, and support!
Jasper, Bart, Riet, Hilde, An, …
♥