3. learning analytics in Australia
(non-comprehensive list of LA research clusters)
+ quite a few more researchers in various universities/organisations
Some thoughts:
HD and schools are very interested in LA
Learning Analytics SIG (ASCILITE)
ALASI – Australian Summer Institute (December)
There is at least one PHD in LA program
Institutional and industry funding
LA as a Field-of-Research under review by the ARC
5. learning analytics offer great promise in providing
evidence for teachers and learners to make informed decisions and transform the
educational experience
6. is the measurement, collection, analysis and reporting
of data about learners and their contexts, for
purposes of understanding and
optimising learning and the environments in which it
occurs.
Learning Analytics (definition back in
2011)
1st International Conference on
Learning Analytics and Knowledge,
Banff, Alberta, February 27–March 1,
2011
7. “The potential of learning analytics is arguably far more
significant than as an enabler of data-intensive educational
research, exciting as this is. The new possibility is that
educators and learners — the stakeholders who constitute
the learning system studied for so long by researchers — are
for the first time able to see their own processes and
progress rendered in ways that until now were the preserve
of researchers outside the system.
Knight, S. and Buckingham Shum, S. (2017). Theory and Learning Analytics. The Handbook of Learning Analytics.
What makes Learning Analytics unique?
8. The Learning
Analytics loop
Clow, D. (2012). The learning analytics cycle: closing the loop effectively.
In Proc. LAK’12.
DEFINITIONS
Learning Analytics:
“…the process of developing actionable
insights…” (Cooper, 2012, p. 3).
Actionable insight:
“data that allows a corrective procedure,
or feedback loop, to be established for a
set of actions” (Jørnø & Gynther, 2018).
9. “The potential of learning analytics is arguably far more
significant than as an enabler of data-intensive educational
research, exciting as this is. The new possibility is that
educators and learners — the stakeholders who constitute
the learning system studied for so long by researchers — are
for the first time able to see their own processes and
progress rendered in ways that until now were the preserve
of researchers outside the system.
Knight, S. and Buckingham Shum, S. (2017). Theory and Learning Analytics. The Handbook of Learning Analytics.
What makes Learning Analytics unique?
10. “The potential of learning analytics is arguably far more
significant than as an enabler of data-intensive educational
research, exciting as this is. The new possibility is that
educators and learners — the stakeholders who constitute
the learning system studied for so long by researchers — are
for the first time able to see their own processes and
progress rendered in ways that until now were the preserve
of researchers outside the system.
Knight, S. and Buckingham Shum, S. (2017). Theory and Learning Analytics. The Handbook of Learning Analytics.
What makes Learning Analytics unique?
11. limitationsof most of the current LA systems are also coming under critical scrutiny
students face difficulties in interpreting and acting upon
data to improve learning (Jivet et al., 2018; Matcha et al., 2019), and
the same applies to teachers (Mangaroska & Giannakos, 2018)
poor design decisions in LA
(Gibson & Martinez-Maldonado, 2017)
– or not reportedlack of involvement of teachers and
learners in the design of LA tools
(Buckingham Shum, 2019, Holstein et al., 2017, 2018,
Wise & Jung, 2019)
13. This confirms Biesta’s concern about evidence-based
teaching practice
… the role that evidence can play
should be subordinated to the
values that constitute the
educational practice.
14. Gašević, D., Kovanović, V., & Joksimović, S. (2017). Piecing the learning analytics puzzle: a consolidated model of a field of
research and practice. Learning: Research and Practice, 3(1), 63-78.
The importance of Design
15. The importance of Design
Gašević, D., Kovanović, V., & Joksimović, S. (2017). Piecing the learning analytics puzzle: a consolidated model of a field of
research and practice. Learning: Research and Practice, 3(1).
19. Martinez-Maldonado, R., Pardo, A., Mirriahi, N., Yacef, K., Kay, J. and Clayphan, A. (2016) LATUX: an Iterative Workflow for Designing,
Validating and Deploying Learning Analytics Visualisations. International Journal on Learning Analytics, JLA, 2(3)
Emerging interest on Human centred learning analytics
21. Human centredness has been identified in
other fields as a characteristic of systems that have
been carefully designed by:
identifying the critical stakeholders,
their relationships, and
the contexts in which those systems will function
.
Buckingham Shum, S., Ferguson, R. and Martinez-Maldonado, R. (2019). Human-Centred Learning Analytics.
Journal of Learning Analytics, JLA, 6(2)
22. the human centred (not centric)
All the human factors,
social factors and
technology factors
interact together under the
human activity umbrella.
23. “Human-centred design is concerned less with assuring
that artifacts work as intended (by their producers, designers, or
other cultural authorities) than with enabling many individual or
cultural conceptions to unfold into uninterrupted interfaces
with technology.” Klaus Krippendorff
Krippendorff, K. 2004, Intrinsic motivation and human-centred design,
Theoretic Issues in Ergonomics Science, Vol. 5, No. 1, pp 43-72.
28. Simplified design process
Sanders, E. B. N., & Stappers, P. J. (2008). Co-creation and the new landscapes of design. Co-design, 4(1), 5-18.
29. Sanders, E. B. N., & Stappers, P. J. (2008). Co-creation and the new landscapes of design. Co-design, 4(1), 5-18.
User-centred design Co-creation (co-design)
User
Researcher
Designer
32. Sanders, E. B. N., & Stappers, P. J. (2008). Co-creation and the new landscapes of design. Co-design, 4(1), 5-18.
User-centred design Co-creation (co-design)
User
Researcher
Designer
33. Sanders, E. B. N., & Stappers, P. J. (2008). Co-creation and the new landscapes of design. Co-design, 4(1), 5-18.
What a co-design approach for Learning Analytics
should ideally look like
34. Sanders, E. B. N., & Stappers, P. J. (2008). Co-creation and the new landscapes of design. Co-design, 4(1), 5-18.
What a co-design approach for Learning Analytics
should ideally look like
36. Is there a working definition of
human-centred learning analytics?
37. “LA may be 10 years old but HCLA is a Toddler”
38. 5 critical current take-home messages in HCLA
1- Learning is complex
The more we embrace this complexity the
more we will realise the challenges of
building effective LA interfaces.
For example: Teachers have many more needs than
data needs
39. 5 critical current take-home messages in HCLA
2- Many current LA tools and dashboards
require a level of data literacy
From a HCD perspective, it is more sensible to change the tool
to suit teachers’ and learners’ needs than training them to suit
the LA tools.
40. 5 critical current take-home messages in HCLA
3- Non-data experts are unlikely to be
aware of the implication of LA design
choices.
This suggests the need for new methods particularly
tailored to engage with teachers and students in the
design of data-intensive and pedagogical meaningful
LA innovations.
41. Teachers and students should be considered
non-data experts
Maltese, Adam V, Joseph A Harsh, and Dubravka Svetina.
(2015). Data visualization literacy: Investigating data
interpretation along the novice—expert continuum.
Journal of College Science Teaching, 45(1).
HE STEM students find it hard to interpret
charts and data visualisations…
42. 5 critical current take-home messages in HCLA
4- Involving stakeholders may initially
be perceived as difficult, time-
consuming and expensive.
Yet, this can make a key difference between an
unsuccessful prototype and a system that is effective
and successfully adopted
43. Simplified OrLA framework
Prieto, L. P., Rodríguez-Triana, M. J., Martínez-Maldonado, R., Dimitriadis, Y., & Gašević, D. (2019). Orchestrating
learning analytics (OrLA): Supporting inter-stakeholder communication about adoption of learning analytics at the
classroom level. Australasian Journal of Educational Technology.
44. 5 critical current take-home messages in HCLA
5- Ethics is a burning topic in LA.
HCD has the potential to shift LA from something
imposed on teachers and learners to some done with
them.
45. “I would say it more straightforwardly:
automated adaptation is antinomical to an emancipatory perspective.
Learners are not to be seen as passive beneficiaries of a superior
control entity. With respect to software adaptations, if Learning
Analytics has to play a role, it should be limited to one of awareness
and recommendation.”
An emancipatory perspective for Learning Analytics?
Tchounikine, Pierre. (2019). Learners’ agency and CSCL technologies: towards an emancipatory perspective. International
Journal of Computer-Supported Collaborative Learning, 14(2).
46. Papers in this special section
were mostly focused on teachers
51. This paper provide guidance on developing new learning analytics applications to more
holistically address values pertinent to stakeholders, educators, and the society., such as:
Fairness
Autonomy
Social well-being
Freedom from bias
Self-image
Ease of information seeking
52. Learner-data journey mapping
Prieto-Alvarez, C. G., Martinez-Maldonado, R., & Shum, S. B. (2018, December). Mapping learner-data journeys: evolution of
a visual co-design tool. In Proc. OzCHI’18.
53. LA-Deck: LA design cards
Alvarez, C. P., Martinez-Maldonado, R., & Shum, S. B. (2020, March). LA-DECK: A card-based learning analytics co-design
tool. In Proc. LAK’20
56. HCLA is a burning topic!
https://youtu.be/VF1IkC6DJl8
57. Much more involvement of students!!
Two recommendations for considering Students as partners in LA:
1- Training and professional development for staff.
2- Think of our institutions as communities and the role of data for
such communities
61. “What aspects of the classroom or the
learning activity happening in the classroom
more visible?”
The LATEP process
Prestigiacomo et al. (2020) Lerning-centred Translucence: an Approach to
Understand How Teachers’Talk About Classroom Data. Proc. LAK’20
62. The LATEP process
1- Generative phase: enable educational stakeholders to externalise their ideas
without constraints using tools for dreaming’
What information is currently available (visible)?
How the information should be made available?
What kind of metrics would be interesting to see?
E. B. Sanders. (2000). Generative tools for co-designing. In S. S.A.R., B. L.J. & W. A., Eds.,
Collaborative design. Springer, London, UK, 3-12.
63. 1- Generative phase.
2- Norming and prioritising phase. A converging phase is needed which
includes
norming (assessing and categorising); and
prioritising (ranking), which involves ranking the most critical ideas or
Osborn. (1953). Applied Imagination: Principles and Procedures of Creative Problem Solving. New York: Charles
Scribener’s Sons.
E. De Bono. (2017). Six thinking hats. London, UK: Penguin.
The LATEP process
64. 1- Generative phase.
2- Norming and prioritising phase.
3- Translucence elicitation phase.
Which information is needed to create well-informed understanding (awareness) of
What information would be useful to have before/during/after the learning activity?
Which information is needed for every role to be held accountable for actions? Who
Should the data be made available to different roles (e.g. novice teachers, students)?
Martinez-Maldonado, R. R., Elliott, D., Axisa, C., Power, P., Echeverria, E. and Buckingham Shum, S. (2020).
Designing translucent learning analytics with teachers: an elicitation process. Interactive Learning Environments
The LATEP process
Hassan Khosravi
Linda Corrin
Sara Howard
Grace Lynch
Hassan Khosravi
Linda Corrin
Sara Howard
Grace Lynch
The ASCILITE Learning Analytics Special Interest Group (LA-SIG) Hazel Jones, Casandra Colvin and Linda Corrin.
In the past two decades there has been a growing interest in developing evidence-based practices to help teachers decide on appropriate teaching strategies, monitor student progress effectively, and evaluate teaching effectiveness
Not so different to EDM, AIED and IT….
What makes Learning Analytics different to other data-intensive educational R&D communities?
What makes Learning Analytics different to other data-intensive educational R&D communities?
What makes Learning Analytics different to other data-intensive educational R&D communities?
What makes Learning Analytics different to other data-intensive educational R&D communities?
Human Centre Design – Definition
UCD – PD – VSD -
For example: Teachers have many more needs than data needs
For example: Teachers have many more needs than data needs
For example: Teachers have many more needs than data needs
For example: Teachers have many more needs than data needs
LINK WITH ADOPTION
LINK WITH ADOPTION
Learning is a complex process that cannot be observed directly. It has cognitive, metacognitive, affectiv
e
, and
social aspects that are sensitive to context. The more we embrace this complexity, the more we become aware of
the difficulties inherent in the
creation of effective interfaces that communicate educational insights.
•
Many current
LA
tools and dashboards require a level of digital literacy that has not yet been acquired by the
majority of stakeholders. One way of dealing with this is to call for mor
e training in this area. From a human
-
centred design perspective, though, it is more sensible to change the tools to suit their users, rather than changing
the users to suit the tools. This perspective could shift the focus away from providing users with d
ata to interpret
,
and toward providing them with answers to the questions they are asking.
Drawing on
the
prescient insights
of
Engelbart (1963)
, there will be a co
-
evolution of human and machine capabilities, methodologies
,
and language.
•
Non
-
experts are u
nlikely to be aware of the implications of different design choices, the potential of different
analytic techniques, and constraints on implementation. New methods and combinations of methods, like those
proposed by Holstein et al
.
(2019)
,
are needed to in
volve them meaningfully within the design process.
•
Involving stakeholders may be perceived as difficult, time
-
consumin
g
, and
expensive. Nevertheless, involving
them throughout the design process can
make
the difference between an unsuccessful prototype and
a system
that is taken up successfully.
•
In terms of the ethics of
LA
and growing concerns about the misuse of data, human
-
centred design has the
potential to shift
LA
from something done to learners toward something done with learners. This ethical
perspe
ctive is one that could be more widely taken up and highlighted
he LAFP engages faculty in the scholarship of student success. The engagement process begins with an annual call for
proposals and a campus event to explain the goals and set the stage for the progra
m. At that information session potential LA
Fellows have the opportunity to view existing data and consider their own questions about student success. Following the
event
,
faculty submit a proposal outlining their projects goals and intended outcomes. Fell
ows with accepted proposals attend
a kick
-
off event prior to meeting with the Bloomington Assessment and Research (BAR) staff to discuss their projects and to
develop a research strategy. As an LA Fellow
,
they uncover answers to their questions, work with
colleagues in other
disciplines who are part of the program, and share results at an annual event and at our annual LA Summit. Often their initia
l
answers provoke more questions for deeper analysis (Figure 1).
Figure 1.
Iterative cycle of faculty engage
ment.
CLASS facilitates the program, selecting the proposals that will receive funding, holding monthly meetings, sponsoring an
annual campus
-
wide showcase, and hosting an annual celebration attended by the Provost and Vice Provost of our campus.
BAR
provides support to the faculty, offering useable student records data, various data dashboards, and statistical analysis to
those who request it.
All aspects of the work are discussed with the BAR staff, including the availability of data, how data will
b
e analyzed and the skill sets of the researcher. For some, this initial conversation is the beginning of a close partnership
with
BAR, while other Fello
ws opt to work independently, only returning to BAR with specific questions or data needs (Rehrey et
al.
, 2018). The Office
of the Vice
-
Provost for Undergraduate Education (OVPUE) is also part of the community, providing
funding and important high
-
level visibility to the faculty projects (Figure 2).
Figure 2.
Structure of the LA Fellows Program.
Data is p
rovided to each faculty member
who
meets the research requirements of their project. The typical data made
available include
the following
: a longitudinal view of student progression from admission to graduation, a historical record
L
A
F
e
l
l
o
w
s
C
L
A
S
S
O
V
P
U
E
B
A
R
he LAFP engages faculty in the scholarship of student success. The engagement process begins with an annual call for
proposals and a campus event to explain the goals and set the stage for the progra
m. At that information session potential LA
Fellows have the opportunity to view existing data and consider their own questions about student success. Following the
event
,
faculty submit a proposal outlining their projects goals and intended outcomes. Fell
ows with accepted proposals attend
a kick
-
off event prior to meeting with the Bloomington Assessment and Research (BAR) staff to discuss their projects and to
develop a research strategy. As an LA Fellow
,
they uncover answers to their questions, work with
colleagues in other
disciplines who are part of the program, and share results at an annual event and at our annual LA Summit. Often their initia
l
answers provoke more questions for deeper analysis (Figure 1).
Figure 1.
Iterative cycle of faculty engage
ment.
CLASS facilitates the program, selecting the proposals that will receive funding, holding monthly meetings, sponsoring an
annual campus
-
wide showcase, and hosting an annual celebration attended by the Provost and Vice Provost of our campus.
BAR
provides support to the faculty, offering useable student records data, various data dashboards, and statistical analysis to
those who request it.
All aspects of the work are discussed with the BAR staff, including the availability of data, how data will
b
e analyzed and the skill sets of the researcher. For some, this initial conversation is the beginning of a close partnership
with
BAR, while other Fello
ws opt to work independently, only returning to BAR with specific questions or data needs (Rehrey et
al.
, 2018). The Office
of the Vice
-
Provost for Undergraduate Education (OVPUE) is also part of the community, providing
funding and important high
-
level visibility to the faculty projects (Figure 2).
Figure 2.
Structure of the LA Fellows Program.
Data is p
rovided to each faculty member
who
meets the research requirements of their project. The typical data made
available include
the following
: a longitudinal view of student progression from admission to graduation, a historical record
L
A
F
e
l
l
o
w
s
C
L
A
S
S
O
V
P
U
E
B
A
R
Norming can be achieved by finding ideas that are related, defining categories and identifying outliers.
Prioritising can be achieved through techniques such as voting, ranking or SWOT analysis.