1. Towards Institutional Adoption of
Learning Analytics
Yi-Shan Tsai
The University of Edinburgh
yi-shan.tsai@ed.ac.uk
@yi_shan_tsai
CRLI seminar series
University of Sydney
3 March 2018
2. Learning Analytics at the University of Edinburgh
http://www.ed.ac.uk/information-services/learning-technology/learning-analytics
3. Early MOOC Analytics
• 6 courses:
– Artificial Intelligence Planning
– Astrobiology
– Critical Thinking in Global
Challenges
– E-Learning and Digital Cultures
– Equine Nutrition
– Introduction to Philosophy
• August 2013 – April 2014
Detail on course design: MOOCs @ Edinburgh 2013: Report #1
(http://hdl.handle.net/1842/6683)
13. Learning Analytics Report Card (LARC)
http://larc-project.com
• Involve students in critical
conversations around the
use of their data for
computational analysis in
education.
14.
15.
16.
17.
18. Learning Analytics Policy and Governance
• Task Group (reporting to Senate Learning and Teaching, and
Knowledge Strategy Committees )
• Governance group:
̵ Convenor - a senior academic member of staff with expertise in Learning Analytics
̵ The Assistant Principal with strategic responsibility for Learning Analytics
̵ A student representative
̵ The University’s Data Protection Officer
̵ Representatives from relevant service units (Universities Secretaries Group and
Information Services Group)
̵ A member of academic staff with expertise in research ethics.
19. Statement of Principles
1. LA will not be used to inform significant action at an individual level
without human intervention.
2. We will use LA to benefit all students in reaching their full academic
potential.
3. We will be transparent about data collection, sharing, consent and
responsibilities.
4. We will actively work to recognise and minimise any potential negative
impacts from LA.
5. We will abide with ethical principles and align with organisational
strategy, policy and values.
6. LA will be supported by focused staff and student development activities.
7. LA will not be used to monitor staff performance.
https://www.ed.ac.uk/files/atoms/files/learninganalyticsprinciples.pdf
21. Lessons Learned
• Built capacity and understanding
• No one size fits all
• Retention focus is of limited value at Edinburgh
• Market does not provide
• Data protection, security, FOI all take more time
• Data validation takes time
• Learning analytics does not fit neatly into the organisation
• Our data are not always easy to work with
22. Next steps
• GDPR – detailed policy and governance group
• Capacity building
• Course design / feedback at scale
25. Objectives
• The state of the art
• Direct engagement with key stakeholders
• A comprehensive policy framework
http://sheilaproject.eu/
26. Slide credit: Dragan Gašević (2017) Let’s get there! Towards policy for adoption of learning analytics. LSAC, Amsterdam, The Netherlands.
http://sheilaproject.eu/
27. The state of the art
Challenges, adoption and strategy
http://sheilaproject.eu/
28. Adoption challenges
1. Leadership for strategic implementation & monitoring
2. Equal engagement with stakeholders
3. Pedagogy-based approaches to removing learning barriers
4. Training to cultivate data literacy among primary
stakeholders
5. Evidence of impact
6. Context-based policies to address privacy & ethics issues
and other challenges
Tsai, Y. S., & Gasevic, D. (2017). Learning analytics in higher education –challenges and policies: a review of eight learning analytics policies.
InProceedings of the Seventh International Learning Analytics & Knowledge Conference(pp. 233-242).
http://sheilaproject.eu/
29. LA adoption in Europe
• Institutional interviews: 16 countries, 51 HEIs, 64
interviews, 78 participants
N O P L A N S
I N P R E P A R A T I O N
I M P L E M E N T E D 9 7 5
12
18
The adoption of learning analytics (interviews)
Institution-wide Partial/ Pilots Data exploration/cleaning
http://sheilaproject.eu/
30. LA adoption in Europe
• Institutional survey: 22 countries
NO P LA NS
IN P RE P A RA TION
IMP LE ME NT ED 2 13
15
16
The adoption of LA
Institution-wide Small scale N/A
http://sheilaproject.eu/
31. LA strategy
No defined strategy
LA
Digitisation strategies
Teaching & learning strategies Immature
plans for
monitoring &
evaluation
http://sheilaproject.eu/
33. Essential features of a LA policy…
CHALLENGES
Experts’ perspectives
http://sheilaproject.eu/
34. Interests – senior managers
• To improve student learning
performance (16%)
• To improve student satisfaction
(13%)
• To improve teaching excellence
(13%)
• To improve student retention (11%)
• To explore what learning analytics
can do for our institution/ staff/
students (10%)
LA
Learner
driver
Teaching
driver
Institutional
driver
http://sheilaproject.eu/
35. Concerns – senior managers
• No one-size-fits-all solutions
• Pressure to adopt LA
• How can the institution as a whole benefit
from LA?
• The strictness of existing data protection
regulations makes adoption more difficult.
http://sheilaproject.eu/
36. Interests – teaching staff
• Pedagogical interests
Know how students
engage with learning
contents
Improve the design and
provision of learning
materials, curriculum,
and support to
students.
37. Concerns– teaching staff
Profiling students & unequal
support
Privacy & autonomy
Demotivation & Anxiety Behaviour alteration
Student-Centred
Concerns
38. Concerns– teaching staff
Time pressure Performance judgement
Teaching professionalism is
disrespected
Managing expectations
Teacher-centred
concerns
39. Concerns– teaching staff
Differences among individual
students/ teachers/ courses/
subjects/ disciplines/ faculties
Interpretations of learning (data
collection, analysis & analytics
interpretation)
Damaging teacher-student
relationships
LA capabilities
LA-centred concerns
40. Interests – students
Personalised support
• Inform teaching support and curriculum design.
• Support a widening access policy.
• Support students at all achievement levels to
improve learning.
• Assist with transitions from pre-tertiary education to
higher education, and from higher education to
employment.
http://sheilaproject.eu/
41. Concerns– students
• Surveillance
• Stereotypes and biases
• Limitations in quantifying learning
• Worries about human contacts and teaching
professionalism being replaced by machines
http://sheilaproject.eu/
42. Concerns– students
Legitimate or illegitimate?
• Purpose
• Anonymity
• Access
Privacy
paradox
Transparency
Effective
communication
http://sheilaproject.eu/
52. In what way might learning analytics
be useful to you or your students?
53. Would you have any concerns about
using learning analytics in your daily
teaching practice?
54. Should we give students access to
their analytics if it could potentially
demotivate them?
55. What are the pros and cons with
predictive modeling?
Hinweis der Redaktion
Use one word to describe your experience or impression of learning analytics:
https://www.mentimeter.com/app
What data do we have?
Can we identify patterns of student behaviours?
Tutors very central on one; participants more central on the other
Usability of data is low: Data is very ‘raw’ - requires a lot of processing.
Effort and skills required can be significant: Define questions, Make pragmatic decisions Foster an open / sharing culture
Platforms are still maturing: be prepared for change and re-work; comparisons of data from different platforms could be hard
Experience can re-used: Experience / approaches may be useful when considering work with on-campus platforms
We took an experimental approach.
Online MSc programmes have smaller cohort of students. LA doesn’t render new information. Retention isn’t an issue in UoE.
Skills – Interactions with analytics as part of the University learning experience can help our students build 'digital savviness' and prompt more critical reflection on how data about them is being used more generally, what consent might actually mean and how algorithms work across datasets to define and profile individuals.
Partner organisations:
The University of Edinburgh, UK
Universidad Carlos III de Madrid, Spain
Open University of the Netherlands, Netherlands
Tallinn University, Estonia
Erasmus Student Network aisbl (ESN), international
European Association for Quality Assurance in Higher Education, international
Brussels Educational Services, international
Challenge 5 has also been identified in Ferguson, R., & Clow, D. (2017). Where is the evidence? A call to action for learning analytics.
16 countries – UK (21), Spain (11), Estonia (3), Ireland (2), Italy (2), Portugal (2), Austria (1), Croatia (1), Czech Republic (1), Finland (1), France (1), Latvia (1), Netherlands (1), Norway (1), Romania (1), and Switzerland (1)
21 out of 51 institutions were already implementing centrally-supported learning analytics projects.
25 institutions have established formal working groups, but not all institutions have planned to provide analytics data to students.
In many cases where LA was supported centrally, LA was usually initiated under the wider digitalisation strategies or teaching and learning strategies. However, there were also a great number of institutions that had not defined clear strategies for learning analytics and were still at the ‘experimental’ or ‘exploratory’ stage.
The importance scale suggests priorities: (1) privacy & ethics (safeguard); (2) management and goals; (3) data management & analysis
The rating results of the these statements show an obvious drop of rating scale in the ‘ease of implementation’ level of these themes, compared to their ‘importance’ level. One of the implications is that the six features could potentially be challenges to deal with in order to scale up the adoption of LA.
It’s also interesting to see that privacy & transparency relevant actions are considered the easiest to implement.
The interviews identified three common aspects of internal drivers for the adoption of learning analytics:
Learner-driver: to encourage students taking responsibility for their own studies by providing data-based information or guidance.
Teaching-driver: to identify learning problems, improve teaching delivery, and allow timely, evidence-based support.
Institution-driver: to inform strategic plans, manage resources, and improve institutional performances, such as retention rate and student satisfaction.
An equivalent question (multiple choices) in the survey provided 11 options for motivations specific to learning and teaching. The results identified five top drivers.
How can the institution as a whole benefit from LA?
No one-size-fits-all solutions:
Needs vary by institutions, but existing solutions focus on addressing retention problems. LA should not be used as a deficit model.
differences among subjects and faculties.
Other concerns:
Uncertainly about the benefits of LA: fear of failing expectations
Pressure to adopt LA
The strictness of existing data protection regulations makes adoption more difficult.
Teaching staff and student feed back comes from UoE only.
Time pressure: no time for training (LA tools need to be intuitive); information overload (on time to process information), no time for support (LA needs to save teachers time from doing mundane routines so as to free up their time to provide more personalised support to students)
Performance judgement: LA used by HR; course evaluation discourages innovations
Teaching professionalism is disrespected: trust issue – teachers feel that the institution does not trust them to make professional decisions
Managing expectations (managers’ expectations of what can LA do Students’ expectations of what can LA do Teachers’ responsibility to meet the expectations from managers and students)
Differences among… : No one size fits all
Interpretations of learning: individual differences, lack of qualitative data, off-line learning, causal relationships between data and learning (engagement)
Damaging teacher-student relationships: when misinterpreting learning or not having proper conversations with students (inviting them to interpret the analytics results about them)
LA capabilities: precision of prediction (e.g., identify optimal pathway for learning), supporting all students (disengaged students do not respond: it is important to recognise that while LA is meant to support all students, realistically some students are not reachable).
Inform teaching support and curriculum design so that no one is falling behind or having to learn the same materials repetitively.
Support a widening access policy – at a class level.
Support students at all achievement levels to improve learning by providing them a better overview of their own learning progress.
Other concerns:
Limitations in quantifying learning
Worries about human contacts and teaching professionalism being replaced by machines
GDPR, Article 6, “Lawfulness of processing”, counters Article 7 by allowing institutions to process personal data when such data is necessary for the purpose of ‘legitimate interests’, or are necessary to carry out tasks that are of ‘public interest’.
Three purposes:
to comply with legal requirements, such as visas;
to improve educational services, such as learning support, teaching delivery, career development, educational resources management, and the support of student well-being;
to improve the overall performance of the university, such as league rankings, equality, and the recruitment of future students.
Anonymity: okay with personal tutors. Not okay with tutors who may be involved in marking student performance.
Access: extreme distrust in 3rd parties for the fear of becoming marketing targets.
Although the participants had strong views about protecting their privacy and expectations about how their data should be used, they felt that they had sufficient understanding about the existing data practice to critically question its legitimacy – privacy paradox. The privacy paradox phenomenon suggests that institutions need to scale up their transparency and effective communication with students.
Managers make the decision to invest in LA, so there is a sense of uncertainty whether the return of the investment is worthwhile.
Teaching staff are the ones who are expected to use LA to support teaching and learning, so there are worries about all the possible implications of such expectation.
Students want personalised support, but they are the primary data subjects, so privacy is of the top concern.
We adopted the Rapid Outcome Mapping Approach to developing this policy framework. The ROMA model was originally designed by to support policy and strategy processes in the field of international development. The model begins with defining an overarching policy objective, followed by six steps designed to provide policy makers with context-based information. It allows decision makers to identify key factors that enable or impede the implementation of learning analytics. Moreover, the reflective process allows refinement and adaptation of policy goals to meet context change over time.