This talk summarised the SHEILA project and its preliminary findings. It was presented at the EUNIS (European University Information Systems) workshop on 7 November 2017.
3. Supporting Higher Education to
Integrate Learning Analytics
• The state of the art
• Direct engagement
with key stakeholders
• A comprehensive
policy framework
http://sheilaproject.eu/
4. Slide credit: Dragan Gašević (2017) Let’s get there! Towards policy for adoption of learning analytics. LSAC, Amsterdam, The Netherlands.
http://sheilaproject.eu/
5. The state of the art
Challenges, adoption and strategy
http://sheilaproject.eu/
6. Adoption challenges
1. Leadership for strategic implementation &
monitoring
2. Equal engagement with stakeholders
3. Pedagogy-based approaches to removing
learning barriers
4. Training to cultivate data literacy among primary
stakeholders
5. Evidence of impact
6. Context-based policies to address privacy &
ethics issues and other challenges
Tsai, Y. S., & Gasevic, D. (2017). Learning analytics in higher education –challenges and policies: a review of eight learning analytics policies.
InProceedings of the Seventh International Learning Analytics & Knowledge Conference(pp. 233-242).
http://sheilaproject.eu/
7. Essential features of a LA policy…
http://sheilaproject.eu/
CHALLENGES
Experts’ perspectives
8. LA adoption in Europe
• Institutional interviews: 16 countries, 51 HEIs, 64
interviews, 78 participants
N O P L A N S
I N P R E P A R A T I O N
I M P L E M E N T E D 9 7 5
12
18
The adoption of learning analytics (interviews)
Institution-wide Partial/ Pilots Data exploration/cleaning
http://sheilaproject.eu/
9. LA adoption in Europe
• Institutional survey: 22 countries
NO P LA NS
IN P RE P A RA TION
IMP LE ME NT ED 2 13
15
16
The adoption of LA
Institution-wide Small scale N/A
http://sheilaproject.eu/
12. Interests – senior managers
• To improve student learning
performance (16%)
• To improve student satisfaction
(13%)
• To improve teaching excellence
(13%)
• To improve student retention
(11%)
• To explore what learning
analytics can do for our
institution/ staff/ students (10%)
http://sheilaproject.eu/
LA
Learner
driver
Teaching
driver
Institutional
driver
13. Interests – teaching staff
• An overview of student attendance, submission of
assignments, access to coursework and resources,
and performance.
• Inform course design.
• Manage a big class.
• Know ‘why’ students struggle.
http://sheilaproject.eu/
14. Interests – students
Personalised approach
• Inform teaching support and curriculum design.
• Support a widening access policy.
• Support students at all achievement levels to
improve learning.
• Assist with transitions from pre-tertiary education
to higher education, and from higher education to
employment.
http://sheilaproject.eu/
15. Concerns – senior managers
• No one-size-fits-all solutions
• Pressure to adopt LA
• How can the institution as a whole benefit from LA?
• The strictness of existing data protection
regulations makes adoption more difficult.
http://sheilaproject.eu/
16. Concerns– teaching staff
• Workload
• Judging staff performance
• Not all learning is digital
• No one size-fits-all solution
• Correlation does not suggest causation
• Surveillance on students
http://sheilaproject.eu/
17. Concerns– students
• Data collection is unnecessarily personal
• Data producing stereotypes and biases
• Limitations in quantifying learning
• Worries about human contacts and teaching
professionalism being replaced by machines
http://sheilaproject.eu/
18. Concerns– students
Legitimate or illegitimate?
• Purpose
• Anonymity
• Access
http://sheilaproject.eu/
Privacy
paradox
Transparency
Effective
communication
20. ROMA (Rapid Outcome Mapping Approach)
Macfadyen, L., Dawson, S., Pardo, A., Gašević, D., (2014). The learning analytics imperative and the
sociotechnical challenge: Policy for complex systems. Research & Practice in Assessment, 9(Winter 2014), 17-28.
http://sheilaproject.eu/
22. • Become an associate partner of the
SHEILA project?
• Visit: http://sheilaproject.eu/
Yi-Shan Tsai
yi-shan.tsai@ed.ac.uk
@yi_shan_tsai
http://sheilaproject.eu/
Hinweis der Redaktion
Partner organisations:
The University of Edinburgh, UK
Universidad Carlos III de Madrid, Spain
Open University of the Netherlands, Netherlands
Tallinn University, Estonia
Erasmus Student Network aisbl (ESN), international
European Association for Quality Assurance in Higher Education, international
Brussels Educational Services, international
SHEILA aims to support the adoption of LA in higher education. To do so, we have 3 clear objectives: 1. understanding the state of the art in Europe. 2. Direct engagement with key stakeholders; 3. policy development
Challenge 5 has also been identified in Ferguson, R., & Clow, D. (2017). Where is the evidence? A call to action for learning analytics.
The importance scale suggests priorities: (1) privacy & ethics (safeguard); (2) management and goals; (3) data management & analysis
The rating results of the these statements show an obvious drop of rating scale in the ‘ease of implementation’ level of these themes, compared to their ‘importance’ level. One of the implications is that the six features could potentially be challenges to deal with in order to scale up the adoption of LA.
It’s also interesting to see that privacy & transparency relevant actions are considered the easiest to implement.
16 countries – UK (21), Spain (11), Estonia (3), Ireland (2), Italy (2), Portugal (2), Austria (1), Croatia (1), Czech Republic (1), Finland (1), France (1), Latvia (1), Netherlands (1), Norway (1), Romania (1), and Switzerland (1)
21 out of 51 institutions were already implementing centrally-supported learning analytics projects.
25 institutions have established formal working groups, but not all institutions have planned to provide analytics data to students.
In many cases where LA was supported centrally, LA was usually initiated under the wider digitalisation strategies or teaching and learning strategies. However, there were also a great number of institutions that had not defined clear strategies for learning analytics and were still at the ‘experimental’ or ‘exploratory’ stage.
The interviews identified three common aspects of internal drivers for the adoption of learning analytics:
Learner-driver: to encourage students taking responsibility for their own studies by providing data-based information or guidance.
Teaching-driver: to identify learning problems, improve teaching delivery, and allow timely, evidence-based support.
Institution-driver: to inform strategic plans, manage resources, and improve institutional performances, such as retention rate and student satisfaction.
An equivalent question (multiple choices) in the survey provided 11 options for motivations specific to learning and teaching. The results identified five top drivers.
Student engagement data: when, how long, etc.
Inform course design: reflect on places where students fail.
Know ‘why’ students struggle: it’s not good enough to just know that students fail certain questions.
Inform teaching support and curriculum design so that no one is falling behind or having to learn the same materials repetitively.
Support a widening access policy – at a class level.
Support students at all achievement levels to improve learning by providing them a better overview of their own learning progress.
No one-size-fits-all solutions:
Needs vary by institutions, but existing solutions focus on addressing retention problems.
differences among subjects and faculties.
Uncertainly about the benefits of LA: fear of failing expectations
Correlation does not suggest causation: e.g., engaging in discussion forums does not necessarily prevent students from failing, even though data may suggest a correlation between forum engagement and learning success.
Correlation does not suggest causation: e.g., engaging in discussion forums does not necessarily prevent students from failing, even though data may suggest a correlation between forum engagement and learning success.
GDPR, Article 6, “Lawfulness of processing”, counters Article 7 by allowing institutions to process personal data when such data is necessary for the purpose of ‘legitimate interests’, or are necessary to carry out tasks that are of ‘public interest’.
Three purposes:
to comply with legal requirements, such as visas;
to improve educational services, such as learning support, teaching delivery, career development, educational resources management, and the support of student well-being;
to improve the overall performance of the university, such as league rankings, equality, and the recruitment of future students.
Anonymity: okay with personal tutors. Not okay with tutors who may be involved in marking student performance.
Access: extreme distrust in 3rd parties for the fear of becoming marketing targets.
Although the participants had strong views about protecting their privacy and expectations about how their data should be used, they felt that they had sufficient understanding about the existing data practice to critically question its legitimacy – privacy paradox. The privacy paradox phenomenon suggests that institutions need to scale up their transparency and effective communication with students.
The ROMA model was originally designed by to support policy and strategy processes in the field of international development. The model begins with defining an overarching policy objective, followed by six steps designed to provide policy makers with context-based information. It allows decision makers to identify key factors that enable or impede the implementation of learning analytics. Moreover, the reflective process allows refinement and adaptation of policy goals to meet context change over time.