Speakers:
David Lewis, senior analytics consultant, Jisc
Martin Lynch, learning systems manager, University of South Wales
An opportunity to find out about how an institution has been implementing learning analytics to support the student journey with and opportunity to discuss issues and possibilities that the use of learning analytics may create.
PISA-VET launch_El Iza Mohamedou_19 March 2024.pptx
Exploring learning analytics
1. Learning Analytics Implementation at USW - 2017/18 activity
Playing the hand you’re dealt
Learning Analytics Implementation at USW 2017/18
Martin Lynch
Learning Systems Manager, CELT
2. Learning Analytics Implementation at USW
• What is happening regarding Learning Analytics implementation at USW;
• The cards in our deck
• Timeline of events
• What are we doing in the project – 4 use-cases
• Ethics and GDPR
• Challenges and successes
3. Learning Analytics Implementation at USW
The cards in the deck
• JISC
• Tribal
• Blackboard
• GDPR
• PVC
• Strategic alignment
• Professional Services – Library, Registry,
IT Services Development
• Governance
• Learning Technologist
• Developer
• Project manager
• PACs
• Students
STUDENTS
STUDENTS
External
InternalStakeholders
4. Learning Analytics Implementation at USW
June 2015 PVC Student Retention and Success Dr Ben Calvert promotes internal conversation
regarding Student Retention and Success – USW now looking for an analytics solution
USW put forward
for the last place on
the JISC discovery
phase - consultation
stage begins with Bb
Bb consulting run
Discovery phase at
USW in order to
produce a “Readiness
Assessment”
Memorandum of
understanding signed
between USW and JISC
Readiness
assessment
report published
University agrees to set up
Learning Analytics Project – 2
year duration 2 posts identified,
with 0.6FTE project management
Project board
initiated at USW
Recruitment process
setting up for learning
technologist and developer
Begin extraction
of historical data
from Blackboard
Assessment data
coming from Bb
Progression data from Student
Record system now in UDD
Appointment of
learning technologist
Student enrolment
form amended in
light of LA project
Module marks for
current session held
in Student Record
System showing in
Data Explorer
Decision taken that Data
explorer does not meet
minimum requirements
for use as a pilot tool –
data and usability issues
Senior faculty staff and data users
meet with Tribal to consider
predictive modelling requirements
– select 3 key models
Staff questionnaire
sent out to baseline
expectation on
functionality of Data
explorer 50 staff agree
to pilot use of Data
Explorer from Jan
Appointment of
IT Developer
RUHere
attendance data
surfacing in DX
Tribal develop
3 new models
for USW –
data still has
gaps
USW invited to
join the ‘group
of five’
Workshop conducted for
student-facing dashboards
for 2019
All marks and
grades from Bb
and Quercus
appearing in DX
Library ‘recipe’
simplified by JISC
– test data
extract can begin
Historical grade data
from Student Record
System and
Blackboard (3 years)
now in warehouse –
Tribal can review
their models
Project timeline May 2015 - May 2018
PROJECT SET-UP = 18 MONTHS
PROCESS AND PRODUCT DEVELOPMENT = CONTINUOUS
DATA = 14 MONTHS
6. Learning Analytics Implementation at USW
• Personal Academic Coaches (PAC’s) 300+ at USW
• Every student allocated a PAC on enrolment
• Senior teaching staff from course team – understand delivery
• Follows student through their studies – 3X20 min meetings every year, focus on
academic progress
• Need to document the event and have data to inform conversation
• Data Explorer replaces accessing multiple data sources but more than the sum of
the parts
25. Learning Analytics Implementation at USW
• Predictions are made with very large data set (3 years)
• Machine learning algorithm so refining factors (regression analysis) in each
context (module or course) to establish significance
• Applying pattern recognition from training set to current students to find close
matches
• Predictive models are questions you want to ask of your data
• Tribal our current data science partner and models are almost there…
27. Learning Analytics Implementation at USW
• First models are retention-based and deal with risk
• Intention is to use intelligence to better target support
But..
• Are these models any good?
• What services to build - how should we use this data?
• What value is this to the institution?
29. Learning Analytics Implementation at USW
• Unexplored in the sector so breaking new ground in the UK
• Small-scale studies – ref. Dr Liz Bennett University of Huddersfield, Ed de Quincey
Keele University
• Understanding this as a type of feedback literacy – ref. Paul Sutton (2012) notions
of feedback literacy
Principles are;
• Access should be mediated in the first instance
• Motivation is personal so dashboards should be customisable to fit user needs
• Dispositional dimension is critical
33. Learning Analytics Implementation at USW
Ethics and GDPR
• USW implementation of Learning Analytics GDPR compliant
• Enrolment process amended
• Consent on protected characteristics
• Clearly described data processing activity
• Student-facing guide
• Student involvement in project
• Guided by ICO guidance and JISC
• Using ‘legitimate business interest’ to process data
34. Learning Analytics Implementation at USW
Successes
• Good resourcing has driven success
• Good appointments and effective project management
• Effective governance – interested (but not too involved) executive sponsor and a good board that turn up
and ask questions
• Staff see the value already but important to wait until its ready before showing it
Challenges
• Supplier pilot processes – delayed deployment for a year
• Some developments and data wrangling too slow (predictor, library recipe)
• Developers are hard to find – took us 10 months
• Unclear internal business processes – who looks after wobbling students?
• Inconsistent use of data for decision making
Project successes and challenges
35. Learning Analytics Implementation at USW
• DX – full roll-out and use for Personal Academic Coaches (PAC’s)
• Use of DX by other roles – faculty management, course and module teams to assess engagement and to
contribute to course design, side-cases (eg to evidence non-payment of fees)
• Study Goal – pilot use as a simple attendance monitoring tool
• Use of predictive data to drive early-warning alerts (Tribal)
• Assess value to institution of predictive business intelligence (Tribal)
• Compare and contrast Tribal and JISC predictive offers
• Student-facing dashboards – prototype mash-ups – refactor to something more useful
• Attract research and development funding to incubate this
• Full project evaluation and analysis of impact>options appraisal >recommendations
Project Vision and plans for 2018/19
Jisc – main supplier for the project, providing technical infrastructure (Learning Data Hub), data schema, visualisation tools, network opportunities
Tribal – data science provider – offering data science analysis and predictive models
Blackboard – our VLE provider – provides a plug-in which is harvesting student activity data and passing in to JISC Learning data Hub currently the main proxy for engagement – no assessment data!
Library – cooperation of Learning Services supplies us with data pertaining to library resource usage – books and e-resources
Registry – providing profile data for student users being fed into the LDH, providing assessment data for formal assessments
IT Services – providing development services – have managed to query Bb databases to extract ALL assessments – regardless of tool, both formative and summative including due dates and submission dates found in VLE
Governance
Students – the ultimate data owners and providers of the data need participation
Learning Technologist – allocated to the project for 2 years – data wrangling and staff development
Project Manager
Pro-Vice Chancellor Teaching, Learning and Student Experience – executive sponsor and champion
Strategic alignment – Learning Analytics finds itself woven into the university’s strategic plan for student success “The Student Experience Plan”
GDPR – this has shaped the public declaration of data processing, the consent framework and the local interpretation of the legal definitions of processing and intervention
First models are retention-based and deal with risk – obvious human and monetary cost to retention but with TEF in our mirrors, should be looking at data science to evidence added value and learner stretch
Intention is to use intelligence to better target support – “Barry’s Bell”
But..
Are these models any good? Need models to be ‘finished’ and need time to assess the results
What services to build - how should we use this data? – is the institution ready to consume this in a way that would improve business?
What value is this to the institution? – these are expensive services so what is the business case