2. About this workshop
An opportunity to consider analytics, in particular achievable
ambition to improve and enhance the practice and management
of education:
• Unpacking Analytics (15 mins – pres)
• Raising Questions and Concerns (10 mins - plenary)
• Case Studies & Survey of Analytics in Practice (10 mins – pres)
• Discussing Achievable Ambition (20 mins - round table)
3. Cetis Analytic Series
http://publications.cetis.ac.uk/c/analytics
• Case Study, Acting on Assessment Analytics
• Case Study, Engaging with Analytics
• Infrastructure and Tools for Analytics
• The impact of analytics in Higher Education on academic practice
• A Brief History of Analytics
• Institutional Readiness for Analytics
• A Framework of Characteristics for Analytics
• Analytics for Understanding Research
• What is Analytics? Definition and Essential Characteristics
• Legal, Risk and Ethical Aspects of Analytics in Higher Education
4. 1. Unpacking analytics
“Analytics is the process of developing actionable
insights through problem definition and the
application of statistical models and analysis against
existing and/or simulated future data.”
5. 1. Unpacking analytics
What interest?
• Business Intelligence
• Learning, Teaching and Assessment
However, this is not entirely new…
• Exam boards
• HESA returns
• Extensive use of data in school sector
6. 1. Unpacking analytics
Suggested purposes across the educational system:
• individual learners
• predictors of students requiring extra support
• functional groups within an institution
• institutional administrators
• enhanced regulation and accountability
• methods and tools to help teachers
7. 1. Unpacking analytics
Some institutional considerations:
• Why does your institution collect data
• What data is collected
• Where is data collected and stored
• Who has access to data
• When is it available
Provision of data, interpretation and visualisation, taking action…
8. Ethical and Legal Issues
Stakeholder motivations:
• In assuring educational benefits
• As businesses
• To satisfy expectations
Guiding principles:
• Clarity
• Comfort and care
• Choice and consent
• Consequence and complaint
9. 2. Raising questions and concerns (10 mins)
What do we imagine an educationally
meaningful analytics initiative would look like?
10. 3. Case Study 1: Engaging with Analytics
A richer of understanding of the student journey to scope a
Support system for staff and students. The team began
with a list of questions:
• What is actually happening to students, how can we find out?
• What are the touch points with between students and the institution?
• What are the institutional "digital footprints" of our students?
• What really matters to our students?
(Sheila MacNeill [CETIS] & Jean Mutton [University of Derby])
11. 3. Case Study 1: Engaging with Analytics
Engagement analytics have allowed the team to look
"beyond the classroom" and help identify patterns of behavior,
both academic and non-academic, that might lead to student
withdrawal. This has led to insights into how the withdrawal process
could be redeveloped to offer better support to “at risk” students.
12. 3. Case Study 2: Acting on Assessment Analytics
…e-submission and e-marking tools allows the collection of and
access to far more detailed levels of assessment data…
…using collective data from previous cohorts it is possible to visualize
common errors and their impact on final marks…
…once the assignment is completed and marked, a follow up
workshop provides a collective view of group performance…
…this opportunity for students to see common mistakes and contextualize
their own performance within a cohort is proving to be very motivating…
(Sheila MacNeill [CETIS] and Dr Cath Ellis [University of Huddersfield])
14. 4. Achievable Ambition (20 mins)
Small group discussion followed by plenary feedback
will develop and share improved thinking on practical ways
forward for enhancing student experience and outcomes at an
institutional or sectoral level.
15. Licence
This presentation <title>
by <presenter name> <presenter email>
of Cetis www.cetis.ac.uk is licensed under the
Creative Commons Attribution 3.0 Unported Licence
http://creativecommons.org/licenses/by/3.0/
Hinweis der Redaktion
Big Data is in the spotlight following celebrated applications in retail and in detecting criminal activity and doubtless there are hidden insights for teaching and learning in Big Data. This session will be an antidote to worries about Big Data and Big IT and will focus on achievable ambition to improve and enhance the practice and management of education. It will consider a less technically and culturally challenging development path than replication of corporate Big Data initiatives; the title of the session is a loose reference to the collection of essays by E.F. Schumacher entitled “small is beautiful: a study of economics as if people mattered”.We will step back from Big Data and consider the field of analytics generally, defining it as:“Analytics is the process of developing actionable insights through problem definition and the application of statistical models and analysis against existing and/or simulated future data.”The session will contain four components:1. Unpacking Analytics (15 mins - presentation)This will be a broad but shallow exposition. It will give the audience a non-technical view of the of the breadth of questions analytics can address and will outline issues arising from a consideration of ethics, law and professional practice. This will place Big Data and predictive methods in a wider context and will draw out the key themes from recent publications by the presenters2. Raising Questions and Concerns (10 mins - plenary discussion)A facilitated discussion will be opened with the question: “what do we imagine an educationally meaningful analytics initiative would look like?”3. Introducing Case Studies and a Survey of Analytics in Practice (10 mins - presentation)A short presentation will outline recent work undertaken by CETIS: brief case studies of analytics innovation in educational organisations that demonstrate achievable ambition; the results of a new survey, and the conclusions drawn from it, on the extent to which UK higher and further education organisations are using analytics to improve and enhance the practice and management of education. The case studies will show how different organisations have interpreted “appropriate technology” and have achieved sustainable results without attempting to emulate the multi-nationals’ use of Big Data and cutting edge technologies. The survey and conclusions will help to show how we should proceed from where we are today.4. Discussing Achievable Ambition (20 mins - round table discussion)Small group discussion followed by plenary feedback will develop and share improved thinking on practical ways forward for enhancing student experience and outcomes at an institutional or sectoral level. Printed copies, and URLs to online versions, of the case studies and the survey will be available as stimulus material.Throughout the session, audience contributions will be captured and live-blogged. The presenters will augment this record with references to relevant online resources and incorporate the materials presented. This will be published during or directly following the conference.
This is the Cetis effort at unpacking analytics.
For our work, we have adopted this definition of analytics because we find it useful to guide our thinking.
Big Data: Large amount of unstructured data in different formats: Many organisations are experimenting with data sets to generate insight to gain business advantage: Gartner analyst Doug Laney introduced the 3Vs concept in a 2001 MetaGroup research publication, 3D data management: Controlling data volume (amount of data), variety (number of types of data) and velocity (speed of data processing).Educational data miningLater we will look briefly at two case studies, one from the domain of business intelligence and the other from learning, teaching and assessment.Data used in many different ways, although probably the case the potential is nowhere near being realised.Exam board: simple presentation of data with little statistical analysisHigher Education Statistics Agency (HESA) returns: high level analysisIn schools sector: established use of data at a pupil level looking at attainment, added value, etc. (initially PANDA reports, now FFT & RAISEonline
Audiences and purposes across the educational system:1. for individual learners to reflect on their achievements and patterns of behaviour in relation to others2. as predictors of students requiring extra support and attention to help teachers and support staff plan supporting interventions wit individuals and groups3. for functional groups such as course teams seeking to improve current courses or develop new curriculum offerings4. for institutional administrators taking decisions on matters such as marketing and recruitment or efficiency and effectiveness measures5. enhanced regulation of the teaching and learning environment, which has potentially negative impact on teaching practice6. methods and tools intended to help lecturers carry out their tasks more effectively, which have the potential to be a useful tool in teaching practice“Over the years, an accommodation has developed between regulatory authorities, management and teaching professionals: educational managers indicate the goals which teachers and learners should work towards, provide a framework for them to act within, and ensure that the results of their activity meet some minimum standards. The rest is left up to the professional skills of teachers and the ethical integrity of both teachers and learners.This accommodation has been eroded by the efforts of successive governments to increase their control over the education received by both school and higher education students. Learning Analytics radically reduces the effort involved in gathering information on the way in which lecturers deliver the curriculum, and also to automate the work of analysing this information. An alliance of these two trends has the potential to constrain teaching practice, and therefore it is necessary to take a systemic view when assessing the impact of analytics on teaching practice. It is concluded that Learning Analytics should not be seen as a short cut to providing teaching professionals with universal advice on ‘what works’, and that its use to increase the accountability of teachers to management may have unintended negative consequences. Rather, the most promising area for enhancing teaching practice is the creation of applications which help teachers identify which of the many interventions open to them are most worthy of their attention, as part of an on-going collaborative inquiry into effective practice.”(Professor Dai Griffiths (IEC) CETIS Analytics Series, 2012)
Three steps:provision of data - from different data sources that may be of variable quality, poorly integrated and not designed for accessibility and require the development of a data warehouse2triple store3 approach. A good illustration of the importance of this stage is the Apple Maps debacle where either ‘bad data, incomplete data, conflicting data, poor quality data, incorrectly formatted datahas caused significant problems;interpretation and visualisation - working with practitioners to develop an understanding of how data held on systems can be used to inform the enterprise's activities and presenting information in an accessible and informative way and identification of additional data requirements; andactioning insights - processes by which practitioners and learners can turn insights into actions within their context
1. Data Protection2.Confidentiality and Consent3. Freedom of Information4. Intellectual Property Rights5. Licensing for Reuse1.5 GUIDING PRINCIPLESAs Voltaire’s Candide might have reflected, we are faced with the imperative to seek out the ‘best of all possible worlds’: In assuring educational benefits, not least supporting student progression, maximising employment prospects and enabling personalised learning, it is incumbent on institutions to adopt key principles from research ethics. As businesses, post-compulsory educational institutions are facing the same business drivers and globalised competitive pressures as any organisation in the consumer world. To satisfy expectations of the ‘born digital’ / ‘born social’ generations, there is a likely requirement to take on ethical considerations, which may run contrary to the sensibilities of previous generations, especially in respect of the trade-off between privacy and service.Notwithstanding these tensions, we conclude that there are common principles that provide for good practice: Clarity, open definition of purpose, scope and boundaries, even if that is broad and in some respects open-ended. Comfort and care, consideration for both the interests and the feelings of the data subject and vigilance regarding exceptional cases. Choice and consent, informed individual opportunity to opt-out or opt-in. Consequence and complaint, recognition that there may be unforeseen consequences and therefore providing mechanisms for redress.
Cetis Analytics Series: Case Study, Engaging with Analytics http://publications.cetis.ac.uk/2013/706Data fromexistingsystmeswithsmall changes in practice whererequired. Project workswithuniversitystats team, to developactivitiesbeyond management reporting.
Big Data: Large amount of unstructured data in different formats: Many organisations are experimenting with data sets to generate insight to gain business advantage: Gartner analyst Doug Laney introduced the 3Vs concept in a 2001 MetaGroup research publication, 3D data management: Controlling data volume (amount of data), variety (number of types of data) and velocity (speed of data processing).
Cetis Analytics Series: Case Study, Acting on Assessment Analytics http://publications.cetis.ac.uk/2013/750Onceidentifiedandobtained, data is analysed within a spreadsheet.
Cetis Analytics Series: Institutional Readiness for Analytics, offers another brief case study from the Open University, Data Wrangler project.
Big Data: Large amount of unstructured data in different formats: Many organisations are experimenting with data sets to generate insight to gain business advantage: Gartner analyst Doug Laney introduced the 3Vs concept in a 2001 MetaGroup research publication, 3D data management: Controlling data volume (amount of data), variety (number of types of data) and velocity (speed of data processing).