Presentation by Dr Jason Zagami to the Queensland Society for Information Technology in Education (QSITE) conference on 1 October 2013 on the Sunshine Coast, Queensland.
Zagami, J. (2013, October). Educational Technologies. Presentation presented at the Queensland Society for Information Technology in Education Conference, Sunshine Coast, Australia. Retrieved from http://www.slideshare.net/j.zagami/educational-technologies-26715895
17. Technology/Pedagogy/
Learning Analytics
• Boundaries of school based learning
• Ethics of crossing these boundaries
• Implications for life long learning
• Individual vs aggregated analysis
Pedagogical
Diversity
Tuesday, 1 October 13
25. Virtual Laboratories
• Students can make mistakes and repeat experiments
• Record and replay experiments to analyse for areas of
improvement and acknowledge excellence
• Often developed and linked with current research labs
• Laboratory/Workshops work not limited to a schools
physical labs/workshops
Pedagogical
Diversity
Tuesday, 1 October 13
44. LMS’s vs PLN’s
• Teacher/Institution Centred vs Learner Centred
• Blackboard/Moodle vs Social Media
• SCORM -> Tin Can API
• Platform Reliance
Technological
Innovation
Tuesday, 1 October 13
46. MOOCs
• cMOOC vs xMOOC
• Global market and commercialisation
• MOOC’s vs Communities of Practice
• Publishers/BB
Pedagogical
Diversity
Tuesday, 1 October 13
78. Learning Analytics
• Prediction (in which data is used to predict future
performance);
• Clustering (identify similarities in students/tasks);
• Relationship mining (between students, teachers,
concepts, etc.);
• Modeling (creating a model and using this for further
prediction or analysis); and
• Distillation of data for human judgment (the most
common use by teachers using statistics and
visualisations).
Learning
Analytics
Tuesday, 1 October 13
79. Learning Analytics
• Extracting and analysing data from learning
management systems;
• Building an analytics matrix that incorporates data
from multiple sources (social media, LMS, student
information systems, etc);
• Profile or model development of individual learners
(across the analytics matrix);
• Predictive analytics: determining at-risk learners;
• Automated intervention and adaptive analytics: i.e.
the learner model should be updated rapidly to
reflect near real-time learner success and activity
so that decisions are not made on out-dated models;
Learning
Analytics
Tuesday, 1 October 13
80. Learning Analytics
• Development of "intelligent curriculum" where
learning content is semantically defined;
• Personalisation and adaptation of learning based
on intelligent curriculum where content, activities,
and social connections can be presented to each
learner based on their profile or existing
knowledge; and
• Advanced assessment: comparing learner profiles
with architecture of knowledge in a domain for
grading or assessment.
Learning
Analytics
Tuesday, 1 October 13
81. Learning Analytics
• Data can uncover problems that might otherwise
remain invisible
• Data can convince people of the need for change
• Data can confirm or discredit assumptions about
students and school practices
• Data can get to the root cause of problems, pinpoint
areas where change is most needed, and guide
resource allocation
Learning
Analytics
Tuesday, 1 October 13
82. Learning Analytics
• Data can prevent reliance on standardised tests.
• Data can help evaluate program effectiveness and
keep the focus on student learning
• Data can prevent one-size-fits-all solutions
• Data can help address accountability questions
• Data can build a culture of inquiry and improvement
Learning
Analytics
Tuesday, 1 October 13