Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Seminario eMadrid 2015 05 22 (UAM) Su White - Incorporando MOOCs en la formación presencial
1. Bending MOOCs into
face to face teaching
Su White
University of Southampton, UK
21/05/15 @suukii http://blog.soton.ac.uk/mobs/home/
2. Our MOOCs
Oceans Shipwrecks
Developing your
research project
Portus Web Science Digital marketing
Contract
Management Waterloo
21/05/15 @suukii http://blog.soton.ac.uk/mobs/home/
4. Teaching (& blending)
Orchestrated
• Preparation for
– Instruction
– Discussion
– Structured reflection
– Independent learning
• Vehicle for
– Enhancing motivation
– Sustaining motivation
Enviornmental
• Practice or rehearse
– Skills
– Argumentation
• Revise/prepare
• For tests and examinations
• After a break (vacation)
• Contextual reminder
21/05/15 @suukii http://blog.soton.ac.uk/mobs/home/
5. Blending (and teaching/learning)
purposeful
• Flipped classroom
• Structured exercise
– Revise for test
– Consult for extended writing
– Make the basis of class
discussion
– Use as an exemplar of
independent research
environmental
• Let the learners decide
– Access as a resource
– Available if wanted
• Independent
or semi-independent
learning
21/05/15 @suukii http://blog.soton.ac.uk/mobs/home/
6. Our MOOCs
Oceans Shipwrecks
Developing your
research project
Portus Web Science Digital marketing
Contract
Management Waterloo
21/05/15 @suukii http://blog.soton.ac.uk/mobs/home/
7. Web Science
• Flipped/blended
– Masters
– Undergraduate
– Generic
• Personal blending
– Review of approaches
• Revisions in pipeling
21/05/15 @suukii http://blog.soton.ac.uk/mobs/home/
9. Digital marketing
• Courses
– Masters
– Undergraduate
• After the event using
archives of discussion
• Extension of existing
face to face methods
• Next time
– In parallel
21/05/15 @suukii http://blog.soton.ac.uk/mobs/home/
10. Developing your research project
• Began as outreach/load
balancing
– Augmenting existing
support
– Used for recruitment
• Individual Integrated
– Generic – all levels
• Independent
– appropriated
21/05/15 @suukii http://blog.soton.ac.uk/mobs/home/
11. Contract management
• Courses
• Generic business
– Illustrative
• Potential - clients
• Professional
development
21/05/15 @suukii http://blog.soton.ac.uk/mobs/home/
14. Mini MOOC
• Next off the stock
• Developed by the library
• Using/showcasing archive
collection
• Courses
– Nothing planned.. But
– Introduction to archives for
history students at all
levels
• Public awareness
– Same principle
21/05/15 @suukii http://blog.soton.ac.uk/mobs/home/
15. Learning and change
• Constructive alignment
– Learning in the most
appropriate way
– Matching activities with
valued skills, knowledge
and understanding
– Trust the subject expert
• Pace and scaffolding
• Keeping the learner on
task
• Modelling (learning)
behaviours
We may change our use
depending on the stage of
learning
But also trust the learners to decide21/05/15 @suukii http://blog.soton.ac.uk/mobs/home/
16. Trust
• Trust humans to see
insights
• Academics are smart
• They like to be efficient
• Change needs trust to
happen
– Coffee room
conversations – I trust
my friends
– Reputation can work too
• Like teachers
– Student understanding
evolves over time
– Understandings can
develop
– There can be
understanding memes
– There is value in
modelling
21/05/15 @suukii http://blog.soton.ac.uk/mobs/home/
17. Final thoughts
– Let the learners decide
what is valuable
– Academics can’s unlearn
– All students are on
different paths
• Thank you
• Any Questions??
21/05/15 @suukii http://blog.soton.ac.uk/mobs/home/
18. FOLLOW WHAT WE DO
21/05/15 @suukii http://blog.soton.ac.uk/mobs/home/
19. Curation activity
Mendeley Group:
● Over 300 academic sources
related to MOOCs
● All tagged and classified
● Open Group
● you can join
● You can follow
Scoop.it page
● Daily curation
● Grey Literature
● MOOC news and journalistic
articles
22. Other groups
FLAN
• FutureLearn Academic
Network
• Regular meetings
• Some PhD students – across
the UK
Collaborations
• Southampton
• UEA
• Reading
– Sharing data
– Research roadmap
– Bidding for funding
e.g. Leverhulme, ESRC
21/05/15 @suukii http://blog.soton.ac.uk/mobs/home/
23. Not used – for ref
21/05/15 @suukii http://blog.soton.ac.uk/mobs/home/
28. MOOCs
• Researching
– Behaviours, beliefs and
understandings of MOOCs
• Building
– a collaborative network of
labs and researchers
• Creating
– tools to automatically and
efficiently log and annotate
MOOC related artefacts
• Assembling
– a definitive historic archive
for current and future
researchers
21/05/15 @suukii http://blog.soton.ac.uk/mobs/home/
29. Building a toolkit:
tools, methods and methodologies
Curated
e.g.
papers
Data
Wild
e.g.
twitter
Manual
to auto’Processing Datasets
e.g Excel
NVivo WorldWareMendeley
Custom C-Map
• Logging
– the ‘What? When? Where?
Why? and How?’ of MOOC
activity
• Charting
– the growth and evolution
of MOOCs
• Developing
– expertise in MOOC data
collection and analysis;
• Tracking
– platforms and technologies
Moving from manual towards automatic
21/05/15 @suukii http://blog.soton.ac.uk/mobs/home/
30. Questions
Stakeholders:
• What are the
motivations and
rewards for academics
running MOOCs
• How are academics
integrating MOOCs
with their face to face
teaching
• What are the models
for harnessing existing
MOOCs to bring
additional expertise
into the classroom?
21/05/15 @suukii http://blog.soton.ac.uk/mobs/home/
31. Questions
Learners:
• How can groupwork
insights and learning
analytics be combined to
enhance the learners’
exerience of MOOCs?
• What role can MOOCs
play in enhancing
employability of young
people?
• What role do MOOCs play
in enhancing digtial
literacies?
21/05/15 @suukii http://blog.soton.ac.uk/mobs/home/
32. Researching
Learner experiences
• learner engagement
and motivations
• how MOOCs are
impacting recruitment
on F2F courses
Hosting Experience
• Moderating discussions
Stakeholders
• Educator attitudes
• Institutional motivation
Potential
• Blended MOOCs
• Personalisation in
MOOCs
• Affordances
21/05/15 @suukii http://blog.soton.ac.uk/mobs/home/
39. Learner Activity Patterns (comments per user)
N. of users
N. of
Comments
Active social learners?
Highly active social
learners?
How many of these completed the
course?
40. Discussion generation analysis
Zeemap: place yourself in the world
map
Most Popular task
Reflection step
Steps (x) by number of comments that
they generate (y)
43. Text mining of Portus MOOC comments
● Undertaking primary research about development
and communication of archaeological knowledge
(see next slide)
● Using concordance (AntConc), topic maps and other
approaches to mine comments
● e.g. undertaking specific research such as examining
the multisensory nature of creative writing on the
course through co-occurrence of words (in this case
“smell”)
Author: Graeme Earl
Blending MOOCs into face to face teaching
Universities are investing increasing amounts of time in the development of online courses in the form of MOOCs. At the same time, there appears to be an enormous appetite amongst learners for this more flexible and informal approach to learning.
Seeking to maximise returns on investments in MOOCs, individual academics are looking to see how they can integrate their quote "free for all" online learning resources, with more formal face-to-face educational offerings. Furthermore, teachers are seeing potential gains to be made by incorporating M00Cs that have been created outside their own institution into their teaching programmes.
This presentation will briefly present some observations of the experience of blending M00Cs into formal teaching. It will discuss the strengths and possible weaknesses of such an approach and propose a framework for designing and implementing this type of blended learning.
We have a Mendeley group in which the MOOC observatory group records all the academic articles to which we come across.
We also have a Soop.it site in which we curate MOOC related content in magazines, blogs, newspapers, and other online publications.
Just in case the audience wants the links to the Mendeley group and the Scoop.it page
This is the blog, in which the Southampton MOOC Observatory group will be posting regularly its activity. Also with a QR code.
The University of Southampton has run 7 MOOCs in topics such as Oceanography, Web Science, Roman Archaeology, Maritime Archaeology, Digital Marketing, and Language Learning. Many of them have already run more than once. All of them are hosted in the Futurelearn MOOC Platform
Through Futurelearn, each MOOC run automatically generates a set of datasets:
A full comments log (similar to a twitter feed with anonymised user ID, comment ID, parent ID if it is a reply, timestamp, number of likes, and comment).
Enrolment data: A dataset that records user enrolments and un-enrolments by date (user IDs anonymised)
Peer Review assignments: A dataset that records learner assignments for being peer reviewed
Peer Review Reviews: A dataset that records the reviews that learners make to their colleagues’ assignments
Question Response: A dataset that records how well learners have done in the assessment quizzes (to be reminded that learner IDs are anonymised, but are unique and can be cross-analysed across different datasets)
Step Activity: A dataset that records when a learner has first visited a “step” (a piece of content) and when they have completed it.
Total figures of enrolments, comments, and types of learners (social, active, lurkers…), both weekly and over the whole MOOC term.
Futurelearn also provides a voluntary survey completed by learners in which they share data, among other aspects about:
Age
Education level
Aims, motivations, and expectations.
Learners complete an entry survey and an exit survey.
•logging the growth and evolution of MOOCs;
•assembling a definitive historic archive for current and future re- searchers;
•developing the tools to automatic- ally and efficiently log and annotate MOOC related artefacts;
•identifying and establishing a col- laborative network of labs and re- searchers;
Logging
the ‘What? When? Where? Why? and How?’ of MOOC activity
Charting
the growth and evolution of MOOCs
Developing
expertise in MOOC data collection and analysis;
Tracking
platforms and technologies
RESEARCHING• on learner engagement and motivations: 2 surveys conducted, one
with potential MOOC learners, the other with actual MOOC learners. • on educator attitudes• on marketing: how MOOCs are impacting recruitment on F2F courses • on customised automated recommender systems in MOOCs• Research on perspectives from other stakeholders within HEIs• on the role of social media in MOOCs• on MOOCs impact on the openness agendaCURATING• Mendeley group (Around 250 tagged academic sources)• 2 Scoopit blogsLOGGING PROGRESS• MOOC Observatory Blog• Building a MOOC timelineHOSTING• Seminars• Research discussions• Summer School (future plans)• Doctoral consortia (future plans)
RESEARCHING• on learner engagement and motivations: 2 surveys conducted, one
with potential MOOC learners, the other with actual MOOC learners. • on educator attitudes• on marketing: how MOOCs are impacting recruitment on F2F courses • on customised automated recommender systems in MOOCs• Research on perspectives from other stakeholders within HEIs• on the role of social media in MOOCs• on MOOCs impact on the openness agendaCURATING• Mendeley group (Around 250 tagged academic sources)• 2 Scoopit blogsLOGGING PROGRESS• MOOC Observatory Blog• Building a MOOC timelineHOSTING• Seminars• Research discussions• Summer School (future plans)• Doctoral consortia (future plans)
The University of Southampton has run 7 MOOCs in topics such as Oceanography, Web Science, Roman Archaeology, Maritime Archaeology, Digital Marketing, and Language Learning. Many of them have already run more than once. All of them are hosted in the Futurelearn MOOC Platform
Through Futurelearn, each MOOC run automatically generates a set of datasets:
A full comments log (similar to a twitter feed with anonymised user ID, comment ID, parent ID if it is a reply, timestamp, number of likes, and comment).
Enrolment data: A dataset that records user enrolments and un-enrolments by date (user IDs anonymised)
Peer Review assignments: A dataset that records learner assignments for being peer reviewed
Peer Review Reviews: A dataset that records the reviews that learners make to their colleagues’ assignments
Question Response: A dataset that records how well learners have done in the assessment quizzes (to be reminded that learner IDs are anonymised, but are unique and can be cross-analysed across different datasets)
Step Activity: A dataset that records when a learner has first visited a “step” (a piece of content) and when they have completed it.
Total figures of enrolments, comments, and types of learners (social, active, lurkers…), both weekly and over the whole MOOC term.
Futurelearn also provides a voluntary survey completed by learners in which they share data, among other aspects about:
Age
Education level
Aims, motivations, and expectations.
Learners complete an entry survey and an exit survey.
This network represents the % of learners who made comments that are shared between pairs of UoS MOOCs vs the maximum possible authors that could be shared between the courses (the lower of the two counts of authors). The thicker and darker the edges are between two vertices, the higher the proportion was of learners made comments in both vertices.
For providing the best MOOC delivery, we need to know our learners’ habits. This example represents the comments per day made by learners in one of our MOOCs. We have identified a pattern in which the activity increases in the extremes of the week, and decreases by the middle of the week. This can help devise more effective and efficient strategies for communicating with our learners and facilitate their learning process.
We can also identify which learners (and how many) are most active, and compare it with their levels of success in other course activities. In this chart, we see how many learners (y axis) make how many comments (x axis)
For individual datasets, we can determine how much discussion a piece of content generates. In this example, the “zeemap” step is what generated most comments (see next slide)
In our Understanding Language MOOC, there was a step that received more than 10000 comments. Learners would post where they come from and that was reflected in this map. We can identify popular steps with our analytics tools.
Having our data together in the MOOC observatory (within the Web Observatory architecture) allows us to join datasets and run aggregate analyses, as well as make comparisons between different courses, universities, and platforms. For example, we can analyse how similarly or differently learners behave in twitter, compared to their behaviour in the Futurelearn platform.
The MOBS also aims to provide training to both its members and external parties in the form of seminars, summer schools, webinars, workshops, and consortia. For example, I (Manuel) am going to run a workshop in the Futurelearn forum in London next week about facilitation and mentoring, and the will run the same workshop in the JTEL summer school in Ischia this July. All this under the umbrella of the MOOC Observatory.