2. What to expect Changing learning environment Conclusions Challenges in capturing data Changing Research and Analysis methods Analytics on a Massive Open Online Course
3. The changing learning environment http://bit.ly/gmNndn
4. The Web in 2010 http://www.flowtown.com/blog/have-we-reached-a-world-of-infinite-information?display=wide
13. Postings by participants across six weeks of PLENK Facilitator posts in PLENK PLENK Analysis: Interactions on the Moodle
14. PLENK Analysis: Interactions on the Moodle The complex network a facilitator's post generated Relationships between topics in a discussion in week 1 SNAPP and Netdraw
17. Analytics: student blogging experience ‘ You will be noticed only if you tweet.’ Comparison: CritLit – PLENK: Moodle introductions 42 – 197 blog visitors 170 – 566 blog visits 550 – 1478 Residence 29 – 69 http://helistudies.edublogs.org/2010/11/18/plenk-in-the-whole-world-almost/ MOOC Participant blog analytics comparison CritLit versus PLENK CritLit PLENK introductions in Moodle 42 197 blog visitors 170 566 blog respondents residences 29 69 blog visits 550 1478
18. PLENK Analysis: use of Twitter Twitter connections between PLENK participants
19. PLENK Analysis: use of Twitter #tags related to Twitter posts in the PLENK Daily - six weeks duration
20.
21.
Hinweis der Redaktion
Hi my name is Rita Kop and mine is Helene Fournier. We are researchers at the National Research Council of Canada and amongst other projects we are working on the PLE project, on the research and development of a Personal Learning Environment. Hanan Sitlia would have liked to be here today, but she couldn’t unfortunately.
We have outlined here the major points of this presentation. We would like to start with influences on the learning environment and learning itself. Also what challenges these changes might cause to research and analysis methods . Helene will then continue with telling you more about the research on Massive Open Online Courses that we have been involved in and with some conclusions drawn from the research .
The proliferation of Information and Communications Technology in recent years has changed the educational landscape . The Web has added to the complexity of our lives as it is adding huge amounts of information and resources but at the same time is aiding in the creation of a plethora of new opportunities for learning . Technological development has made learning environments outside the institutional structure a reality and it has made that faculty members are experimenting with open educational resources and cloud computing , acknowledging that informal and self-directed learning now form part of our every day existence.
The learning environment has moved more and more onto the web. The emergence of social media have made that the Web itself is changing and user-generated content is growing . As you can see here, the number of websites produced is growing steadily, and 70% of the ‘digital universe’ was in 2010 created by users – individuals at home, at work, and on the go. Of course if you carry out research in networked and Web-based learning and learning environments these changes also affect the nature of the research and the analysis of the research .
The research we will be talking about here was on a Massive Open Online course, PLENK2010 with as subject Personal Learning Environments, Networks and Knowledge. It did not not consist of a body of content and was not conducted in a single place or environment. It was distributed across the web . This type of learning event was first used by Stephen Downes and George Siemens and is called a ‘connectivist’ course, linked to their developing learning theory connectivism . I don’t have time to discuss connectivism here, that is a different presentation. Downes highlights that for these courses to run well, they should be based on four major types of activity and these were the activities we researched: 1. The aggregation of resources. One of the aggregation strategies was through ‘The Daily’ newsletter. 2. A remixing stage : after reading, watching or listening people reflect on what has been collected and make connections between different resources. 3. After this stage a form of repurposing was expected to take place, in which participants would create something of their own. In the PLENK2010 the facilitators suggested tools that participants could use to create their own content. The job of the participants was to use the tools and just practice with them. Facilitators demonstrated, gave examples, used the tools themselves, and talked about them in depth. It was envisaged that with practice participants would become accomplished creators and critics of ideas and knowledge. 4. The fourth stage would be a sharing stage, where participants were encouraged to share their work with other people in the course, and with the world at large.
The setting The MOOC researched was organized by two educational institutions and the Canadian Research Council . The subject of the course was Personal Learning Environments, Networks and Knowledge (PLENK). It was a free course which lasted 10 weeks and on which 1641 participants were registered. PLENK2010 did not consist of a body of content and was not conducted in a single place or environment. It was distributed across the Web . Participants were able to work completely in private, not showing anything to anybody if they wished to do so. Facilitators emphasized that sharing would always be the participant’s choice. A course identifier tag was used to recognize anything that was created in relation to the course, also outside the course environment, on sites such as blogs, social networking sites and through the use of micro-blogging tools such as Twitter (#PLENK2010). That is how content related to the course was recognized, aggregated, and displayed in ‘ The Daily ’ newsletter for the course. In addition, a Moodle Learning Management System with wiki was used to hold discussions and display course resources, including information on schedules and speakers for twice weekly Elluminate sessions . Learner support was provided by four facilitators in the form of videos, slideshows and discussion posts in addition to blog posts, feedback to blogs and Moodle discussion posts. Elluminate was used once a week by facilitators for a synchronous discussion and chat session of that week’s subject.
Participants would not only use the Moodle Learning Management System , but produce blogs, videos, animations, slideshows, wordles, pearltree networks, and used social networks such as Twitter, Second Life, Facebook, Friendfeed. De Laat (2006) highlighted the complexity of researching networked learning and emphasized as key problems the issues of human agency and the multitude of issues involved, such as the dynamics of the network, power-relations on the network, and the amount of content generated, which might be difficult to capture on a distributed learning environment . He suggested as research methods social network analysis in addition to more traditional forms of research data collections, such as interviews and surveys. This seems a viable choice of research methods. Analytics could be used as a form of Social Network Analysis, and could clarify who the central nodes on the network were, in other words which people on the network performed vital roles of connecting to the otherwise un-connected. It could also provide information on the importance of “connectors” to other networks, which would be important in finding out who the innovators on the network were, i.e. the ones to link vital information streams. Moving our research on an open online network has also made us rethink the ethics of the research . It has given a new meaning to the words ‘informed consent’ for instance . Miller and Bell [2002] argued that ‘gaining “informed” consent is problematic if it is not clear what the participant is consenting to and where “participation begins and ends” (p.53), which has implications for the data used. Boyd [2010] even contends that data could pose a threat to people when misused, or used for different purposes than for what it was supplied in the first place. Individuals should have the ability to protect their information and in the new online environment that is controlled by corporate entities, this is not always possible. Researchers should at least anonymise data in order to help in privacy issues. People should also have the choice to opt in or opt out. If someone is not aware that the information/knowledge is being collected or of how it will be used, he/she has no opportunity to consent or withhold consent for its collection and use. This invisible data gathering is common on the Web [Wel and Royakker, 2004, p.129, p.133]. We have had extensive communication with the NRC ethics board on this. That is why we only used data related to the course tag #PLENK2010.
If people are encouraged to move away from the institution for their learning, it is important to find out the relevance to the learning experience of the informal (online) networks in which they find their information. A network in the context of this paper would be an open online ‘space’ where people meet, as nodes on networks, while communicating with others and while using blogs, wikis, audio-visuals and other information streams and resources. We decided to try out some learning analytics on our data and we used the definition of the conference organizers: “ Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs .” This means different types of analysis might be required for different purposes . It also follows that not only quantitative data analysis should be used, but also qualitative as data collection would not only relate to the increase of the effectiveness of learning, for instance by showing trends in use, but also to possible changes in the learning process. The NRC research team decided to use a mixed methods approach and a variety of research techniques and analysis tools to capture the diverse activities and the learning experiences of participants on PLENK2010 . Learning analytics tools were used as a quantitative form of Social Network Analysis to clarify activity and relationships between nodes on the PLENK network. Three surveys were carried out at the end of the course, and after it had finished, to capture learning experiences during the course: A course end survey (N=62), an ‘active producers’ survey (N= 31) and a ‘lurkers’ survey (N=73). In addition, qualitative methods in the form of virtual ethnography were used. A researcher was an observer during the course , collecting qualitative data through observation of activities and engagement. She also carried out a focus group in the final week of the course to gain a deeper understanding of particular issues related to the active participation of learners . Discursive data was also collected from the different sites of learning.
As vast amounts of discursive data were generated and collected , computational tools, such as SNAPP, NetDraw and Nvivo , were used to identify themes in the data and for analysis and interpretation of the qualitative research data. For the data analysis on the course, the Moodle data mining functionality was used and provided participant details, their level of use and access of resources, information on course activities, and discussions taking place in the course forums. The gRSShopper aggregator statistics functionality provided details on course-related use of blogs and micro-blogging tools such as Twitter . Some analytics and visualization tools, such as the Social Networks Adapting Pedagogical Practice (SNAPP) tool, were also used to deliver real-time social network visualizations of Moodle discussion forum activity and Twitter activity , while the visualization tool NetDraw was used to create an ego network to provide an understanding of the role of a particular actor in a discussion. Because of the volume of data generated by the 1641 participants and facilitators and the restrictions on time to produce this paper, quantitative analysis of blog posts, Twitter and Moodle participation were used, but the qualitative analysis of data was restricted to the Moodle environment and blogs that were representative of all the blogs produced by participants . SNAPP accessible online at https://topaz.ad.uow.edu.au/SNAPP/Menu.html ) NetDraw accessible online at http://www.analytictech.com/downloadnd.htm )
We had not used vizalisation tools before, other than graphs and figures generated by for instance Excel spreadsheets. We wanted to play with the tools to find out if they would add anything to the other research methods that we used in the past . The questions put up on the screen were the main motivators for us to try out some analysis tools. Questions have also been raised by some social scientists, who think it a mistake if we leave the analytics to the commercial companies that are operating on the Web as their interpretation of data might not be neutral, but influenced by other interests (Lazer et al, 2010 ). I’ll hand you over now to Helene who will give you an insight in what we did on the MOOC and how we used analytics to better understand the learning taking place on PLENK2010.
Who were the participants? The professional background of participants on the PLENK course were mainly employed in education, research and design, and development of learning opportunities and environments. They were teachers, researchers, managers, mentors, engineers, facilitators, trainers, and university professors. Chart 1 shows PLENK participants’ age and Figure 2 shows a Google Map, instigated by one of the PLENK participants, representing participants’ residence.
Challenges of analyzing and visualizing participation on the course: The number of registered participants continued to rise throughout the course from 846 to 1641 by the end of the course Highlight tools used and participation over the 10 weeks A high number of blog posts were generated related to the course (949) and an even higher number of Twitter contributions (3459) The #PLENK2010 identifier facilitated the easy aggregation of blog posts, del.icio.us links and Twitter messages produced by participants, which highlighted a wide number of resources and links back to participant’s blogs and discussion forums, and thus connecting different areas of the course Registration was high but an examination of contributions across weeks (i.e., Moodle discussions, blogs, Twitter posts marked with #PLENK2010 course tag, and participation in live Elluminate sessions) suggested that about 40-60 individuals on average contributed actively to the course on a regular basis by producing blog posts and discussion posts, while visible participation was much lower for the majority. This gives us a graphical representation of an upward trend for blogs, course registrations, and Twitter activities as increasing in time but doesn’t help in interpreting interactions between them and any impacts on the quality of learning.
In the Moodle Forums for PLENK2010, general trends in posting behaviors indicate that there was a peak in activity in Week 2 in Moodle forums, with a slight upward trend in Blog and Twitter posts as well. This was followed by a sharp decline in the number of posts in all three mediums (Moodle, Blogs, and Twitter) in Week 3, a slight increase in Week 4, and a steady decline again in Weeks 5 and 6. Interestingly, the number of posts by course facilitators follows similar trends, with the number of posts by facilitators peaking in Week 2, then showing a steady decline in Weeks 5 and 6. The facilitator(s) played an important role in triggering discussion, questioning, providing feedback, and sustaining interaction amongst participants. Let’s contrast this graphical representation of the number of forum posts, blogs twitter activity (click to next slide….) with a more dynamic representation using SNAPP and Netdraw to focus in on the shape and quality of the social networks and who the important contributors to the hub might be.
The graphic in the previous slide showed the number of times people used particular tools but do not show how these interactions took place. We have been experimenting with several analytics tools , such as the social network analysis tool SNAPP used as a bookmarklet to the browser. The activation of the SNAPP tool results in an online network visualization and the results of these interactions have been exported to both VNA (Edgelist format) and GraphML formats and visualized in NetDraw to create a social network visualizations to illustrate the role that an actor plays in a particular discussion. SNAPP delivers real-time social network visualisations of forum discussion activity; in contrast to the vast majority of analytics derived from LMS reports, which usually provides system data on the number of session (log-ins) by users, amount of dwell time (how long the log-in lasted) and level of content downloads. The figure on the left shows that the facilitator as important (the red dot), but that there are other participants with a strong influence on the network through their connections with others. The figure on the right shows the relationship between the main topic of PLE/PLN in the Moodle discussion thread and concepts that are connected and their relationship to each, for Week 1 only. The free network visualization program Pajek was used to provide colored highlights of important nodes, either people or themes. Variable width nodes and arcs can also be calculated and analyzed in depth, in order to determine proximity-distance between people, concepts, perspectives on the network- but this was beyond the scope of current research. Imagine the complexity in visualizing relationships between topics and discussions for the entire course, a period of 10 Weeks !!!! Participants in PLENK2010 did express a lack of support/tools for tracking discussions or topic threads which sometimes extended over several weeks. Tools for visualizing the interaction of participants in the MOODLE, in blogs, and on Twitter could help facilitators and learners fill in gaps in their wayfinding and sensemaking in the context of a MOOC. Project are currently underway that explore Information visualizations with multiple coordinated views that enable users to rapidly explore complex data and discover relationships. However, it is usually difficult for users to find or create the coordinated visualizations they need. Snap-Together Visualization allows users to coordinate visualizations to create multiple-view interfaces that are customized to their needs. Users query their relational database and load results into desired visualizations. Then they specify coordinatibetween visualizations for selecting, navigating, or re-querying. Developers can make independent visualization tools 'snap-able' by including a few simple hooks. http://www.cs.umd.edu/hcil/snap/. So in this sense, users could feasibly visualize what’s going on in the MOODLE, in relation to blogs, twitter, or various other social networking sites or virtual worlds (identified by the course hashtag). **SNAPP is promoted as “allowing academic staff” to identify patterns of student behavior and facilitate appropriate interventions as required. However, some students/participants in PLENK2010 used analytics to get a better sense of the network or community of learners they were engaging with, their own contributions, and how participation on a network might have impacted their learning.
In order to better understand the nature of Moodle forum data, we used Nvivo to arrange discussion data into themes . This visualization places LEARNING at the center but as connected to sub-themes such as Personal, the Workplace, Environment, Resources. A quantitative analysis (word count) of themes for Week 1- Discussion of PLE/PLNs revealed the importance of the concept of “Network”, “Me”, “Learning” for participants, and exploration and questions to a lesser extent. The importance of the “network” in the discussion is thus highlighted, followed by the knowledge, ideas, thinking, information, tools, and experience that promote learning and the use of concept or mind mapping, and personal agency underscored in words such as I, me, my, own.
Outside the COURSE MOODLE environment , one of the applications that was used extensively by learners was Twitter , a micro-blogging tool where people can use a limited number of words to communicate a message to others. It is also possible to retweet and redistribute a message, or reply directly to someone who sent a message out., as expressed in the Tweets graph. We wanted to see how well connected participants were to the outside world and how they used Twitter : for chat, messages, or to distribute links to resources. As you can see in the graph of Tweets to External links , most tweets were links to resources, such as blog posts, videos or interesting papers to read.
Administrators and academic staff want to understand the scope and quality of interactions in measuring outcomes of learning in their courses but more and more participants also want to use analytics to measure their impact and contributions to the hub of activity, in courses, and various other areas of activity. One participant in PLENK2011 used Google analytics to understand the traffic and impact of visitors/replies to her blog; comparing and contrasting the number of posts during the previous MOOC (CritiLit) and for the current MOOC (PLENK2010).
Beyond analytics applied to the LMS/MOODLE for the COURSE, if we look at the reach of activities like Tweeter, outside the bounds of the course, we see here a visualization of Twitter connections between PLENK participants, over a six week period, with the use of the course ‘tag’. But again like the SNAPP visualization tool, it shows the connectedness and who might be important with lots of links to and from participants but it doesn’t tell us anything about the quality of the interactions or contributions. Some PLENK participants functioned as ‘ information hubs ’ while others did not communicate much; these are the outliers in this network.
This image illustrates the complexity of connections between people on PLENK on Twitter . A #tag can be used as an identifyer of a particular subject, in this case a course, and as you can see PLENK participants also sent messages and links to these other #tag communities. It doesn’t tell anything about the quality of the interactions, but it shows the connectedness between the course (tag) and other related topics, resources, and conversations and their reach as expanding outwards away from the course.
DISCUSSION AND CONCLUSION Data is not always complete At the moment, the data from MOOCs includes data from the LMS/MOODLE which gives only a very small picture of what learning and learning related activities are going on, data from blogs, Twitter and social networking sites. We have only scratch the surface in connecting all this data together as part of a comprehensive research methodology From January 10- February 20, 2011 there was an Open Online Course offered: Introduction to Learning and Knowledge Analytics http://learninganalytics.net/syllabus.html