1. Researching learners’ digital experiences RhonaSharpe JISC Learner Experience Support & Synthesis projects JISC SLiDA projectChair ELESIG Oxford Centre for Staff and Learning Development Oxford Brookes University Greg BenfieldJISC Learner Experience Support & Synthesis projects JISC SLiDA project Oxford Centre for Staff and Learning Development Oxford Brookes University Helen Beetham Consultant to JISC JISC Llida project
2. Why research learners’ experiences of e-learning? Learners’ experiences are a key measure of our success Technology is pervasive in learners’ lives Learner’ perspectives are surprising, demanding, innovative Learners have new expectations of education, thanks to technology Learners need new skills and strategies for the digital age
9. Sustained engagement Students feel involved, feel part of the project, (logo, contact). Using a flexible and responsive approach to data collection High involvement from academic staff in each of the disciplines. Providing incentives and benefits Students get something out of the project, including the benefits of reflection on their learning
10. Participatory approach “Nothing about me, without me” Involving learners as consultants and partners Early and continued participation Meaningful and useful outcomes
11. What if you can engage learners over time, in holistic, participatory research?
12. Context Technology use was prompted by the course, the tutor, peers, work and/or specific learning requirements. Learners use technology to create their own environments which meet their needs and the demands of their context. We noted the agility of some learners at finding and using tools, skills and social networks to support their study in creative ways.
17. What if you can engage learners over time, in holistic, participatory research? Uncover hidden practices Creative appropriation Agile adopters Situated practices
The recognition that learners are using numerous personal and social technologies, outside those recommended by tutors was termed the ‘underworld’ by Linda Creanor and colleagues in their report of interviews with 52 adult learners (Creanor et al, 2006). Their choice of terminology exemplified the extent to which their research approach had revealed aspects of learners’ lives which previously had not been observed in the course evaluations typical of the time. The unexpected use of technology by learners included the use of the Internet as the first port of call for information and the pervasive use of social networking tools. Such findings have been replicated in larger samples since (JISC student expectations and great expectations surveys, annual surveys at Edinburgh and oxford, Melville, 2009). This has led to a desire to find out exactly how learners are making use of the technology available to them, if not in ways which their teachers and tutors expect.Creanor et al used interviewing with a very open set of questions, analysed carefully by a team of 3 using IPA. Phase 2 used these tools, nothing surprising here, except it might seem obvious to say, but needs mentioning - learner experience research needs methods which capture and retain student voicesbut it’s the approach that underpins them that is more interesting ...
Well, first up, we found that context is incredibly important. Everyone has been saying that for a while, but ‘context’ used to be seen as the course (when research = course evaluation) or then more recently as the VLE. If you do holistic research, you find the different learning environments learners are operating in, and for some (agile adopters) those they are creating. But even for the agile adopter, that is context dependent, they might be expert in one environment but not in the next. The only way to see this is by close examination of a person. You can’t get at it from a survey or even from a single interview. THIS IS IMPORTANT when you are concerned, as we are, with learner development and teaching practice. This careful understanding is so important when faced daily with this kind of *&?! (charleswankels new book)
Also see STROLL’s guidelines for induction
So, in summary, the benefits of conducting holistic, participatory research are: (while remembering LexDis’s warning that not all LE research is participatory).1. It uncovers hidden practices: we found situated learning practices, and a few examples of creative appropriation (not many)
Second, it improves impact.We saw Student produced outputs e.g. LEXDIS, which help us to work out not just what the experience is, but how we should respond as well.The SLiDA case studies are examples of impact. And we are starting to produce conceptual accounts which arise from the actual learner experience, but these have some way to go I think.
SamplingThe classic problem with sampling in online research is that you don’t know quite who you are talking to. Our research shows that this doesn’t always have to be a problem. Most of our projects used purposive sampling. Want the articulate, reflective, skilled learners, but issues about representativeness. E.g. Dujardin talks about the need to have a ‘key informant’ willing to engage in reflective conversations with tutors, to help us all understand participation patterns in an online learning community. For example email interviewing already been noted to have problems with providing data from IT literate individual who have a preference for the written word (hunt & mchale, 2007). Do note though that attrition tends to be better with participatory approaches e.g. thema, Lexdis, lead, all exceeded their expected numbers.ElicitationIn qualitative research, it is so time consuming and disheartening to spend time conducting interviews and transcribing and not get the data you needed. The methods described here help students to feel part of research project, to understand what is wanted from them, to have opportunities to clarify, question and validate their responses e.g. The researchers personalised requests for information, conversational style and replies, were important in eliciting the data needed. Elicitation is also helped by having artefacts to prompt conversation e.g. interview plus and card sort, although Towle & Draffan (2009, Greenwich presentation) note that interview plus takes time for the researcher (they advise seenign the artefacts in advance, and that researchers need to understand the context in which the artefact has been produced , in order to be able to talk about it. Also popular is using students are researchers (Ainley, 2009)Dealing with the data The powerful nature of learners’ voices. Avoiding the temptation to over emphasise selected verbatim quotes at the expense of analysis and synthesis of qualitative data. Remembering that storytelling still needs to come from a research methodology e.g. Thema’s good explanation of what case studies (after Yin 2003) can and can’t do.handling large amounts of multimedia data e.g. stroll mindmapsRepresenting learners’ voicesProducing easily digestible and readable findings from large amounts of data e.g. personalized, vivid case studies of individual learners (e.g. Thema)Ethical issuesThema – how to deal with students who let the researcher know that they were having problems. Informed consentThema – difference between being anonymous and being impossible to identify. Our participant information sheets have to acknowledge this danger up front. Explaining to students just how widely their middle of the night study bedroom video diary will be disseminated at national conferences