SlideShare ist ein Scribd-Unternehmen logo
1 von 37
Downloaden Sie, um offline zu lesen
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Mobile Learning EvaluationMobile Learning Evaluation
Giasemi Vavoula
University of Leicester
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
OverviewOverview
Evaluation (session) in context
Evaluation context
Part 1: What do we evaluate? (a framework)
Part 2: How do we evaluate it? (methods and tools)
Part 3: Practical & ethical considerations
Identifying assumptions
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Evaluation (session) in ContextEvaluation (session) in Context
Evaluation
Research
Publishing
Ethics
Theorising
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Evaluation ContextEvaluation Context
Part 1. What do we evaluate?
M3 Evaluation Framework (Vavoula & Sharples 2009)
Part 2. How do we evaluate it?
Methods and tools
Case study of evaluation methods and tools within M3
Framework
Part 3. Practical & Ethical considerations
Who evaluates and who is evaluated
Where
When
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Part 1. What do we evaluate?Part 1. What do we evaluate?
technology
experience
institutional practice
personal practice
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Part 1. M3 Evaluation at three levelsPart 1. M3 Evaluation at three levels
Micro level: user’s experience of the technology
Usability
Utility of functions
Meso level: user’s learning/educational experience
Cognitive learning
Breakthroughs
Breakdowns
Macro level: impact on institutional & personal
learning/teaching practice
Appropriation of new technology: unexpected and envisaged
use
New practices – further requirements
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Part 1. M3 Evaluation in three stagesPart 1. M3 Evaluation in three stages
User’s expectations (data collection)
User’s actual experience (data collection)
Expectations – reality gaps (data analysis)
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Part 1. Evaluation at 3 levels,Part 1. Evaluation at 3 levels,
in 3 stages, throughout project lifecyclein 3 stages, throughout project lifecycle
design implement deploy
micromesomacro
analyse requirements
Technology robust enough to
support full user trial
Technology deployed long
enough to assess impact
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Part 2. How do we evaluate?Part 2. How do we evaluate?
Typical process:
Collect data
Analyse data
Answer/refine research questions
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Part 2. Case Study: MyartspacePart 2. Case Study: Myartspace
Handout
phones
Explore
museum
Recap
learning
task etc.
Logon
Phone
training
Share /
present
Example
gallery
CollectCollectCollect
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Part 2. Case study in greater scheme of thingsPart 2. Case study in greater scheme of things
Learning tools
Learning
method +
activities
Learning
objectives +
outcomes
Social setting
Location +
space layout
…
Familiar, setFamiliar, setUnpredictableUnpredictable
Pre-determinedPre-determinedUnknown – some
idea
Unknown
Pre-set, externalPre-set, externalUnknownUnknown
FixedKnownUnpredictableUnpredictable
FixedKnown but not
standard
Known but not
standard
Unpredictable
traditional
classroom
museum school
visit
general museum
visit
mobile
vagueness++ --
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Part 2. Collect data @ all levelsPart 2. Collect data @ all levels
Data sources
Stage 1 - expectations
• Design heuristics
• System documentation
• Experience documentation
• Promotion materials
• Minutes of project meetings
• Project proposal
• Press coverage
• Scoping study / literature review
• Stakeholder/user interviews & focus groups
• …
Stage 2 - reality
• Evaluation outcomes Requirements specification
• User observations
• Stakeholder/user interviews & focus groups
• User questionnaires
• User-created artifacts
• Stakeholder consultation workshops
• Heuristic evaluation
• …
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Part 2. Collect data @ micro levelPart 2. Collect data @ micro level
Method: Heuristic Evaluation
Collect data re expectations
Established design heuristics
Collect data re reality
Experts undertaking heuristic evaluation
Analyse gaps
Analysis of expert reports and production of (re)design
recommendations
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Part 2. Collect data @ micro levelPart 2. Collect data @ micro level
Method: Technical Testing
Collect data re expectations
Data supplied by system requirements
Collect data re reality
System performance tests outcomes
Analyse gaps
Comparison of performance data against requirements
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Part 2. Collect data @ micro levelPart 2. Collect data @ micro level
Method: Full-scale user trial
Collect data re expectations
Examine system documentation (Teacher’s Pack and Lesson Plans, online help) for
descriptions of functionality
Interview teacher prior to lesson to assess level of knowledge and expectations for
functionality
Observe training sessions at museum and school to document how functionality is described
to teachers/students.
Student questionnaires regarding expectations of system functionality in forthcoming lesson
Collect data re reality
Observe lesson to establish actual teacher and student experience of functionality
Interview teacher after the lesson to clarify experience of functionality
Questionnaire and focus groups with students after the lesson to capture experience of
functionality
Analyse gaps
Capture expectations-reality gaps in terms of user experience of functionality through
• reflective interpretation of documentation analysis in the light of observations
• interviews and focus groups with teachers/students
• critical incident analysis with students
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Part 2. Collect data @ meso levelPart 2. Collect data @ meso level
Method: Full-scale user trial
Collect data re expectations
Analyse description of educational experience based on Teacher’s Pack and Lesson Plans
Interview teachers and museum educators prior to lessons about what they have planned for
the students’ learning experience
Observe teachers and museum educators while presenting learning experience to students in
the classroom/museum
Student questionnaires regarding expectations of learning experience in forthcoming lesson
Collect data re reality
Observe educational experience in museum/classroom
• Note critical incidents that show new forms of learning or educational interaction
• Note breakdowns
Interviews/focus groups with teachers, museum educators, students on educational
experience in museum/classroom
Analyse gaps
Capture expectations-reality gaps in terms of educational experience through
• reflective interpretation of documentation analysis and observations
• interviews/focus groups with teachers, students, museum educators
• critical incident analysis with students
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Part 2. Collect data @ macro levelPart 2. Collect data @ macro level
Method: Full-scale user trial
Collect data re expectations
Analyse descriptions in service promotion materials, original proposal, minutes of early
project meetings
Interviews with stakeholders to elicit initial expectations for impact of service
Collect data re reality
Review of press coverage and interviews with stakeholders to document
impact/transformations effected by the service
Analyse gaps
Reflective analysis of expectations-reality gaps in terms of service impact
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Part 2. Collect data @ all levelsPart 2. Collect data @ all levels
Method: Various for Requirements analysis
Collect data re expectations
Scoping study of previous projects and related recommendations
Consultation workshop on ‘User Experience’ to establish requirements
Collect data re reality
Data supplied by evaluation analysis
Analyse gaps
Workshop to finalise educational and user requirements
Revisions of requirements in light of evaluation findings
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
TOTAL
Collected objects
Written comments
Sounds
Photographs
ClassGroup avg.
58
7
7
11
33
637
75
77
121
364
“A student can
effectively process
5-10 items during a
single post-visit
lesson”
Part 2. Example of data analysisPart 2. Example of data analysis
“-It has a code
- I want to take my
own picture”
“How will I know
what this photo
is about?”
“Expect to be able
to record what
pictures are of”
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
TOTAL
Collected objects
Written comments
Sounds
Photographs
ClassGroup avg.
58
7
7
11
33
637
75
77
121
364
Part 2. Example of data analysisPart 2. Example of data analysis
“A student can
effectively process
5-10 items during a
single post-visit
lesson”
“-It has a code
- I want to take my
own picture”
“How will I know
what this photo
is about?”
“Expect to be able
to record what
pictures are of”
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Part 2. Example of data analysisPart 2. Example of data analysis
micro
meso
macro
Creating and collecting
items is quick and easy
Children enjoy the creativity
and sense of ownership in
creating own content
System does not support
annotating collected items
Frustration /
confusion
Change to system to
support photo annotation
Read label into the phone
after each photo
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
TOTAL
Collected objects
Written comments
Sounds
Photographs
ClassGroup avg.
58
7
7
11
33
637
75
77
121
364
Part 2. Example of data analysisPart 2. Example of data analysis
“A student can
effectively process
5-10 items during a
single post-visit
lesson”
“-It has a code
- I want to take my
own picture”
“How will I know
what this photo
is about?”
“Expect to be able
to record what
pictures are of”
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Part 2. Example of data analysisPart 2. Example of data analysis
micro
meso
macro
Creating and collecting
items is quick and easy
Decomposing collected
content takes longer
Enforce upper limit
on number of collected items
Teachers change their practice
to do >1 post-visit lesson
Make website simpler
quicker to use
Educate students to
regulate collecting
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Part 3. Beyond the case studyPart 3. Beyond the case study
Learning tools
Learning
method +
activities
Learning
objectives +
outcomes
Social setting
Location +
space layout
…
Familiar, setFamiliar, setUnpredictableUnpredictable
Pre-determinedPre-determinedUnknown – some
idea
Unknown
Pre-set, externalPre-set, externalUnknownUnknown
FixedKnownUnpredictableUnpredictable
FixedKnown but not
standard
Known but not
standard
Unpredictable
traditional
classroom
museum school
visit
general museum
visit
mobile
vagueness++ --
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Part 3. More to considerPart 3. More to consider
Practical and ethical considerations of:
Where, when and how do we collect data
Who evaluates and who is evaluated
Whatever happened to learning outcomes?
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Part 3. Where / when / howPart 3. Where / when / how
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Part 3. Where / when / howPart 3. Where / when / how
Roto et al., 2004
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Part 3. Where / when / howPart 3. Where / when / how
Technology-based solutions
ASL MobileEye eyetracker
(Wessel et al. 2007; Mayr et al. 2009)
Constraints (other than the
obvious…):
Limited temporal and spatial accuracy
(short fixations may be missed; tricky
to calibrate fixation distance)
Laborious data analysis
Can’t infer cognitive processes…
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Part 3. Where / when / howPart 3. Where / when / how
‘Cooperative Inquiry’-based solutions
(Hsi 2008)
Learner accounts
(diaries, questionnaires, post-interviews, attitude surveys)
Constraints:
Accuracy of recall
Post-rationalisation
Concern of projected image
Fragmentation of learning
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Part 3. Where / when / howPart 3. Where / when / how
Triangulation ever so important:
mixed methods
Validate, and also
Capture different perspectives on:
Video, audio, observation notes, learner-created
artifacts, screenshots, interview transcripts…
Constraints:
Synchronisation
or converting into meaningful narratives
Smith et al., 2007
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Part 3. Where / when / howPart 3. Where / when / how
Mobile technology translates (most often) to
personal technology
Are learners willing to be monitored? How much of their
privacy will they unveil? What if they’re under-age?
Is it OK to monitor everything? How much do we really
need to know?
Even if they agree, is it easy to safeguard personal
data? What are best dissemination practices?
Will users cooperate in practice? E.g. synchronise as
and when needed?
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Part 3. Who evaluates / is evaluatedPart 3. Who evaluates / is evaluated
“Will users cooperate in practice?”
Users/participants as co-researchers
• Who defines the agenda?
• Ethics?
• Capacity?
• Commitment?
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Part 3. Learning outcomesPart 3. Learning outcomes
Assessing learning processes and outcomes
Classroom: well-established assessment methods
(essays, open-book exam, unseen exam, multiple-choice test)
• Formative assessement: provide feedback on progress
• Summative assessment: judge achievement
– Measure of teaching success
– Measure of learning effectiveness
(Boud 1995)
– Reliability? Validity?
(Knight 2001)
Informal/Mobile: elusive, highly personal learning
outcomes…
• When, what, how learning occurs not pre-determined
• Sometimes not even post-determined…
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
E.g. Museum learningE.g. Museum learning……
Studies that measure knowledge gains give inconclusive results,
reporting variable amount and nature of cognitive learning
Rennie & McClafferty 1995
Knowledge gains are hard to achieve during a short visit in an
unfamiliar context
Gail Donald 1991
The main conceptual gains are in consolidating/ reinforcing
previous knowledge, not acquiring new knowledge
Falk 2004
“Measurements of specific
impacts with the traditional tools
of experimental design are often
inappropriate for the confounding
variability of informal settings,
making the result of such
assessment often disappointing or
insignificant”
(Bitgood et al. 1994)
“Each visitor has
a unique experience”
(Rennie & McClafferty 1996)
“Whilst many studies use
performance on assessment as a
proxy for learning, this remains
problematic for several reasons.
Perhaps most importantly, it is
assumed that what has been
learnt can be performed; that
there is a correlation between
learning and assessment. This is
evidently not the case.”
(Oliver & Harvey 2002)
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Part 3. Learning outcomesPart 3. Learning outcomes
Learner perceptions
Attitudes towards the technology
Enjoyment of experience
Watch for processes which indicate that
learning may be happening
showing responsibility for and initiating own
learning (e.g. by writing, drawing, or taking
photos by choice; deciding where and when
to move)
being actively involved in learning (e.g. by
absorbed, close examination of resources; or
persevering with a task)
making links and transferring ideas and skills
(e.g. by comparing evidence)
sharing learning with experts and peers (e.g.
by talking and gesturing; or asking each
other questions)
Griffin & Symington, 1998
Assess learner-created artifacts
online media they create, personal reflective
accounts such as blogs and e-portfolios, logs
of interactions with and through the
technology
Longitudinal studies
Validated attitude measurement scales
needed
Critical incident analysis may be helpful
– but outcomes need to be triangulated
What makes a good blog? Assessment
standards still to be agreed…
New research mindsets
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
ConclusionConclusion
Notice the assumptions:
Mobile learning happens in discrete, time-bound
episodes
(Mobile) learning is clearly distinguishable from other
forms of human activity
More?...
G. Vavoula – 15/10/09
Mobile Learning Evaluation MLearnResearch 09
Related publicationsRelated publications
1. Vavoula, G., Pachler, N., and Kukulska-Hulme, A. (Eds.) (2009).
Researching Mobile Learning: Frameworks, methods and
research designs.
Peter Lang.
2. Vavoula, G., Sharples, M. (2009).
Meeting the Challenges in Evaluating Mobile Learning: A 3-level
Evaluation Framework.
International Journal of Mobile and Blended Learning, 1(2), pp.
54-75.
(Or view preprint)

Weitere ähnliche Inhalte

Was ist angesagt?

Thefutureeducator 91788-pred-213-learning-based-technology-education-educatio...
Thefutureeducator 91788-pred-213-learning-based-technology-education-educatio...Thefutureeducator 91788-pred-213-learning-based-technology-education-educatio...
Thefutureeducator 91788-pred-213-learning-based-technology-education-educatio...SFYC
 
Sunway University 12th Annual ESAP Symposium (E) Workshop J Farhana Azhani ...
Sunway University 12th Annual ESAP Symposium (E) Workshop J   Farhana Azhani ...Sunway University 12th Annual ESAP Symposium (E) Workshop J   Farhana Azhani ...
Sunway University 12th Annual ESAP Symposium (E) Workshop J Farhana Azhani ...Stephen j Hall
 
Introduction to Computerized Adaptive Testing (CAT)
Introduction to Computerized Adaptive Testing (CAT)Introduction to Computerized Adaptive Testing (CAT)
Introduction to Computerized Adaptive Testing (CAT)Nathan Thompson
 
Report of Previous Project by Yifan Guo
Report of Previous Project by Yifan GuoReport of Previous Project by Yifan Guo
Report of Previous Project by Yifan GuoYifan Guo
 
Interactive e-assessment
Interactive e-assessmentInteractive e-assessment
Interactive e-assessmentRMIT University
 

Was ist angesagt? (6)

E assessment
E assessmentE assessment
E assessment
 
Thefutureeducator 91788-pred-213-learning-based-technology-education-educatio...
Thefutureeducator 91788-pred-213-learning-based-technology-education-educatio...Thefutureeducator 91788-pred-213-learning-based-technology-education-educatio...
Thefutureeducator 91788-pred-213-learning-based-technology-education-educatio...
 
Sunway University 12th Annual ESAP Symposium (E) Workshop J Farhana Azhani ...
Sunway University 12th Annual ESAP Symposium (E) Workshop J   Farhana Azhani ...Sunway University 12th Annual ESAP Symposium (E) Workshop J   Farhana Azhani ...
Sunway University 12th Annual ESAP Symposium (E) Workshop J Farhana Azhani ...
 
Introduction to Computerized Adaptive Testing (CAT)
Introduction to Computerized Adaptive Testing (CAT)Introduction to Computerized Adaptive Testing (CAT)
Introduction to Computerized Adaptive Testing (CAT)
 
Report of Previous Project by Yifan Guo
Report of Previous Project by Yifan GuoReport of Previous Project by Yifan Guo
Report of Previous Project by Yifan Guo
 
Interactive e-assessment
Interactive e-assessmentInteractive e-assessment
Interactive e-assessment
 

Ähnlich wie Evaluating Mobile Learning

Uo W – 25th Annual Distance Learning Conference
Uo W – 25th Annual Distance Learning ConferenceUo W – 25th Annual Distance Learning Conference
Uo W – 25th Annual Distance Learning ConferenceGary Marrer
 
Data mining to predict academic performance.
Data mining to predict academic performance. Data mining to predict academic performance.
Data mining to predict academic performance. Ranjith Gowda
 
Edtech 1 syllabus 2015 2016
Edtech 1 syllabus 2015 2016Edtech 1 syllabus 2015 2016
Edtech 1 syllabus 2015 2016rcborja17
 
Edtech 1 syllabus 2015 2016
Edtech 1 syllabus 2015 2016Edtech 1 syllabus 2015 2016
Edtech 1 syllabus 2015 2016patrickcultura
 
Assessment Futures: The Role for e-Assessment?
Assessment Futures: The Role for e-Assessment?Assessment Futures: The Role for e-Assessment?
Assessment Futures: The Role for e-Assessment?Kenji Lamb
 
SE-IT MINI PROJECT SYLLABUS
SE-IT MINI PROJECT SYLLABUSSE-IT MINI PROJECT SYLLABUS
SE-IT MINI PROJECT SYLLABUSnikshaikh786
 
21st Century School Presentation - Acorn High School
21st Century School Presentation - Acorn High School21st Century School Presentation - Acorn High School
21st Century School Presentation - Acorn High SchoolLisa Nielsen
 
APP and Controlled Assessment in History - June 2009
APP and Controlled Assessment in History - June 2009APP and Controlled Assessment in History - June 2009
APP and Controlled Assessment in History - June 2009David Drake
 
IRJET- Tracking and Predicting Student Performance using Machine Learning
IRJET- Tracking and Predicting Student Performance using Machine LearningIRJET- Tracking and Predicting Student Performance using Machine Learning
IRJET- Tracking and Predicting Student Performance using Machine LearningIRJET Journal
 
Sept17 college 2
Sept17 college 2Sept17 college 2
Sept17 college 2Ning Ding
 
An Empirical Study on Attainment of Course Outcome for an Engineering course ...
An Empirical Study on Attainment of Course Outcome for an Engineering course ...An Empirical Study on Attainment of Course Outcome for an Engineering course ...
An Empirical Study on Attainment of Course Outcome for an Engineering course ...iosrjce
 
beyond walls: moving from in-person to online instruction to teach evaluatio...
beyond walls: moving from in-person to online instruction to teach evaluatio...beyond walls: moving from in-person to online instruction to teach evaluatio...
beyond walls: moving from in-person to online instruction to teach evaluatio...Dominique Turnbow
 
Project Based Learning Model Development on Buffer Solution Materials with So...
Project Based Learning Model Development on Buffer Solution Materials with So...Project Based Learning Model Development on Buffer Solution Materials with So...
Project Based Learning Model Development on Buffer Solution Materials with So...theijes
 
Criteria in Choosing Appropriate Assessment Tool
Criteria in Choosing Appropriate Assessment ToolCriteria in Choosing Appropriate Assessment Tool
Criteria in Choosing Appropriate Assessment ToolIra Sagu
 
Learning Analytics for Computer Programming Education
Learning Analytics for Computer Programming EducationLearning Analytics for Computer Programming Education
Learning Analytics for Computer Programming EducationIRJET Journal
 
Mobile Learning Framework & Evaluation
Mobile Learning Framework & EvaluationMobile Learning Framework & Evaluation
Mobile Learning Framework & Evaluationador
 

Ähnlich wie Evaluating Mobile Learning (20)

FLOSS Pilot Studies
FLOSS Pilot StudiesFLOSS Pilot Studies
FLOSS Pilot Studies
 
Uo W – 25th Annual Distance Learning Conference
Uo W – 25th Annual Distance Learning ConferenceUo W – 25th Annual Distance Learning Conference
Uo W – 25th Annual Distance Learning Conference
 
ijrar_issue_20543888.pdf
ijrar_issue_20543888.pdfijrar_issue_20543888.pdf
ijrar_issue_20543888.pdf
 
Data mining to predict academic performance.
Data mining to predict academic performance. Data mining to predict academic performance.
Data mining to predict academic performance.
 
Edtech 1 syllabus 2015 2016
Edtech 1 syllabus 2015 2016Edtech 1 syllabus 2015 2016
Edtech 1 syllabus 2015 2016
 
Edtech 1 syllabus 2015 2016
Edtech 1 syllabus 2015 2016Edtech 1 syllabus 2015 2016
Edtech 1 syllabus 2015 2016
 
Assessment Futures: The Role for e-Assessment?
Assessment Futures: The Role for e-Assessment?Assessment Futures: The Role for e-Assessment?
Assessment Futures: The Role for e-Assessment?
 
Sp110328
Sp110328Sp110328
Sp110328
 
SE-IT MINI PROJECT SYLLABUS
SE-IT MINI PROJECT SYLLABUSSE-IT MINI PROJECT SYLLABUS
SE-IT MINI PROJECT SYLLABUS
 
21st Century School Presentation - Acorn High School
21st Century School Presentation - Acorn High School21st Century School Presentation - Acorn High School
21st Century School Presentation - Acorn High School
 
APP and Controlled Assessment in History - June 2009
APP and Controlled Assessment in History - June 2009APP and Controlled Assessment in History - June 2009
APP and Controlled Assessment in History - June 2009
 
IRJET- Tracking and Predicting Student Performance using Machine Learning
IRJET- Tracking and Predicting Student Performance using Machine LearningIRJET- Tracking and Predicting Student Performance using Machine Learning
IRJET- Tracking and Predicting Student Performance using Machine Learning
 
Sept17 college 2
Sept17 college 2Sept17 college 2
Sept17 college 2
 
An Empirical Study on Attainment of Course Outcome for an Engineering course ...
An Empirical Study on Attainment of Course Outcome for an Engineering course ...An Empirical Study on Attainment of Course Outcome for an Engineering course ...
An Empirical Study on Attainment of Course Outcome for an Engineering course ...
 
beyond walls: moving from in-person to online instruction to teach evaluatio...
beyond walls: moving from in-person to online instruction to teach evaluatio...beyond walls: moving from in-person to online instruction to teach evaluatio...
beyond walls: moving from in-person to online instruction to teach evaluatio...
 
Project Based Learning Model Development on Buffer Solution Materials with So...
Project Based Learning Model Development on Buffer Solution Materials with So...Project Based Learning Model Development on Buffer Solution Materials with So...
Project Based Learning Model Development on Buffer Solution Materials with So...
 
Criteria in Choosing Appropriate Assessment Tool
Criteria in Choosing Appropriate Assessment ToolCriteria in Choosing Appropriate Assessment Tool
Criteria in Choosing Appropriate Assessment Tool
 
Learning Analytics for Computer Programming Education
Learning Analytics for Computer Programming EducationLearning Analytics for Computer Programming Education
Learning Analytics for Computer Programming Education
 
Mobile Learning Framework & Evaluation
Mobile Learning Framework & EvaluationMobile Learning Framework & Evaluation
Mobile Learning Framework & Evaluation
 
LA as a metacognitive tool
LA as a metacognitive toolLA as a metacognitive tool
LA as a metacognitive tool
 

Kürzlich hochgeladen

Activity 2-unit 2-update 2024. English translation
Activity 2-unit 2-update 2024. English translationActivity 2-unit 2-update 2024. English translation
Activity 2-unit 2-update 2024. English translationRosabel UA
 
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSGRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSJoshuaGantuangco2
 
ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4MiaBumagat1
 
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfInclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfTechSoup
 
Dust Of Snow By Robert Frost Class-X English CBSE
Dust Of Snow By Robert Frost Class-X English CBSEDust Of Snow By Robert Frost Class-X English CBSE
Dust Of Snow By Robert Frost Class-X English CBSEaurabinda banchhor
 
Keynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designKeynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designMIPLM
 
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17Celine George
 
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfVirtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfErwinPantujan2
 
Textual Evidence in Reading and Writing of SHS
Textual Evidence in Reading and Writing of SHSTextual Evidence in Reading and Writing of SHS
Textual Evidence in Reading and Writing of SHSMae Pangan
 
ICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfVanessa Camilleri
 
Presentation Activity 2. Unit 3 transv.pptx
Presentation Activity 2. Unit 3 transv.pptxPresentation Activity 2. Unit 3 transv.pptx
Presentation Activity 2. Unit 3 transv.pptxRosabel UA
 
Measures of Position DECILES for ungrouped data
Measures of Position DECILES for ungrouped dataMeasures of Position DECILES for ungrouped data
Measures of Position DECILES for ungrouped dataBabyAnnMotar
 
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxINTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxHumphrey A Beña
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...Nguyen Thanh Tu Collection
 
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...JojoEDelaCruz
 
Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...Seán Kennedy
 
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptxAUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptxiammrhaywood
 

Kürzlich hochgeladen (20)

Activity 2-unit 2-update 2024. English translation
Activity 2-unit 2-update 2024. English translationActivity 2-unit 2-update 2024. English translation
Activity 2-unit 2-update 2024. English translation
 
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSGRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
 
ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4
 
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfInclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
 
Dust Of Snow By Robert Frost Class-X English CBSE
Dust Of Snow By Robert Frost Class-X English CBSEDust Of Snow By Robert Frost Class-X English CBSE
Dust Of Snow By Robert Frost Class-X English CBSE
 
Keynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designKeynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-design
 
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
 
YOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptx
YOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptxYOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptx
YOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptx
 
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfVirtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
 
Textual Evidence in Reading and Writing of SHS
Textual Evidence in Reading and Writing of SHSTextual Evidence in Reading and Writing of SHS
Textual Evidence in Reading and Writing of SHS
 
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptxYOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
 
ICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdf
 
Presentation Activity 2. Unit 3 transv.pptx
Presentation Activity 2. Unit 3 transv.pptxPresentation Activity 2. Unit 3 transv.pptx
Presentation Activity 2. Unit 3 transv.pptx
 
Measures of Position DECILES for ungrouped data
Measures of Position DECILES for ungrouped dataMeasures of Position DECILES for ungrouped data
Measures of Position DECILES for ungrouped data
 
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxINTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
 
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...
 
Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...
 
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptxAUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
 
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptxFINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
 

Evaluating Mobile Learning

  • 1. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Mobile Learning EvaluationMobile Learning Evaluation Giasemi Vavoula University of Leicester
  • 2. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 OverviewOverview Evaluation (session) in context Evaluation context Part 1: What do we evaluate? (a framework) Part 2: How do we evaluate it? (methods and tools) Part 3: Practical & ethical considerations Identifying assumptions
  • 3. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Evaluation (session) in ContextEvaluation (session) in Context Evaluation Research Publishing Ethics Theorising
  • 4. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Evaluation ContextEvaluation Context Part 1. What do we evaluate? M3 Evaluation Framework (Vavoula & Sharples 2009) Part 2. How do we evaluate it? Methods and tools Case study of evaluation methods and tools within M3 Framework Part 3. Practical & Ethical considerations Who evaluates and who is evaluated Where When
  • 5. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 1. What do we evaluate?Part 1. What do we evaluate? technology experience institutional practice personal practice
  • 6. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 1. M3 Evaluation at three levelsPart 1. M3 Evaluation at three levels Micro level: user’s experience of the technology Usability Utility of functions Meso level: user’s learning/educational experience Cognitive learning Breakthroughs Breakdowns Macro level: impact on institutional & personal learning/teaching practice Appropriation of new technology: unexpected and envisaged use New practices – further requirements
  • 7. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 1. M3 Evaluation in three stagesPart 1. M3 Evaluation in three stages User’s expectations (data collection) User’s actual experience (data collection) Expectations – reality gaps (data analysis)
  • 8. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 1. Evaluation at 3 levels,Part 1. Evaluation at 3 levels, in 3 stages, throughout project lifecyclein 3 stages, throughout project lifecycle design implement deploy micromesomacro analyse requirements Technology robust enough to support full user trial Technology deployed long enough to assess impact
  • 9. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 2. How do we evaluate?Part 2. How do we evaluate? Typical process: Collect data Analyse data Answer/refine research questions
  • 10. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 2. Case Study: MyartspacePart 2. Case Study: Myartspace Handout phones Explore museum Recap learning task etc. Logon Phone training Share / present Example gallery CollectCollectCollect
  • 11. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 2. Case study in greater scheme of thingsPart 2. Case study in greater scheme of things Learning tools Learning method + activities Learning objectives + outcomes Social setting Location + space layout … Familiar, setFamiliar, setUnpredictableUnpredictable Pre-determinedPre-determinedUnknown – some idea Unknown Pre-set, externalPre-set, externalUnknownUnknown FixedKnownUnpredictableUnpredictable FixedKnown but not standard Known but not standard Unpredictable traditional classroom museum school visit general museum visit mobile vagueness++ --
  • 12. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 2. Collect data @ all levelsPart 2. Collect data @ all levels Data sources Stage 1 - expectations • Design heuristics • System documentation • Experience documentation • Promotion materials • Minutes of project meetings • Project proposal • Press coverage • Scoping study / literature review • Stakeholder/user interviews & focus groups • … Stage 2 - reality • Evaluation outcomes Requirements specification • User observations • Stakeholder/user interviews & focus groups • User questionnaires • User-created artifacts • Stakeholder consultation workshops • Heuristic evaluation • …
  • 13. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 2. Collect data @ micro levelPart 2. Collect data @ micro level Method: Heuristic Evaluation Collect data re expectations Established design heuristics Collect data re reality Experts undertaking heuristic evaluation Analyse gaps Analysis of expert reports and production of (re)design recommendations
  • 14. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 2. Collect data @ micro levelPart 2. Collect data @ micro level Method: Technical Testing Collect data re expectations Data supplied by system requirements Collect data re reality System performance tests outcomes Analyse gaps Comparison of performance data against requirements
  • 15. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 2. Collect data @ micro levelPart 2. Collect data @ micro level Method: Full-scale user trial Collect data re expectations Examine system documentation (Teacher’s Pack and Lesson Plans, online help) for descriptions of functionality Interview teacher prior to lesson to assess level of knowledge and expectations for functionality Observe training sessions at museum and school to document how functionality is described to teachers/students. Student questionnaires regarding expectations of system functionality in forthcoming lesson Collect data re reality Observe lesson to establish actual teacher and student experience of functionality Interview teacher after the lesson to clarify experience of functionality Questionnaire and focus groups with students after the lesson to capture experience of functionality Analyse gaps Capture expectations-reality gaps in terms of user experience of functionality through • reflective interpretation of documentation analysis in the light of observations • interviews and focus groups with teachers/students • critical incident analysis with students
  • 16. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 2. Collect data @ meso levelPart 2. Collect data @ meso level Method: Full-scale user trial Collect data re expectations Analyse description of educational experience based on Teacher’s Pack and Lesson Plans Interview teachers and museum educators prior to lessons about what they have planned for the students’ learning experience Observe teachers and museum educators while presenting learning experience to students in the classroom/museum Student questionnaires regarding expectations of learning experience in forthcoming lesson Collect data re reality Observe educational experience in museum/classroom • Note critical incidents that show new forms of learning or educational interaction • Note breakdowns Interviews/focus groups with teachers, museum educators, students on educational experience in museum/classroom Analyse gaps Capture expectations-reality gaps in terms of educational experience through • reflective interpretation of documentation analysis and observations • interviews/focus groups with teachers, students, museum educators • critical incident analysis with students
  • 17. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 2. Collect data @ macro levelPart 2. Collect data @ macro level Method: Full-scale user trial Collect data re expectations Analyse descriptions in service promotion materials, original proposal, minutes of early project meetings Interviews with stakeholders to elicit initial expectations for impact of service Collect data re reality Review of press coverage and interviews with stakeholders to document impact/transformations effected by the service Analyse gaps Reflective analysis of expectations-reality gaps in terms of service impact
  • 18. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 2. Collect data @ all levelsPart 2. Collect data @ all levels Method: Various for Requirements analysis Collect data re expectations Scoping study of previous projects and related recommendations Consultation workshop on ‘User Experience’ to establish requirements Collect data re reality Data supplied by evaluation analysis Analyse gaps Workshop to finalise educational and user requirements Revisions of requirements in light of evaluation findings
  • 19. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 TOTAL Collected objects Written comments Sounds Photographs ClassGroup avg. 58 7 7 11 33 637 75 77 121 364 “A student can effectively process 5-10 items during a single post-visit lesson” Part 2. Example of data analysisPart 2. Example of data analysis “-It has a code - I want to take my own picture” “How will I know what this photo is about?” “Expect to be able to record what pictures are of”
  • 20. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 TOTAL Collected objects Written comments Sounds Photographs ClassGroup avg. 58 7 7 11 33 637 75 77 121 364 Part 2. Example of data analysisPart 2. Example of data analysis “A student can effectively process 5-10 items during a single post-visit lesson” “-It has a code - I want to take my own picture” “How will I know what this photo is about?” “Expect to be able to record what pictures are of”
  • 21. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 2. Example of data analysisPart 2. Example of data analysis micro meso macro Creating and collecting items is quick and easy Children enjoy the creativity and sense of ownership in creating own content System does not support annotating collected items Frustration / confusion Change to system to support photo annotation Read label into the phone after each photo
  • 22. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 TOTAL Collected objects Written comments Sounds Photographs ClassGroup avg. 58 7 7 11 33 637 75 77 121 364 Part 2. Example of data analysisPart 2. Example of data analysis “A student can effectively process 5-10 items during a single post-visit lesson” “-It has a code - I want to take my own picture” “How will I know what this photo is about?” “Expect to be able to record what pictures are of”
  • 23. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 2. Example of data analysisPart 2. Example of data analysis micro meso macro Creating and collecting items is quick and easy Decomposing collected content takes longer Enforce upper limit on number of collected items Teachers change their practice to do >1 post-visit lesson Make website simpler quicker to use Educate students to regulate collecting
  • 24. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 3. Beyond the case studyPart 3. Beyond the case study Learning tools Learning method + activities Learning objectives + outcomes Social setting Location + space layout … Familiar, setFamiliar, setUnpredictableUnpredictable Pre-determinedPre-determinedUnknown – some idea Unknown Pre-set, externalPre-set, externalUnknownUnknown FixedKnownUnpredictableUnpredictable FixedKnown but not standard Known but not standard Unpredictable traditional classroom museum school visit general museum visit mobile vagueness++ --
  • 25. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 3. More to considerPart 3. More to consider Practical and ethical considerations of: Where, when and how do we collect data Who evaluates and who is evaluated Whatever happened to learning outcomes?
  • 26. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 3. Where / when / howPart 3. Where / when / how
  • 27. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 3. Where / when / howPart 3. Where / when / how Roto et al., 2004
  • 28. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 3. Where / when / howPart 3. Where / when / how Technology-based solutions ASL MobileEye eyetracker (Wessel et al. 2007; Mayr et al. 2009) Constraints (other than the obvious…): Limited temporal and spatial accuracy (short fixations may be missed; tricky to calibrate fixation distance) Laborious data analysis Can’t infer cognitive processes…
  • 29. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 3. Where / when / howPart 3. Where / when / how ‘Cooperative Inquiry’-based solutions (Hsi 2008) Learner accounts (diaries, questionnaires, post-interviews, attitude surveys) Constraints: Accuracy of recall Post-rationalisation Concern of projected image Fragmentation of learning
  • 30. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 3. Where / when / howPart 3. Where / when / how Triangulation ever so important: mixed methods Validate, and also Capture different perspectives on: Video, audio, observation notes, learner-created artifacts, screenshots, interview transcripts… Constraints: Synchronisation or converting into meaningful narratives Smith et al., 2007
  • 31. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 3. Where / when / howPart 3. Where / when / how Mobile technology translates (most often) to personal technology Are learners willing to be monitored? How much of their privacy will they unveil? What if they’re under-age? Is it OK to monitor everything? How much do we really need to know? Even if they agree, is it easy to safeguard personal data? What are best dissemination practices? Will users cooperate in practice? E.g. synchronise as and when needed?
  • 32. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 3. Who evaluates / is evaluatedPart 3. Who evaluates / is evaluated “Will users cooperate in practice?” Users/participants as co-researchers • Who defines the agenda? • Ethics? • Capacity? • Commitment?
  • 33. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 3. Learning outcomesPart 3. Learning outcomes Assessing learning processes and outcomes Classroom: well-established assessment methods (essays, open-book exam, unseen exam, multiple-choice test) • Formative assessement: provide feedback on progress • Summative assessment: judge achievement – Measure of teaching success – Measure of learning effectiveness (Boud 1995) – Reliability? Validity? (Knight 2001) Informal/Mobile: elusive, highly personal learning outcomes… • When, what, how learning occurs not pre-determined • Sometimes not even post-determined…
  • 34. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 E.g. Museum learningE.g. Museum learning…… Studies that measure knowledge gains give inconclusive results, reporting variable amount and nature of cognitive learning Rennie & McClafferty 1995 Knowledge gains are hard to achieve during a short visit in an unfamiliar context Gail Donald 1991 The main conceptual gains are in consolidating/ reinforcing previous knowledge, not acquiring new knowledge Falk 2004 “Measurements of specific impacts with the traditional tools of experimental design are often inappropriate for the confounding variability of informal settings, making the result of such assessment often disappointing or insignificant” (Bitgood et al. 1994) “Each visitor has a unique experience” (Rennie & McClafferty 1996) “Whilst many studies use performance on assessment as a proxy for learning, this remains problematic for several reasons. Perhaps most importantly, it is assumed that what has been learnt can be performed; that there is a correlation between learning and assessment. This is evidently not the case.” (Oliver & Harvey 2002)
  • 35. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 3. Learning outcomesPart 3. Learning outcomes Learner perceptions Attitudes towards the technology Enjoyment of experience Watch for processes which indicate that learning may be happening showing responsibility for and initiating own learning (e.g. by writing, drawing, or taking photos by choice; deciding where and when to move) being actively involved in learning (e.g. by absorbed, close examination of resources; or persevering with a task) making links and transferring ideas and skills (e.g. by comparing evidence) sharing learning with experts and peers (e.g. by talking and gesturing; or asking each other questions) Griffin & Symington, 1998 Assess learner-created artifacts online media they create, personal reflective accounts such as blogs and e-portfolios, logs of interactions with and through the technology Longitudinal studies Validated attitude measurement scales needed Critical incident analysis may be helpful – but outcomes need to be triangulated What makes a good blog? Assessment standards still to be agreed… New research mindsets
  • 36. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 ConclusionConclusion Notice the assumptions: Mobile learning happens in discrete, time-bound episodes (Mobile) learning is clearly distinguishable from other forms of human activity More?...
  • 37. G. Vavoula – 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Related publicationsRelated publications 1. Vavoula, G., Pachler, N., and Kukulska-Hulme, A. (Eds.) (2009). Researching Mobile Learning: Frameworks, methods and research designs. Peter Lang. 2. Vavoula, G., Sharples, M. (2009). Meeting the Challenges in Evaluating Mobile Learning: A 3-level Evaluation Framework. International Journal of Mobile and Blended Learning, 1(2), pp. 54-75. (Or view preprint)