SlideShare ist ein Scribd-Unternehmen logo
1 von 79
Downloaden Sie, um offline zu lesen
E-Assessment for Learning: balancing
    challenges and opportunities


             Denise Whitelock
     The Open University, Walton Hall,
         Milton Keynes MK7 6AA
        d.m.whitelock@open.ac.uk
DMW, Loughborough, July 2009
I hate marking but want the tasks and feedback to
assist student learning
The e-Assessment Challenge

• Constructivist
  Learning –
  Push


• Institutional
  reliability and
  accountability –
  Pull
.
MCQs: Variation on a theme




Example of LAPT Certainty-Based
Marking, UK cabinet ministers demo   Drug Chart Errors and Omissions,
exercise showing feedback,           Medicines Administration Assessment,
University College, Tony Gardner-    Chesterfield Royal Hospital
Medwin
MCQs: High Stakes Assessment




Example of practice Thinking Skills
Assessment" (TSA) question,           Example of practice Thinking Skills
Admissions Interview,                 Assessment" (TSA) feedback,
Cambridge Assessment, Steve Lay       Admissions Interview,
                                      Cambridge Assessment, Steve Lay
Scaffolding and High Stakes assessment

•    Math for Science


•    Tutor less course


•    Competency led


•    No point to cheat


•    Web home exam


•    Invigilation technologies
Self diagnosis

                 •   Basic IT skills, first
                     year med students
                     (Sieber, 2009)
                 •   Competency based
                     testing
                 •   Repeating tests for
                     revision
                 •   Enables remedial
                     intervention
Theoretical drivers for Asesessment

•   Piaget (1930) Individual manipulating surroundings --
    mental representations can be tested Induction of rules
    and their testing ...Nuffield science
•   Cognitive pyschology( 70s 80s) How are these mental
    representations stored? Classification development
    and how this can go wrong. .... Misconceptions and mal
    rules........ AI
•   Bruner (1982) assessment tasks will match
    competency levels depending on level of
    help...SCAFFOLDING
Scaffolding and Interactivity; OU Science
                Foundation Course

•    Interaction, Feedback loops
      •   Tell, Explore, Check
      •   Predict, Look and Explain
•    Entering the discourse of a subject via audio
     feedback
•    Scaffolded text feedback (Bruner & Woods)
•    SHOW ME button
Interactive Tasks

                    •   Games


                    •   Simulations


                    •   Making the abstract
                        concrete


                    •   Directing the sequence
                        of an animation
Interactivity and Cognitive
       Change Scores
10                           Interactivity
                             Cog. Change
8

6

4

2

0
     Galapagos   Organic     Meiosis &       Seismic
                 Molecules    Mitosis        Waves
Theoretical drivers:Social Constructivism
•   Vygotsky (1978) individuals shape cultural settings
    which shapes minds no longer individual


•   Activity theory (Engstrom 1987) Tool mediation


•   Situated Cognition (Lave and Wenger 1991) Authentic
    assessment


•   Peer interaction and assessment


•   Learning conversations Laurillard (2002)
                    DMW, Loughborough, July 2009
Characteristics                                  Descriptor

Authentic                   Involving real-world knowledge and skills
Personalised                Tailored to the knowledge, skills and interests of each
                            student
Negotiated                  Agreed between the learner and the teacher
Engaging                    Involving the personal interests of the students
Recognise existing skills   Willing to accredit the student’s existing work
Deep                        Assessing deep knowledge – not memorization
Problem oriented            Original tasks requiring genuine problem solving skills
Collaboratively produced Produced in partnership with fellow students
Peer and self assessed      Involving self reflection and peer review

Tool supported              Encouraging the use of ICT




       Elliott’s characteristics of Assessment 2.0 activities
Authentic assessments :e-portfolios




  Electronic NVQ portfolio cover contents page, OCR IT Practitioner, EAIHFE, Robert Wilsdon
Candidate Assessment Records section, OCR IT Practitioner, EAIHFE, Robert Wilsdon
Building e-portfolios on a chef’s course




                                              Evidence of food preparation skill for e-
                                              portfolio, Modern Apprenticeship in
                                              Hospitality and Catering, West Suffolk
 food preparation for e-portfolio, Modern     College, Mike Mulvihill
Apprenticeship in Hospitality and Catering,
   West Suffolk College, Mike Mulvihill
Sharing e-portfolios: The Netfolio concept

•   Social constructivism
•   Connecting e-portfolios (Barbera, 2009)
•   Share and build upon a joint body of evidence
•   Trialled with 31 PhD students at a virtual university
•   Control group used but Netfolio group obtained higher
    grades
•   Greater visibility of revision process and peer
    assessment in the Netfolio system
Peer Assessment and the WebPA Tool

                     •   Loughborough (Loddington et al,
                         2009)
                     •   Self assess and peer assess
                         with given criteria
                     •   Group mark awarded by tutor
                     •   Students rated:
                          •     More timely feedback
                          •     Reflection
                          •     Fair rewards for hard work
                     •   Staff rated:
                          •     Time savings
                          •     Administrative gains
                          •     Automatic calculation
                          •     Students have faith in the
                                administrative system
Mobile Technologies and Assessment

•   MCQs ,PDAs
    Valdiva &
    Nussbaum(2009)
•   Polls,instant surveys
•   Simpson & Oliver
    (2007)
•   Draper (2009) EVS
Gains from Formative Assessment

 Mean effect size on standardised tests between 0.4 to
  0.7 (Black & Williams, 1998)
 Particularly effective for students who have not done
  well at school
  http://kn.open.ac.uk/document.cfm?docid=10817
 Can keep students to timescale and motivate them
 How can we support our students to become more
  reflective learners and engage in formative assessment
  tasks?
Formative assessment
in Level 3 Physics and Astronomy


SM358       The Quantum World          2007, 2009, 2010

SMT359 Electromagnetism                2006, 2008, 2010

S382       Astrophysics                2010

S383       The Relativistic Universe   2010

 Black = Summative


 Red = Formative
Formative assessment strategy

SM358 and SMT359: Quantum and Electromagnetism


   iCMA Maths
                                            OCAS Threshold
   iCMA 51
                           TMA 01
   iCMA 52                                   30% or more on 7
                           TMA 02            or more
   iCMA 53
                                             assignments
   iCMA 54                 TMA 03            including at least
                                             2 TMAs.
   iCMA 55                 TMA 04
   iCMA 56
   3 Revision Medleys

     Course Grade determined by exam score alone.
Formative assessment strategy

S382: Astrophysics

    iCMA 51              TMA 01
                                              OCAS Threshold
    iCMA 52              TMA 02
                         TMA 03               30% or more on 8
    iCMA 53                                   or more
                         Project
                                              assignments.
    iCMA 54              TMA 04
    iCMA 55              TMA 05


   TMAs and iCMAs cover complementary material.


    Course Grade determined by exam (2/3)+ Project (1/3)
Formative assessment strategy

S383 Relativistic Universe

    iCMA 51                  TMA 01

    iCMA 52                  TMA 02           OCAS Threshold

    iCMA 53                  TMA 03
                                              30% or more on 9
                             EC1              or more
    iCMA 54
                             TMA 04           assignments.
    iCMA 55
                             TMA 05
    iCMA 56
                             TMA 06
   TMAs and iCMAs cover complementary material.


      Course Grade determined by exam (2/3)+ EC1 (1/3).
Attempts at TMAs (Physics)

                        SM358 2009

                        SM358 2010




                        SMT359 2008
          Red = 2010
                        SMT359 2010
          Blue = 2009
Attempts at TMAs (Astromomy)

                     S381 2008

                      S382 2010




                      S357 2009

                      S383 2010
Average marks for TMAs (Physics)




                              SMT359

           SM358



Blue = summative
Red = formative
Average marks for TMAs (Astronomy)




                         S357/S383

            S381/S382



Blue = summative
Red = formative
iCMA marks (SM358)




                     Red = 2010
                     Blue = 2009
Time of first attempts at iCMAs
e.g.
        Cumulative total




                                            Red = 2010
                                            Blue = 2009




            Recommended       Exam   Exam
            completion date   2010   2009
The bottom line

            SM358    SM358    SMT359   SMT359
            2009     2010     2009     2010
Retention   70.7%    70.5 %   62.7 %   60.8 %

Grade 1     22.0 %   22.7 %   12.9 %   7.3 %

Grade 2     15.2 %   13.5 %   9.8 %    10.5 %

Grade 3     12.3 %   17.9 %   15.1 %   17.1 %

Grade 4     11.5 %   6.3 %    12.3 %   13.3 %

Grade Q/5   9.7 %    10.1 %   12. 6%   12.6%

Pass rate   61.0 %   60.4 %   50.1%    48.2%
The bottom line

            S381     S382     S357     S383
            2008     2010     2009     2010
Retention   59.2%    57.5 %   60.1 %   56.4 %

Grade 1     10.1 %   12.1 %   12.2 %   9.9 %

Grade 2     24.7 %   18.9 %   6.1 %    13.7 %

Grade 3     13.5 %   15.2 %   17.0 %   20.6 %

Grade 4     4.7 %    6.8 %    17.4 %   6.1 %

Grade Q/5   6.2 %    4.5 %    7.4%     6.1 %

Pass rate   53.0 %   53.0 %   52.7%    50.3 %
Conclusions


     • Formative assessment makes much less
     difference than we might have imagined.

     • In the first year of running this strategy, the results are
     comparable with (or marginally worse than) those for
     summative assessment

     • BUT we are still learning how to optimise the
     strategy, and there is the potential for future
     improvements.
Physicists’ conclusions

                             Formative assessment
                              makes much less
                              difference imagined
                             In the first year of running
                              this strategy, the results are
                              comparable with (or
                              marginally worse than) those
                              for summative assessment
                             BUT we are still learning how
                              to optimise the strategy, and
                              there is the potential for
                              future improvements.
MU 123: Discovering Mathematics Tim Lowe



 30 point introductory
  module
 Launched February
  2010
 2 presentations a year
 1800 students oer
  presentation



                      Academic:Tim Lowe
2 Types of Computer Assisted Assessment
quizzes

Practice Quizzes          iCMAs
 Formative                Summative
 Immediate feedback       Delayed feedback
 3 tries per question     Single attempt
 Worked solutions         6 questions per unit
 Can repeat               12% of grade
 12 questions per unit
Grades
Practice Quiz repeats
End of course survey 2010



 The computer-based
  individual learning
  activities for this course
  supported my learning


 92.2% agree 0.5%
  disagee


 Increased retention
Collaborative formative assessment with
             Global Warming




          DMW, Institute of Educati onal Technology, September 1997
Global Warming
Next: ‘Yoked’ apps via BuddySpace




                                      Student A




                                       Student B
                                     (‘yoked’, but
                                     without full
                                    screen sharing
                                       required!)
Global Warming: Simlink Presentation
Research issues from CAA: Marking Free
text

•   e-Rater Attali & Burnstein (2006)
•   LSA (Laudaurer,Laham& Foltz 2003)
•   Christie SEAR (1999)
•   Knowledge engineering Pulman &Sukkarieh (2005)
•   IAT (Jordan and Mitchell 2009)
•   Genre Analysis Moreale& Vargas –Vera (2005)
Open Comment addresses the problem of
free text entry


•   Automated formative assessment tool
•   Free text entry for students
•   Automated feedback and guidance
•   Open questions, divergent assessment
•   No marks awarded
•   For use by Arts Faculty
Three Causal Question Types proved to be
suitable for Open Comment




1. Analysis of statistics, usually presented to the student
   as a table
2. Comprehension of a set text
3. Identifying similarities and differences for a given event
Open Comment components



•   A Java-based feedback system
•   A web service shell
•   A graphical interface for testing
•   A Moodle-based question type
•   A forms-based editing tool
Stages of analysis by computer of students’ free text
entry for Open Comment: advice with respect to
content (socio-emotional support stylised example)

•   STAGE 1a: DETECT ERRORS E.g. Incorrect dates,
    facts. (Incorrect inferences and causality is dealt with
    below)
•   Instead of concentrating on X, think about Y in order to
    answer this question Recognise effort (Dweck) and
    encourage to have another go
•   You have done well to start answering this question but
    perhaps you misunderstood it. Instead of thinking about
    X which did not…….. Consider Y
Computer analysis continued

•   STAGE 2a: REVEAL FIRST OMISSION
•   Consider the role of Z in your answer Praise what is
    correct and point out what is missing Good but now
    consider the role X plays in your answer
•   STAGE 2b: REVEAL SECOND OMISSION
•   Consider the role of P in your answer Praise what is
    correct and point out what is missing Yes but also
    consider P. Would it have produced the same result if P
    is neglected?



                    DMW, Loughborough, July 2009
Final stages of analysis

•   STAGE 3:REQUEST CLARIFICATION OF KEY POINT 1
•   STAGE 4:REQUEST FURTHER ANALYSIS OF KEY POINT
    1(Stages 3 and 4 repeated with all the key points)
•   STAGE 5:REQUEST THE INFERENCE FROM THE ANALYSIS
    OF KEY POINT 1 IF IT IS MISSING
•   STAGE 6:REQUEST THE INFERENCE FROM THE ANALYSIS
    OF KEY POINT 1 IF IT IS NOT COMPLETE
•   STAGE 7:CHECK THE CAUSALITY
•   STAGE 8:REQUEST ALL THE CAUSAL FACTORS ARE
    WEIGHTED
Where are we now?

•   Opening up with Open Source
•   Moving towards vision and not losing sight of it through
    tool adaptation
•   More work to do for Arts
•   Open Comment - pedagogical model open to test
•   Feedback
     •   Changing pedagogy
     •   Another handle on misconceptions
Open Comment drivers for reflection


•    Students are able to find facts similar to X

•    Know how X might be disputed

•    Are able to make predictions about X

•    Know how to use X in an argument

•    Know how far X can be pushed

•    Supported with tools and strategies
LISC: Aily Fowler

•   Kent University ab-initio Spanish module
     •   Large student numbers
     •   Skills-based course
     •   Provision of sufficient formative assessment
         meant unmanageable marking loads
     •   Impossible to provide immediate feedback
          • leading to fossilisation of errors




                   DMW, Loughborough, July 2009
The LISC solution: developed by Ali Fowler
•   A CALL system designed to enable students
    to:
    •   Independently
        practise sentence
        translation
    •   Receive
        immediate (and
        robust) feedback
        on all errors
    •   Attend
        immediately to the
        feedback (before
        fossilisation can
        occur)
How is the final mark arrived at in the
    LISC System?
•    The two submissions are unequally weighted
      •    Best to give more weight to the first
           attempt
            •    since this ensures that students
                 give careful consideration to the
                 construction of their first answer
            •    but can improve their mark by
                 refining the answer
      •    The marks ratio can vary (depending
           on assessment/feedback type)
            •    the more information given in the
                 feedback, the lower the weight
                 the second mark should carry
Heuristics for the final mark

                        •   If the ratio is skewed too far
                            in favour of the first
                            attempt…
                             •     students are less
                                   inclined to try hard to
                                   correct non-perfect
                                   answers
                        •   If the ratio is skewed too far
                            in favour of the second
                            attempt…
                             •     students exhibit less
                                   care over the
                                   construction of their
                                   initial answer
Feedback: Advice for Action

 • Students must decode
  feedback and then act on it
  Boud (2000)


 • Students must have the
  opportunity to act on
  feedback Sadler (1989)


 • Gauging efficacy through
  student action
DMW, Loughborough, July 2009
Characteristics                               Descriptor
Authentic                   Involving real-world knowledge and skills
Personalised                Tailored to the knowledge, skills and interests of each
                            student
Negotiated                  Agreed between the learner and the teacher
Engaging                    Involving the personal interests of the students
Recognise existing skills   Willing to accredit the student’s existing work
Deep                        Assessing deep knowledge – not memorization
Problem oriented            Original tasks requiring genuine problem solving skills
Collaboratively produced Produced in partnership with fellow students
Peer and self assessed      Involving self reflection and peer review
Tool supported              Encouraging the use of ICT


         Advice                         for                Action
                 Elliott’s characteristics of Assessment 2.0 activities
Embedding Assessment into Pedagogical
Frameworks

•   Luca & Oliver (2002)


•   Boud & Falchikov
    (2007)


•   Bartlett-Bragg
    (2008)
Cycles in e-Assessment
The 4Ts Pyramid


                  Transfer
                  Learning

              Transformation
                  Tasks

              Training of staff

             Tool Development
e-Assessment: effectiveness vs.
innovation
                35


                                                                                                         Dundee
                30
                                                                                                                                                 Derby
                                                                                                                                   OU
                                                                               Ulster
                             East Antrium
                                                                               Plymouth
                                                               Birkbeck
                25                                                                                       Glamorgan                 Loughborough
                                                                                          Southampton

                             West Suffolk
Effectiveness




                20                                                                                                                 Heriot Watt
                                                               Cambridge Assessment
                                                     Belfast
                                                                                            Coleg Sir Gar
                15                       Warwickshire College




                10                                                                                       UCL         Surrey

                             Calibrand


                 5



                 0           COLA
                     4   6                       8                        10                        12                        14                         16
                                                                     Innovation
e-Assessment: Maturity vs. Availability
                                          Scatter diagram to illustrate relationship between Maturity vs Availability

           32
                                                                     Heriot Watt
                                                                                                                                                  OU
                                                                                                                                                  Surrey
           30                                                                                     Derby                             Dundee
                                                                                                                                                  Heriot Watt
                                                                        Ulster
                                                                                                                                                  Dundee
           28                  East Antrium                                                                                                       Plymouth
                                                 Surrey                               Birkbeck                                                    Southampton
                                                                                                                                                  Loughborough
           26                                                                         OU
                                                                                                                                                  Ulster
                                                                                                  Southampton
                                                                                                                                                  UCL
                                                                                                           Loughborough
Maturity




           24                                                                                              Plymouth                               Glamorgan
                                                                                                  Cambridge Assessment                            Derby
           22                                                  Warwickshire College                            Coleg Sir Gar                      Birkbeck
                                                                                                                                                  Warwickshire College
                               UCL
                                                                                                                                                  Coleg Sir Gar
           20                                                                                                                                     East Antrium
                                                 West Suffolk                         Calibrand                                                   Belfast
           18                                                                                                                                     West Suffolk
                                                                                                                                                  COLA
                                                                                                                                                  Cambridge Assessment
           16        Belfast                                                                                                                      Calibrand


           14
                12                   14                   16                     18                       20                   22            24
                                                                         Availability
Training

 Question
  development
 Quality REAQ
 Plagiarism software
 Advice for Action
 Socio e-emotive
  content
 Maintain empathy
  with the Learner
e-Assessment Futures
• Research issues
• Adaptive testing
• Automatic
  marking
• Learning Analytics
  Data mining
• Web 2.0, 3.0 ...
Rearranging

              •   Promoting
                  champions
              •   Multidisciplinary
                  teams
              •   Open source
              •   Pedagogical
                  frameworks
National Union of Students’ Principles of Effective
Assessment     Times Higher Education, 29th January
2009

•   Should be for learning, not simply of learning
•   Should be reliable, valid, fair and consistent
•   Should consist of effective and constructive feedback
•   Should be innovative and have the capacity to inspire
    and motivate.
•   Should be conducted throughout the course, rather
    than being positioned as a final event
•   Should develop key skills such as peer and reflective
    assessment
References
 Whitelock, D. (2010) Activating Assessment for
  Learning: are we on the way with Web 2.0? In M.J.W.
  Lee & C. McLoughlin (Eds.) Web 2.0-Based-E-
  Learning: Applying Social Informatics for Tertiary
  Teaching. IGI Global. pp. 319–342.
 Whitelock, D. (2008) ‘Accelerating the assessment
  agenda: thinking outside the black box’. In
  Scheuermann, F. & Pereira, A.G. (eds.) Towards a
  Research Agenda on Computer-Based
  Assessment: Challenges and Needs for European
  Educational Measurement, published by the
  European Commission, Joint Research Centre. pp15-
  21 ISSN. 1018-5593
Three Assessment Special Issues
 Brna, P. & Whitelock, D. (Eds.) (2010) Special Issue of
  International Journal of Continuing Engineering
  Education and Life-long Learning, Focussing on
  electronic Feedback: Feasible progress or just
  unfulfilled promises? Volume 2, No. 2
 Whitelock, D. (Ed.) (2009) Special Issue on e-
  Assessment: Developing new dialogues for the digital
  age. British Journal of Educational Technology. Volume
  40, No. 2
 Whitelock, D. and Watt, S. (Eds.) (2008). Reframing e-
  assessment: adopting new media and adapting old
  frameworks. Learning, Media and Technology, Vol. 33,
  No. 3

Weitere ähnliche Inhalte

Was ist angesagt?

Assessment Strategies of Online Pedagogy
Assessment Strategies of Online PedagogyAssessment Strategies of Online Pedagogy
Assessment Strategies of Online PedagogyVijayalakshmi Murugesan
 
2012 Computer-Assisted Assessment
2012 Computer-Assisted Assessment2012 Computer-Assisted Assessment
2012 Computer-Assisted AssessmentTim Hunt
 
Technology Integration in the Classroom - A case study in learning engagement...
Technology Integration in the Classroom - A case study in learning engagement...Technology Integration in the Classroom - A case study in learning engagement...
Technology Integration in the Classroom - A case study in learning engagement...William Welder
 
Strategies for keeping the eLearner engaged
Strategies for keeping the eLearner engagedStrategies for keeping the eLearner engaged
Strategies for keeping the eLearner engagedYum Studio
 
E learning-basic guidelines to develop multimedia learning
E learning-basic guidelines to develop multimedia learningE learning-basic guidelines to develop multimedia learning
E learning-basic guidelines to develop multimedia learningDimas Prasetyo
 
Computer Based Training Methods
Computer Based Training MethodsComputer Based Training Methods
Computer Based Training MethodsDejiaofOrlando
 
Online Coursework: Making It Count - Lore Carvajal
Online Coursework: Making It Count - Lore CarvajalOnline Coursework: Making It Count - Lore Carvajal
Online Coursework: Making It Count - Lore Carvajalmhhighered
 
Computer assisted instruction
Computer assisted instructionComputer assisted instruction
Computer assisted instructionVinci Viveka
 
Computer based online written test system "Tao Software"
Computer based online written test system "Tao Software"Computer based online written test system "Tao Software"
Computer based online written test system "Tao Software"Awais Chaudhary
 
E Assessment Presentation Ver2 June 2008
E Assessment Presentation Ver2 June 2008E Assessment Presentation Ver2 June 2008
E Assessment Presentation Ver2 June 2008Jo Richler
 
Introduction to e-submission July 2016
Introduction to e-submission July 2016Introduction to e-submission July 2016
Introduction to e-submission July 2016Bunyan Nick
 
Introduction to e-submission at the University of Liverpool
Introduction to e-submission at the University of LiverpoolIntroduction to e-submission at the University of Liverpool
Introduction to e-submission at the University of LiverpoolAlex Spiers
 
E-learning presentation
E-learning presentationE-learning presentation
E-learning presentationSeshi Malik
 

Was ist angesagt? (20)

E assessment
E assessmentE assessment
E assessment
 
Assessment Strategies of Online Pedagogy
Assessment Strategies of Online PedagogyAssessment Strategies of Online Pedagogy
Assessment Strategies of Online Pedagogy
 
Adaptive Learning
Adaptive   LearningAdaptive   Learning
Adaptive Learning
 
2012 Computer-Assisted Assessment
2012 Computer-Assisted Assessment2012 Computer-Assisted Assessment
2012 Computer-Assisted Assessment
 
Technology Integration in the Classroom - A case study in learning engagement...
Technology Integration in the Classroom - A case study in learning engagement...Technology Integration in the Classroom - A case study in learning engagement...
Technology Integration in the Classroom - A case study in learning engagement...
 
Strategies for keeping the eLearner engaged
Strategies for keeping the eLearner engagedStrategies for keeping the eLearner engaged
Strategies for keeping the eLearner engaged
 
The Live Online Class
The Live Online ClassThe Live Online Class
The Live Online Class
 
MOOCs and pedagogy: where are we heading?
MOOCs and pedagogy: where are we heading?MOOCs and pedagogy: where are we heading?
MOOCs and pedagogy: where are we heading?
 
Online Evaluation/Assessment by Mudasir Amin (Durpora shopian)
Online Evaluation/Assessment by Mudasir Amin (Durpora shopian)Online Evaluation/Assessment by Mudasir Amin (Durpora shopian)
Online Evaluation/Assessment by Mudasir Amin (Durpora shopian)
 
E learning-basic guidelines to develop multimedia learning
E learning-basic guidelines to develop multimedia learningE learning-basic guidelines to develop multimedia learning
E learning-basic guidelines to develop multimedia learning
 
EDTEC 544
EDTEC 544 EDTEC 544
EDTEC 544
 
Computer Based Training Methods
Computer Based Training MethodsComputer Based Training Methods
Computer Based Training Methods
 
Online Coursework: Making It Count - Lore Carvajal
Online Coursework: Making It Count - Lore CarvajalOnline Coursework: Making It Count - Lore Carvajal
Online Coursework: Making It Count - Lore Carvajal
 
Computer assisted instruction
Computer assisted instructionComputer assisted instruction
Computer assisted instruction
 
Computer based online written test system "Tao Software"
Computer based online written test system "Tao Software"Computer based online written test system "Tao Software"
Computer based online written test system "Tao Software"
 
E Assessment Presentation Ver2 June 2008
E Assessment Presentation Ver2 June 2008E Assessment Presentation Ver2 June 2008
E Assessment Presentation Ver2 June 2008
 
Introduction to e-submission July 2016
Introduction to e-submission July 2016Introduction to e-submission July 2016
Introduction to e-submission July 2016
 
Introduction to e-submission at the University of Liverpool
Introduction to e-submission at the University of LiverpoolIntroduction to e-submission at the University of Liverpool
Introduction to e-submission at the University of Liverpool
 
E-learning presentation
E-learning presentationE-learning presentation
E-learning presentation
 
E Learning Addon In Pedagogy
E Learning Addon In PedagogyE Learning Addon In Pedagogy
E Learning Addon In Pedagogy
 

Andere mochten auch

Electronic Management of Assessment
Electronic Management of AssessmentElectronic Management of Assessment
Electronic Management of AssessmentJisc
 
Module 5 module 3 draft electrical and electronic layout and details
Module 5   module 3 draft electrical and electronic layout and detailsModule 5   module 3 draft electrical and electronic layout and details
Module 5 module 3 draft electrical and electronic layout and detailsGilbert Bautista
 
Introduction to e-Assessment
Introduction to e-AssessmentIntroduction to e-Assessment
Introduction to e-AssessmentJisc Scotland
 
PM Humor - At Some Point You’re Going To Have To Break Down ...
PM Humor - At Some Point You’re Going To Have To  Break Down ...PM Humor - At Some Point You’re Going To Have To  Break Down ...
PM Humor - At Some Point You’re Going To Have To Break Down ...OSP International LLC
 
EMR Implementation Considerations Slides
EMR Implementation Considerations SlidesEMR Implementation Considerations Slides
EMR Implementation Considerations SlidesSaide OER Africa
 
Importance of early project requirements definition
Importance of early project requirements definitionImportance of early project requirements definition
Importance of early project requirements definitionMaveric Systems
 
Patterns For Effective Use Cases
Patterns For Effective Use CasesPatterns For Effective Use Cases
Patterns For Effective Use CasesMayflower GmbH
 
e-Learning policy by Mohamed Amin Embi
e-Learning policy by Mohamed Amin Embie-Learning policy by Mohamed Amin Embi
e-Learning policy by Mohamed Amin EmbiMohamed Amin Embi
 
e-Learning Management System : a Critical Study
e-Learning Management System : a Critical Studye-Learning Management System : a Critical Study
e-Learning Management System : a Critical StudyKaustav Saha
 
Online Assessment Presentation
Online Assessment PresentationOnline Assessment Presentation
Online Assessment Presentationmargaretntrent
 
African Health OER Network Overview - 2 pages
African Health OER Network Overview - 2 pagesAfrican Health OER Network Overview - 2 pages
African Health OER Network Overview - 2 pagesKathleen Ludewig Omollo
 
Desirable software features simulation & modeling
Desirable software features simulation & modelingDesirable software features simulation & modeling
Desirable software features simulation & modelingShashwat Shriparv
 
PowerStory - a better way to define requirements and test cases
PowerStory - a better way to define requirements and test casesPowerStory - a better way to define requirements and test cases
PowerStory - a better way to define requirements and test casesPowerStory
 
Needs Assessment
Needs AssessmentNeeds Assessment
Needs AssessmentLynda Milne
 
6 Steps to an Effective Needs Assessment
6 Steps to an Effective Needs Assessment6 Steps to an Effective Needs Assessment
6 Steps to an Effective Needs AssessmentErin Lett
 
Needs Assessment Powerpoint 2007
Needs Assessment Powerpoint 2007Needs Assessment Powerpoint 2007
Needs Assessment Powerpoint 2007Johan Koren
 
From Use case to User Story
From Use case to User StoryFrom Use case to User Story
From Use case to User StoryKunta Hutabarat
 

Andere mochten auch (20)

Electronic Management of Assessment
Electronic Management of AssessmentElectronic Management of Assessment
Electronic Management of Assessment
 
Module 5 module 3 draft electrical and electronic layout and details
Module 5   module 3 draft electrical and electronic layout and detailsModule 5   module 3 draft electrical and electronic layout and details
Module 5 module 3 draft electrical and electronic layout and details
 
Introduction to e-Assessment
Introduction to e-AssessmentIntroduction to e-Assessment
Introduction to e-Assessment
 
PM Humor - At Some Point You’re Going To Have To Break Down ...
PM Humor - At Some Point You’re Going To Have To  Break Down ...PM Humor - At Some Point You’re Going To Have To  Break Down ...
PM Humor - At Some Point You’re Going To Have To Break Down ...
 
EMR Implementation Considerations Slides
EMR Implementation Considerations SlidesEMR Implementation Considerations Slides
EMR Implementation Considerations Slides
 
Creating Effective Use-cases
Creating Effective Use-casesCreating Effective Use-cases
Creating Effective Use-cases
 
Importance of early project requirements definition
Importance of early project requirements definitionImportance of early project requirements definition
Importance of early project requirements definition
 
Patterns For Effective Use Cases
Patterns For Effective Use CasesPatterns For Effective Use Cases
Patterns For Effective Use Cases
 
e-Learning policy by Mohamed Amin Embi
e-Learning policy by Mohamed Amin Embie-Learning policy by Mohamed Amin Embi
e-Learning policy by Mohamed Amin Embi
 
e-Learning Management System : a Critical Study
e-Learning Management System : a Critical Studye-Learning Management System : a Critical Study
e-Learning Management System : a Critical Study
 
Online Assessment Presentation
Online Assessment PresentationOnline Assessment Presentation
Online Assessment Presentation
 
African Health OER Network Overview - 2 pages
African Health OER Network Overview - 2 pagesAfrican Health OER Network Overview - 2 pages
African Health OER Network Overview - 2 pages
 
Desirable software features simulation & modeling
Desirable software features simulation & modelingDesirable software features simulation & modeling
Desirable software features simulation & modeling
 
PowerStory - a better way to define requirements and test cases
PowerStory - a better way to define requirements and test casesPowerStory - a better way to define requirements and test cases
PowerStory - a better way to define requirements and test cases
 
Needs Assessment
Needs AssessmentNeeds Assessment
Needs Assessment
 
Usecase
UsecaseUsecase
Usecase
 
How to Conduct a Needs Assessment
How to Conduct a Needs AssessmentHow to Conduct a Needs Assessment
How to Conduct a Needs Assessment
 
6 Steps to an Effective Needs Assessment
6 Steps to an Effective Needs Assessment6 Steps to an Effective Needs Assessment
6 Steps to an Effective Needs Assessment
 
Needs Assessment Powerpoint 2007
Needs Assessment Powerpoint 2007Needs Assessment Powerpoint 2007
Needs Assessment Powerpoint 2007
 
From Use case to User Story
From Use case to User StoryFrom Use case to User Story
From Use case to User Story
 

Ähnlich wie Balancing assessment challenges and opportunities with e-learning tools

The Moodle quiz at the Open University
The Moodle quiz at the Open UniversityThe Moodle quiz at the Open University
The Moodle quiz at the Open UniversityTim Hunt
 
The Moodle Quiz at the Open University: how we use it & how that helps students
The Moodle Quiz at the Open University: how we use it & how that helps studentsThe Moodle Quiz at the Open University: how we use it & how that helps students
The Moodle Quiz at the Open University: how we use it & how that helps studentsTim Hunt
 
IDEE Workshop: Applying the 4C-ID Model to the Design of a Digital Educationa...
IDEE Workshop: Applying the 4C-ID Model to the Design of a Digital Educationa...IDEE Workshop: Applying the 4C-ID Model to the Design of a Digital Educationa...
IDEE Workshop: Applying the 4C-ID Model to the Design of a Digital Educationa...Guilhermina Miranda
 
TESTA Interactive Masterclass
TESTA Interactive MasterclassTESTA Interactive Masterclass
TESTA Interactive MasterclassTansy Jessop
 
Teaching learning based optimization technique
Teaching   learning based optimization techniqueTeaching   learning based optimization technique
Teaching learning based optimization techniqueSmriti Mehta
 
The why and what of testa
The why and what of testaThe why and what of testa
The why and what of testaTansy Jessop
 
Replicable Evaluation of Recommender Systems
Replicable Evaluation of Recommender SystemsReplicable Evaluation of Recommender Systems
Replicable Evaluation of Recommender SystemsAlejandro Bellogin
 
Math 205 syllabus Fall 2012
Math 205 syllabus Fall 2012Math 205 syllabus Fall 2012
Math 205 syllabus Fall 2012Jeneva Clark
 
Classroom 2020 presentation
Classroom 2020 presentationClassroom 2020 presentation
Classroom 2020 presentationYouTestMe
 
CAA 2012 Closing Address
CAA 2012 Closing Address CAA 2012 Closing Address
CAA 2012 Closing Address jisc-elearning
 
Teaching Six Sigma Using Six Sigma
Teaching Six Sigma Using Six SigmaTeaching Six Sigma Using Six Sigma
Teaching Six Sigma Using Six SigmaBrandon Theiss, PE
 
You testme classroom 2020 en
You testme classroom 2020 enYou testme classroom 2020 en
You testme classroom 2020 enDanilo Sretenovic
 
YouTestMe Classroom 2020 Presentation
YouTestMe Classroom 2020 PresentationYouTestMe Classroom 2020 Presentation
YouTestMe Classroom 2020 PresentationFilip Dimitrijević
 
Obc 2011
Obc 2011Obc 2011
Obc 2011obepsp
 
Assignments .30%
Assignments .30%Assignments .30%
Assignments .30%butest
 
Aqt instructor-notes-final
Aqt instructor-notes-finalAqt instructor-notes-final
Aqt instructor-notes-finalwaheedjan
 
Improving student learning through taking a programme approach
Improving student learning through taking a programme approachImproving student learning through taking a programme approach
Improving student learning through taking a programme approachTansy Jessop
 

Ähnlich wie Balancing assessment challenges and opportunities with e-learning tools (20)

The Moodle quiz at the Open University
The Moodle quiz at the Open UniversityThe Moodle quiz at the Open University
The Moodle quiz at the Open University
 
The Moodle Quiz at the Open University: how we use it & how that helps students
The Moodle Quiz at the Open University: how we use it & how that helps studentsThe Moodle Quiz at the Open University: how we use it & how that helps students
The Moodle Quiz at the Open University: how we use it & how that helps students
 
IDEE Workshop: Applying the 4C-ID Model to the Design of a Digital Educationa...
IDEE Workshop: Applying the 4C-ID Model to the Design of a Digital Educationa...IDEE Workshop: Applying the 4C-ID Model to the Design of a Digital Educationa...
IDEE Workshop: Applying the 4C-ID Model to the Design of a Digital Educationa...
 
TESTA Interactive Masterclass
TESTA Interactive MasterclassTESTA Interactive Masterclass
TESTA Interactive Masterclass
 
Teaching learning based optimization technique
Teaching   learning based optimization techniqueTeaching   learning based optimization technique
Teaching learning based optimization technique
 
Cambridge presentation for gr 6 parents 19 july 2012
Cambridge presentation for gr 6 parents   19 july 2012Cambridge presentation for gr 6 parents   19 july 2012
Cambridge presentation for gr 6 parents 19 july 2012
 
The why and what of testa
The why and what of testaThe why and what of testa
The why and what of testa
 
Replicable Evaluation of Recommender Systems
Replicable Evaluation of Recommender SystemsReplicable Evaluation of Recommender Systems
Replicable Evaluation of Recommender Systems
 
Math 205 syllabus Fall 2012
Math 205 syllabus Fall 2012Math 205 syllabus Fall 2012
Math 205 syllabus Fall 2012
 
Classroom 2020 presentation
Classroom 2020 presentationClassroom 2020 presentation
Classroom 2020 presentation
 
CAA 2012 Closing Address
CAA 2012 Closing Address CAA 2012 Closing Address
CAA 2012 Closing Address
 
Teaching Six Sigma Using Six Sigma
Teaching Six Sigma Using Six SigmaTeaching Six Sigma Using Six Sigma
Teaching Six Sigma Using Six Sigma
 
You testme classroom 2020 en
You testme classroom 2020 enYou testme classroom 2020 en
You testme classroom 2020 en
 
YouTestMe Classroom 2020 Presentation
YouTestMe Classroom 2020 PresentationYouTestMe Classroom 2020 Presentation
YouTestMe Classroom 2020 Presentation
 
Obc 2011
Obc 2011Obc 2011
Obc 2011
 
Assignments .30%
Assignments .30%Assignments .30%
Assignments .30%
 
Rutgers Green Belt
Rutgers Green BeltRutgers Green Belt
Rutgers Green Belt
 
Aqt instructor-notes-final
Aqt instructor-notes-finalAqt instructor-notes-final
Aqt instructor-notes-final
 
Designing assessment and assessment-criteria
Designing assessment and assessment-criteriaDesigning assessment and assessment-criteria
Designing assessment and assessment-criteria
 
Improving student learning through taking a programme approach
Improving student learning through taking a programme approachImproving student learning through taking a programme approach
Improving student learning through taking a programme approach
 

Mehr von Open Assessment Technologies

TAO DAYS - TAO as a platform for an Online Diagnotsic Assessment System
TAO DAYS - TAO as a platform for an Online Diagnotsic Assessment SystemTAO DAYS - TAO as a platform for an Online Diagnotsic Assessment System
TAO DAYS - TAO as a platform for an Online Diagnotsic Assessment SystemOpen Assessment Technologies
 
TAO DAYS - Integration of 3rd party components into TAO
TAO DAYS - Integration of 3rd party components into TAOTAO DAYS - Integration of 3rd party components into TAO
TAO DAYS - Integration of 3rd party components into TAOOpen Assessment Technologies
 
TAO DAYS - Challenges of Modern Computer Based Assessment
TAO DAYS - Challenges of Modern Computer Based AssessmentTAO DAYS - Challenges of Modern Computer Based Assessment
TAO DAYS - Challenges of Modern Computer Based AssessmentOpen Assessment Technologies
 
TAO DAYS - Development of an advanced item (IT Session)
TAO DAYS - Development of an advanced item (IT Session)TAO DAYS - Development of an advanced item (IT Session)
TAO DAYS - Development of an advanced item (IT Session)Open Assessment Technologies
 
TAO DAYS - Introduction to TAO 2.0 and its architecture (IT Session)
TAO DAYS - Introduction to TAO 2.0 and its architecture (IT Session)TAO DAYS - Introduction to TAO 2.0 and its architecture (IT Session)
TAO DAYS - Introduction to TAO 2.0 and its architecture (IT Session)Open Assessment Technologies
 

Mehr von Open Assessment Technologies (19)

TAO at ATP 2012 showcase
TAO at ATP 2012 showcaseTAO at ATP 2012 showcase
TAO at ATP 2012 showcase
 
TAO at ATP 2012 showcase
TAO at ATP 2012 showcaseTAO at ATP 2012 showcase
TAO at ATP 2012 showcase
 
TAO DAYS - TAO as a platform for an Online Diagnotsic Assessment System
TAO DAYS - TAO as a platform for an Online Diagnotsic Assessment SystemTAO DAYS - TAO as a platform for an Online Diagnotsic Assessment System
TAO DAYS - TAO as a platform for an Online Diagnotsic Assessment System
 
TAO DAYS - GeoGebra and TAO 2.0 by Yves Kreis
TAO DAYS - GeoGebra and TAO 2.0 by Yves KreisTAO DAYS - GeoGebra and TAO 2.0 by Yves Kreis
TAO DAYS - GeoGebra and TAO 2.0 by Yves Kreis
 
TAO DAYS - GeoGebra and TAO 2.0 by Raynald Jadoul
TAO DAYS - GeoGebra and TAO 2.0 by Raynald JadoulTAO DAYS - GeoGebra and TAO 2.0 by Raynald Jadoul
TAO DAYS - GeoGebra and TAO 2.0 by Raynald Jadoul
 
TAO DAYS - Integration of 3rd party components into TAO
TAO DAYS - Integration of 3rd party components into TAOTAO DAYS - Integration of 3rd party components into TAO
TAO DAYS - Integration of 3rd party components into TAO
 
TAO DAYS - Challenges of Modern Computer Based Assessment
TAO DAYS - Challenges of Modern Computer Based AssessmentTAO DAYS - Challenges of Modern Computer Based Assessment
TAO DAYS - Challenges of Modern Computer Based Assessment
 
TAO DAYS - ROADMAP
TAO DAYS - ROADMAPTAO DAYS - ROADMAP
TAO DAYS - ROADMAP
 
TAO DAYS - Free advanced items (User Session)
TAO DAYS - Free advanced items (User Session)TAO DAYS - Free advanced items (User Session)
TAO DAYS - Free advanced items (User Session)
 
TAO DAYS - Development of an advanced item (IT Session)
TAO DAYS - Development of an advanced item (IT Session)TAO DAYS - Development of an advanced item (IT Session)
TAO DAYS - Development of an advanced item (IT Session)
 
TAO DAYS - Result (User session)
TAO DAYS - Result (User session)TAO DAYS - Result (User session)
TAO DAYS - Result (User session)
 
TAO DAYS - HAWAI (User Session)
TAO DAYS - HAWAI (User Session)TAO DAYS - HAWAI (User Session)
TAO DAYS - HAWAI (User Session)
 
TAO DAYS - Process (User session)
TAO DAYS - Process (User session)TAO DAYS - Process (User session)
TAO DAYS - Process (User session)
 
TAO DAYS - Process (IT session)
TAO DAYS - Process (IT session)TAO DAYS - Process (IT session)
TAO DAYS - Process (IT session)
 
TAO DAYS - API (IT Session)
TAO DAYS - API (IT Session)TAO DAYS - API (IT Session)
TAO DAYS - API (IT Session)
 
TAO DAYS - Creation of QTI Items
TAO DAYS - Creation of QTI ItemsTAO DAYS - Creation of QTI Items
TAO DAYS - Creation of QTI Items
 
TAO DAYS - Introduction to TAO 2.0 and its architecture (IT Session)
TAO DAYS - Introduction to TAO 2.0 and its architecture (IT Session)TAO DAYS - Introduction to TAO 2.0 and its architecture (IT Session)
TAO DAYS - Introduction to TAO 2.0 and its architecture (IT Session)
 
TAO DAYS - Introduction to TAO 2.0 (User Session)
TAO DAYS - Introduction to TAO 2.0 (User Session)TAO DAYS - Introduction to TAO 2.0 (User Session)
TAO DAYS - Introduction to TAO 2.0 (User Session)
 
TAO-ECTEL
TAO-ECTELTAO-ECTEL
TAO-ECTEL
 

Balancing assessment challenges and opportunities with e-learning tools

  • 1. E-Assessment for Learning: balancing challenges and opportunities Denise Whitelock The Open University, Walton Hall, Milton Keynes MK7 6AA d.m.whitelock@open.ac.uk
  • 2.
  • 4. I hate marking but want the tasks and feedback to assist student learning
  • 5. The e-Assessment Challenge • Constructivist Learning – Push • Institutional reliability and accountability – Pull .
  • 6.
  • 7. MCQs: Variation on a theme Example of LAPT Certainty-Based Marking, UK cabinet ministers demo Drug Chart Errors and Omissions, exercise showing feedback, Medicines Administration Assessment, University College, Tony Gardner- Chesterfield Royal Hospital Medwin
  • 8. MCQs: High Stakes Assessment Example of practice Thinking Skills Assessment" (TSA) question, Example of practice Thinking Skills Admissions Interview, Assessment" (TSA) feedback, Cambridge Assessment, Steve Lay Admissions Interview, Cambridge Assessment, Steve Lay
  • 9. Scaffolding and High Stakes assessment • Math for Science • Tutor less course • Competency led • No point to cheat • Web home exam • Invigilation technologies
  • 10. Self diagnosis • Basic IT skills, first year med students (Sieber, 2009) • Competency based testing • Repeating tests for revision • Enables remedial intervention
  • 11. Theoretical drivers for Asesessment • Piaget (1930) Individual manipulating surroundings -- mental representations can be tested Induction of rules and their testing ...Nuffield science • Cognitive pyschology( 70s 80s) How are these mental representations stored? Classification development and how this can go wrong. .... Misconceptions and mal rules........ AI • Bruner (1982) assessment tasks will match competency levels depending on level of help...SCAFFOLDING
  • 12. Scaffolding and Interactivity; OU Science Foundation Course • Interaction, Feedback loops • Tell, Explore, Check • Predict, Look and Explain • Entering the discourse of a subject via audio feedback • Scaffolded text feedback (Bruner & Woods) • SHOW ME button
  • 13. Interactive Tasks • Games • Simulations • Making the abstract concrete • Directing the sequence of an animation
  • 14. Interactivity and Cognitive Change Scores 10 Interactivity Cog. Change 8 6 4 2 0 Galapagos Organic Meiosis & Seismic Molecules Mitosis Waves
  • 15. Theoretical drivers:Social Constructivism • Vygotsky (1978) individuals shape cultural settings which shapes minds no longer individual • Activity theory (Engstrom 1987) Tool mediation • Situated Cognition (Lave and Wenger 1991) Authentic assessment • Peer interaction and assessment • Learning conversations Laurillard (2002) DMW, Loughborough, July 2009
  • 16. Characteristics Descriptor Authentic Involving real-world knowledge and skills Personalised Tailored to the knowledge, skills and interests of each student Negotiated Agreed between the learner and the teacher Engaging Involving the personal interests of the students Recognise existing skills Willing to accredit the student’s existing work Deep Assessing deep knowledge – not memorization Problem oriented Original tasks requiring genuine problem solving skills Collaboratively produced Produced in partnership with fellow students Peer and self assessed Involving self reflection and peer review Tool supported Encouraging the use of ICT Elliott’s characteristics of Assessment 2.0 activities
  • 17. Authentic assessments :e-portfolios Electronic NVQ portfolio cover contents page, OCR IT Practitioner, EAIHFE, Robert Wilsdon
  • 18. Candidate Assessment Records section, OCR IT Practitioner, EAIHFE, Robert Wilsdon
  • 19. Building e-portfolios on a chef’s course Evidence of food preparation skill for e- portfolio, Modern Apprenticeship in Hospitality and Catering, West Suffolk food preparation for e-portfolio, Modern College, Mike Mulvihill Apprenticeship in Hospitality and Catering, West Suffolk College, Mike Mulvihill
  • 20. Sharing e-portfolios: The Netfolio concept • Social constructivism • Connecting e-portfolios (Barbera, 2009) • Share and build upon a joint body of evidence • Trialled with 31 PhD students at a virtual university • Control group used but Netfolio group obtained higher grades • Greater visibility of revision process and peer assessment in the Netfolio system
  • 21. Peer Assessment and the WebPA Tool • Loughborough (Loddington et al, 2009) • Self assess and peer assess with given criteria • Group mark awarded by tutor • Students rated: • More timely feedback • Reflection • Fair rewards for hard work • Staff rated: • Time savings • Administrative gains • Automatic calculation • Students have faith in the administrative system
  • 22. Mobile Technologies and Assessment • MCQs ,PDAs Valdiva & Nussbaum(2009) • Polls,instant surveys • Simpson & Oliver (2007) • Draper (2009) EVS
  • 23. Gains from Formative Assessment  Mean effect size on standardised tests between 0.4 to 0.7 (Black & Williams, 1998)  Particularly effective for students who have not done well at school http://kn.open.ac.uk/document.cfm?docid=10817  Can keep students to timescale and motivate them  How can we support our students to become more reflective learners and engage in formative assessment tasks?
  • 24. Formative assessment in Level 3 Physics and Astronomy SM358 The Quantum World 2007, 2009, 2010 SMT359 Electromagnetism 2006, 2008, 2010 S382 Astrophysics 2010 S383 The Relativistic Universe 2010 Black = Summative Red = Formative
  • 25. Formative assessment strategy SM358 and SMT359: Quantum and Electromagnetism iCMA Maths OCAS Threshold iCMA 51 TMA 01 iCMA 52 30% or more on 7 TMA 02 or more iCMA 53 assignments iCMA 54 TMA 03 including at least 2 TMAs. iCMA 55 TMA 04 iCMA 56 3 Revision Medleys Course Grade determined by exam score alone.
  • 26. Formative assessment strategy S382: Astrophysics iCMA 51 TMA 01 OCAS Threshold iCMA 52 TMA 02 TMA 03 30% or more on 8 iCMA 53 or more Project assignments. iCMA 54 TMA 04 iCMA 55 TMA 05 TMAs and iCMAs cover complementary material. Course Grade determined by exam (2/3)+ Project (1/3)
  • 27. Formative assessment strategy S383 Relativistic Universe iCMA 51 TMA 01 iCMA 52 TMA 02 OCAS Threshold iCMA 53 TMA 03 30% or more on 9 EC1 or more iCMA 54 TMA 04 assignments. iCMA 55 TMA 05 iCMA 56 TMA 06 TMAs and iCMAs cover complementary material. Course Grade determined by exam (2/3)+ EC1 (1/3).
  • 28. Attempts at TMAs (Physics) SM358 2009 SM358 2010 SMT359 2008 Red = 2010 SMT359 2010 Blue = 2009
  • 29. Attempts at TMAs (Astromomy) S381 2008 S382 2010 S357 2009 S383 2010
  • 30. Average marks for TMAs (Physics) SMT359 SM358 Blue = summative Red = formative
  • 31. Average marks for TMAs (Astronomy) S357/S383 S381/S382 Blue = summative Red = formative
  • 32. iCMA marks (SM358) Red = 2010 Blue = 2009
  • 33. Time of first attempts at iCMAs e.g. Cumulative total Red = 2010 Blue = 2009 Recommended Exam Exam completion date 2010 2009
  • 34. The bottom line SM358 SM358 SMT359 SMT359 2009 2010 2009 2010 Retention 70.7% 70.5 % 62.7 % 60.8 % Grade 1 22.0 % 22.7 % 12.9 % 7.3 % Grade 2 15.2 % 13.5 % 9.8 % 10.5 % Grade 3 12.3 % 17.9 % 15.1 % 17.1 % Grade 4 11.5 % 6.3 % 12.3 % 13.3 % Grade Q/5 9.7 % 10.1 % 12. 6% 12.6% Pass rate 61.0 % 60.4 % 50.1% 48.2%
  • 35. The bottom line S381 S382 S357 S383 2008 2010 2009 2010 Retention 59.2% 57.5 % 60.1 % 56.4 % Grade 1 10.1 % 12.1 % 12.2 % 9.9 % Grade 2 24.7 % 18.9 % 6.1 % 13.7 % Grade 3 13.5 % 15.2 % 17.0 % 20.6 % Grade 4 4.7 % 6.8 % 17.4 % 6.1 % Grade Q/5 6.2 % 4.5 % 7.4% 6.1 % Pass rate 53.0 % 53.0 % 52.7% 50.3 %
  • 36. Conclusions • Formative assessment makes much less difference than we might have imagined. • In the first year of running this strategy, the results are comparable with (or marginally worse than) those for summative assessment • BUT we are still learning how to optimise the strategy, and there is the potential for future improvements.
  • 37. Physicists’ conclusions  Formative assessment makes much less difference imagined  In the first year of running this strategy, the results are comparable with (or marginally worse than) those for summative assessment  BUT we are still learning how to optimise the strategy, and there is the potential for future improvements.
  • 38. MU 123: Discovering Mathematics Tim Lowe  30 point introductory module  Launched February 2010  2 presentations a year  1800 students oer presentation Academic:Tim Lowe
  • 39. 2 Types of Computer Assisted Assessment quizzes Practice Quizzes iCMAs  Formative  Summative  Immediate feedback  Delayed feedback  3 tries per question  Single attempt  Worked solutions  6 questions per unit  Can repeat  12% of grade  12 questions per unit
  • 41.
  • 43. End of course survey 2010  The computer-based individual learning activities for this course supported my learning  92.2% agree 0.5% disagee  Increased retention
  • 44. Collaborative formative assessment with Global Warming DMW, Institute of Educati onal Technology, September 1997
  • 46. Next: ‘Yoked’ apps via BuddySpace Student A Student B (‘yoked’, but without full screen sharing required!)
  • 47. Global Warming: Simlink Presentation
  • 48. Research issues from CAA: Marking Free text • e-Rater Attali & Burnstein (2006) • LSA (Laudaurer,Laham& Foltz 2003) • Christie SEAR (1999) • Knowledge engineering Pulman &Sukkarieh (2005) • IAT (Jordan and Mitchell 2009) • Genre Analysis Moreale& Vargas –Vera (2005)
  • 49. Open Comment addresses the problem of free text entry • Automated formative assessment tool • Free text entry for students • Automated feedback and guidance • Open questions, divergent assessment • No marks awarded • For use by Arts Faculty
  • 50. Three Causal Question Types proved to be suitable for Open Comment 1. Analysis of statistics, usually presented to the student as a table 2. Comprehension of a set text 3. Identifying similarities and differences for a given event
  • 51. Open Comment components • A Java-based feedback system • A web service shell • A graphical interface for testing • A Moodle-based question type • A forms-based editing tool
  • 52.
  • 53.
  • 54.
  • 55.
  • 56. Stages of analysis by computer of students’ free text entry for Open Comment: advice with respect to content (socio-emotional support stylised example) • STAGE 1a: DETECT ERRORS E.g. Incorrect dates, facts. (Incorrect inferences and causality is dealt with below) • Instead of concentrating on X, think about Y in order to answer this question Recognise effort (Dweck) and encourage to have another go • You have done well to start answering this question but perhaps you misunderstood it. Instead of thinking about X which did not…….. Consider Y
  • 57. Computer analysis continued • STAGE 2a: REVEAL FIRST OMISSION • Consider the role of Z in your answer Praise what is correct and point out what is missing Good but now consider the role X plays in your answer • STAGE 2b: REVEAL SECOND OMISSION • Consider the role of P in your answer Praise what is correct and point out what is missing Yes but also consider P. Would it have produced the same result if P is neglected? DMW, Loughborough, July 2009
  • 58. Final stages of analysis • STAGE 3:REQUEST CLARIFICATION OF KEY POINT 1 • STAGE 4:REQUEST FURTHER ANALYSIS OF KEY POINT 1(Stages 3 and 4 repeated with all the key points) • STAGE 5:REQUEST THE INFERENCE FROM THE ANALYSIS OF KEY POINT 1 IF IT IS MISSING • STAGE 6:REQUEST THE INFERENCE FROM THE ANALYSIS OF KEY POINT 1 IF IT IS NOT COMPLETE • STAGE 7:CHECK THE CAUSALITY • STAGE 8:REQUEST ALL THE CAUSAL FACTORS ARE WEIGHTED
  • 59. Where are we now? • Opening up with Open Source • Moving towards vision and not losing sight of it through tool adaptation • More work to do for Arts • Open Comment - pedagogical model open to test • Feedback • Changing pedagogy • Another handle on misconceptions
  • 60. Open Comment drivers for reflection • Students are able to find facts similar to X • Know how X might be disputed • Are able to make predictions about X • Know how to use X in an argument • Know how far X can be pushed • Supported with tools and strategies
  • 61. LISC: Aily Fowler • Kent University ab-initio Spanish module • Large student numbers • Skills-based course • Provision of sufficient formative assessment meant unmanageable marking loads • Impossible to provide immediate feedback • leading to fossilisation of errors DMW, Loughborough, July 2009
  • 62. The LISC solution: developed by Ali Fowler • A CALL system designed to enable students to: • Independently practise sentence translation • Receive immediate (and robust) feedback on all errors • Attend immediately to the feedback (before fossilisation can occur)
  • 63. How is the final mark arrived at in the LISC System? • The two submissions are unequally weighted • Best to give more weight to the first attempt • since this ensures that students give careful consideration to the construction of their first answer • but can improve their mark by refining the answer • The marks ratio can vary (depending on assessment/feedback type) • the more information given in the feedback, the lower the weight the second mark should carry
  • 64. Heuristics for the final mark • If the ratio is skewed too far in favour of the first attempt… • students are less inclined to try hard to correct non-perfect answers • If the ratio is skewed too far in favour of the second attempt… • students exhibit less care over the construction of their initial answer
  • 65. Feedback: Advice for Action • Students must decode feedback and then act on it Boud (2000) • Students must have the opportunity to act on feedback Sadler (1989) • Gauging efficacy through student action
  • 67. Characteristics Descriptor Authentic Involving real-world knowledge and skills Personalised Tailored to the knowledge, skills and interests of each student Negotiated Agreed between the learner and the teacher Engaging Involving the personal interests of the students Recognise existing skills Willing to accredit the student’s existing work Deep Assessing deep knowledge – not memorization Problem oriented Original tasks requiring genuine problem solving skills Collaboratively produced Produced in partnership with fellow students Peer and self assessed Involving self reflection and peer review Tool supported Encouraging the use of ICT Advice for Action Elliott’s characteristics of Assessment 2.0 activities
  • 68. Embedding Assessment into Pedagogical Frameworks • Luca & Oliver (2002) • Boud & Falchikov (2007) • Bartlett-Bragg (2008)
  • 70. The 4Ts Pyramid Transfer Learning Transformation Tasks Training of staff Tool Development
  • 71. e-Assessment: effectiveness vs. innovation 35 Dundee 30 Derby OU Ulster East Antrium Plymouth Birkbeck 25 Glamorgan Loughborough Southampton West Suffolk Effectiveness 20 Heriot Watt Cambridge Assessment Belfast Coleg Sir Gar 15 Warwickshire College 10 UCL Surrey Calibrand 5 0 COLA 4 6 8 10 12 14 16 Innovation
  • 72. e-Assessment: Maturity vs. Availability Scatter diagram to illustrate relationship between Maturity vs Availability 32 Heriot Watt OU Surrey 30 Derby Dundee Heriot Watt Ulster Dundee 28 East Antrium Plymouth Surrey Birkbeck Southampton Loughborough 26 OU Ulster Southampton UCL Loughborough Maturity 24 Plymouth Glamorgan Cambridge Assessment Derby 22 Warwickshire College Coleg Sir Gar Birkbeck Warwickshire College UCL Coleg Sir Gar 20 East Antrium West Suffolk Calibrand Belfast 18 West Suffolk COLA Cambridge Assessment 16 Belfast Calibrand 14 12 14 16 18 20 22 24 Availability
  • 73. Training  Question development  Quality REAQ  Plagiarism software  Advice for Action  Socio e-emotive content  Maintain empathy with the Learner
  • 74. e-Assessment Futures • Research issues • Adaptive testing • Automatic marking • Learning Analytics Data mining • Web 2.0, 3.0 ...
  • 75. Rearranging • Promoting champions • Multidisciplinary teams • Open source • Pedagogical frameworks
  • 76. National Union of Students’ Principles of Effective Assessment Times Higher Education, 29th January 2009 • Should be for learning, not simply of learning • Should be reliable, valid, fair and consistent • Should consist of effective and constructive feedback • Should be innovative and have the capacity to inspire and motivate. • Should be conducted throughout the course, rather than being positioned as a final event • Should develop key skills such as peer and reflective assessment
  • 77.
  • 78. References  Whitelock, D. (2010) Activating Assessment for Learning: are we on the way with Web 2.0? In M.J.W. Lee & C. McLoughlin (Eds.) Web 2.0-Based-E- Learning: Applying Social Informatics for Tertiary Teaching. IGI Global. pp. 319–342.  Whitelock, D. (2008) ‘Accelerating the assessment agenda: thinking outside the black box’. In Scheuermann, F. & Pereira, A.G. (eds.) Towards a Research Agenda on Computer-Based Assessment: Challenges and Needs for European Educational Measurement, published by the European Commission, Joint Research Centre. pp15- 21 ISSN. 1018-5593
  • 79. Three Assessment Special Issues  Brna, P. & Whitelock, D. (Eds.) (2010) Special Issue of International Journal of Continuing Engineering Education and Life-long Learning, Focussing on electronic Feedback: Feasible progress or just unfulfilled promises? Volume 2, No. 2  Whitelock, D. (Ed.) (2009) Special Issue on e- Assessment: Developing new dialogues for the digital age. British Journal of Educational Technology. Volume 40, No. 2  Whitelock, D. and Watt, S. (Eds.) (2008). Reframing e- assessment: adopting new media and adapting old frameworks. Learning, Media and Technology, Vol. 33, No. 3