The development of e-learning has progressed to a stage where it is becoming part of mainstream provision in higher education. Therefore the issue of assessing and sustaining the quality of e-learning must now come to the fore. Quality assessment in higher education is well-established in relation to learning and teaching generally, but what methods can be used to establish quality in the domain of e-learning?
The E-xcellence methodology for assessing quality in e-learning (EADTU 2009) is securing recognition by European and international learning organisations. It was designed to be applied to the design and delivery of e-learning in both distance learning and blended learning contexts. It supports a range of uses, from accreditation by external agencies to process improvement through internal review.
The methodology presents principles of good practice in six domains of e-learning: strategic management; curriculum design; course design; course delivery; student support; and staff support. A total of 33 benchmark statements cover these domains, and are supported by a handbook for practitioners and guidance for assessors. The handbook includes principles for quality e-learning and exemplars of good practice. Amongst the tools is an online ‘QuickScan’ self-evaluation questionnaire based on the E-xcellence benchmarks which is highly valued as a focus for collaborative review of e-learning programmes.
The e-learning landscape has changed since the E-xcellence methodology was first developed. In particular, the use of Open Education Resources (OECD 2007) and the application of social networking tools (Mason & Rennie 2008) were not explicitly considered in the original benchmarks. Accordingly, the E-xcellence NEXT project was instigated to produce and evaluate a revision of the benchmark criteria, associated handbook and exemplars. This paper describes the project process and initial recommendations.
A consultation exercise was carried out among E-xcellence participants. Feedback from this was brought to participatory workshops at a European Seminar on QA in e-learning in June 2011. Following this exercise, the benchmark statements were revised and are now available in beta version.
The project resources (Quickscan and manual) are being used for a series of self-evaluation and assessment seminars held at European higher education institutions. Feedback from these assessment seminars will be used to finalise materials for publication late in 2012. At that point the E-xcellence Next project will offer to the higher education community a set of self-evaluation and quality assessment tools which are fully updated to encompass social networking, Open Educational Resources and other recent developments in e-learning.
Next Steps for Excellence in the Quality of e-Learning
1. Next Steps for Excellence in the
Quality of e-Learning
Jon Rosewell, Karen Kear, Keith Williams
Dept of Communications and Systems,
Faculty of Maths Computing and Technology,
The Open University, UK
ALT-C, 11-13th Sept 2012
2. E-xcellence project 2005–present
Funded by EU Lifelong Learning programme
Managed by EADTU
• E-xcellence 2005-06
– Development and trialling of criteria, handbooks and
methodology
• E-xcellence plus 2008-09
– Dissemination to institutions and to QA agencies in 9 European
countries
• E-xcellence NEXT 2011-12
– Continuing dissemination and updating of criteria and
resources
3.
4.
5. E-xcellence: modes of use
• Informal self-evaluation
– Use Quickscan
• Local seminar
– Local use of Quickscan with justification for rating
– Meeting: institution, project team, national QA agency
– Improvement roadmap
• Full assessment
– As above but part of formal accreditation
– Evidence provided for benchmarks
6. E-xcellence NEXT: updating
• General updating of manual
– Clarifying language / terminology
• Deal with emerging trends
– Convergence between distance and F2F blended modes
– Social networking in HE
– Use of Open Educational Resources
• Process:
– Quickscan comments from partners
– Participatory workshops
– Feedback from local seminars
7. E-xcellence NEXT: social networking
• How might social networking contribute to high quality in
e-learning?
• What risks to quality might arise?
• Which of the existing E-xcellence quality benchmarks
might apply in this context?
• Are any new benchmarks needed to cover this
scenario?
8. Social networking
• Where:
– Online communication: forums, blogs, wikis, …
– Sites: Facebook, Twitter, LinkedIn, …
• Why:
– Learning: social learning, collaborative work
– Building communities: motivation, progress, social
• Issues:
– Public (Facebook etc) or walled-garden (VLE)?
– Boundaries and invasion of student space?
9. Revised benchmarks
– social networking
Curriculum design
10. Curricula are designed to enable participation in
academic communities via online social networking
tools. These online communities provide opportunities
for collaborative learning, contact with external
professionals and involvement in research and
professional activities.
10. Indicators – social networking
• There are institutional policies relating to the provision of online community
spaces for student-student and student-teacher interactions.
• Curriculum designers specify clearly the educational role that student-student
interaction plays in their programmes.
• Criteria for the assessment of student online collaboration exist and are applied
consistently across programmes and courses.
At excellence level:
• Teaching staff are supported by formal and informal staff development activity
in the use of online tools for community building.
• The institution works closely with professional bodies in the development of
online professional communities.
• Innovative assessment approaches, such as online collaborative work, peer
assessment and self-assessment, form a part of the institution’s practice in this
area.
11. E-xcellence NEXT: OERs
• How might OERs contribute to high quality in e-learning?
• What risks to quality might arise?
• Which of the existing E-xcellence quality benchmarks
might apply in this context?
• Are any new benchmarks needed to cover this
scenario?
12. Quality points
Provenance
Provenance
Reputation
Reputation
checking Brand
Brand
OER
OER
repository
repository
creation use
peer review user recommendation
13. Quality Dimensions
Content Pedagogic Effectiveness
Accuracy Learning objectives
Currency Prerequisites
Relevance Learning design
Learning styles
Assessment
Ease of use Reusability & openness
Clarity Format & interoperability
Visual attractiveness, Localisation
engaging Discoverability: metadata
Clear navigation Digital preservation
Functional! Accessibility
14. Revised benchmarks – OERs
Course design
14. OER material is selected with regard to learning
outcome, tailored if necessary for fit to the learning
context, and integrated with other learning materials.
OER materials are subject to the same review
processes as other course materials.
15. Indicators – OERs
• The institution has a policy for use of independent learning
materials from a number of quality assured sources, including
OER.
• Course materials obtained from OER are judged fit for purpose
by students and external assessors.
• There is a principled approach to judging the quality of material
obtained from an OER repository.
• There is a process for tracking intellectual property rights
associated with e-learning components.
At excellence level
• E-learning components are contributed to repositories as OER.
16. Updating E-xcellence resources
• Work done (published end Sept):
– Revised benchmarks and Quickscan
– Manual
• Edited for language, relevance to blended learning
• Including social networking and OER
• Work still to be completed:
– Update assessors’ notes
17. Local seminars – purpose
• To discuss with HE institutions the quality of e-learning
on the basis of the benchmarks
• To explore with QA agencies how to incorporate e-
learning into their frameworks
• To exchange ideas during an on-site visit
• To improve process:
– Exchange experience on the E-xcellence framework
and the Quickscan
– Collect feedback on tools
18. Local seminars – format
• Preparation
– Participants: managers, staff members, course designers, tutors, students
– Decide programme to be assessed
– Select some or all benchmarks
– Team meets to complete QuickScan self-evaluation
• Seminar
– First day: local team meet with assessors
– Second day: local team, assessors and national QA agency
• Report
– From assessors
– Roadmap for improvement from institution
19. Local seminars 2011-12
• Russia MESI University, Moscow
• Lithuania Kaunas University of Technology
• Poland Akademia Górniczo-Hutnicza, Krakow
• Cyprus Open University of Cyprus, Nicosia
• Latvia Riga Technical University
• Portugal Universidade Aberta, Lisbon
• Greece Hellenic Open University, Patras
20. Local seminar feedback (1)
• Framework
– Quickscan is valuable to structure discussion
– Completeness of the framework is appreciated
• Team working
– People exchange perspectives with other departments
• External perspective
– Exchange of experience between the evaluators and staff
was valuable
– New ideas surfaced for course design
21. Local seminar feedback (2)
• Reflection
– A valued ‘moment of reflection’ on quality
– People become aware of choices and implementations
– Gives insight into strengths and weaknesses
• Analysis
– Opportunity to formulate e-learning policy
– Provides foundations for decision making
22. Comments and feedback?
Web: http://www.eadtu.nl/e-xcellencelabel/
Email: J.P.Rosewell@open.ac.uk
K.L.Kear@open.ac.uk
K.Williams@open.ac.uk
Thank you for your attention
23. What do we mean by ‘social
networking’?
• ‘Social networking’ can be interpreted broadly to cover a
range of online communication processes
– e.g. via forums, blogs, wikis
• It can also be interpreted more narrowly to focus on
social network sites that provide accessible tools
– e.g. Facebook, Twitter, LinkedIn
24. Why use social networking?
• Social networking has two primary purposes in
education:
– facilitating learning
• social learning theories
• focused pedagogic function such as group work,
peer assessment
– building communities
• motivation and progress
• informal and social
25. Social networking tools
• Forums discussion and debate
• Wikis co-creation of resources
• Blogs reflection, sharing and feedback
• Social network sites sense of community
• Public (Facebook etc) or walled-garden (VLE)?
– Boundaries and invasion of student space?
26. Social network sites
Benefits Challenges
• many students already • privacy issues
use them regularly • lack of control
• seen as more social, • blurring of boundaries
informal and flexible between social and
academic life.
27. OER use-cases
• Life-long learner finds material for independent study
• Individual teacher uses assets in own material
• Course uses podcasts from iTunes U
• Course uses a 10-hour unit
• Entire 100-hour module reused, with new assessment
• Course and assignments in OER; tutorial / marking /
accreditation offered for fee
• Consortium develops material for own use and ‘frees’ it
28. Trends toward Open Educational Practice?
use create
teacher centred learner centred
transmission constructivism
(sage on stage) (guide on side)
focus on outcome focus on process
standardised personalised learning
individual social/ peer learning
Capability maturity model:
Use OERs Adapt OER material Create OER material
See, for example, OPAL OEP Guide
Hinweis der Redaktion
Need to do: Get final version of manual – benchmarks changed Maybe cut background slides on social networking and OER Maybe give indicators as well as benchmarks Maybe add slide re process Delete slide re OE practices
What is the problem? There are established HE QA procedures in Europe These were designed for conventional universities They don’t necessarily fit e-learning Solution: Provide resources and processes for QA of e-learning These can be adapted for local/national purposes
Resources: manual including benchmarks and indicators Structure Strategic Management Curriculum Design Course Design Course Delivery Student Support Staff Support
Online Quickscan 35 Benchmark statements Quick self-assessment of e-learning performance Rate programme/course on the most relevant aspects Identifies hot / cold spots of e-learning programme/course Online version provides feedback: To identify elements to be improved To guide the internal discussion To learn if a full quality assessment procedure is useful
Is it possible to evaluate quality of components in isolation, or only in the context of their use? Quality process Checking Peer review Feedback Rating / voting / recommendation Branding / provenance / reputation
Note categories of users
Issue: OER use is very varied in scale – from single assets to whole courses – so QA procedures could be very different in different contexts