Paper given at the 19th Annual INSPIRE (International conference for Process Improvement, Research and Education) in Southampton, April 2014.
More complex and chaotic methods are being adopted in the development of technology to enhance learning and teaching in higher education today in order to achieve innovation in teaching practice. However, because this type of development does not conform to a linear process-driven order, it is notoriously difficult to evaluate its success as a holistic educational initiative. It is proposed that there are five factors that impact on effective educational technology evaluation, which contributes to insubstantial evidence of positive outcomes, these being: premature timing; inappropriate software evaluation techniques and models; lack of shared understanding of the terminology or the semantics of education technology; the growing complexity of agile and open development; and the corporatisation of higher education.
This paper suggests that it is no longer helpful for policy makers to evaluate whether educational technology project outcomes were successful or unsuccessful but instead they should use agile evaluation strategies to understand the impact of the product, process and outcomes in a changing context. It is no longer useful to ask the question, ‘did the software work?’ The key is for software developers and policy-makers to ask ‘what type of software works, in which conditions and for whom?’ To understand this, the software development community needs to look at adopting evaluation strategies from the social science community. For example, realist evaluation supplies context driven and evidence-based techniques, exploring outcomes that tend towards the social rather than technical. It centres on exploring the ‘mechanisms’, ‘contexts’ and ‘outcomes’ associated with an intervention and is a form of theory-driven evaluation that is the theory and reasoning of its stakeholders that is rooted in practitioner wisdom.
Z Score,T Score, Percential Rank and Box Plot Graph
The Need for Evidence Innovation in Educational Technology Evaluation
1. http://cede.lboro.ac.uk
Melanie King
Head, The Centre for Engineering & Design Education
The Need for Evidence Innovation in
Educational Technology Evaluation
INSPIRE: International conference for Process
Improvement, Research and Education
April 2014, Southampton Solent University
2. The Centre for Engineering and Design Education
• The Centre for Engineering and Design Education
• The problem: The need for evidence innovation
• The 5 contributing factors that lead to narrow
evaluation
• Recommendations: Realist Evaluation
Outline
3. The Centre for Engineering and Design Education
The Centre for Engineering and Design Education (CEDE)
CEDE’s team of specialists work closely
with the Engineering and Design
Schools at Loughborough University to
encourage effective practice and
innovation in teaching and learning.
www.kit-catalogue.com
www.webpaproject.com
www.co-tutor.co.uk
4. The Centre for Engineering and Design Education
The problem: the need for evidence innovation
src: http://sociologicalimagination.org
“we need to be
at the forefront
of the edtech
revolution”
David Willetts MP
2011 marked a rapid and critical demise in funding = tighter budgets and smaller teams
How can we provide supporting
evidence and make a case that in-
house educational technology
development is indeed a catalyst for
innovation in T&L, and therefore
justify further investment?
< 2011
2014 >
5. The Centre for Engineering and Design Education
5 Factors that impact on effective edtech evaluation
Premature timing Inappropriate models Corporatization of HE
Complexity of agile
Lack of shared
terminology
6. The Centre for Engineering and Design Education
1. Premature Timing
Summative evaluations carried out immediately after an edtech development will never
fully give an understanding of the potential influence and impact.
Learning tech development project
start end
IMPACT?
evaluationPROCESS
OUTCOME
PRODUCT
7. The Centre for Engineering and Design Education
IMPACT
src: http://en.wikipedia.org
CETL evaluation?
IMPACT
8. The Centre for Engineering and Design Education
2. Inappropriate existing software techniques and models
Existing maturity models do not help us to
fully understand the organisational factors
that affect the potential for success of in-
house edtech development.
Organization
maturity
Users’
acceptance
Prediction of
potential to
nurture and
support
edtech
SESRMM
CMM
eMM
Existing acceptance models do not help
us to understand staff and students’
beliefs, attitudes and intentions with
regards to adopting new edtech
UTAUT
[The UTAUT contributes to the
study of technology adoption]
“reaching a stage of chaos.”
R. P., Bagozzi, 2007
9. The Centre for Engineering and Design Education
3. Political context and the corporatisation of Higher Education
Higher Education is in such a rapid state of change that it makes contextual evaluations
problematic with political drivers calling for quantifiable evidence of cost savings and
efficiency
10. The Centre for Engineering and Design Education
4. Complexity, agile development & participatory design
Homegrown edtech development is a complex and chaotic cycle of process and product
improvement.
src: http://ciosp3.us/is-agile-development-good-for-health-it/
Students
Developer
Chaos is the
science of
surprises
11. The Centre for Engineering and Design Education
5. Terminology – the semantics of edtech
The use of inconsistent terminology within the sector is a barrier to effective evaluation.
src: http://blogs.cetis.ac.uk
src: http://infteam.jiscinvolve.org
“Future research would
benefit from a greater
degree of consensus over
the use of common and
explicitly defined
terminology.”
McLeod & MacDonnell, 2011
E-learning? Learning technology?
E-administration? Educational
technology? Edtech? Technology
enhanced learning?
12. The Centre for Engineering and Design Education
RECOMMENDATIONS
“What type of software works, in which
conditions and for whom?”
“Did the software work?”
Pawson, 2013
13. The Centre for Engineering and Design Education
RECOMMENDATIONS
src: https://www.wageningenur.nl
14. The Centre for Engineering and Design Education
RECOMMENDATIONS
“Based on specific theories, realist evaluation
provides an alternative lens to empiricist evaluation
techniques for the study and understanding of
programmes and policies. This technique assumes
that knowledge is a social and historical product, thus
the social and political context as well as theoretical
mechanisms, need consideration in analysis of
programme or policy effectiveness.” Pawson, 2013
15. The Centre for Engineering and Design Education
RECOMMENDATIONS
Mechanism: what independent mechanisms, linked to edtech
initiatives, may lead to particular outcomes in a given context?
Context: what complex conditions are needed for an edtech
initiative to trigger mechanisms which produce particular
outcomes patterns?
Outcomes pattern: what are the practical effects produced by
causal mechanisms being triggered in a given context?
16. http://cede.lboro.ac.uk
Melanie King
Head, The Centre for Engineering & Design Education
m.r.n.king@lboro.ac.uk
INSPIRE: International conference for Process
Improvement, Research and Education
April 2014, Southampton Solent University
Prof Ray Dawson
Department of Computer Science, Loughborough University
Dr Firat Batmaz
Department of Computer Science, Loughborough University
Prof Steve Rothberg
School of Mechanical and Manufacturing Engineering, Loughborough University