SlideShare ist ein Scribd-Unternehmen logo
1 von 313
Downloaden Sie, um offline zu lesen
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
I
EVALUATION OF THE APPLICATION OF E-
LEARNING METHODOLOGIES TO THE
EDUCATION OF ENGINEERING
RITA RODRIGUES CLEMENTE FALCÃO DE BERREDO
Dissertação submetida para satisfação dos requisitos do grau de
DOUTOR EM MEDIA DIGITAIS
ESPECIALIDADE DE CRIAÇÃO DE AUDIOVISUAL E DE CONTEÚDOS INTERACTIVOS
Orientador: Professor Doutor Alfredo Augusto Vieira Soeiro
AGOSTO DE 2012
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
II
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
III
Versão revista em Abril 2013
Dissertação desenvolvida no âmbito do Programa Doutoral em Media Digitais da Faculdade de
Engenharia da Universidade do Porto em colaboração com a Universidade do Texas em Austin
Revised version, April 2013
Dissertation submitted to the Doctoral Programme in Digital Media at the School of Engineering of
University of Porto, in collaboration with University of Texas in Austin
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
IV
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
V
ACKNOWLEDGMENTS
I want to express my sincerest gratitude to Professor Alfredo Soeiro for encouraging and supervising
my research. His constant support, guidance and the several productive discussions were essential for
reaching this final stage.
My sincerest appreciation is extended to Professor Gráinne Conole and Professor Joan Hughes for the
useful conversations and their activity in my committee.
I am also grateful to Professor Raul Vidal, Professor Rui Maranhão and Professor José Fernando
Oliveira for their willingness to participate in this research study. I am especially thankful to Professor
Gabriel David for his willingness to participate in the study and his continuous disposition to help in
the development of the technical component of the project.
I would also like to acknowledge my appreciation to Dra. Lígia Ribeiro and to all the members of the
Unit for New Technologies in Education of Universidade do Porto for the constant support in my
pursuit to extend my studies. My gratitude also goes to the Fundação para a Ciência e Tecnologia for
their financial aid and the opportunity to research the area of e-learning and assessment.
My appreciation also goes to my closest friends and family for their support and understanding that
helped me to overcome the difficulties created by research work. A very special thanks goes to Nuno,
for being there, for his support, encouragement and patience, and especially for his commitment to my
researcher’s life. Finally, my sincerest gratitude goes to my Mother that was always there, supporting
me in every possible way during this long period and in particular for being the best Grandmother that
Francisco could ever wished to have. To Francisco, my son, I dedicate this work.
Your education is the only thing that you will always carry with you,
no matter what happens. Nobody can take it away.
Luis Falcão de Berredo
To you, my Father, only one word: saudade…
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
VI
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
VII
TABLE OF CONTENTS
ACKNOWLEDGMENTS .....................................................................................................................V
TABLE OF CONTENTS ...................................................................................................................VII
ABSTRACT......................................................................................................................................... XI
RESUMO .......................................................................................................................................... XIII
RESUME.............................................................................................................................................XV
LIST OF TERMS AND ACRONYMS IN USE.............................................................................XVII
LIST OF TABLES.............................................................................................................................XIX
LIST OF FIGURES...........................................................................................................................XXI
CHAPTER 1 - INTRODUCTION....................................................................................................1
1.1 Background ......................................................................................................................................... 1
1.2 Field of Study and Intended Stakeholders........................................................................................... 2
1.3 Background and motivation of the researcher .................................................................................... 3
1.4 Statement of the problem and research questions.............................................................................. 4
1.5 Impact expectations of the study ........................................................................................................ 5
1.6 Structure of the study ......................................................................................................................... 6
1.7 Publications related with the research study ...................................................................................... 6
CHAPTER 2 - RELATED WORK...................................................................................................9
2.1 Introduction ........................................................................................................................................ 9
2.2 Assessment ........................................................................................................................................11
2.3 Overview on e-assessment.................................................................................................................30
2.4 Learning Outcomes ............................................................................................................................32
2.5 The alignment question......................................................................................................................39
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
VIII
2.6 Learning Outcomes in engineering education ....................................................................................41
2.7 Conclusions from the literature review..............................................................................................45
CHAPTER 3 - RESEARCH METHOD........................................................................................47
3.1 Research design .................................................................................................................................47
3.2 Development of the conceptual model..............................................................................................48
3.3 Multiple case-studies approach .........................................................................................................52
3.4 Data collection...................................................................................................................................52
3.5 Data analysis......................................................................................................................................54
3.6 Validity, reliability and limitations .....................................................................................................55
3.7 Ethical considerations ........................................................................................................................56
CHAPTER 4 - THE ALOA CONCEPTUAL MODEL.................................................................57
4.1 The rBloom matrix .............................................................................................................................58
4.2 Assessment methods .........................................................................................................................67
4.3 E-assessment tasks.............................................................................................................................86
4.4 The description of LO in EE.................................................................................................................92
4.5 Defining relations.............................................................................................................................107
CHAPTER 5 - IMPLEMENTATION........................................................................................ 113
5.1 Implementation scenarios................................................................................................................113
5.2 Practical tools ..................................................................................................................................118
5.3 Case studies .....................................................................................................................................123
CHAPTER 6 - INTERPRETATION OF RESULTS................................................................ 135
6.1 Application of the ALOA model........................................................................................................135
6.2 Adequacy of the chosen LOs in EE (RQ1) ..........................................................................................136
6.3 Adequacy of the selection of assessment methods (RQ2)................................................................137
6.4 Adequacy of the ALOA model to describe assessment.....................................................................137
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
IX
6.5 The alignment question (RQ3 and RQ4) ...........................................................................................138
CHAPTER 7 - CONCLUSIONS ..................................................................................................141
7.1 Conclusions regarding the research problem....................................................................................141
7.2 Theoretical implications of the research project ..............................................................................143
7.3 Practical implications of the research project...................................................................................144
7.4 Implications for future research.......................................................................................................146
7.5 Final remarks....................................................................................................................................148
REFERENCES..................................................................................................................................149
ANNEXES...............................................................................................................................................I
Annex I - DATABASE DIAGRAMS ................................................................................................III
Annex II - TEMPLATES FOR THE DIFFERENT TYPES OF CASE STUDIES .........................V
Annex III - CASE STUDY 1.............................................................................................................VII
Annex IV - CASE STUDY 2.............................................................................................................. IX
Annex V - CASE STUDY 3 .............................................................................................................. XI
Annex VI - CASE STUDY 4........................................................................................................... XIII
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
X
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
XI
ABSTRACT
This study researched the question of alignment between the intended learning outcomes and
assessment in the field of engineering education. Based on the initial problem, four research questions
were developed that focused on: defining and describing learning outcomes in engineering; defining
and describing assessment methods and e-assessment tasks; defining a model for achieving alignment
between learning outcomes and assessment; finally, applying the model to link specific learning
outcomes with specific assessment methods.
During this study a conceptual model was developed with the goal of providing an answer to the
research problem, the ALOA (Aligning Learning Outcomes with Assessment). The model was derived
from literature research and included different components directly linked with the research questions.
The first one is a selection and description of learning outcomes in the field of engineering that was
based on existing qualification frameworks of the sector: ABET and EUR-ACE. The second
component was a selection and description of assessment methods. This selection was derived from
literature and compiled in a structured list that included six general assessment categories that are then
divided in specific assessment methods. Both the learning outcomes and the assessment methods were
described using an adapted version of the Taxonomy Table developed by Anderson et al. in their work
A Taxonomy for Learning, Teaching and Assessing. The Taxonomy Table describes the assessment
items and the learning outcomes using a two dimensional classification system based on knowledge
and cognitive processes. During the development of the ALOA model each selected learning outcome
and specific assessment method were analysed and classified in terms of type of knowledge and
cognitive processes addressed. This detailed description was used for answering the main question of
the research problem. By using the same classification system for both types of items, it was possible
to develop an alignment proposal.
The ALOA conceptual model was developed with the intention of being used by different stakeholders
in Higher Education Institutions. The study defined four different scenarios of implementation of the
ALOA model: to verify current alignment in existing courses; to develop an assessment strategy based
on statements of learning outcomes; to verify vertical alignment of courses with higher level learning
outcomes; to verify horizontal alignment of learning outcomes defined at the same level but in
different contexts, as in mobility situations. The ALOA model was translated in practical tools and
tested for applicability using a multiple case study approach. The results of the case studies are mainly
concerned with the improvement of the model by detecting problems and suggesting changes.
The main theoretical findings of the study were related with the clarification of core concepts in terms
of assessment methods and assessment tasks. There was also an attempt to structure general
information concerning assessment and e-assessment. In terms of the concept of alignment this study
contributes with an innovative perspective that includes the clarification of the meaning of alignment
and the definition of four criteria for achieving alignment: match, emphasis, coverage and precision.
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
XII
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
XIII
RESUMO
Este estudo investigou a questão de alinhamento entre os resultados de aprendizagem e a avaliação, na
área da educação da engenharia. A partir do problema inicial forma definidas quarto questões a
investigar: como definir e descrever os resultados de aprendizagem em engenharia; como selecionar e
descrever métodos de avaliação e métodos de e-assessment; como definir um modelo que permita
atingir o alinhamento entre os resultados de aprendizagem e os métodos de avaliação; como aplicar o
modelo de forma a obter a ligação entre certos tipos de resultados de aprendizagem e métodos de
avaliação específicos.
Durante este estudo for desenvolvido um modelo conceptual que tem como objectivo dar resposta ao
problema inicial, o modelo ALOA (Aligning Learning Outcomes with Assessment). O modelo foi
obtido a partir de investigação de literatura e inclui componentes que abordam cada uma das questões
acima mencionadas. A primeira componente é uma seleção e descrição dos resultados de
aprendizagem na área da engenharia que foi obtida a partir de quadros de qualificações existentes no
sector: ABET e EUR-ACE. A segunda componente é uma seleção e descrição de métodos de
avaliação que foi obtida a partir da análise de obras de referência na área. A seleção dos métodos de
avaliação foi organizada numa lista com seis categorias gerais em que cada uma inclui vários métodos
específicos. Tanto os resultados de aprendizagem como os métodos de avaliação foram descritos
recorrendo a uma versão adaptada da Tabela Taxonómica desenvolvida por Anderson et al. na obra A
Taxonomy for Learning, Teaching and Assessing. A Tabela Taxonómica define um sistema de
classificação com duas dimensões: tipo de conhecimento e processos cognitivos. Durante o
desenvolvimento do modelo ALOA cada resultado de aprendizagem e cada método de avaliação foi
analisado e classificado em termos de cada uma destas dimensões. Esta descrição detalhada dos itens
foi utilizada para responder à questão principal deste projeto de investigação que era o alinhamento
entre os resultados de aprendizagem e a avaliação. Ao utilizar o mesmo de sistema de classificação
para os dois tipos de itens foi possível desenvolver uma proposta de alinhamento.
O modelo conceptual ALOA foi desenvolvido com o objective de ser usado por diferentes tipos de
utilizadores em instituições de ensino superior. O estudo definiu quarto cenários de implementação
para o modelo: verificação do alinhamento em unidades curriculares existentes; definição de uma nova
estratégia de avaliação tendo como base os resultados de aprendizagem pretendidos; verificação do
alinhamento vertical de uma unidade curricular por comparação com resultados de aprendizagem
definidos a um nível superior; verificação do alinhamento horizontal de duas unidades curriculares
definidas ao mesmo nível mas em contextos diferentes como ocorre em situações de mobilidade. O
modelo ALOA foi traduzido em ferramentas práticas e a sua aplicabilidade foi testada recorrendo a
múltiplos casos de estudo.
As principais conclusões teóricas deste estudo estão relacionadas com a clarificação dos conceitos
métodos de avaliação e práticas de avaliação. Houve também uma tentativa da qual resultou uma
proposta de estruturação de informação relativa à avaliação e e-assessment. Em relação ao conceito de
alinhamento, este estudo contribui com uma perspectiva inovadora que inclui a clarificação do
significado do termo alinhamento e a definição de quarto critérios para que se possa atingir o
alinhamento: coincidência, ênfase, cobertura e precisão.
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
XIV
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
XV
RESUME
Cette étude a examiné la question de l'alignement entre les acquis d'apprentissage et d'évaluation dans
le domaine de la formation des ingénieurs. D’aprés le problème initial ont défini les quatres questions
de la recherche: comment définir et décrire les résultats d'apprentissage dans l'ingénierie;comment
choisir et decrireles méthodes d'évaluation et les méthodes d'évaluation électronique; comment définir
un modèle qui permettra d'atteindre l'alignement entre les résultats de l'apprentissage et les méthodes
d'évaluation; comment 'appliquer le modèle de façon à obtenir la liaison entre certains types de
résultats d'apprentissage et les méthodes d'évaluation spécifiques.
Au cours de cette étude, on a developpé un modèle conceptuel qui vise à répondre au problème de
départ , le modèle ALOA (Aligning Learning Outcomes with Assessment). Le modèle a été dérivée à
partir de la littérature scientifique et comprend des composantes qui répondent à chacune des questions
mentionnées ci-dessus. La première composante est une sélection et une description des résultats
d'apprentissage en ingénierie qui a été obtenue à partir des cadres de qualifications du secteur: ABET
et EUR-ACE. La deuxième composante est une sélection et une description des méthodologies qui ont
été obtenus à partir de l'analyse des ouvrages de référence dans le domaine. Le choix des méthodes
d'évaluation a été organisée dans une liste avec six grandes catégories dans lesquelles s’incluent les
plusieurs méthodes spécifiques. Les résultats d'apprentissage et les méthodes d'évaluation ont été
décrites à l'aide d'une version adaptée de la taxonomie développé par Anderson et al. Dans le travail
sur la taxonomie de l'apprentissage, l'enseignement et l'évaluation. Cette taxonomie tdéfine un système
de classification à deux dimensions: type de de connaissance et processus cognitif. Au cours du
développement du modèle ALOA on a été evalué chaque résultat de l'apprentissage et chaque méthode
d'évaluation en chacune de ces dimensions. Cette description détaillée des éléments a été utilisé pour
répondre à la question principale de ce projet de recherche qu’était l'alignement entre les résultats
d'apprentissage et d'évaluation. En utilisant le même système de notation pour les deux types d'articles
a été possible développer une proposition d’alignement.
Le modèle conceptual ALOA a été élaboré pour être utilisé par différents utilisateurs dans les
établissements d'enseignement supérieur. L'étude a défine quatre scénarios de déploiement pour le
modèle: vérification de l'alignement dans les cours existants; définition d'une nouvelle stratégie
d'évaluation basée sur les résultats d'apprentissage escomptés; vérification de l'alignement vertical
d'une unité d'enseignement par rapport aux résultats d'apprentissage définis à un niveau supérieur;
vérification de l'alignement horizontal des deux cours fixés au même niveau, mais dans des contextes
différents en situation de mobilité. Le modèle ALOA a été traduit en outils pratiques et leur
applicabilité a été testée à l'aide de plusieurs 'études de cas .
Les principales conclusions théoriques de cette étude sont liées à clarifier les concepts, les méthodes
et les pratiques d'évaluation. Il y avait aussi une tentative qui a mené à une proposition de structuration
de l'information sur l'évaluation et l'e-évaluation. En ce qui concerne le concept de l'alignement, cette
étude fournit une nouvelle perspective qui inclut la clarification de la signification du terme et la
définition de quatrecritère afin que nous puissions parvenir à un alignement: coïncidence, l'accent, la
couverture et la précision.
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
XVI
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
XVII
LIST OF TERMS AND ACRONYMS IN USE
ABET Organization that accredits college and university programs in the fields of
applied sciences and engineering. Formerly known as Accreditation Board for
Engineering and Technology
ALOA Model for Aligning Learning Outcomes with e-Assessment
AT Assessment task
CDIO Conceiving — Designing — Implementing — Operating real-world systems
and products
CE Continuing education
CEI Continuing education institutions
EE Engineering education
EUR-ACE European quality label for engineering degree programmes
HE Higher education
HEI Higher education institutions
ICT Information and communication technologies
iLO Intended learning outcome
LO Learning outcome
LT Learning technologies
MCQ Multiple choice question
PL Prior learning
rBloom Adapted version of the revised Bloom’s taxonomy
RPL Recognition of prior learning
RQ Research question
SAQ Short answer question
TLA Teaching and Learning activities
TT Taxonomy Table
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
XVIII
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
XIX
LIST OF TABLES
TABLE 1 - EXAMPLES OF SEARCH EXPRESSIONS USED ............................................................................................10
TABLE 2 - TYPES OF SOURCES USED AND EXAMPLES..............................................................................................11
TABLE 3 - TRENDS IN ASSESSMENT. FROM BROWN ET AL. [7, P.13] .......................................................................13
TABLE 4 – PRINCIPLES AND PRACTICES ON ASSESSMENT (EXTRACTED AND ADAPTED FROM WOODS [50]) ...........16
TABLE 5 - REASONS FOR ASSESSING STUDENTS......................................................................................................16
TABLE 6 - SUMMARY OF ASSESSMENT METHODS ...................................................................................................22
TABLE 7 - FAMILIES OF ESSAY QUESTIONS .............................................................................................................24
TABLE 8 - GENERAL CRITERIA FOR ASSESSING ESSAYS (ADAPTED FROM BROWN ET AL. [7]).................................25
TABLE 9 - PROBLEM SOLVING STAGES AND COGNITIVE PROCESSES (ADAPTED FROM WOODS [55, P.448]) ............26
TABLE 10 - SUMMARY OF PROBLEM-SOLVING AS VIEWED BY PLANTS ET AL. [56].................................................26
TABLE 11 - SUMMARY OF PRACTICAL ENQUIRY AS VIEWED BY HERRON ...............................................................27
TABLE 12 - ASSESSING EXPERIMENTAL PROJECT (ADAPTED FROM BROWN ET AL. [7, P.128-9]) ............................28
TABLE 13 - BRIEF ANALYSIS OF CRISP’S E-ASSESSMENT ITEMS..............................................................................32
TABLE 14 - DIFFERENCES BETWEEN SPECIFICITY LEVELS OF OBJECTIVES (FROM ANDERSON ET AL. [40, P.17]) ....35
TABLE 15 - SUMMARY OF BLOOM'S TAXONOMY [41] ............................................................................................36
TABLE 16 - SUMMARY OF THE KNOWLEDGE DIMENSION OF RBLOOM ....................................................................37
TABLE 17 - SUMMARY OF THE COGNITIVE DIMENSION ...........................................................................................38
TABLE 18 - AN EXAMPLE OF A TAXONOMY TABLE USED TO VERIFY ALIGNMENT ..................................................40
TABLE 19 - COMPARISON BETWEEN QUALIFICATIONS IN EE..................................................................................44
TABLE 20 - SUMMARY OF DATA COLLECTION ........................................................................................................53
TABLE 21 - SUMMARY OF THE ANALYSIS OF DATA.................................................................................................54
TABLE 22 - THE TAXONOMY TABLE BY ANDERSON ET AL.....................................................................................59
TABLE 23 - FACTUAL KNOWLEDGE (EXTRACTED FROM ANDERSON ET AL. [40, P. 29])..........................................60
TABLE 24 - CONCEPTUAL KNOWLEDGE (EXTRACTED FROM ANDERSON ET AL. [40, P. 29])....................................61
TABLE 25 - PROCEDURAL KNOWLEDGE (EXTRACTED FROM ANDERSON ET AL. [40, P. 29])....................................61
TABLE 26 – METACOGNITIVE KNOWLEDGE (EXTRACTED FROM ANDERSON ET AL. [40, P. 29]) .............................62
TABLE 27 - REMEMBER CATEGORY (ADAPTED FROM ANDERSON ET AL. [40, P. 67/8])...........................................63
TABLE 28 - UNDERSTAND CATEGORY (ADAPTED FROM ANDERSON ET AL. [40, P. 67/8]).......................................63
TABLE 29 – APPLY CATEGORY (ADAPTED FROM ANDERSON ET AL. [40, P. 67/8])..................................................64
TABLE 30 - ANALYSE CATEGORY (ADAPTED FROM ANDERSON ET AL. [40, P. 67/8])..............................................65
TABLE 31 - EVALUATE CATEGORY (ADAPTED FROM ANDERSON ET AL. [40, P. 67/8])............................................65
TABLE 32 – CREATE CATEGORY (ADAPTED FROM ANDERSON ET AL. [40, P. 67/68])..............................................66
TABLE 33 – RBLOOM MATRIX, AN ADAPTED VERSION OF THE TAXONOMY TABLE BY ANDERSON ET AL ...............67
TABLE 34 - MAPPING OF ASSESSMENT BASED ON MCQ TO BLOOM'S REVISED TAXONOMY ...................................69
TABLE 35 - MAPPING OF ASSESSMENT BASED ON ESSAYS TO BLOOM'S REVISED TAXONOMY................................73
TABLE 36 - MAPPING OF SIMPLE CLOSED-ENDED PROBLEM SOLVING TO BLOOM'S REVISED TAXONOMY (DIAGNOSIS
+ ROUTINE ACTIVITIES) ..................................................................................................................................76
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
XX
TABLE 37 - MAPPING OF COMPLEX CLOSED-ENDED PROBLEM SOLVING TO BLOOM'S REVISED TAXONOMY ...........76
TABLE 38 - MAPPING OF OPEN ENDED PROBLEM SOLVING TO BLOOM'S REVISED TAXONOMY................................77
TABLE 39 - MAPPING OF PRACTICAL WORK TO BLOOM'S REVISED TAXONOMY ......................................................79
TABLE 40 - MAPPING OF ASSESSMENT USING SHORT-ANSWER QUESTIONS TO BLOOM'S REVISED TAXONOMY .......83
TABLE 41 - MAPPING OF REFLECTIVE PRACTICE ASSESSMENT TO BLOOM'S REVISED TAXONOMY..........................85
TABLE 42 - COMPLETE MAPPING OF ABET PROGRAMME OUTCOMES TO BLOOM'S REVISED TAXONOMY.............100
TABLE 43 - COMPLETE MAPPING OF EUR-ACE PROGRAMME OUTCOMES TO BLOOM'S REVISED TAXONOMY. .....107
TABLE 44 – MAPPING OF THE ABILITY TO USE NEWTON’S FIRST LAW IN ROUTINE PROBLEM SOLVING.................109
TABLE 45 - STAKEHOLDERS OF DIFFERENT SCENARIOS FOR ALIGNMENT..............................................................118
TABLE 46 – NUMBERING OF THE CELLS TO TRANSFORM TWO DIMENSIONS INTO ONE...........................................120
TABLE 47 - LINK BETWEEN ILOS AND THE LOS OF ABET ...................................................................................126
TABLE 48 - LINK BETWEEN ILOS AND THE LOS OF EUR-ACE ............................................................................126
TABLE 49 - SUMMARY OF METHODS OF ASSESSMENT USED IN THE CASE STUDY ..................................................127
TABLE 50 - OVERLAPPING OF RBLOOM FOR SIMPLE PROBLEM SOLVING WITH ITEM CS.02.A03.14A ...................127
TABLE 51 - OVERLAPPING OF RBLOOM FOR OPEN PROBLEM SOLVING WITH ITEM CS.02.A03.14B.......................127
TABLE 52 - OVERLAPPING OF RBLOOM FOR PROJECT WITH ITEM CS.02.A02.......................................................128
TABLE 53 - ALIGNMENT OF LO1 WITH REAL ASSESSMENT ...................................................................................129
TABLE 54 - ALIGNMENT OF LO2 WITH REAL ASSESSMENT ...................................................................................129
TABLE 55 - ALIGNMENT OF LO3 WITH REAL ASSESSMENT ...................................................................................129
TABLE 56 - ALIGNMENT OF LO4 WITH REAL ASSESSMENT ...................................................................................130
TABLE 57 - ALIGNMENT OF LO1 WITH STANDARD ASSESSMENT ..........................................................................130
TABLE 58 - ALIGNMENT OF LO2 WITH STANDARD ASSESSMENT ..........................................................................130
TABLE 59 - ALIGNMENT OF LO3 WITH STANDARD ASSESSMENT ..........................................................................130
TABLE 60 - ALIGNMENT OF LO4 WITH STANDARD ASSESSMENT ..........................................................................131
TABLE 61 - MATCHING SCORES OBTAINED FOR THE EACH ILOS WHEN COMPARED WITH STANDARD ASSESSMENT
METHODS......................................................................................................................................................131
TABLE 62 - OVERLAPPING OF STANDARD ALOA SIMPLE PROBLEM SOLVING MATRIX WITH REAL ASSESSMENT
FROM CASE-STUDY .......................................................................................................................................132
TABLE 63 - OVERLAPPING OF STANDARD ALOA OPEN PROBLEM SOLVING MATRIX WITH REAL ASSESSMENT FROM
CASE-STUDY .................................................................................................................................................133
TABLE 64 - OVERLAPPING OF STANDARD ALOA PROJECT MATRIX WITH REAL ASSESSMENT FROM CASE-STUDY 133
TABLE 65 - RELATION BETWEEN THE PROBLEM AND THE ALOA MODEL .............................................................142
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
XXI
LIST OF FIGURES
FIGURE 1 - FIELD OF STUDY......................................................................................................................................2
FIGURE 2 - ASSESSMENT CYCLE AS DESCRIBED BY BROWN ET AL..........................................................................15
FIGURE 3 - PURPOSES OF ASSESSMENT ...................................................................................................................17
FIGURE 4 - SOURCES OF ASSESSMENT ....................................................................................................................18
FIGURE 5 – CURRENT CHALLENGES IN ASSESSMENT ..............................................................................................18
FIGURE 6 - DIFFERENT DIMENSIONS OF RELIABILITY OF ASSESSMENT....................................................................19
FIGURE 7 - TYPES OF VALIDITY RELATED WITH CURRENT RESEARCH.....................................................................20
FIGURE 8 - THE LEARNING CYCLE AS VIEWED BY KOLB.........................................................................................29
FIGURE 9 - GROUPS OF ACTIVITIES RELATED WITH E-ASSESSMENT IMPLEMENTATION [26]....................................30
FIGURE 10 - THE SAME LO SHOULD BE PRESENT IN ALL THE ACTIVITIES ...............................................................41
FIGURE 11 - RELATION BETWEEN THE CONCEPTUAL MODEL AND THE RESEARCH QUESTIONS................................49
FIGURE 12 – INITIAL REPRESENTATION OF THE PROBLEM.......................................................................................49
FIGURE 13 - SECOND VERSION OF THE MODEL........................................................................................................50
FIGURE 14 - THIRD VERSION OF THE MODEL...........................................................................................................51
FIGURE 15 - FOURTH AND FINAL VERSION OF THE MODEL......................................................................................52
FIGURE 16 - THE ALOA CONCEPTUAL MODEL.......................................................................................................57
FIGURE 17 - ALIGNMENT POSSIBILITIES FOR ONE UNIT OR COURSE ......................................................................107
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
XXII
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
1
CHAPTER 1 - INTRODUCTION
1.1 BACKGROUND
Since the last half of the 20th Century, the World has been experiencing rapid transformation in the
field of Education led by the changing Knowledge Society [1]. As Peter Drucker explained in 1996, in
this new society access to work is only gained through formal education and not acquired through
apprenticeship. Almost two decades have passed and this is already what is happening in some parts of
the World. Education and schooling have become a major concern for the society and it is a priority in
national and transnational policies. The strategic document Europe 2020 [2] defines as a priority the
development of an economy based on knowledge and innovation. All educational institutions play an
important role in achieving this goal.
Higher Education (HE), Continuing Education (CE) and Vocational Training have been most affected
by this transformation, adapting to the demand for new skills of the labour market and at the same
time corresponding to the needs of an increasing number of students [3]. The global economy created
opportunity and need for the mobility of students and workers, demanding efficient recognition of
qualifications and increasing competitiveness in this field. The labour market demands more workers
qualified and updated and this trend is mirrored in educational policies in Europe [4-6]. All this
generates pressure towards a quality-based approach for all education providers, as Drucker predicted
in his article about the knowledge society[1].
One visible effect of this transformation is the shift from a content-based approach in Education to an
approach centred on the student and what he/she has learned and achieved. This transformation has led
to different trends in HE as identified by Brown et al. [7]. One of the trends is the change from
structuring education based on what should be taught by defining educational objectives to structuring
education by defining what students should learn, the Learning Outcomes (LOs). This approach is
underpinning the development and implementation of most European Education policies at
international and national levels [8-10]. In Europe, HEI and CEI are redefining programmes in terms
of Learning Outcomes, harmonizing them with national, international and sector level frameworks of
qualifications that are also based on Learning Outcomes [11-14]. Several projects and initiatives are
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
2
working towards the definition LOs, specific and transversal that can be used as a common reference
[15, 16]. Learning Outcomes are also becoming fundamental for structuring the standards and
guidelines of quality assessment of HE and CE institutions in Europe and worldwide [17-20]. The
field of engineering education has also been affected by these current trends. Learning outcomes are
being used in qualification frameworks specific of engineering, like ABET, EUR-ACE and CDIO. In
this context, the assessment of Learning Outcomes becomes a crucial process for the Educational
System. It should be a major concern of educational institutions to ensure that assessment of student
learning is being guided by what they should be learning, i.e. assessment should be aligned with the
intended learning outcomes (iLOs).
Another major revolution in our society has been the introduction of Information and Communication
Technologies. The use of ICT applied to Education, e-learning, has been increasing and its use creates
new opportunities for teaching, learning and assessment and has huge potential as an answer to some
of the current challenges of Education. The change to the digital media has impact on the availability,
reusability, accessibility and cost of learning resources, complemented by the communication and
networking potential of the Internet that takes Education to a Global level [21] [22, 23]. The
application of ICT to education and in particular to assessment is a subject of great discussion. Some
of the issues related with the use of e-learning in assessment are related with validity and reliability of
the process. This study will be focusing on this particular issue.
1.2 FIELD OF STUDY AND INTENDED STAKEHOLDERS
The current study will be looking into the relationship between learning and assessment. The focus of
the study is placed in the intersection of three areas, as illustrated by Figure 1: learning outcomes,
assessment of student’s learning and e-learning. It will be focusing on Higher Education but more
specifically on Engineering Education. The question of alignment of e-assessment and education is
represented by the area where the three circles overlap.
Figure 1 - Field of study
The wide use of e-learning is already impacting Education, promoting change and innovation in
different aspects including pedagogy, technology, organization, accessibility, and flexibility among
others [24]. It is a complex and multidisciplinary area and, given its impact, it is important that E-
learning research be informed by evidence [23]. Current literature reviews in this area indicate that e-
learning approaches to assessment lack the pedagogical framework and most research describes
Learning
Outcomes
e-Learning
Assessment
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
3
implementation studies at course level [25]. The present research intends to contribute to establishing
a conceptual model for the implementation of e-assessment in Engineering Education.
Assessment is a crucial process of Education and is seen by current trends as part of the learning
process and not as a separate event [7]. Assessment of student learning will encourage involvement of
students and provides feed-back to students and teachers [26]. It has an important role in validation
and certification and is deeply related with quality issues as will be explained below. The current study
will be focusing on the specific link between what should be learned and what should be assessed. The
purpose of this study is to research the applicability of e-assessment strategies to the field of
engineering education. It intends to help in the definition of adequate e-assessment strategies based on
what is the stated intended learning.
This research study mainly applies to engineering courses that use or intend to use e-assessment
strategies. The implementation stage addressed existing courses of Faculdade de Engenharia da
Universidade do Porto (FEUP). It focused on the stated learning outcomes of the courses and on the
existing assessment and e-assessment strategies as important components of the alignment question.
As the study will focus at course level, it will be strongly related with the activities of faculty that are
usually the ones that define the iLOs and the assessment tasks. However, the results of the study have
impact on other levels, including programme and qualification frameworks. It is expected that other
stakeholders including curriculum designers, mobility staff, and quality and accreditation staff will be
using the results of this study. In terms of students, they are not considered direct users of the study
and tools. Still, the project is based on the learner centred perspective of education and they will
benefit from a more efficient and effective educational process.
1.3 BACKGROUND AND MOTIVATION OF THE RESEARCHER
Since 1998 I have been working in the field of ICT applied to education. I started working as an editor
of multimedia projects in an educational publishing company. Later in 2001, I continued the work in
this field as responsible for the e-learning unit of Universidade do Porto. This position gave me the
opportunity to contribute to the implementation of an institutional strategy for e-learning. At the same
time, it allowed me to build a transversal perspective of the implementation of learning technologies at
the institutional course level, including different learning management systems (LMS) and other
learning technologies (LTs). This role included giving individual support to teachers in terms of
pedagogical and technological issues. It was possible to learn from the different strategies used by the
teachers in terms of teaching and assessment and to perceive the perspective of the students.
Additionally it included the active participation in European research projects in the field of e-learning
which was important to learn and update knowledge and skills in this area,
During this period I was a teacher in traditional courses at the university. As I developed blended
learning strategies to work with the students I could better understand the perspective of the teachers
and learners when dealing with e-learning. I was involved in the development, teaching and tutoring of
distance e-learning courses that contributed to improve my knowledge in this area. Finally, I was a
student in a Master’s programme of the School of Engineering that included an e-learning course with
e-assessment tasks. This gave me the perception of the student view in terms of e-learning.
This summary of activities shows that the motivation for the field of study comes from both the
academic and the professional area. When dealing with e-learning strategies from each of the
perspectives, assessment has always been a component that raises questions and problems. The
flexibility of the e-learning environment creates opportunities for experimenting and many interesting
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
4
and creative examples of e-assessment tasks can be found at Universidade do Porto. A recent trend at
the institution showed that teachers are using e-learning based assessment for summative purposes.
Most of the e-assessment implementations were online exams but the teachers successfully developed
other types of tasks [27-29]. All of these represent a very interesting field of research.
The application to the field of engineering comes basically from the same background. From the
researcher perspective, there are two subject areas that are quite innovative in the use of Learning
Technologies: medicine and engineering. Also, both areas have strong civil responsibility and the need
to have regulations from professional bodies. Both aspects have impact on the mobility of the students
and workers. The affiliation to the School of Engineering as a student and a former invited teacher
both contributed to my decision of the scope of application of the research.
Finally, I worked closely in the creation and development of the European project VIRQUAL that was
related with field of qualification frameworks, learning outcomes and assessment in the context of
international mobility [30-33] which developed an interest in the area of transnational Education
policies, Qualification Frameworks and Quality Assurance in HE, a field that has gained an increased
importance in Higher Education.
The combination of these different areas of interest contributed to the definition of the problem and to
the research approach that came after the initial research.
1.4 STATEMENT OF THE PROBLEM AND RESEARCH QUESTIONS
The research problem is placed in the area of interaction of three fields of study: Assessment, Learning
Outcomes and e-Learning. The scope of application will be engineering education. Engineering
education has specificities and requisites [34] that may constitute challenges in the implementation of
assessment based in Learning Technologies (LT). This study intends to contribute to developing e-
learning strategies for assessing EE, helping to deal with current challenges in education.
In general terms, the approach chosen was to develop a model that matches specific e-assessment
methods to specific LOs in the field of engineering. This means that it might be possible for teachers
to define the intended Learning Outcomes (iLO) of an online course and from this definition to have
an indication of the e-assessment methods they might consider using. Formally, this problem is
defined as:
To what extent may e-assessment methods be used to measure the
achievement of Learning Outcomes in engineering education?
Given this problem, it was necessary to recognize that there are a wide variety of engineering schools,
engineering programmes and engineering courses. There are different qualification frameworks that
use LOs in the engineering sector. So, the first challenge was how to select the LOs that were going to
be used for the purpose of the research. The selection process had to take in consideration the subject
area, level and even the nature of the LOs. The same problem existed in relation to the assessment
methods. Assessment tasks are usually defined at course level, even though some examples can be
found at a higher level. Again, there are a considerable variety of assessment and e-assessment tasks
that could be used in this study. Another aspect of assessment is that in most cases is deeply embedded
in the structure of the course or unit and highly contextualized.
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
5
The first two research questions translate this need for definition of the central concepts of the
problem:
RQ1) Which type of Learning Outcomes in the field of Engineering are
relevant and should be considered?
RQ2) What are the e-assessment methods that should be considered?
After defining the concepts, it is necessary to deal with the core of the problem that is the relationship
between assessment and LOs. This link between the iLOs and the assessment tasks is part of what is
considered the alignment of a course [35, 36]. The first concern with alignment is if every iLO is
assessable. It is a recognized issue in HE the need to include transversal skills in curricula and how
difficult it is to assess them. In engineering, curricula are pressured to include these types of LOs
among the technical LOs specific of the sector. This is a known challenge in EE [34] that may also
represent a challenge to effective assessment. The last two research questions of this study are related
with the alignment question:
RQ3) What type of intended Learning Outcomes can be measured by e-
assessment methods?
RQ4) Is it possible to propose specific e-assessment strategies for each type
of LO in EE?
1.5 IMPACT EXPECTATIONS OF THE STUDY
Assessment is an important issue in HE that has high impact on learning [3, 7, 37, 38]. It raises
questions about the efficiency, effectiveness and adequacy of different methods and strategies. E-
assessment, due to the technological component and the association with distance learning brings
additional controversies, some of them intrinsic to e-learning. As described by Conole and Martin, e-
learning is a complex area that may have impact in a diversity of fields including the teaching and
learning process (T/L), organizational structures, political and socio-cultural issues, among others
[22].
It is expected that the findings of this research project will:
• Help teachers to decide which assessment tasks are more suitable for the LOs stated for a
specific course or module. The ALOA model developed includes practical tools that will
support the decision process.
• Help teacher and other stakeholders to verify the current alignment of existing courses. This is
one of the scenarios of application of the ALOA model.
• Contribute to develop a pedagogical framework for the implementation of aligned e-
assessment strategies, based on the definition of Learning Outcomes. The ALOA model was
developed as a conceptual and theoretical model that intends to promote the alignment
between iLOs and assessment.
• Facilitate accreditation processes and navigation between different qualification frameworks,
by providing a common tool for the description of LOs. This was defined in the current study
as vertical alignment and is one of the implementation scenarios of the ALOA model.
• Promote mobility and recognition of prior learning, formal or informal by allowing the
comparison of the LOs of previous experiences with the intended ones.
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
6
1.6 STRUCTURE OF THE STUDY
This dissertation is divided in seven chapters.
This chapter defines the context of the current research study and introduces the main thematic areas
that will be explored during the development of the work. After providing a rationale for the work, the
chapter will present the background of the researcher and the motivations for conducting this study.
Following, the research problem is described as well as the research questions that guided the project.
Finally, the chapter addresses the expected impact of the research and includes a description of the
structure of this dissertation, with a brief explanation of the contents of each chapter.
Chapter 2 provides a theoretical background for the problem and a critical analysis of the published
work in the field. From the analysis of theory and published work it was possible to define the
approach that was followed to address the problem. The ALOA conceptual model was derived from
the literature research so this chapter grounds most the decisions that were taken during development
of the work.
Chapter 3 describes the different components of the research method, including the theoretical
development of the conceptual model and the multiple case studies approach used in the
implementation stage. The chapter also explains how the research project dealt with validity,
reliability and ethical issues.
In Chapter 4 the conceptual model is described in detail. It includes the description of the main
concepts of the model: LOs in engineering and assessment methods. The chapter also includes a
definition of the relationships between the concepts that are included in the ALOA model.
Chapter 5 describes the implementation of the model. It starts by identifying and describing some
implementation scenarios and the practical tools that were developed. Finally, the chapter describes
the implementation of the ALOA model to the case studies that were used to test the applicability of
the model.
In Chapter 6 the results of the applicability test are interpreted from the perspective of each case and
from a transversal point of view. In this chapter, the results of the implementation stage are analysed
from the perspective of the research questions.
Chapter 7 is the final chapter of the dissertation. It starts by presenting the global conclusions of the
work from the perspective of the research problem. Then, the results are analysed in terms of the
contributions to theory and practice. Finaly, it explores future work that could be developed based on
the current project.
1.7 PUBLICATIONS RELATED WITH THE RESEARCH STUDY
• E-ASSESSMENT OF LEARNING OUTCOMES IN ENGINEERING EDUCATION”,
Falcão, R., Proceeding of WEEF 2012, Buenos Aires, Argentina, October 2012 (accepted)
• “A CONCEPTUAL MODEL FOR E-ASSESSING STUDENT LEARNING IN
ENGINEERING EDUCATION”, Rita Falcão, Proceedings of ICL2012, Villach,
Switzerland, September 2012 (accepted)
• “A CONCEPTUAL MODEL FOR E-ASSESSING STUDENT LEARNING IN
ENGINEERING EDUCATION “, Falcão, R, SITE 2012, Austin, Texas, USA, March 2012
• “ASSESSING LEARNING OUTCOMES IN ENGINEERING EDUCATION: AN E-
LEARNING BASED APPROACH”, WEE 2011, Lisboa, Portugal, September 2011
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
7
• “Assessment of Learning Outcomes through e-Learning”, WCCEE2010, Singapore, October
2010
• "MEASURING IMPACT OF E-LEARNING ON QUALITY AND LEARNING
OUTCOMES – A PROPOSAL FOR A PHD PROJECT", Falcão, R., Soeiro, A., EDEN
2007, Junho 2007;
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
8
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
9
CHAPTER 2 - RELATED WORK
2.1 INTRODUCTION
The field of study of this dissertation is defined by the intersection of three areas: assessment, learning
outcomes and e-learning. Given this field, the research focused in the area of Higher Education and in
particular Engineering Education. Given the general field of the research, what was expected to obtain
from the literature review?
What guided the literature research was the analysis of the problem and of the research questions.
To what extent e-assessment methods may be used to measure the
achievement of intended Learning Outcomes in engineering education?
The problem is composed by two dimensions and intends to explore a specific relation between them.
The dimensions “Learning Objects in Engineering Education” and “e-assessment methods” are the
focus of the first two research questions.
RQ1) Which Learning Outcomes in the field of Engineering are relevant and
should be considered?
RQ2) Which are the on-line assessment methods that should be considered?
What is intended with these two questions is a clarification of the dimensions that will be analysed so
it is possible to obtain a valid selection or at least an informed selection process. Literature research
was fundamental to analyse this aspect of the problem. The first dimension to be analysed was the LOs
in EE. Starting with the general approach, it was necessary to clarify certain aspects related with LOs
including terminology, definitions and the theory behind them. Concerning EE, it was researched what
was the role of LOs in EE, how were they implemented and if there was a trend for a common set of
LOs in engineering courses.
Concerning the second dimension, e-assessment, it was possible to find a wide variety of published
work with a corresponding variety of assessment methods. However, most of the published work was
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
10
based on individual cases of application. It was difficult to find fundamentals of e-assessment or a
comprehensive study of e-assessment methods. Early in the literature research of this subject it was
considered necessary to go back to the basics of assessment. Donnan [39] faced a similar situation in
his work. Educational developers considered that e-assessment or online assessment was not distinct
from general assessment. So the analysis again started from the perspective of general assessment to
clarify several issues and finally specific e-assessment methods were researched.
The third component of the problem is a specific relation between the dimensions, the alignment
between LOs in EE and e-assessment methods. That is the core of the problem and the focus of
research questions 3 and 4.
RQ3) What type of intended Learning Outcomes can be measured by e-
assessment methods?
RQ4) Is it possible to propose specific e-assessment strategies for each type
of LO in EE?
In this stage, literature research was centred on finding suggestions on using specific assessment
methods to measure or evaluate specific types of Learning Outcomes, in general and in the field of
Engineering. An important finding in terms of related work, was the book “A Taxonomy for Learning,
Teaching and Assessment” [40] that became a central piece of the this research project.
2.1.1 LITERATURE RESEARCH METHOD
The initial stage of the literature research was exploratory with a view of obtaining a framework for
the research problem. The more systematic approach was guided by the analysis of the problem and
previous knowledge on the subject. Table 1 summarizes the main search expressions used.
Table 1 - Examples of search expressions used
Research question Research terms
RQ1 Learning Outcomes or Learning Objective
and
Engineering, Higher Education, qualification frameworks, ABET, EUR-ACE
RQ2 assessment or e-assessment or evaluation
and
methods, tools, online, CBT (computer based testing), CBA (computer based assessment), CAA
(computer assisted assessment), web based
RQ3 Learning Outcomes + assessment
Learning Outcomes + assessment methods
Learning outcomes + evaluation
Bloom + assessment
Bloom + Learning Outcomes + assessment
RQ4 Assessment or e-assessment or portfolios or multiple choice questions (MCQ) or essays or
short answer questions (SAQ)
and
engineering, learning outcomes + engineering, learning outcomes + ABET, learning outcomes +
EUR-ACE, learning outcomes + Bloom, ABET + Bloom, EUR-ACE + Bloom
The research was conducted using resources available at the library of School of Engineering,
including books, journals and databases of journals and conference proceedings. Additionally, online
searches were performed using web tools including Google, Google Scholar and specific websites.
Table 2 presents the main sources used during this research project.
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
11
Table 2 - Types of sources used and examples
Types of sources Examples
Books Taxonomy of educational objectives [41]
A taxonomy for learning, teaching, and assessing: a revision of
Bloom's taxonomy of educational objectives [40]
Teaching for quality learning at university: what the student does
[42]
Assessing student learning in higher education [7]
Assessment for learning in higher education [43]
Designing better engineering education through assessment [44]
Contemporary perspectives in e-learning research : themes,
methods, and impact on practice [24]
The Sage handbook of e-learning research [21]
The e-assessment handbook [26]
Journals Assessment and Evaluation in Higher Education
European Journal of Engineering Education
Journal of Engineering Education
British journal of educational technology
Alt-J, research in Learning Technology
Studies in Higher Education
Conference proceedings CAA
Frontiers in Education
Teaching and Learning in Higher Education
EDUCOM
Computers in Education
Policy papers European Commission
Bologna Process
Copenhagen Process
National Institutions
Websites ABET
EUR-ACE
AHELO
Tuning
JISC
E-learning Papers
2.2 ASSESSMENT
Assessment is an important process in education and intimately related to how students learn, as
pointed by different authors. Brown, Bull and Pendlebury [7, p. 7] state that assessment is at the centre
of the learning experience and should be a concern for those involved in the learning process,
including learners, educators and institutions. In their view, assessment defines what students will
consider important, how they will spend their time and even how they will see themselves in the
future. They consider that to change learning we need to change the way we assess. This perspective is
shared by Biggs [42] when he describes the backwash effect of assessment. Biggs defends that
students will look strategically at assessment to determine what and how they will learn. Again, the
main idea is that assessment is the central driver for the learning process.
Another author, Peter Knight [43, p.11] shows a similar but more dramatic view when he asks how we
can make student work without assessment. In the introduction to the book “Assessment for Learning
in Higher Education” Knight considers that assessment is a moral activity that states the values of the
teachers and institutions. What is assessed and the methods used give clear indication to the students
of what is valued in a course or in a programme. In the view of the author, even though the goals of a
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
12
full programme are stated in documents like the mission statement and the programme goals, it is in
fact in the assessment tasks that lays the essence of the programme and of the learning experience.
In the same book, David Boud [3] considers that assessment is the aspect of HE where there is more
bad practice and the effects of this bad practice has a strong impact on students learning and success.
As the author refers, students may avoid bad teaching methods but they cannot avoid being assessed if
they want to graduate. Boud, based on the work of other authors considers that as a consequence of
bad assessment practices, students may be hurt in their self-esteem and even reject some subjects.
Race [45] also explores the negative effects of assessment, in particular exams, in terms of feelings.
He describes the worst nightmares students get before or after an exam and how these bad feeling
affect learning.
Reinforcing this idea, Brown et al. [46] write that even though assessment is an important element of
the learning process, new or existing faculty rarely have training to improve their assessment skills.
An interesting perspective from the works of these authors is that in most cases, assessing and marking
are activities done privately and most of the times teachers only receive feedback from students and
not from their peers (other teachers) to validate their decisions. From the perspective of the students,
the problem is similar. Most of the times students are assessed only by their teachers and don’t receive
feed-back from other students.
Race [45] goes even further and considers that one of the principles of assessment is the need for
transparency not only to students and staff but also to the employers. In fact, recent calls for
accountability of the institutions placed assessment on the main stage of the educational process.
Assessment provides important evidence for quality and accreditation, which means that assessment
methods, criteria and results won’t be as private as before.
Even though some statements of these authors may be consider a bit strong, there are some interesting
points made concerning the importance of assessment that may be summarized as follows:
• Assessment is one of the main drivers for learning
• Has a strong impact on what and how students learn
• Affects students self-esteem and confidence
• Embeds the values of the teacher and the institution
• Is a compulsory activity to obtain a graduation/certification
• Provides evidence for quality and accreditation
Current trends, as Knight, Boud and other authors refer [3, 7, 43, 47] treat assessment as an activity
that is part of the T/L process and that cannot be separated. As Erwin puts it [47] deciding what to
teach and assess is one issue, not two. Assessment should be seen as a learning activity, centred on the
learner.
Brown et al. present an overview of current trends in assessment, focused on practical issues. A
summary of these trends is represented in Table 3. It can be said that these changes are somewhat
related with the overarching trend of student centred learning. These trends focus on promoting the
formative function of assessment, increasing feedback to students and contributing to improve
learning. It can be said that assessment is becoming more personalized and students are becoming
more involved in the learning process through the use of explicit criteria and learning outcomes,
allowing self and peer assessment and facilitating the recognition of prior-learning.
From the trends referred in Table 3, three are of particular importance to the present work: the change
from objectives to outcomes, from content to competences and from implicit to explicit criteria. As
referred by Brown et al. using learning outcomes is useful to clarify the relationship between course
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
13
design and assessment. Also, LOs play an important role in the recognition of prior learning since it
separates learning from teaching activities, opening the possibility of assessing what was learned
independently of how it was learned.
Table 3 - Trends in assessment. From Brown et al. [7, p.13]
From Towards
Written examinations Coursework
Tutor led-assessment Student led-assessment
Competition Collaboration
Product assessment Process assessment
Objectives Outcomes
Content Competencies
Course assessment Modular assessment
Advanced levels Assessed prior learning
The change from content to competences is related with the change to outcomes. As referred by
Brown et al. competences are clusters of skills students are able to use in different situations and they
provide a framework for defining LOs and transferable skills. Finally the use of explicit criteria in
assessment plays an important role in the pedagogical process. When the assessment criteria or
marking schemes are explicit they may provide important information for the learning process. If
shared with students, these instruments will give them indication of what is expected from them. By
defining explicit criteria for an assessment task one is also defining the links between the task and the
iLOs. Explicit criteria may have an important role in the reliability of the assessment task since it may
reduce differences between assessors. As referred by Brown et al. [7] there is considerable controversy
around the efficiency of the use of implicit and explicit instruments. However, the focus of this work
is not on the instruments but on the methods of assessment.
Boud [3] provides a useful description of the evolution of assessment that provides some background
understanding on this matter. In the conventional conception, assessment follows learning and aims at
finding out how much was learned; it is a quantitative perspective. There is no questioning concerning
linking the assessment task to learning. This conception is followed by educational measurement that
follows the same principles but it intends to be more rational, more efficient and reliable. It includes
ideas and concepts from psychometrics. Nowadays we are still influenced by this conception as can be
seen by the wide use of multiple-choice question exams that are a typical instrument of educational
measurement. The latest perspective identified by Boud is competency based and authentic
assessment. It resulted from concerns about validity of assessment focusing on the link between what
was assessed and what was expected for students to have learned. Authentic assessment includes the
direct assessment of complex performance and includes methods such as portfolios, open-ended
problems, hands-on lab work. It contrasts with indirect assessment methods like multiple-choice
questions that measure, among other things, indirect indicators of performance [48].
This conception of assessment questions the validity of educational measurement approaches and
promotes performance-based assessment and the importance of learning outcomes. As Boud refers,
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
14
what is important is to assess if the students are achieving the iLO, independently of how they reached
them. Good assessment is, as described by Boud, the one that is linked with the iLO and the one that
promotes learning. Erwin [47] also supports that the first step in educational design is to define clearly
the learning outcomes. This should be done before deciding teaching and assessment strategies, both at
programme and course level. Another author, Race [45, p. 67] provides ten principles of assessment
and again, the first ones are related with clearly defining the purpose of assessment and integrating
assessment in the course activities and not as a separate event. Race defends the importance of
assessment providing feedback to students, in agreement with the perspective of Boud.
Race and other authors [46] consider that a key question about assessment is knowing what we are
assessing. However, the analysis proposed by these authors is not based on the iLO but on the
assessment tasks. For each task it should be clear what is being assessed: the content, the process, the
structure, the product, the style, the presentation, etc. The authors recommend that assessment criteria
should be clearly stated and then provided and explained to students. They even go further saying that
these criteria should be negotiated with the students, to share the ownership of assessment and help
them understand the whole assessment process.
Another interesting perspective is provided by Brown et al. [46, p.82] when they suggest that the
outcome of an assessment task should not be merely a grade but a description of what students know
and can do. Again, the link between assessment and LO is highlighted as being valuable in the
educational process for the future development of students.
Given the impact of assessment in student learning, Boud considers important to reflect on how
assessment affects learning, what students learn from assessment. He considers assessment gives a
message to students about what they should be learning but the message is not clear and most likely
will not be interpreted the same way by teachers and students. For this reason, assessment will most
likely have non-intended consequences in student learning. Students will respond strategically to
assessment based on past experiences, choosing an approach that will lead them to success.
Linn [48, p. 16] shares the same opinion saying that assessment may have intended and unintended
effects both on learning and on teaching. As an example, both teachers and students may spend more
time teaching or studying concepts that will be explicitly included on the assessment tasks and neglect
those that will not be included. This concept is called consequential validity and is approached by Linn
and other authors like Messick [49]. This is related with the backwash effect of assessment on
learning. As Boud explains, the backwash effect is positive if encourages the intended learning
outcomes and is negative when encourages ways of learning that are not desired, like memorizing
instead of understanding. Linn [48, p. 19] suggests that to understand the real cognitive complexity of
an assessment task one must analyse the task, the familiarity of the student with the task, and ideally
the process students follow to solve it. An apparently complex task may be addressing lower level
thinking skills if the student is only recalling previous knowledge about the task.
To summarize, the following ideas are important when defining assessment:
• Assessment should be seen as part of the teaching and learning process (T/L)
• The first step is to have good assessment is to define the iLOs of the course or module
• It is important to define clearly the assessment tasks and what is being considered for
assessment
• Assessors have to realize that assessment tasks will have intended and unintended outcomes
• The real cognitive complexity of an assessment task depends not only on the task but on many
other factors, some related with the learner, others with the T/L process.
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
15
2.2.1 KEY CONCEPTS ABOUT ASSESSMENT
The term assessment, as analysed by Brown et al. [7] may have different interpretations. The origin of
the term is Medieval Latin meaning “sitting beside”1
to determine tax value. The general meaning of
assessment is to estimate worth, to judge the value. In traditional views of education, as described
above, judging and determining value is the core function of assessment. However, current trends in
education give assessment a new and important role, contributing directly to learning. For the purpose
of this work, we will use the definition of assessment as proposed by Brown et al. [7, p.11] since it is
broad enough to include most assessment tasks:
Any procedure used to estimate student learning for whatever purpose.
Brown et al. [7, p.8] describe what can be considered an assessment cycle. It consists of three essential
steps: taking a sample of what students do, making inferences and estimating the worth of what was
done.
Figure 2 - Assessment cycle as described by Brown et al
The first step, sampling may include a traditional assessment method like an essay or exam but may
also include solving problems, carrying out a project, performing a procedure, etc. Samples are
analysed by the assessors who will make conclusions about what was achieved by the students when
compared with what was intended as described on the LOs’ statements. Finally, the assessor will make
an estimate of the value of what was achieved by attributing marks or grades. This research is mostly
concerned with the first two steps of the cycle. The third step is concerned with marking and grading
and falls out of the goals of the current work.
Woods [50] defines assessment as a judgement based on the degree to which the goals have been
achieved based on measurable criteria and on pertinent evidence. This definition includes a strong
emphasis on the judgemental purpose of assessment. Based on this definition, Woods and other
referenced authors present five assessment principles and six good practice recommendations that are
summarized in Table 4.
Concerning the purpose of assessment, Brown et al. [46, p.77] analyze existing motivations for
assessment and identify different reasons to assess. Brown et al. [7, p.11] presented a similar list of
reasons. A synthesis of both lists is presented in Table 5.
1
http://dictionary.reference.com/browse/assess
assessment
1. Sampling
2. Making
inferences
3.
Estimating
value
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
16
Table 4 – Principles and practices on assessment (extracted and adapted from Woods [50])
Assessment principles Principles in practice
Assessment is a judgement based on performance - not
personalities.
Assessment is a judgement based on evidence - not feelings.
Assessment should be done for a purpose with clearly-defined
performance conditions. The student should know when he/she
is being assessed.
Assessment is a judgement done in the context of published
goals, measurable criteria and pertinent, agreed-upon forms of
evidence.
Assessment should be based on multidimensional evidence.
What is being assessed? Have the goals been expressed
unambiguously in observable terms? Who creates the goals?
Are the goals explicit and published?
Are there criteria that relate to the goals? Can each criterion be
measured? Who creates the criteria? Are the criteria explicit and
published? 

What evidence is consistent with the criteria? Do both the
assessor and the student know that this form of evidence is
acceptable?
Are the goals and the collection of the evidence possible to
achieve in the time and with the resources available?
What is the purpose of the assessment? Under what conditions
is the student’s performance assessed? Who assesses? What
type of feedback is given by the assessor?
Have both the student and the assessor received training in
assessment?
Table 5 - Reasons for assessing students
Developmental Judgmental Quality of teaching
To provide feed back to learners and
improve learning
To motivate learning and help them focus
To diagnose students knowledge and
skills
To consolidate student learning
To diagnose the learning status of the
student
To rank, classify or grade student
achievement
To estimate students’ value and to allow the
to progress in their studies
To select for future courses or employment
To provide license to practice
To give feed-back on teaching efficiency and
improve teaching
To provide data for quality and accreditation
processes
As stated by Brown et al., the results of assessment are used mainly for developmental and
judgemental purposes. The developmental purpose is related with improving student learning. The
judgmental purpose is usually concerning with providing a license to proceed to the next level. In
fact, in the model presented by these authors, the purpose of an assessment event is placed in a
continuum between developmental and judgemental and it is never purely in one extreme of the
continuum. Or at least it shouldn’t be, in the opinion of these authors.
This perspective is supported by other authors like Boud [3] who distinguishes the purpose of
assessment as formative (developmental) and summative (judgemental). The former aims to improve
learning and provides feed-back to the student, the latter aims at making decisions, judgements and is
related with grading and marking. Boud considers that it is not advisable to separate both types of
assessment since students will most likely be more concerned with the summative tasks that determine
their grades. It is recommended that both aspects of assessment should be approached together to be
effective since all assessment leads to learning.
Brown et al. [46] also consider the formative component of assessment of great importance. Students
should receive feedback about their performance and accomplishments in a timing that allows them to
be informed and to improve. Again, they consider that there is not a clear separation between these
types of assessment. Even though formative assessment should not contribute to the grade of the
course, this is not what happens due to time and workload constrictions. Coursework assignments will
provide formative feedback to students but will also, in most cases, contribute to the final grade.
Summarizing, in terms of function assessment tasks can then be classified as summative or judgmental
and formative or developmental, as described above. Additionally, assessment may have a diagnostic
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
17
function when intends to assess prior knowledge. Assessment results may be included as indicators in
quality and accreditation processes of the institution.
Figure 3 - Purposes of assessment
Another issue concerning assessment is the source of assessment, the person who performs the role of
assessor. Traditionally, the assessors were the tutors but current trends in assessment defend that the
involvement of other sources including the assessed student, peers or employers2
. The use of students
as assessors, being in self or peer-assessment has many advantages [7, 46].
From the learning perspective, by assessing their work or the work of others students may increase
their evaluating skills and their meta-cognitive knowledge, important for lifelong learning and for
professional life. Using self- and peer-assessment also facilitates dealing with time constraints when
facing large numbers of students, for instance. Another benefit of using students as source of
assessment is to take advantage of their privileged perspective in situations where the tutor won’t be
able to make an informed judgment. This is the case of as it happens in group-work.
Peer- and self-assessment may include providing feedback to the students or marking students work.
In the perspective of Brown et al. self and peer assessment are more suitable for formative than for
summative assessment. These authors consider that for using different sources of assessment it is
necessary to provide training and to provide clear information about the task and the assessment
instrument (marking scheme, grading criteria, etc.).
One of the reasons is that these sources of assessment are not always well accepted by students. As
assessors, students don’t like to assess others and take responsibility for their grades. When being
assessed students prefer receiving feedback or being judged by someone with expertise in the subject
area. Both self and peer assessment are present when using reflexive practice assessment methods like
portfolios.
2
Brown et al. provide a list of sources of assessment
Formative Summative
Diagnostic Institutional
Purpose
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
18
Figure 4 - Sources of assessment
Boud [3] identifies two factors that are increasing the pressure on assessment and that are present in
current challenges of higher education. One is the trend of larger numbers of students in HE which
increase the load of assessment in teachers and institutions. The other factor identified by Boud is the
existing pressure towards the need to incorporate in existing HE frameworks of competences and the
need to deal with the accountability of institutions.
Assessment of transversal learning outcomes or skills, included in existing frameworks seems to be a
difficult task. Erwin [47] identifies several of these that should be assessed by faculty, given the
importance they assume in society:
• Ability to commit oneself through reasoned beliefs and actions
• Work cooperatively
• Independent work
• Accept criticism
• Manage personal stress
• Self discipline
Brown et al. suggest that cuts in resources, modularization, along with larger number of student
increase the pressure in education and assessment. Based on the work of other authors, they suggest
several strategies for dealing with this pressure that are not important for the current work.
Figure 5 – Current challenges in assessment
A final issue concerning assessment is the question of validity and reliability. The discussion about
validity and reliability of assessment is important because it helps to clarify important issues of the
assessment process. Brown et al. [7, p.234] use an interesting metaphor to explain these two concepts.
If assessment was a watch, it was reliable if it was precise, if it measured time consistently. But it was
only valid if it showed the right time. The opposite is also true: a watch may be telling the right time at
a specific moment but it may be not consistent and it goes slower than it should. Also, the observer of
tutor self
peer others
Source
Large numbers Transferable skills
Available resources
Personalized
authentic assessment
Challenges
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
19
the watch must know how to tell the time using that type of watch. So, an assessment task must be
reliable (or consistent), must be valid (or accurate) and the assessor must know how to use the
instruments (marking schemes, grading criteria) to achieve this.
Reliability may be described as being related with fairness and consistency between different assessors
and within one assessor [7, 26, 46]. Reliability is concerned with different assessors awarding the
same marks or grades to the same assessment, or the same assessor awarding the same marks or grades
in different moments. Reliability is strongly related with fairness and replicability. Marking schemes,
grading criteria, anonymous marking are all assessment instruments that intend to improve reliability.
It is suggested by Brown et al. that even when using explicit criteria and other instruments it is not
easy to achieve the intended reliability. As an example, using a detailed marking scheme may be
difficult to implement and if so, reliability is compromised. The same authors suggest that some
assessment methods are more reliable than others. Reliable methods include MCQs and other methods
with a well-defined solution. In the particular case of MCQ it is common practice to measure the
internal consistency of the test by analysing the results. Student performance also affects reliability.
The same student may respond the same task differently in separate moments, due to many reasons.
Figure 6 summarizes the main issues related with reliability.
Figure 6 - Different dimensions of reliability of assessment
Validity in assessment is concerned with measuring the right thing, matching what is intended to
measure with what is actually measured. For the purpose of this work, the concept of validity of
assessment is more relevant than reliability. Validity is concerned with the sampling phase of the
assessment cycle while reliability is closer to inferring and estimating value (see Figure 2).
Authors identify different forms of validity [7, 48, 49] but for the purpose of this work only some will
be explored. Brown et al. [7] describe face validity as being the first impression of the assessment task.
An assessment task should explain in a clear manner the purpose of the assessment and what is
expected from the student. Another form of validity is the consequential validity of assessment that
was referred above. This is related with a broader view of assessment, how it impacts the teaching and
learning processes and the intended and non-intended outcomes of an assessment task. The same
authors explain the concept of intrinsic validity of assessment. This means that an assessment task is
measuring the iLOs of the course or module. The authors point out that to achieve this type of validity,
the iLOs must be clearly described and described in measurable terms. The authors identify one risk
associated with this type of validity. Very detailed descriptions of iLO and of the assessment tasks
may be unmanageable by the assessors, affecting the reliability of the process. This type of validity is
the one with greater relevance to this research project.
Linn et al. [48] propose eight validity criteria for complex performance based assessment. The
intention was to propose a framework to help decide on the adequacy of new forms of assessment. The
Between
assessors
Within one assessor
Intrinsic to the student
Intrinsic to the
method
Reliability
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
20
criteria are: consequences, fairness, transfer and generalizability, cognitive complexity, content
quality, content coverage, meaningfulness and cost and efficiency. Consequential validity was already
addressed and is related with the impact, intended and unintended, of assessment on learning and
teaching. Cognitive complexity introduces an interesting perspective to the current work. To truly
understand if an assessment is promoting the use of specific cognitive processes it is not enough to
have a detailed analysis of the specifications of the task. It is also necessary to consider the familiarity
of the student with the task and analyse how he solves the problem. This perspective is shared by both
Bloom et al. and Anderson et al. [40, 41] that identifies the student background as being a problem
when trying to use the taxonomy to classify assessment tasks.
Figure 7 - Types of validity related with current research
Concerning different assessment methods, Brown et al. [7] suggest that the ones related with
remembering knowledge or well defined solutions are the ones with higher reliability. But in their
view, what is not necessarily true for validity. Given the explained concepts of reliability and validity,
it is easily understood why Brown et al. [7] refer that validity and reliability have conflicting needs of
effectiveness and efficiency. In Erwin’s comments about this issue [47] it is suggested that validity
and reliability of assessment are of particular concern when the focus of assessment is not on the
individual student but when issues of accreditation and the quality of an institution start to come up.
With the evaluation processes of institutions the number of people interested in the results of
assessment increases and the institution needs to be accountable for the results. Brown et al. show a
different opinion when they refer that validity and reliability are crucial features of a fair and effective
assessment system. These opinions are not necessarily conflicting as we may consider that reliability
is important when analysing results from a broader perspective, when comparing students or groups of
students. This is important not only from the institutional perspective but to ensure fairness among
different students. Validity is more focused on the learner and learning process. Still some aspects of
validity may be approached from a broader perspective like in accreditation processes.
2.2.2 OVERVIEW OF METHODS OF ASSESSMENT
Brown et al. [7] start by splitting assessment in two main categories: examinations and coursework.
Traditionally, examinations would be written or oral and were typically blind, meaning the student
would only know the questions when the examination started. Nowadays, the line dividing
examinations from coursework is not so clear. Examinations are not necessarily blind. In some cases
students may know the questions in advance the questions or may even take the examination home to
solve them; in other cases students are allowed to take their notes to the exam. Traditionally
coursework was made of essays, problems and reports of practicals. Coursework served mainly but not
exclusively for formative purposes. Changes in HE are affecting this category and coursework
includes a variety of tasks that are currently being used for summative and formative assessment.
Intrinsic Face
Consequential Cognitive
Validity
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
21
Erwin [47, p.53], based on the work of other authors, identified two broad classes of assessment
formats: constructed response and selective response. In the former, students will produce something
like a case study, a report or must perform a task. In the latter, students are presented with several
possible answers and recognize/select the correct one.
Brown et al. [46] propose the following list of nineteen assessment methods:
• Activities putting into perspective a topic or issue
• Case studies and simulations
• Critical reviews of articles, viewpoints or opinions
• Critiques
• Dissertations and thesis
• Essay plans
• Essays, formal and non-traditional
• Fieldwork, casework and other forms of applied research
• Laboratory reports and notebooks
• Literature searches
• Activities putting into perspective a topic or issue
• In-tray exercises
• Oral presentations
• Poster exhibitions
• Practical skills and competences
• Projects (individual or group)
• Reviews for specific audiences
• Seen written exams
• Unseen written exams
• Strategic plans
In later work, Brown et al. [7] present a shorter list of assessment in a more structured approach that
was closer to what was intended for the purpose of the current work:
• Essays
• Problems
• Reports on practicals
• Multiple choice questions (MCQ)
• Short answer questions (SAQ)
• Cases and open problems
• Mini-practicals
• Projects, group projects and dissertations
• Orals
• Presentations
• Poster sessions
• Reflective practice assignments
• Single essay exams
After comparing and analysing both proposals it was possible to group them into similar activities.
The first finding was that the first list does not include any type of questions or exercises that are a
common component of exams (MCQs, problems, SAQs) and it was mostly centred on coursework.
But for the purpose of the current work it was necessary to include the summative perspective.
Another issue that resulted from the analysis of both lists was the inclusion of orals or poster
Evaluation of the Application of e-Learning Methodologies to the Education of Engineering
22
exhibition as methods of assessment. It was considered that these were formats of delivery of other
methods that could be a report from practical work, a synthesis of an essay or answering verbally to a
SAQ.
From this type of analysis of assessment as it was presented by different authors, it was decided, for
the purpose of the current work to differentiate between assessment methods and assessment tasks.
The main difference is that the methods are the essence of the assessment tasks that are independent of
the context of implementation, the grading criteria or the media chosen for delivery. The assessment
tasks are the assessment methods in practice, as adopted by the teachers in their courses. Using this
differentiating principle, it was possible to summarize the findings from the literature and compile a
list of six general categories of assessment methods that could be used for answering RQ2. Table 6
presents a summary of both lists, distributed from the identified general categories.
Table 6 - Summary of assessment methods
General categories List of assessment methods by Brown et
al
List of assessment methods by Brown et
al
MCQ Multiple choice questions (MCQ)
Orals
SAQ Short answer questions (SAQ)
Orals
Essays (scripts) Essays
Single essay exams
Dissertations
Presentations
Poster sessions
Dissertations and theses
Essay plans
Essays, formal and non-traditional
Activities putting into perspective a topic
or issue
Critical reviews of articles, viewpoints or
opinions
Critiques
Literature searches
Practical work Projects
Group projects
Mini-practicals
Presentation
Poster sessions
Reports on practicals
Projects (individual or group)
Fieldwork, casework and other forms of
applied research
Laboratory reports and notebooks
Practical skills and competences
Oral presentations
Poster exhibitions
In-tray exercises
Problems Problems
Cases and open problems
Case studies and simulations
Reflexive practice Reflective practice assignments In-tray exercise
Brown et al. [7] suggest that different methods may have different applications. As an example, MCQ
are more suitable for sampling comprehensive knowledge while essays are better for assessing
understanding, synthesis and evaluation skills. Still, they consider that almost every method could be
used to any purpose, although with some sacrifice of validity. Another interesting statement of these
authors which is of great importance for the current work is that the effectiveness or validity of
assessment depends not only on the method but on the specificities of the assessment task. It is
necessary to match the purpose of assessment with the iLOs and the assessment tasks, including the
methods and the instruments. To achieve this match, Brown et al. propose the course designer or tutor
to answer a list of questions related with iLOs, assessment methods and grading schemes/criteria that
will lead to the detailed definition of the assessment task.
Multiple Choice Questions (MCQ) or objective test questions
As defined by Brown et al. [7] a MCQ consists of a question followed by several alternative answers
from which the student has to choose the correct one. This type of questions is frequently used in
objective tests. Bull and McKenna [51] define objective tests as the ones where students are required
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education
Evaluation of e-Learning Methodologies for Engineering Education

Weitere ähnliche Inhalte

Ähnlich wie Evaluation of e-Learning Methodologies for Engineering Education

Chisholm Institute presentation - Neil Morris
Chisholm Institute presentation - Neil MorrisChisholm Institute presentation - Neil Morris
Chisholm Institute presentation - Neil MorrisNeil Morris
 
3rd International Conference on Education and Technology (EDUTEC 2024)
3rd International Conference on Education and Technology (EDUTEC 2024)3rd International Conference on Education and Technology (EDUTEC 2024)
3rd International Conference on Education and Technology (EDUTEC 2024)IJITE
 
3rd International Conference on Education and Technology (EDUTEC 2024)
3rd International Conference on Education and Technology (EDUTEC 2024)3rd International Conference on Education and Technology (EDUTEC 2024)
3rd International Conference on Education and Technology (EDUTEC 2024)IJITE
 
Digital Transformation Strategies at organizational level for universities
Digital Transformation Strategies at organizational level for universitiesDigital Transformation Strategies at organizational level for universities
Digital Transformation Strategies at organizational level for universitiesDiana Andone
 
3rd International Conference on Education and Technology (EDUTEC 2024)
3rd International Conference on Education and Technology (EDUTEC 2024)3rd International Conference on Education and Technology (EDUTEC 2024)
3rd International Conference on Education and Technology (EDUTEC 2024)IJITE
 
Gerdts master's project
Gerdts master's projectGerdts master's project
Gerdts master's projectmegerdts
 
Design and Implementation Of A Virtual Learning System (A case study of Adele...
Design and Implementation Of A Virtual Learning System (A case study of Adele...Design and Implementation Of A Virtual Learning System (A case study of Adele...
Design and Implementation Of A Virtual Learning System (A case study of Adele...Mekitmfon AwakEssien
 
3rd International Conference on Education and Technology (EDUTEC 2024)
3rd International Conference on Education and Technology (EDUTEC 2024) 3rd International Conference on Education and Technology (EDUTEC 2024)
3rd International Conference on Education and Technology (EDUTEC 2024) IJITE
 
3rd International Conference on Education and Technology (EDUTEC 2024)
3rd International Conference on Education and Technology (EDUTEC 2024)3rd International Conference on Education and Technology (EDUTEC 2024)
3rd International Conference on Education and Technology (EDUTEC 2024)IJITE
 
Jisc/UCISA The Digital Student UCISA 2014 presentation
Jisc/UCISA The Digital Student UCISA 2014 presentationJisc/UCISA The Digital Student UCISA 2014 presentation
Jisc/UCISA The Digital Student UCISA 2014 presentationJim Nottingham
 
3rd International Conference on Education and Technology (EDUTEC 2024)
3rd International Conference on Education and Technology (EDUTEC 2024)3rd International Conference on Education and Technology (EDUTEC 2024)
3rd International Conference on Education and Technology (EDUTEC 2024)IJITE
 
final_diagnostic_report_clean.pdf
final_diagnostic_report_clean.pdffinal_diagnostic_report_clean.pdf
final_diagnostic_report_clean.pdfFajar Baskoro
 
3rd International Conference on Education and Technology (EDUTEC 2024)
3rd International Conference on Education and Technology (EDUTEC 2024) 3rd International Conference on Education and Technology (EDUTEC 2024)
3rd International Conference on Education and Technology (EDUTEC 2024) IJITE
 

Ähnlich wie Evaluation of e-Learning Methodologies for Engineering Education (20)

Holo zoo
Holo zooHolo zoo
Holo zoo
 
Dissertation_Graetz Final
Dissertation_Graetz FinalDissertation_Graetz Final
Dissertation_Graetz Final
 
PhD_VanCoile
PhD_VanCoilePhD_VanCoile
PhD_VanCoile
 
FE & Digital Skills
FE & Digital SkillsFE & Digital Skills
FE & Digital Skills
 
Chisholm Institute presentation - Neil Morris
Chisholm Institute presentation - Neil MorrisChisholm Institute presentation - Neil Morris
Chisholm Institute presentation - Neil Morris
 
Attachment Report
Attachment ReportAttachment Report
Attachment Report
 
3rd International Conference on Education and Technology (EDUTEC 2024)
3rd International Conference on Education and Technology (EDUTEC 2024)3rd International Conference on Education and Technology (EDUTEC 2024)
3rd International Conference on Education and Technology (EDUTEC 2024)
 
blend10
blend10blend10
blend10
 
3rd International Conference on Education and Technology (EDUTEC 2024)
3rd International Conference on Education and Technology (EDUTEC 2024)3rd International Conference on Education and Technology (EDUTEC 2024)
3rd International Conference on Education and Technology (EDUTEC 2024)
 
Digital Transformation Strategies at organizational level for universities
Digital Transformation Strategies at organizational level for universitiesDigital Transformation Strategies at organizational level for universities
Digital Transformation Strategies at organizational level for universities
 
3rd International Conference on Education and Technology (EDUTEC 2024)
3rd International Conference on Education and Technology (EDUTEC 2024)3rd International Conference on Education and Technology (EDUTEC 2024)
3rd International Conference on Education and Technology (EDUTEC 2024)
 
Gerdts master's project
Gerdts master's projectGerdts master's project
Gerdts master's project
 
Design and Implementation Of A Virtual Learning System (A case study of Adele...
Design and Implementation Of A Virtual Learning System (A case study of Adele...Design and Implementation Of A Virtual Learning System (A case study of Adele...
Design and Implementation Of A Virtual Learning System (A case study of Adele...
 
3rd International Conference on Education and Technology (EDUTEC 2024)
3rd International Conference on Education and Technology (EDUTEC 2024) 3rd International Conference on Education and Technology (EDUTEC 2024)
3rd International Conference on Education and Technology (EDUTEC 2024)
 
3rd International Conference on Education and Technology (EDUTEC 2024)
3rd International Conference on Education and Technology (EDUTEC 2024)3rd International Conference on Education and Technology (EDUTEC 2024)
3rd International Conference on Education and Technology (EDUTEC 2024)
 
Jisc/UCISA The Digital Student UCISA 2014 presentation
Jisc/UCISA The Digital Student UCISA 2014 presentationJisc/UCISA The Digital Student UCISA 2014 presentation
Jisc/UCISA The Digital Student UCISA 2014 presentation
 
3rd International Conference on Education and Technology (EDUTEC 2024)
3rd International Conference on Education and Technology (EDUTEC 2024)3rd International Conference on Education and Technology (EDUTEC 2024)
3rd International Conference on Education and Technology (EDUTEC 2024)
 
Mobile learning framework
Mobile learning frameworkMobile learning framework
Mobile learning framework
 
final_diagnostic_report_clean.pdf
final_diagnostic_report_clean.pdffinal_diagnostic_report_clean.pdf
final_diagnostic_report_clean.pdf
 
3rd International Conference on Education and Technology (EDUTEC 2024)
3rd International Conference on Education and Technology (EDUTEC 2024) 3rd International Conference on Education and Technology (EDUTEC 2024)
3rd International Conference on Education and Technology (EDUTEC 2024)
 

Kürzlich hochgeladen

Q-Factor General Quiz-7th April 2024, Quiz Club NITW
Q-Factor General Quiz-7th April 2024, Quiz Club NITWQ-Factor General Quiz-7th April 2024, Quiz Club NITW
Q-Factor General Quiz-7th April 2024, Quiz Club NITWQuiz Club NITW
 
Oppenheimer Film Discussion for Philosophy and Film
Oppenheimer Film Discussion for Philosophy and FilmOppenheimer Film Discussion for Philosophy and Film
Oppenheimer Film Discussion for Philosophy and FilmStan Meyer
 
Grade Three -ELLNA-REVIEWER-ENGLISH.pptx
Grade Three -ELLNA-REVIEWER-ENGLISH.pptxGrade Three -ELLNA-REVIEWER-ENGLISH.pptx
Grade Three -ELLNA-REVIEWER-ENGLISH.pptxkarenfajardo43
 
Expanded definition: technical and operational
Expanded definition: technical and operationalExpanded definition: technical and operational
Expanded definition: technical and operationalssuser3e220a
 
Active Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdfActive Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdfPatidar M
 
4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptxmary850239
 
Sulphonamides, mechanisms and their uses
Sulphonamides, mechanisms and their usesSulphonamides, mechanisms and their uses
Sulphonamides, mechanisms and their usesVijayaLaxmi84
 
MS4 level being good citizen -imperative- (1) (1).pdf
MS4 level   being good citizen -imperative- (1) (1).pdfMS4 level   being good citizen -imperative- (1) (1).pdf
MS4 level being good citizen -imperative- (1) (1).pdfMr Bounab Samir
 
How to Make a Duplicate of Your Odoo 17 Database
How to Make a Duplicate of Your Odoo 17 DatabaseHow to Make a Duplicate of Your Odoo 17 Database
How to Make a Duplicate of Your Odoo 17 DatabaseCeline George
 
Transaction Management in Database Management System
Transaction Management in Database Management SystemTransaction Management in Database Management System
Transaction Management in Database Management SystemChristalin Nelson
 
ESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnv
ESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnvESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnv
ESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnvRicaMaeCastro1
 
4.11.24 Mass Incarceration and the New Jim Crow.pptx
4.11.24 Mass Incarceration and the New Jim Crow.pptx4.11.24 Mass Incarceration and the New Jim Crow.pptx
4.11.24 Mass Incarceration and the New Jim Crow.pptxmary850239
 
Using Grammatical Signals Suitable to Patterns of Idea Development
Using Grammatical Signals Suitable to Patterns of Idea DevelopmentUsing Grammatical Signals Suitable to Patterns of Idea Development
Using Grammatical Signals Suitable to Patterns of Idea Developmentchesterberbo7
 
Congestive Cardiac Failure..presentation
Congestive Cardiac Failure..presentationCongestive Cardiac Failure..presentation
Congestive Cardiac Failure..presentationdeepaannamalai16
 
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptxQ4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptxlancelewisportillo
 
4.11.24 Poverty and Inequality in America.pptx
4.11.24 Poverty and Inequality in America.pptx4.11.24 Poverty and Inequality in America.pptx
4.11.24 Poverty and Inequality in America.pptxmary850239
 
4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptxmary850239
 
Unraveling Hypertext_ Analyzing Postmodern Elements in Literature.pptx
Unraveling Hypertext_ Analyzing  Postmodern Elements in  Literature.pptxUnraveling Hypertext_ Analyzing  Postmodern Elements in  Literature.pptx
Unraveling Hypertext_ Analyzing Postmodern Elements in Literature.pptxDhatriParmar
 

Kürzlich hochgeladen (20)

Q-Factor General Quiz-7th April 2024, Quiz Club NITW
Q-Factor General Quiz-7th April 2024, Quiz Club NITWQ-Factor General Quiz-7th April 2024, Quiz Club NITW
Q-Factor General Quiz-7th April 2024, Quiz Club NITW
 
Oppenheimer Film Discussion for Philosophy and Film
Oppenheimer Film Discussion for Philosophy and FilmOppenheimer Film Discussion for Philosophy and Film
Oppenheimer Film Discussion for Philosophy and Film
 
Grade Three -ELLNA-REVIEWER-ENGLISH.pptx
Grade Three -ELLNA-REVIEWER-ENGLISH.pptxGrade Three -ELLNA-REVIEWER-ENGLISH.pptx
Grade Three -ELLNA-REVIEWER-ENGLISH.pptx
 
Expanded definition: technical and operational
Expanded definition: technical and operationalExpanded definition: technical and operational
Expanded definition: technical and operational
 
Active Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdfActive Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdf
 
4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx
 
Sulphonamides, mechanisms and their uses
Sulphonamides, mechanisms and their usesSulphonamides, mechanisms and their uses
Sulphonamides, mechanisms and their uses
 
MS4 level being good citizen -imperative- (1) (1).pdf
MS4 level   being good citizen -imperative- (1) (1).pdfMS4 level   being good citizen -imperative- (1) (1).pdf
MS4 level being good citizen -imperative- (1) (1).pdf
 
How to Make a Duplicate of Your Odoo 17 Database
How to Make a Duplicate of Your Odoo 17 DatabaseHow to Make a Duplicate of Your Odoo 17 Database
How to Make a Duplicate of Your Odoo 17 Database
 
Transaction Management in Database Management System
Transaction Management in Database Management SystemTransaction Management in Database Management System
Transaction Management in Database Management System
 
ESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnv
ESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnvESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnv
ESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnv
 
4.11.24 Mass Incarceration and the New Jim Crow.pptx
4.11.24 Mass Incarceration and the New Jim Crow.pptx4.11.24 Mass Incarceration and the New Jim Crow.pptx
4.11.24 Mass Incarceration and the New Jim Crow.pptx
 
Using Grammatical Signals Suitable to Patterns of Idea Development
Using Grammatical Signals Suitable to Patterns of Idea DevelopmentUsing Grammatical Signals Suitable to Patterns of Idea Development
Using Grammatical Signals Suitable to Patterns of Idea Development
 
Congestive Cardiac Failure..presentation
Congestive Cardiac Failure..presentationCongestive Cardiac Failure..presentation
Congestive Cardiac Failure..presentation
 
Mattingly "AI & Prompt Design: Large Language Models"
Mattingly "AI & Prompt Design: Large Language Models"Mattingly "AI & Prompt Design: Large Language Models"
Mattingly "AI & Prompt Design: Large Language Models"
 
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptxQ4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
 
4.11.24 Poverty and Inequality in America.pptx
4.11.24 Poverty and Inequality in America.pptx4.11.24 Poverty and Inequality in America.pptx
4.11.24 Poverty and Inequality in America.pptx
 
4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx
 
Paradigm shift in nursing research by RS MEHTA
Paradigm shift in nursing research by RS MEHTAParadigm shift in nursing research by RS MEHTA
Paradigm shift in nursing research by RS MEHTA
 
Unraveling Hypertext_ Analyzing Postmodern Elements in Literature.pptx
Unraveling Hypertext_ Analyzing  Postmodern Elements in  Literature.pptxUnraveling Hypertext_ Analyzing  Postmodern Elements in  Literature.pptx
Unraveling Hypertext_ Analyzing Postmodern Elements in Literature.pptx
 

Evaluation of e-Learning Methodologies for Engineering Education

  • 1. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering I EVALUATION OF THE APPLICATION OF E- LEARNING METHODOLOGIES TO THE EDUCATION OF ENGINEERING RITA RODRIGUES CLEMENTE FALCÃO DE BERREDO Dissertação submetida para satisfação dos requisitos do grau de DOUTOR EM MEDIA DIGITAIS ESPECIALIDADE DE CRIAÇÃO DE AUDIOVISUAL E DE CONTEÚDOS INTERACTIVOS Orientador: Professor Doutor Alfredo Augusto Vieira Soeiro AGOSTO DE 2012
  • 2. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering II
  • 3. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering III Versão revista em Abril 2013 Dissertação desenvolvida no âmbito do Programa Doutoral em Media Digitais da Faculdade de Engenharia da Universidade do Porto em colaboração com a Universidade do Texas em Austin Revised version, April 2013 Dissertation submitted to the Doctoral Programme in Digital Media at the School of Engineering of University of Porto, in collaboration with University of Texas in Austin
  • 4. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering IV
  • 5. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering V ACKNOWLEDGMENTS I want to express my sincerest gratitude to Professor Alfredo Soeiro for encouraging and supervising my research. His constant support, guidance and the several productive discussions were essential for reaching this final stage. My sincerest appreciation is extended to Professor Gráinne Conole and Professor Joan Hughes for the useful conversations and their activity in my committee. I am also grateful to Professor Raul Vidal, Professor Rui Maranhão and Professor José Fernando Oliveira for their willingness to participate in this research study. I am especially thankful to Professor Gabriel David for his willingness to participate in the study and his continuous disposition to help in the development of the technical component of the project. I would also like to acknowledge my appreciation to Dra. Lígia Ribeiro and to all the members of the Unit for New Technologies in Education of Universidade do Porto for the constant support in my pursuit to extend my studies. My gratitude also goes to the Fundação para a Ciência e Tecnologia for their financial aid and the opportunity to research the area of e-learning and assessment. My appreciation also goes to my closest friends and family for their support and understanding that helped me to overcome the difficulties created by research work. A very special thanks goes to Nuno, for being there, for his support, encouragement and patience, and especially for his commitment to my researcher’s life. Finally, my sincerest gratitude goes to my Mother that was always there, supporting me in every possible way during this long period and in particular for being the best Grandmother that Francisco could ever wished to have. To Francisco, my son, I dedicate this work. Your education is the only thing that you will always carry with you, no matter what happens. Nobody can take it away. Luis Falcão de Berredo To you, my Father, only one word: saudade…
  • 6. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering VI
  • 7. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering VII TABLE OF CONTENTS ACKNOWLEDGMENTS .....................................................................................................................V TABLE OF CONTENTS ...................................................................................................................VII ABSTRACT......................................................................................................................................... XI RESUMO .......................................................................................................................................... XIII RESUME.............................................................................................................................................XV LIST OF TERMS AND ACRONYMS IN USE.............................................................................XVII LIST OF TABLES.............................................................................................................................XIX LIST OF FIGURES...........................................................................................................................XXI CHAPTER 1 - INTRODUCTION....................................................................................................1 1.1 Background ......................................................................................................................................... 1 1.2 Field of Study and Intended Stakeholders........................................................................................... 2 1.3 Background and motivation of the researcher .................................................................................... 3 1.4 Statement of the problem and research questions.............................................................................. 4 1.5 Impact expectations of the study ........................................................................................................ 5 1.6 Structure of the study ......................................................................................................................... 6 1.7 Publications related with the research study ...................................................................................... 6 CHAPTER 2 - RELATED WORK...................................................................................................9 2.1 Introduction ........................................................................................................................................ 9 2.2 Assessment ........................................................................................................................................11 2.3 Overview on e-assessment.................................................................................................................30 2.4 Learning Outcomes ............................................................................................................................32 2.5 The alignment question......................................................................................................................39
  • 8. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering VIII 2.6 Learning Outcomes in engineering education ....................................................................................41 2.7 Conclusions from the literature review..............................................................................................45 CHAPTER 3 - RESEARCH METHOD........................................................................................47 3.1 Research design .................................................................................................................................47 3.2 Development of the conceptual model..............................................................................................48 3.3 Multiple case-studies approach .........................................................................................................52 3.4 Data collection...................................................................................................................................52 3.5 Data analysis......................................................................................................................................54 3.6 Validity, reliability and limitations .....................................................................................................55 3.7 Ethical considerations ........................................................................................................................56 CHAPTER 4 - THE ALOA CONCEPTUAL MODEL.................................................................57 4.1 The rBloom matrix .............................................................................................................................58 4.2 Assessment methods .........................................................................................................................67 4.3 E-assessment tasks.............................................................................................................................86 4.4 The description of LO in EE.................................................................................................................92 4.5 Defining relations.............................................................................................................................107 CHAPTER 5 - IMPLEMENTATION........................................................................................ 113 5.1 Implementation scenarios................................................................................................................113 5.2 Practical tools ..................................................................................................................................118 5.3 Case studies .....................................................................................................................................123 CHAPTER 6 - INTERPRETATION OF RESULTS................................................................ 135 6.1 Application of the ALOA model........................................................................................................135 6.2 Adequacy of the chosen LOs in EE (RQ1) ..........................................................................................136 6.3 Adequacy of the selection of assessment methods (RQ2)................................................................137 6.4 Adequacy of the ALOA model to describe assessment.....................................................................137
  • 9. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering IX 6.5 The alignment question (RQ3 and RQ4) ...........................................................................................138 CHAPTER 7 - CONCLUSIONS ..................................................................................................141 7.1 Conclusions regarding the research problem....................................................................................141 7.2 Theoretical implications of the research project ..............................................................................143 7.3 Practical implications of the research project...................................................................................144 7.4 Implications for future research.......................................................................................................146 7.5 Final remarks....................................................................................................................................148 REFERENCES..................................................................................................................................149 ANNEXES...............................................................................................................................................I Annex I - DATABASE DIAGRAMS ................................................................................................III Annex II - TEMPLATES FOR THE DIFFERENT TYPES OF CASE STUDIES .........................V Annex III - CASE STUDY 1.............................................................................................................VII Annex IV - CASE STUDY 2.............................................................................................................. IX Annex V - CASE STUDY 3 .............................................................................................................. XI Annex VI - CASE STUDY 4........................................................................................................... XIII
  • 10. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering X
  • 11. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering XI ABSTRACT This study researched the question of alignment between the intended learning outcomes and assessment in the field of engineering education. Based on the initial problem, four research questions were developed that focused on: defining and describing learning outcomes in engineering; defining and describing assessment methods and e-assessment tasks; defining a model for achieving alignment between learning outcomes and assessment; finally, applying the model to link specific learning outcomes with specific assessment methods. During this study a conceptual model was developed with the goal of providing an answer to the research problem, the ALOA (Aligning Learning Outcomes with Assessment). The model was derived from literature research and included different components directly linked with the research questions. The first one is a selection and description of learning outcomes in the field of engineering that was based on existing qualification frameworks of the sector: ABET and EUR-ACE. The second component was a selection and description of assessment methods. This selection was derived from literature and compiled in a structured list that included six general assessment categories that are then divided in specific assessment methods. Both the learning outcomes and the assessment methods were described using an adapted version of the Taxonomy Table developed by Anderson et al. in their work A Taxonomy for Learning, Teaching and Assessing. The Taxonomy Table describes the assessment items and the learning outcomes using a two dimensional classification system based on knowledge and cognitive processes. During the development of the ALOA model each selected learning outcome and specific assessment method were analysed and classified in terms of type of knowledge and cognitive processes addressed. This detailed description was used for answering the main question of the research problem. By using the same classification system for both types of items, it was possible to develop an alignment proposal. The ALOA conceptual model was developed with the intention of being used by different stakeholders in Higher Education Institutions. The study defined four different scenarios of implementation of the ALOA model: to verify current alignment in existing courses; to develop an assessment strategy based on statements of learning outcomes; to verify vertical alignment of courses with higher level learning outcomes; to verify horizontal alignment of learning outcomes defined at the same level but in different contexts, as in mobility situations. The ALOA model was translated in practical tools and tested for applicability using a multiple case study approach. The results of the case studies are mainly concerned with the improvement of the model by detecting problems and suggesting changes. The main theoretical findings of the study were related with the clarification of core concepts in terms of assessment methods and assessment tasks. There was also an attempt to structure general information concerning assessment and e-assessment. In terms of the concept of alignment this study contributes with an innovative perspective that includes the clarification of the meaning of alignment and the definition of four criteria for achieving alignment: match, emphasis, coverage and precision.
  • 12. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering XII
  • 13. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering XIII RESUMO Este estudo investigou a questão de alinhamento entre os resultados de aprendizagem e a avaliação, na área da educação da engenharia. A partir do problema inicial forma definidas quarto questões a investigar: como definir e descrever os resultados de aprendizagem em engenharia; como selecionar e descrever métodos de avaliação e métodos de e-assessment; como definir um modelo que permita atingir o alinhamento entre os resultados de aprendizagem e os métodos de avaliação; como aplicar o modelo de forma a obter a ligação entre certos tipos de resultados de aprendizagem e métodos de avaliação específicos. Durante este estudo for desenvolvido um modelo conceptual que tem como objectivo dar resposta ao problema inicial, o modelo ALOA (Aligning Learning Outcomes with Assessment). O modelo foi obtido a partir de investigação de literatura e inclui componentes que abordam cada uma das questões acima mencionadas. A primeira componente é uma seleção e descrição dos resultados de aprendizagem na área da engenharia que foi obtida a partir de quadros de qualificações existentes no sector: ABET e EUR-ACE. A segunda componente é uma seleção e descrição de métodos de avaliação que foi obtida a partir da análise de obras de referência na área. A seleção dos métodos de avaliação foi organizada numa lista com seis categorias gerais em que cada uma inclui vários métodos específicos. Tanto os resultados de aprendizagem como os métodos de avaliação foram descritos recorrendo a uma versão adaptada da Tabela Taxonómica desenvolvida por Anderson et al. na obra A Taxonomy for Learning, Teaching and Assessing. A Tabela Taxonómica define um sistema de classificação com duas dimensões: tipo de conhecimento e processos cognitivos. Durante o desenvolvimento do modelo ALOA cada resultado de aprendizagem e cada método de avaliação foi analisado e classificado em termos de cada uma destas dimensões. Esta descrição detalhada dos itens foi utilizada para responder à questão principal deste projeto de investigação que era o alinhamento entre os resultados de aprendizagem e a avaliação. Ao utilizar o mesmo de sistema de classificação para os dois tipos de itens foi possível desenvolver uma proposta de alinhamento. O modelo conceptual ALOA foi desenvolvido com o objective de ser usado por diferentes tipos de utilizadores em instituições de ensino superior. O estudo definiu quarto cenários de implementação para o modelo: verificação do alinhamento em unidades curriculares existentes; definição de uma nova estratégia de avaliação tendo como base os resultados de aprendizagem pretendidos; verificação do alinhamento vertical de uma unidade curricular por comparação com resultados de aprendizagem definidos a um nível superior; verificação do alinhamento horizontal de duas unidades curriculares definidas ao mesmo nível mas em contextos diferentes como ocorre em situações de mobilidade. O modelo ALOA foi traduzido em ferramentas práticas e a sua aplicabilidade foi testada recorrendo a múltiplos casos de estudo. As principais conclusões teóricas deste estudo estão relacionadas com a clarificação dos conceitos métodos de avaliação e práticas de avaliação. Houve também uma tentativa da qual resultou uma proposta de estruturação de informação relativa à avaliação e e-assessment. Em relação ao conceito de alinhamento, este estudo contribui com uma perspectiva inovadora que inclui a clarificação do significado do termo alinhamento e a definição de quarto critérios para que se possa atingir o alinhamento: coincidência, ênfase, cobertura e precisão.
  • 14. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering XIV
  • 15. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering XV RESUME Cette étude a examiné la question de l'alignement entre les acquis d'apprentissage et d'évaluation dans le domaine de la formation des ingénieurs. D’aprés le problème initial ont défini les quatres questions de la recherche: comment définir et décrire les résultats d'apprentissage dans l'ingénierie;comment choisir et decrireles méthodes d'évaluation et les méthodes d'évaluation électronique; comment définir un modèle qui permettra d'atteindre l'alignement entre les résultats de l'apprentissage et les méthodes d'évaluation; comment 'appliquer le modèle de façon à obtenir la liaison entre certains types de résultats d'apprentissage et les méthodes d'évaluation spécifiques. Au cours de cette étude, on a developpé un modèle conceptuel qui vise à répondre au problème de départ , le modèle ALOA (Aligning Learning Outcomes with Assessment). Le modèle a été dérivée à partir de la littérature scientifique et comprend des composantes qui répondent à chacune des questions mentionnées ci-dessus. La première composante est une sélection et une description des résultats d'apprentissage en ingénierie qui a été obtenue à partir des cadres de qualifications du secteur: ABET et EUR-ACE. La deuxième composante est une sélection et une description des méthodologies qui ont été obtenus à partir de l'analyse des ouvrages de référence dans le domaine. Le choix des méthodes d'évaluation a été organisée dans une liste avec six grandes catégories dans lesquelles s’incluent les plusieurs méthodes spécifiques. Les résultats d'apprentissage et les méthodes d'évaluation ont été décrites à l'aide d'une version adaptée de la taxonomie développé par Anderson et al. Dans le travail sur la taxonomie de l'apprentissage, l'enseignement et l'évaluation. Cette taxonomie tdéfine un système de classification à deux dimensions: type de de connaissance et processus cognitif. Au cours du développement du modèle ALOA on a été evalué chaque résultat de l'apprentissage et chaque méthode d'évaluation en chacune de ces dimensions. Cette description détaillée des éléments a été utilisé pour répondre à la question principale de ce projet de recherche qu’était l'alignement entre les résultats d'apprentissage et d'évaluation. En utilisant le même système de notation pour les deux types d'articles a été possible développer une proposition d’alignement. Le modèle conceptual ALOA a été élaboré pour être utilisé par différents utilisateurs dans les établissements d'enseignement supérieur. L'étude a défine quatre scénarios de déploiement pour le modèle: vérification de l'alignement dans les cours existants; définition d'une nouvelle stratégie d'évaluation basée sur les résultats d'apprentissage escomptés; vérification de l'alignement vertical d'une unité d'enseignement par rapport aux résultats d'apprentissage définis à un niveau supérieur; vérification de l'alignement horizontal des deux cours fixés au même niveau, mais dans des contextes différents en situation de mobilité. Le modèle ALOA a été traduit en outils pratiques et leur applicabilité a été testée à l'aide de plusieurs 'études de cas . Les principales conclusions théoriques de cette étude sont liées à clarifier les concepts, les méthodes et les pratiques d'évaluation. Il y avait aussi une tentative qui a mené à une proposition de structuration de l'information sur l'évaluation et l'e-évaluation. En ce qui concerne le concept de l'alignement, cette étude fournit une nouvelle perspective qui inclut la clarification de la signification du terme et la définition de quatrecritère afin que nous puissions parvenir à un alignement: coïncidence, l'accent, la couverture et la précision.
  • 16. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering XVI
  • 17. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering XVII LIST OF TERMS AND ACRONYMS IN USE ABET Organization that accredits college and university programs in the fields of applied sciences and engineering. Formerly known as Accreditation Board for Engineering and Technology ALOA Model for Aligning Learning Outcomes with e-Assessment AT Assessment task CDIO Conceiving — Designing — Implementing — Operating real-world systems and products CE Continuing education CEI Continuing education institutions EE Engineering education EUR-ACE European quality label for engineering degree programmes HE Higher education HEI Higher education institutions ICT Information and communication technologies iLO Intended learning outcome LO Learning outcome LT Learning technologies MCQ Multiple choice question PL Prior learning rBloom Adapted version of the revised Bloom’s taxonomy RPL Recognition of prior learning RQ Research question SAQ Short answer question TLA Teaching and Learning activities TT Taxonomy Table
  • 18. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering XVIII
  • 19. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering XIX LIST OF TABLES TABLE 1 - EXAMPLES OF SEARCH EXPRESSIONS USED ............................................................................................10 TABLE 2 - TYPES OF SOURCES USED AND EXAMPLES..............................................................................................11 TABLE 3 - TRENDS IN ASSESSMENT. FROM BROWN ET AL. [7, P.13] .......................................................................13 TABLE 4 – PRINCIPLES AND PRACTICES ON ASSESSMENT (EXTRACTED AND ADAPTED FROM WOODS [50]) ...........16 TABLE 5 - REASONS FOR ASSESSING STUDENTS......................................................................................................16 TABLE 6 - SUMMARY OF ASSESSMENT METHODS ...................................................................................................22 TABLE 7 - FAMILIES OF ESSAY QUESTIONS .............................................................................................................24 TABLE 8 - GENERAL CRITERIA FOR ASSESSING ESSAYS (ADAPTED FROM BROWN ET AL. [7]).................................25 TABLE 9 - PROBLEM SOLVING STAGES AND COGNITIVE PROCESSES (ADAPTED FROM WOODS [55, P.448]) ............26 TABLE 10 - SUMMARY OF PROBLEM-SOLVING AS VIEWED BY PLANTS ET AL. [56].................................................26 TABLE 11 - SUMMARY OF PRACTICAL ENQUIRY AS VIEWED BY HERRON ...............................................................27 TABLE 12 - ASSESSING EXPERIMENTAL PROJECT (ADAPTED FROM BROWN ET AL. [7, P.128-9]) ............................28 TABLE 13 - BRIEF ANALYSIS OF CRISP’S E-ASSESSMENT ITEMS..............................................................................32 TABLE 14 - DIFFERENCES BETWEEN SPECIFICITY LEVELS OF OBJECTIVES (FROM ANDERSON ET AL. [40, P.17]) ....35 TABLE 15 - SUMMARY OF BLOOM'S TAXONOMY [41] ............................................................................................36 TABLE 16 - SUMMARY OF THE KNOWLEDGE DIMENSION OF RBLOOM ....................................................................37 TABLE 17 - SUMMARY OF THE COGNITIVE DIMENSION ...........................................................................................38 TABLE 18 - AN EXAMPLE OF A TAXONOMY TABLE USED TO VERIFY ALIGNMENT ..................................................40 TABLE 19 - COMPARISON BETWEEN QUALIFICATIONS IN EE..................................................................................44 TABLE 20 - SUMMARY OF DATA COLLECTION ........................................................................................................53 TABLE 21 - SUMMARY OF THE ANALYSIS OF DATA.................................................................................................54 TABLE 22 - THE TAXONOMY TABLE BY ANDERSON ET AL.....................................................................................59 TABLE 23 - FACTUAL KNOWLEDGE (EXTRACTED FROM ANDERSON ET AL. [40, P. 29])..........................................60 TABLE 24 - CONCEPTUAL KNOWLEDGE (EXTRACTED FROM ANDERSON ET AL. [40, P. 29])....................................61 TABLE 25 - PROCEDURAL KNOWLEDGE (EXTRACTED FROM ANDERSON ET AL. [40, P. 29])....................................61 TABLE 26 – METACOGNITIVE KNOWLEDGE (EXTRACTED FROM ANDERSON ET AL. [40, P. 29]) .............................62 TABLE 27 - REMEMBER CATEGORY (ADAPTED FROM ANDERSON ET AL. [40, P. 67/8])...........................................63 TABLE 28 - UNDERSTAND CATEGORY (ADAPTED FROM ANDERSON ET AL. [40, P. 67/8]).......................................63 TABLE 29 – APPLY CATEGORY (ADAPTED FROM ANDERSON ET AL. [40, P. 67/8])..................................................64 TABLE 30 - ANALYSE CATEGORY (ADAPTED FROM ANDERSON ET AL. [40, P. 67/8])..............................................65 TABLE 31 - EVALUATE CATEGORY (ADAPTED FROM ANDERSON ET AL. [40, P. 67/8])............................................65 TABLE 32 – CREATE CATEGORY (ADAPTED FROM ANDERSON ET AL. [40, P. 67/68])..............................................66 TABLE 33 – RBLOOM MATRIX, AN ADAPTED VERSION OF THE TAXONOMY TABLE BY ANDERSON ET AL ...............67 TABLE 34 - MAPPING OF ASSESSMENT BASED ON MCQ TO BLOOM'S REVISED TAXONOMY ...................................69 TABLE 35 - MAPPING OF ASSESSMENT BASED ON ESSAYS TO BLOOM'S REVISED TAXONOMY................................73 TABLE 36 - MAPPING OF SIMPLE CLOSED-ENDED PROBLEM SOLVING TO BLOOM'S REVISED TAXONOMY (DIAGNOSIS + ROUTINE ACTIVITIES) ..................................................................................................................................76
  • 20. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering XX TABLE 37 - MAPPING OF COMPLEX CLOSED-ENDED PROBLEM SOLVING TO BLOOM'S REVISED TAXONOMY ...........76 TABLE 38 - MAPPING OF OPEN ENDED PROBLEM SOLVING TO BLOOM'S REVISED TAXONOMY................................77 TABLE 39 - MAPPING OF PRACTICAL WORK TO BLOOM'S REVISED TAXONOMY ......................................................79 TABLE 40 - MAPPING OF ASSESSMENT USING SHORT-ANSWER QUESTIONS TO BLOOM'S REVISED TAXONOMY .......83 TABLE 41 - MAPPING OF REFLECTIVE PRACTICE ASSESSMENT TO BLOOM'S REVISED TAXONOMY..........................85 TABLE 42 - COMPLETE MAPPING OF ABET PROGRAMME OUTCOMES TO BLOOM'S REVISED TAXONOMY.............100 TABLE 43 - COMPLETE MAPPING OF EUR-ACE PROGRAMME OUTCOMES TO BLOOM'S REVISED TAXONOMY. .....107 TABLE 44 – MAPPING OF THE ABILITY TO USE NEWTON’S FIRST LAW IN ROUTINE PROBLEM SOLVING.................109 TABLE 45 - STAKEHOLDERS OF DIFFERENT SCENARIOS FOR ALIGNMENT..............................................................118 TABLE 46 – NUMBERING OF THE CELLS TO TRANSFORM TWO DIMENSIONS INTO ONE...........................................120 TABLE 47 - LINK BETWEEN ILOS AND THE LOS OF ABET ...................................................................................126 TABLE 48 - LINK BETWEEN ILOS AND THE LOS OF EUR-ACE ............................................................................126 TABLE 49 - SUMMARY OF METHODS OF ASSESSMENT USED IN THE CASE STUDY ..................................................127 TABLE 50 - OVERLAPPING OF RBLOOM FOR SIMPLE PROBLEM SOLVING WITH ITEM CS.02.A03.14A ...................127 TABLE 51 - OVERLAPPING OF RBLOOM FOR OPEN PROBLEM SOLVING WITH ITEM CS.02.A03.14B.......................127 TABLE 52 - OVERLAPPING OF RBLOOM FOR PROJECT WITH ITEM CS.02.A02.......................................................128 TABLE 53 - ALIGNMENT OF LO1 WITH REAL ASSESSMENT ...................................................................................129 TABLE 54 - ALIGNMENT OF LO2 WITH REAL ASSESSMENT ...................................................................................129 TABLE 55 - ALIGNMENT OF LO3 WITH REAL ASSESSMENT ...................................................................................129 TABLE 56 - ALIGNMENT OF LO4 WITH REAL ASSESSMENT ...................................................................................130 TABLE 57 - ALIGNMENT OF LO1 WITH STANDARD ASSESSMENT ..........................................................................130 TABLE 58 - ALIGNMENT OF LO2 WITH STANDARD ASSESSMENT ..........................................................................130 TABLE 59 - ALIGNMENT OF LO3 WITH STANDARD ASSESSMENT ..........................................................................130 TABLE 60 - ALIGNMENT OF LO4 WITH STANDARD ASSESSMENT ..........................................................................131 TABLE 61 - MATCHING SCORES OBTAINED FOR THE EACH ILOS WHEN COMPARED WITH STANDARD ASSESSMENT METHODS......................................................................................................................................................131 TABLE 62 - OVERLAPPING OF STANDARD ALOA SIMPLE PROBLEM SOLVING MATRIX WITH REAL ASSESSMENT FROM CASE-STUDY .......................................................................................................................................132 TABLE 63 - OVERLAPPING OF STANDARD ALOA OPEN PROBLEM SOLVING MATRIX WITH REAL ASSESSMENT FROM CASE-STUDY .................................................................................................................................................133 TABLE 64 - OVERLAPPING OF STANDARD ALOA PROJECT MATRIX WITH REAL ASSESSMENT FROM CASE-STUDY 133 TABLE 65 - RELATION BETWEEN THE PROBLEM AND THE ALOA MODEL .............................................................142
  • 21. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering XXI LIST OF FIGURES FIGURE 1 - FIELD OF STUDY......................................................................................................................................2 FIGURE 2 - ASSESSMENT CYCLE AS DESCRIBED BY BROWN ET AL..........................................................................15 FIGURE 3 - PURPOSES OF ASSESSMENT ...................................................................................................................17 FIGURE 4 - SOURCES OF ASSESSMENT ....................................................................................................................18 FIGURE 5 – CURRENT CHALLENGES IN ASSESSMENT ..............................................................................................18 FIGURE 6 - DIFFERENT DIMENSIONS OF RELIABILITY OF ASSESSMENT....................................................................19 FIGURE 7 - TYPES OF VALIDITY RELATED WITH CURRENT RESEARCH.....................................................................20 FIGURE 8 - THE LEARNING CYCLE AS VIEWED BY KOLB.........................................................................................29 FIGURE 9 - GROUPS OF ACTIVITIES RELATED WITH E-ASSESSMENT IMPLEMENTATION [26]....................................30 FIGURE 10 - THE SAME LO SHOULD BE PRESENT IN ALL THE ACTIVITIES ...............................................................41 FIGURE 11 - RELATION BETWEEN THE CONCEPTUAL MODEL AND THE RESEARCH QUESTIONS................................49 FIGURE 12 – INITIAL REPRESENTATION OF THE PROBLEM.......................................................................................49 FIGURE 13 - SECOND VERSION OF THE MODEL........................................................................................................50 FIGURE 14 - THIRD VERSION OF THE MODEL...........................................................................................................51 FIGURE 15 - FOURTH AND FINAL VERSION OF THE MODEL......................................................................................52 FIGURE 16 - THE ALOA CONCEPTUAL MODEL.......................................................................................................57 FIGURE 17 - ALIGNMENT POSSIBILITIES FOR ONE UNIT OR COURSE ......................................................................107
  • 22. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering XXII
  • 23. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering 1 CHAPTER 1 - INTRODUCTION 1.1 BACKGROUND Since the last half of the 20th Century, the World has been experiencing rapid transformation in the field of Education led by the changing Knowledge Society [1]. As Peter Drucker explained in 1996, in this new society access to work is only gained through formal education and not acquired through apprenticeship. Almost two decades have passed and this is already what is happening in some parts of the World. Education and schooling have become a major concern for the society and it is a priority in national and transnational policies. The strategic document Europe 2020 [2] defines as a priority the development of an economy based on knowledge and innovation. All educational institutions play an important role in achieving this goal. Higher Education (HE), Continuing Education (CE) and Vocational Training have been most affected by this transformation, adapting to the demand for new skills of the labour market and at the same time corresponding to the needs of an increasing number of students [3]. The global economy created opportunity and need for the mobility of students and workers, demanding efficient recognition of qualifications and increasing competitiveness in this field. The labour market demands more workers qualified and updated and this trend is mirrored in educational policies in Europe [4-6]. All this generates pressure towards a quality-based approach for all education providers, as Drucker predicted in his article about the knowledge society[1]. One visible effect of this transformation is the shift from a content-based approach in Education to an approach centred on the student and what he/she has learned and achieved. This transformation has led to different trends in HE as identified by Brown et al. [7]. One of the trends is the change from structuring education based on what should be taught by defining educational objectives to structuring education by defining what students should learn, the Learning Outcomes (LOs). This approach is underpinning the development and implementation of most European Education policies at international and national levels [8-10]. In Europe, HEI and CEI are redefining programmes in terms of Learning Outcomes, harmonizing them with national, international and sector level frameworks of qualifications that are also based on Learning Outcomes [11-14]. Several projects and initiatives are
  • 24. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering 2 working towards the definition LOs, specific and transversal that can be used as a common reference [15, 16]. Learning Outcomes are also becoming fundamental for structuring the standards and guidelines of quality assessment of HE and CE institutions in Europe and worldwide [17-20]. The field of engineering education has also been affected by these current trends. Learning outcomes are being used in qualification frameworks specific of engineering, like ABET, EUR-ACE and CDIO. In this context, the assessment of Learning Outcomes becomes a crucial process for the Educational System. It should be a major concern of educational institutions to ensure that assessment of student learning is being guided by what they should be learning, i.e. assessment should be aligned with the intended learning outcomes (iLOs). Another major revolution in our society has been the introduction of Information and Communication Technologies. The use of ICT applied to Education, e-learning, has been increasing and its use creates new opportunities for teaching, learning and assessment and has huge potential as an answer to some of the current challenges of Education. The change to the digital media has impact on the availability, reusability, accessibility and cost of learning resources, complemented by the communication and networking potential of the Internet that takes Education to a Global level [21] [22, 23]. The application of ICT to education and in particular to assessment is a subject of great discussion. Some of the issues related with the use of e-learning in assessment are related with validity and reliability of the process. This study will be focusing on this particular issue. 1.2 FIELD OF STUDY AND INTENDED STAKEHOLDERS The current study will be looking into the relationship between learning and assessment. The focus of the study is placed in the intersection of three areas, as illustrated by Figure 1: learning outcomes, assessment of student’s learning and e-learning. It will be focusing on Higher Education but more specifically on Engineering Education. The question of alignment of e-assessment and education is represented by the area where the three circles overlap. Figure 1 - Field of study The wide use of e-learning is already impacting Education, promoting change and innovation in different aspects including pedagogy, technology, organization, accessibility, and flexibility among others [24]. It is a complex and multidisciplinary area and, given its impact, it is important that E- learning research be informed by evidence [23]. Current literature reviews in this area indicate that e- learning approaches to assessment lack the pedagogical framework and most research describes Learning Outcomes e-Learning Assessment
  • 25. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering 3 implementation studies at course level [25]. The present research intends to contribute to establishing a conceptual model for the implementation of e-assessment in Engineering Education. Assessment is a crucial process of Education and is seen by current trends as part of the learning process and not as a separate event [7]. Assessment of student learning will encourage involvement of students and provides feed-back to students and teachers [26]. It has an important role in validation and certification and is deeply related with quality issues as will be explained below. The current study will be focusing on the specific link between what should be learned and what should be assessed. The purpose of this study is to research the applicability of e-assessment strategies to the field of engineering education. It intends to help in the definition of adequate e-assessment strategies based on what is the stated intended learning. This research study mainly applies to engineering courses that use or intend to use e-assessment strategies. The implementation stage addressed existing courses of Faculdade de Engenharia da Universidade do Porto (FEUP). It focused on the stated learning outcomes of the courses and on the existing assessment and e-assessment strategies as important components of the alignment question. As the study will focus at course level, it will be strongly related with the activities of faculty that are usually the ones that define the iLOs and the assessment tasks. However, the results of the study have impact on other levels, including programme and qualification frameworks. It is expected that other stakeholders including curriculum designers, mobility staff, and quality and accreditation staff will be using the results of this study. In terms of students, they are not considered direct users of the study and tools. Still, the project is based on the learner centred perspective of education and they will benefit from a more efficient and effective educational process. 1.3 BACKGROUND AND MOTIVATION OF THE RESEARCHER Since 1998 I have been working in the field of ICT applied to education. I started working as an editor of multimedia projects in an educational publishing company. Later in 2001, I continued the work in this field as responsible for the e-learning unit of Universidade do Porto. This position gave me the opportunity to contribute to the implementation of an institutional strategy for e-learning. At the same time, it allowed me to build a transversal perspective of the implementation of learning technologies at the institutional course level, including different learning management systems (LMS) and other learning technologies (LTs). This role included giving individual support to teachers in terms of pedagogical and technological issues. It was possible to learn from the different strategies used by the teachers in terms of teaching and assessment and to perceive the perspective of the students. Additionally it included the active participation in European research projects in the field of e-learning which was important to learn and update knowledge and skills in this area, During this period I was a teacher in traditional courses at the university. As I developed blended learning strategies to work with the students I could better understand the perspective of the teachers and learners when dealing with e-learning. I was involved in the development, teaching and tutoring of distance e-learning courses that contributed to improve my knowledge in this area. Finally, I was a student in a Master’s programme of the School of Engineering that included an e-learning course with e-assessment tasks. This gave me the perception of the student view in terms of e-learning. This summary of activities shows that the motivation for the field of study comes from both the academic and the professional area. When dealing with e-learning strategies from each of the perspectives, assessment has always been a component that raises questions and problems. The flexibility of the e-learning environment creates opportunities for experimenting and many interesting
  • 26. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering 4 and creative examples of e-assessment tasks can be found at Universidade do Porto. A recent trend at the institution showed that teachers are using e-learning based assessment for summative purposes. Most of the e-assessment implementations were online exams but the teachers successfully developed other types of tasks [27-29]. All of these represent a very interesting field of research. The application to the field of engineering comes basically from the same background. From the researcher perspective, there are two subject areas that are quite innovative in the use of Learning Technologies: medicine and engineering. Also, both areas have strong civil responsibility and the need to have regulations from professional bodies. Both aspects have impact on the mobility of the students and workers. The affiliation to the School of Engineering as a student and a former invited teacher both contributed to my decision of the scope of application of the research. Finally, I worked closely in the creation and development of the European project VIRQUAL that was related with field of qualification frameworks, learning outcomes and assessment in the context of international mobility [30-33] which developed an interest in the area of transnational Education policies, Qualification Frameworks and Quality Assurance in HE, a field that has gained an increased importance in Higher Education. The combination of these different areas of interest contributed to the definition of the problem and to the research approach that came after the initial research. 1.4 STATEMENT OF THE PROBLEM AND RESEARCH QUESTIONS The research problem is placed in the area of interaction of three fields of study: Assessment, Learning Outcomes and e-Learning. The scope of application will be engineering education. Engineering education has specificities and requisites [34] that may constitute challenges in the implementation of assessment based in Learning Technologies (LT). This study intends to contribute to developing e- learning strategies for assessing EE, helping to deal with current challenges in education. In general terms, the approach chosen was to develop a model that matches specific e-assessment methods to specific LOs in the field of engineering. This means that it might be possible for teachers to define the intended Learning Outcomes (iLO) of an online course and from this definition to have an indication of the e-assessment methods they might consider using. Formally, this problem is defined as: To what extent may e-assessment methods be used to measure the achievement of Learning Outcomes in engineering education? Given this problem, it was necessary to recognize that there are a wide variety of engineering schools, engineering programmes and engineering courses. There are different qualification frameworks that use LOs in the engineering sector. So, the first challenge was how to select the LOs that were going to be used for the purpose of the research. The selection process had to take in consideration the subject area, level and even the nature of the LOs. The same problem existed in relation to the assessment methods. Assessment tasks are usually defined at course level, even though some examples can be found at a higher level. Again, there are a considerable variety of assessment and e-assessment tasks that could be used in this study. Another aspect of assessment is that in most cases is deeply embedded in the structure of the course or unit and highly contextualized.
  • 27. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering 5 The first two research questions translate this need for definition of the central concepts of the problem: RQ1) Which type of Learning Outcomes in the field of Engineering are relevant and should be considered? RQ2) What are the e-assessment methods that should be considered? After defining the concepts, it is necessary to deal with the core of the problem that is the relationship between assessment and LOs. This link between the iLOs and the assessment tasks is part of what is considered the alignment of a course [35, 36]. The first concern with alignment is if every iLO is assessable. It is a recognized issue in HE the need to include transversal skills in curricula and how difficult it is to assess them. In engineering, curricula are pressured to include these types of LOs among the technical LOs specific of the sector. This is a known challenge in EE [34] that may also represent a challenge to effective assessment. The last two research questions of this study are related with the alignment question: RQ3) What type of intended Learning Outcomes can be measured by e- assessment methods? RQ4) Is it possible to propose specific e-assessment strategies for each type of LO in EE? 1.5 IMPACT EXPECTATIONS OF THE STUDY Assessment is an important issue in HE that has high impact on learning [3, 7, 37, 38]. It raises questions about the efficiency, effectiveness and adequacy of different methods and strategies. E- assessment, due to the technological component and the association with distance learning brings additional controversies, some of them intrinsic to e-learning. As described by Conole and Martin, e- learning is a complex area that may have impact in a diversity of fields including the teaching and learning process (T/L), organizational structures, political and socio-cultural issues, among others [22]. It is expected that the findings of this research project will: • Help teachers to decide which assessment tasks are more suitable for the LOs stated for a specific course or module. The ALOA model developed includes practical tools that will support the decision process. • Help teacher and other stakeholders to verify the current alignment of existing courses. This is one of the scenarios of application of the ALOA model. • Contribute to develop a pedagogical framework for the implementation of aligned e- assessment strategies, based on the definition of Learning Outcomes. The ALOA model was developed as a conceptual and theoretical model that intends to promote the alignment between iLOs and assessment. • Facilitate accreditation processes and navigation between different qualification frameworks, by providing a common tool for the description of LOs. This was defined in the current study as vertical alignment and is one of the implementation scenarios of the ALOA model. • Promote mobility and recognition of prior learning, formal or informal by allowing the comparison of the LOs of previous experiences with the intended ones.
  • 28. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering 6 1.6 STRUCTURE OF THE STUDY This dissertation is divided in seven chapters. This chapter defines the context of the current research study and introduces the main thematic areas that will be explored during the development of the work. After providing a rationale for the work, the chapter will present the background of the researcher and the motivations for conducting this study. Following, the research problem is described as well as the research questions that guided the project. Finally, the chapter addresses the expected impact of the research and includes a description of the structure of this dissertation, with a brief explanation of the contents of each chapter. Chapter 2 provides a theoretical background for the problem and a critical analysis of the published work in the field. From the analysis of theory and published work it was possible to define the approach that was followed to address the problem. The ALOA conceptual model was derived from the literature research so this chapter grounds most the decisions that were taken during development of the work. Chapter 3 describes the different components of the research method, including the theoretical development of the conceptual model and the multiple case studies approach used in the implementation stage. The chapter also explains how the research project dealt with validity, reliability and ethical issues. In Chapter 4 the conceptual model is described in detail. It includes the description of the main concepts of the model: LOs in engineering and assessment methods. The chapter also includes a definition of the relationships between the concepts that are included in the ALOA model. Chapter 5 describes the implementation of the model. It starts by identifying and describing some implementation scenarios and the practical tools that were developed. Finally, the chapter describes the implementation of the ALOA model to the case studies that were used to test the applicability of the model. In Chapter 6 the results of the applicability test are interpreted from the perspective of each case and from a transversal point of view. In this chapter, the results of the implementation stage are analysed from the perspective of the research questions. Chapter 7 is the final chapter of the dissertation. It starts by presenting the global conclusions of the work from the perspective of the research problem. Then, the results are analysed in terms of the contributions to theory and practice. Finaly, it explores future work that could be developed based on the current project. 1.7 PUBLICATIONS RELATED WITH THE RESEARCH STUDY • E-ASSESSMENT OF LEARNING OUTCOMES IN ENGINEERING EDUCATION”, Falcão, R., Proceeding of WEEF 2012, Buenos Aires, Argentina, October 2012 (accepted) • “A CONCEPTUAL MODEL FOR E-ASSESSING STUDENT LEARNING IN ENGINEERING EDUCATION”, Rita Falcão, Proceedings of ICL2012, Villach, Switzerland, September 2012 (accepted) • “A CONCEPTUAL MODEL FOR E-ASSESSING STUDENT LEARNING IN ENGINEERING EDUCATION “, Falcão, R, SITE 2012, Austin, Texas, USA, March 2012 • “ASSESSING LEARNING OUTCOMES IN ENGINEERING EDUCATION: AN E- LEARNING BASED APPROACH”, WEE 2011, Lisboa, Portugal, September 2011
  • 29. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering 7 • “Assessment of Learning Outcomes through e-Learning”, WCCEE2010, Singapore, October 2010 • "MEASURING IMPACT OF E-LEARNING ON QUALITY AND LEARNING OUTCOMES – A PROPOSAL FOR A PHD PROJECT", Falcão, R., Soeiro, A., EDEN 2007, Junho 2007;
  • 30. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering 8
  • 31. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering 9 CHAPTER 2 - RELATED WORK 2.1 INTRODUCTION The field of study of this dissertation is defined by the intersection of three areas: assessment, learning outcomes and e-learning. Given this field, the research focused in the area of Higher Education and in particular Engineering Education. Given the general field of the research, what was expected to obtain from the literature review? What guided the literature research was the analysis of the problem and of the research questions. To what extent e-assessment methods may be used to measure the achievement of intended Learning Outcomes in engineering education? The problem is composed by two dimensions and intends to explore a specific relation between them. The dimensions “Learning Objects in Engineering Education” and “e-assessment methods” are the focus of the first two research questions. RQ1) Which Learning Outcomes in the field of Engineering are relevant and should be considered? RQ2) Which are the on-line assessment methods that should be considered? What is intended with these two questions is a clarification of the dimensions that will be analysed so it is possible to obtain a valid selection or at least an informed selection process. Literature research was fundamental to analyse this aspect of the problem. The first dimension to be analysed was the LOs in EE. Starting with the general approach, it was necessary to clarify certain aspects related with LOs including terminology, definitions and the theory behind them. Concerning EE, it was researched what was the role of LOs in EE, how were they implemented and if there was a trend for a common set of LOs in engineering courses. Concerning the second dimension, e-assessment, it was possible to find a wide variety of published work with a corresponding variety of assessment methods. However, most of the published work was
  • 32. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering 10 based on individual cases of application. It was difficult to find fundamentals of e-assessment or a comprehensive study of e-assessment methods. Early in the literature research of this subject it was considered necessary to go back to the basics of assessment. Donnan [39] faced a similar situation in his work. Educational developers considered that e-assessment or online assessment was not distinct from general assessment. So the analysis again started from the perspective of general assessment to clarify several issues and finally specific e-assessment methods were researched. The third component of the problem is a specific relation between the dimensions, the alignment between LOs in EE and e-assessment methods. That is the core of the problem and the focus of research questions 3 and 4. RQ3) What type of intended Learning Outcomes can be measured by e- assessment methods? RQ4) Is it possible to propose specific e-assessment strategies for each type of LO in EE? In this stage, literature research was centred on finding suggestions on using specific assessment methods to measure or evaluate specific types of Learning Outcomes, in general and in the field of Engineering. An important finding in terms of related work, was the book “A Taxonomy for Learning, Teaching and Assessment” [40] that became a central piece of the this research project. 2.1.1 LITERATURE RESEARCH METHOD The initial stage of the literature research was exploratory with a view of obtaining a framework for the research problem. The more systematic approach was guided by the analysis of the problem and previous knowledge on the subject. Table 1 summarizes the main search expressions used. Table 1 - Examples of search expressions used Research question Research terms RQ1 Learning Outcomes or Learning Objective and Engineering, Higher Education, qualification frameworks, ABET, EUR-ACE RQ2 assessment or e-assessment or evaluation and methods, tools, online, CBT (computer based testing), CBA (computer based assessment), CAA (computer assisted assessment), web based RQ3 Learning Outcomes + assessment Learning Outcomes + assessment methods Learning outcomes + evaluation Bloom + assessment Bloom + Learning Outcomes + assessment RQ4 Assessment or e-assessment or portfolios or multiple choice questions (MCQ) or essays or short answer questions (SAQ) and engineering, learning outcomes + engineering, learning outcomes + ABET, learning outcomes + EUR-ACE, learning outcomes + Bloom, ABET + Bloom, EUR-ACE + Bloom The research was conducted using resources available at the library of School of Engineering, including books, journals and databases of journals and conference proceedings. Additionally, online searches were performed using web tools including Google, Google Scholar and specific websites. Table 2 presents the main sources used during this research project.
  • 33. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering 11 Table 2 - Types of sources used and examples Types of sources Examples Books Taxonomy of educational objectives [41] A taxonomy for learning, teaching, and assessing: a revision of Bloom's taxonomy of educational objectives [40] Teaching for quality learning at university: what the student does [42] Assessing student learning in higher education [7] Assessment for learning in higher education [43] Designing better engineering education through assessment [44] Contemporary perspectives in e-learning research : themes, methods, and impact on practice [24] The Sage handbook of e-learning research [21] The e-assessment handbook [26] Journals Assessment and Evaluation in Higher Education European Journal of Engineering Education Journal of Engineering Education British journal of educational technology Alt-J, research in Learning Technology Studies in Higher Education Conference proceedings CAA Frontiers in Education Teaching and Learning in Higher Education EDUCOM Computers in Education Policy papers European Commission Bologna Process Copenhagen Process National Institutions Websites ABET EUR-ACE AHELO Tuning JISC E-learning Papers 2.2 ASSESSMENT Assessment is an important process in education and intimately related to how students learn, as pointed by different authors. Brown, Bull and Pendlebury [7, p. 7] state that assessment is at the centre of the learning experience and should be a concern for those involved in the learning process, including learners, educators and institutions. In their view, assessment defines what students will consider important, how they will spend their time and even how they will see themselves in the future. They consider that to change learning we need to change the way we assess. This perspective is shared by Biggs [42] when he describes the backwash effect of assessment. Biggs defends that students will look strategically at assessment to determine what and how they will learn. Again, the main idea is that assessment is the central driver for the learning process. Another author, Peter Knight [43, p.11] shows a similar but more dramatic view when he asks how we can make student work without assessment. In the introduction to the book “Assessment for Learning in Higher Education” Knight considers that assessment is a moral activity that states the values of the teachers and institutions. What is assessed and the methods used give clear indication to the students of what is valued in a course or in a programme. In the view of the author, even though the goals of a
  • 34. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering 12 full programme are stated in documents like the mission statement and the programme goals, it is in fact in the assessment tasks that lays the essence of the programme and of the learning experience. In the same book, David Boud [3] considers that assessment is the aspect of HE where there is more bad practice and the effects of this bad practice has a strong impact on students learning and success. As the author refers, students may avoid bad teaching methods but they cannot avoid being assessed if they want to graduate. Boud, based on the work of other authors considers that as a consequence of bad assessment practices, students may be hurt in their self-esteem and even reject some subjects. Race [45] also explores the negative effects of assessment, in particular exams, in terms of feelings. He describes the worst nightmares students get before or after an exam and how these bad feeling affect learning. Reinforcing this idea, Brown et al. [46] write that even though assessment is an important element of the learning process, new or existing faculty rarely have training to improve their assessment skills. An interesting perspective from the works of these authors is that in most cases, assessing and marking are activities done privately and most of the times teachers only receive feedback from students and not from their peers (other teachers) to validate their decisions. From the perspective of the students, the problem is similar. Most of the times students are assessed only by their teachers and don’t receive feed-back from other students. Race [45] goes even further and considers that one of the principles of assessment is the need for transparency not only to students and staff but also to the employers. In fact, recent calls for accountability of the institutions placed assessment on the main stage of the educational process. Assessment provides important evidence for quality and accreditation, which means that assessment methods, criteria and results won’t be as private as before. Even though some statements of these authors may be consider a bit strong, there are some interesting points made concerning the importance of assessment that may be summarized as follows: • Assessment is one of the main drivers for learning • Has a strong impact on what and how students learn • Affects students self-esteem and confidence • Embeds the values of the teacher and the institution • Is a compulsory activity to obtain a graduation/certification • Provides evidence for quality and accreditation Current trends, as Knight, Boud and other authors refer [3, 7, 43, 47] treat assessment as an activity that is part of the T/L process and that cannot be separated. As Erwin puts it [47] deciding what to teach and assess is one issue, not two. Assessment should be seen as a learning activity, centred on the learner. Brown et al. present an overview of current trends in assessment, focused on practical issues. A summary of these trends is represented in Table 3. It can be said that these changes are somewhat related with the overarching trend of student centred learning. These trends focus on promoting the formative function of assessment, increasing feedback to students and contributing to improve learning. It can be said that assessment is becoming more personalized and students are becoming more involved in the learning process through the use of explicit criteria and learning outcomes, allowing self and peer assessment and facilitating the recognition of prior-learning. From the trends referred in Table 3, three are of particular importance to the present work: the change from objectives to outcomes, from content to competences and from implicit to explicit criteria. As referred by Brown et al. using learning outcomes is useful to clarify the relationship between course
  • 35. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering 13 design and assessment. Also, LOs play an important role in the recognition of prior learning since it separates learning from teaching activities, opening the possibility of assessing what was learned independently of how it was learned. Table 3 - Trends in assessment. From Brown et al. [7, p.13] From Towards Written examinations Coursework Tutor led-assessment Student led-assessment Competition Collaboration Product assessment Process assessment Objectives Outcomes Content Competencies Course assessment Modular assessment Advanced levels Assessed prior learning The change from content to competences is related with the change to outcomes. As referred by Brown et al. competences are clusters of skills students are able to use in different situations and they provide a framework for defining LOs and transferable skills. Finally the use of explicit criteria in assessment plays an important role in the pedagogical process. When the assessment criteria or marking schemes are explicit they may provide important information for the learning process. If shared with students, these instruments will give them indication of what is expected from them. By defining explicit criteria for an assessment task one is also defining the links between the task and the iLOs. Explicit criteria may have an important role in the reliability of the assessment task since it may reduce differences between assessors. As referred by Brown et al. [7] there is considerable controversy around the efficiency of the use of implicit and explicit instruments. However, the focus of this work is not on the instruments but on the methods of assessment. Boud [3] provides a useful description of the evolution of assessment that provides some background understanding on this matter. In the conventional conception, assessment follows learning and aims at finding out how much was learned; it is a quantitative perspective. There is no questioning concerning linking the assessment task to learning. This conception is followed by educational measurement that follows the same principles but it intends to be more rational, more efficient and reliable. It includes ideas and concepts from psychometrics. Nowadays we are still influenced by this conception as can be seen by the wide use of multiple-choice question exams that are a typical instrument of educational measurement. The latest perspective identified by Boud is competency based and authentic assessment. It resulted from concerns about validity of assessment focusing on the link between what was assessed and what was expected for students to have learned. Authentic assessment includes the direct assessment of complex performance and includes methods such as portfolios, open-ended problems, hands-on lab work. It contrasts with indirect assessment methods like multiple-choice questions that measure, among other things, indirect indicators of performance [48]. This conception of assessment questions the validity of educational measurement approaches and promotes performance-based assessment and the importance of learning outcomes. As Boud refers,
  • 36. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering 14 what is important is to assess if the students are achieving the iLO, independently of how they reached them. Good assessment is, as described by Boud, the one that is linked with the iLO and the one that promotes learning. Erwin [47] also supports that the first step in educational design is to define clearly the learning outcomes. This should be done before deciding teaching and assessment strategies, both at programme and course level. Another author, Race [45, p. 67] provides ten principles of assessment and again, the first ones are related with clearly defining the purpose of assessment and integrating assessment in the course activities and not as a separate event. Race defends the importance of assessment providing feedback to students, in agreement with the perspective of Boud. Race and other authors [46] consider that a key question about assessment is knowing what we are assessing. However, the analysis proposed by these authors is not based on the iLO but on the assessment tasks. For each task it should be clear what is being assessed: the content, the process, the structure, the product, the style, the presentation, etc. The authors recommend that assessment criteria should be clearly stated and then provided and explained to students. They even go further saying that these criteria should be negotiated with the students, to share the ownership of assessment and help them understand the whole assessment process. Another interesting perspective is provided by Brown et al. [46, p.82] when they suggest that the outcome of an assessment task should not be merely a grade but a description of what students know and can do. Again, the link between assessment and LO is highlighted as being valuable in the educational process for the future development of students. Given the impact of assessment in student learning, Boud considers important to reflect on how assessment affects learning, what students learn from assessment. He considers assessment gives a message to students about what they should be learning but the message is not clear and most likely will not be interpreted the same way by teachers and students. For this reason, assessment will most likely have non-intended consequences in student learning. Students will respond strategically to assessment based on past experiences, choosing an approach that will lead them to success. Linn [48, p. 16] shares the same opinion saying that assessment may have intended and unintended effects both on learning and on teaching. As an example, both teachers and students may spend more time teaching or studying concepts that will be explicitly included on the assessment tasks and neglect those that will not be included. This concept is called consequential validity and is approached by Linn and other authors like Messick [49]. This is related with the backwash effect of assessment on learning. As Boud explains, the backwash effect is positive if encourages the intended learning outcomes and is negative when encourages ways of learning that are not desired, like memorizing instead of understanding. Linn [48, p. 19] suggests that to understand the real cognitive complexity of an assessment task one must analyse the task, the familiarity of the student with the task, and ideally the process students follow to solve it. An apparently complex task may be addressing lower level thinking skills if the student is only recalling previous knowledge about the task. To summarize, the following ideas are important when defining assessment: • Assessment should be seen as part of the teaching and learning process (T/L) • The first step is to have good assessment is to define the iLOs of the course or module • It is important to define clearly the assessment tasks and what is being considered for assessment • Assessors have to realize that assessment tasks will have intended and unintended outcomes • The real cognitive complexity of an assessment task depends not only on the task but on many other factors, some related with the learner, others with the T/L process.
  • 37. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering 15 2.2.1 KEY CONCEPTS ABOUT ASSESSMENT The term assessment, as analysed by Brown et al. [7] may have different interpretations. The origin of the term is Medieval Latin meaning “sitting beside”1 to determine tax value. The general meaning of assessment is to estimate worth, to judge the value. In traditional views of education, as described above, judging and determining value is the core function of assessment. However, current trends in education give assessment a new and important role, contributing directly to learning. For the purpose of this work, we will use the definition of assessment as proposed by Brown et al. [7, p.11] since it is broad enough to include most assessment tasks: Any procedure used to estimate student learning for whatever purpose. Brown et al. [7, p.8] describe what can be considered an assessment cycle. It consists of three essential steps: taking a sample of what students do, making inferences and estimating the worth of what was done. Figure 2 - Assessment cycle as described by Brown et al The first step, sampling may include a traditional assessment method like an essay or exam but may also include solving problems, carrying out a project, performing a procedure, etc. Samples are analysed by the assessors who will make conclusions about what was achieved by the students when compared with what was intended as described on the LOs’ statements. Finally, the assessor will make an estimate of the value of what was achieved by attributing marks or grades. This research is mostly concerned with the first two steps of the cycle. The third step is concerned with marking and grading and falls out of the goals of the current work. Woods [50] defines assessment as a judgement based on the degree to which the goals have been achieved based on measurable criteria and on pertinent evidence. This definition includes a strong emphasis on the judgemental purpose of assessment. Based on this definition, Woods and other referenced authors present five assessment principles and six good practice recommendations that are summarized in Table 4. Concerning the purpose of assessment, Brown et al. [46, p.77] analyze existing motivations for assessment and identify different reasons to assess. Brown et al. [7, p.11] presented a similar list of reasons. A synthesis of both lists is presented in Table 5. 1 http://dictionary.reference.com/browse/assess assessment 1. Sampling 2. Making inferences 3. Estimating value
  • 38. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering 16 Table 4 – Principles and practices on assessment (extracted and adapted from Woods [50]) Assessment principles Principles in practice Assessment is a judgement based on performance - not personalities. Assessment is a judgement based on evidence - not feelings. Assessment should be done for a purpose with clearly-defined performance conditions. The student should know when he/she is being assessed. Assessment is a judgement done in the context of published goals, measurable criteria and pertinent, agreed-upon forms of evidence. Assessment should be based on multidimensional evidence. What is being assessed? Have the goals been expressed unambiguously in observable terms? Who creates the goals? Are the goals explicit and published? Are there criteria that relate to the goals? Can each criterion be measured? Who creates the criteria? Are the criteria explicit and published? 
 What evidence is consistent with the criteria? Do both the assessor and the student know that this form of evidence is acceptable? Are the goals and the collection of the evidence possible to achieve in the time and with the resources available? What is the purpose of the assessment? Under what conditions is the student’s performance assessed? Who assesses? What type of feedback is given by the assessor? Have both the student and the assessor received training in assessment? Table 5 - Reasons for assessing students Developmental Judgmental Quality of teaching To provide feed back to learners and improve learning To motivate learning and help them focus To diagnose students knowledge and skills To consolidate student learning To diagnose the learning status of the student To rank, classify or grade student achievement To estimate students’ value and to allow the to progress in their studies To select for future courses or employment To provide license to practice To give feed-back on teaching efficiency and improve teaching To provide data for quality and accreditation processes As stated by Brown et al., the results of assessment are used mainly for developmental and judgemental purposes. The developmental purpose is related with improving student learning. The judgmental purpose is usually concerning with providing a license to proceed to the next level. In fact, in the model presented by these authors, the purpose of an assessment event is placed in a continuum between developmental and judgemental and it is never purely in one extreme of the continuum. Or at least it shouldn’t be, in the opinion of these authors. This perspective is supported by other authors like Boud [3] who distinguishes the purpose of assessment as formative (developmental) and summative (judgemental). The former aims to improve learning and provides feed-back to the student, the latter aims at making decisions, judgements and is related with grading and marking. Boud considers that it is not advisable to separate both types of assessment since students will most likely be more concerned with the summative tasks that determine their grades. It is recommended that both aspects of assessment should be approached together to be effective since all assessment leads to learning. Brown et al. [46] also consider the formative component of assessment of great importance. Students should receive feedback about their performance and accomplishments in a timing that allows them to be informed and to improve. Again, they consider that there is not a clear separation between these types of assessment. Even though formative assessment should not contribute to the grade of the course, this is not what happens due to time and workload constrictions. Coursework assignments will provide formative feedback to students but will also, in most cases, contribute to the final grade. Summarizing, in terms of function assessment tasks can then be classified as summative or judgmental and formative or developmental, as described above. Additionally, assessment may have a diagnostic
  • 39. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering 17 function when intends to assess prior knowledge. Assessment results may be included as indicators in quality and accreditation processes of the institution. Figure 3 - Purposes of assessment Another issue concerning assessment is the source of assessment, the person who performs the role of assessor. Traditionally, the assessors were the tutors but current trends in assessment defend that the involvement of other sources including the assessed student, peers or employers2 . The use of students as assessors, being in self or peer-assessment has many advantages [7, 46]. From the learning perspective, by assessing their work or the work of others students may increase their evaluating skills and their meta-cognitive knowledge, important for lifelong learning and for professional life. Using self- and peer-assessment also facilitates dealing with time constraints when facing large numbers of students, for instance. Another benefit of using students as source of assessment is to take advantage of their privileged perspective in situations where the tutor won’t be able to make an informed judgment. This is the case of as it happens in group-work. Peer- and self-assessment may include providing feedback to the students or marking students work. In the perspective of Brown et al. self and peer assessment are more suitable for formative than for summative assessment. These authors consider that for using different sources of assessment it is necessary to provide training and to provide clear information about the task and the assessment instrument (marking scheme, grading criteria, etc.). One of the reasons is that these sources of assessment are not always well accepted by students. As assessors, students don’t like to assess others and take responsibility for their grades. When being assessed students prefer receiving feedback or being judged by someone with expertise in the subject area. Both self and peer assessment are present when using reflexive practice assessment methods like portfolios. 2 Brown et al. provide a list of sources of assessment Formative Summative Diagnostic Institutional Purpose
  • 40. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering 18 Figure 4 - Sources of assessment Boud [3] identifies two factors that are increasing the pressure on assessment and that are present in current challenges of higher education. One is the trend of larger numbers of students in HE which increase the load of assessment in teachers and institutions. The other factor identified by Boud is the existing pressure towards the need to incorporate in existing HE frameworks of competences and the need to deal with the accountability of institutions. Assessment of transversal learning outcomes or skills, included in existing frameworks seems to be a difficult task. Erwin [47] identifies several of these that should be assessed by faculty, given the importance they assume in society: • Ability to commit oneself through reasoned beliefs and actions • Work cooperatively • Independent work • Accept criticism • Manage personal stress • Self discipline Brown et al. suggest that cuts in resources, modularization, along with larger number of student increase the pressure in education and assessment. Based on the work of other authors, they suggest several strategies for dealing with this pressure that are not important for the current work. Figure 5 – Current challenges in assessment A final issue concerning assessment is the question of validity and reliability. The discussion about validity and reliability of assessment is important because it helps to clarify important issues of the assessment process. Brown et al. [7, p.234] use an interesting metaphor to explain these two concepts. If assessment was a watch, it was reliable if it was precise, if it measured time consistently. But it was only valid if it showed the right time. The opposite is also true: a watch may be telling the right time at a specific moment but it may be not consistent and it goes slower than it should. Also, the observer of tutor self peer others Source Large numbers Transferable skills Available resources Personalized authentic assessment Challenges
  • 41. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering 19 the watch must know how to tell the time using that type of watch. So, an assessment task must be reliable (or consistent), must be valid (or accurate) and the assessor must know how to use the instruments (marking schemes, grading criteria) to achieve this. Reliability may be described as being related with fairness and consistency between different assessors and within one assessor [7, 26, 46]. Reliability is concerned with different assessors awarding the same marks or grades to the same assessment, or the same assessor awarding the same marks or grades in different moments. Reliability is strongly related with fairness and replicability. Marking schemes, grading criteria, anonymous marking are all assessment instruments that intend to improve reliability. It is suggested by Brown et al. that even when using explicit criteria and other instruments it is not easy to achieve the intended reliability. As an example, using a detailed marking scheme may be difficult to implement and if so, reliability is compromised. The same authors suggest that some assessment methods are more reliable than others. Reliable methods include MCQs and other methods with a well-defined solution. In the particular case of MCQ it is common practice to measure the internal consistency of the test by analysing the results. Student performance also affects reliability. The same student may respond the same task differently in separate moments, due to many reasons. Figure 6 summarizes the main issues related with reliability. Figure 6 - Different dimensions of reliability of assessment Validity in assessment is concerned with measuring the right thing, matching what is intended to measure with what is actually measured. For the purpose of this work, the concept of validity of assessment is more relevant than reliability. Validity is concerned with the sampling phase of the assessment cycle while reliability is closer to inferring and estimating value (see Figure 2). Authors identify different forms of validity [7, 48, 49] but for the purpose of this work only some will be explored. Brown et al. [7] describe face validity as being the first impression of the assessment task. An assessment task should explain in a clear manner the purpose of the assessment and what is expected from the student. Another form of validity is the consequential validity of assessment that was referred above. This is related with a broader view of assessment, how it impacts the teaching and learning processes and the intended and non-intended outcomes of an assessment task. The same authors explain the concept of intrinsic validity of assessment. This means that an assessment task is measuring the iLOs of the course or module. The authors point out that to achieve this type of validity, the iLOs must be clearly described and described in measurable terms. The authors identify one risk associated with this type of validity. Very detailed descriptions of iLO and of the assessment tasks may be unmanageable by the assessors, affecting the reliability of the process. This type of validity is the one with greater relevance to this research project. Linn et al. [48] propose eight validity criteria for complex performance based assessment. The intention was to propose a framework to help decide on the adequacy of new forms of assessment. The Between assessors Within one assessor Intrinsic to the student Intrinsic to the method Reliability
  • 42. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering 20 criteria are: consequences, fairness, transfer and generalizability, cognitive complexity, content quality, content coverage, meaningfulness and cost and efficiency. Consequential validity was already addressed and is related with the impact, intended and unintended, of assessment on learning and teaching. Cognitive complexity introduces an interesting perspective to the current work. To truly understand if an assessment is promoting the use of specific cognitive processes it is not enough to have a detailed analysis of the specifications of the task. It is also necessary to consider the familiarity of the student with the task and analyse how he solves the problem. This perspective is shared by both Bloom et al. and Anderson et al. [40, 41] that identifies the student background as being a problem when trying to use the taxonomy to classify assessment tasks. Figure 7 - Types of validity related with current research Concerning different assessment methods, Brown et al. [7] suggest that the ones related with remembering knowledge or well defined solutions are the ones with higher reliability. But in their view, what is not necessarily true for validity. Given the explained concepts of reliability and validity, it is easily understood why Brown et al. [7] refer that validity and reliability have conflicting needs of effectiveness and efficiency. In Erwin’s comments about this issue [47] it is suggested that validity and reliability of assessment are of particular concern when the focus of assessment is not on the individual student but when issues of accreditation and the quality of an institution start to come up. With the evaluation processes of institutions the number of people interested in the results of assessment increases and the institution needs to be accountable for the results. Brown et al. show a different opinion when they refer that validity and reliability are crucial features of a fair and effective assessment system. These opinions are not necessarily conflicting as we may consider that reliability is important when analysing results from a broader perspective, when comparing students or groups of students. This is important not only from the institutional perspective but to ensure fairness among different students. Validity is more focused on the learner and learning process. Still some aspects of validity may be approached from a broader perspective like in accreditation processes. 2.2.2 OVERVIEW OF METHODS OF ASSESSMENT Brown et al. [7] start by splitting assessment in two main categories: examinations and coursework. Traditionally, examinations would be written or oral and were typically blind, meaning the student would only know the questions when the examination started. Nowadays, the line dividing examinations from coursework is not so clear. Examinations are not necessarily blind. In some cases students may know the questions in advance the questions or may even take the examination home to solve them; in other cases students are allowed to take their notes to the exam. Traditionally coursework was made of essays, problems and reports of practicals. Coursework served mainly but not exclusively for formative purposes. Changes in HE are affecting this category and coursework includes a variety of tasks that are currently being used for summative and formative assessment. Intrinsic Face Consequential Cognitive Validity
  • 43. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering 21 Erwin [47, p.53], based on the work of other authors, identified two broad classes of assessment formats: constructed response and selective response. In the former, students will produce something like a case study, a report or must perform a task. In the latter, students are presented with several possible answers and recognize/select the correct one. Brown et al. [46] propose the following list of nineteen assessment methods: • Activities putting into perspective a topic or issue • Case studies and simulations • Critical reviews of articles, viewpoints or opinions • Critiques • Dissertations and thesis • Essay plans • Essays, formal and non-traditional • Fieldwork, casework and other forms of applied research • Laboratory reports and notebooks • Literature searches • Activities putting into perspective a topic or issue • In-tray exercises • Oral presentations • Poster exhibitions • Practical skills and competences • Projects (individual or group) • Reviews for specific audiences • Seen written exams • Unseen written exams • Strategic plans In later work, Brown et al. [7] present a shorter list of assessment in a more structured approach that was closer to what was intended for the purpose of the current work: • Essays • Problems • Reports on practicals • Multiple choice questions (MCQ) • Short answer questions (SAQ) • Cases and open problems • Mini-practicals • Projects, group projects and dissertations • Orals • Presentations • Poster sessions • Reflective practice assignments • Single essay exams After comparing and analysing both proposals it was possible to group them into similar activities. The first finding was that the first list does not include any type of questions or exercises that are a common component of exams (MCQs, problems, SAQs) and it was mostly centred on coursework. But for the purpose of the current work it was necessary to include the summative perspective. Another issue that resulted from the analysis of both lists was the inclusion of orals or poster
  • 44. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering 22 exhibition as methods of assessment. It was considered that these were formats of delivery of other methods that could be a report from practical work, a synthesis of an essay or answering verbally to a SAQ. From this type of analysis of assessment as it was presented by different authors, it was decided, for the purpose of the current work to differentiate between assessment methods and assessment tasks. The main difference is that the methods are the essence of the assessment tasks that are independent of the context of implementation, the grading criteria or the media chosen for delivery. The assessment tasks are the assessment methods in practice, as adopted by the teachers in their courses. Using this differentiating principle, it was possible to summarize the findings from the literature and compile a list of six general categories of assessment methods that could be used for answering RQ2. Table 6 presents a summary of both lists, distributed from the identified general categories. Table 6 - Summary of assessment methods General categories List of assessment methods by Brown et al List of assessment methods by Brown et al MCQ Multiple choice questions (MCQ) Orals SAQ Short answer questions (SAQ) Orals Essays (scripts) Essays Single essay exams Dissertations Presentations Poster sessions Dissertations and theses Essay plans Essays, formal and non-traditional Activities putting into perspective a topic or issue Critical reviews of articles, viewpoints or opinions Critiques Literature searches Practical work Projects Group projects Mini-practicals Presentation Poster sessions Reports on practicals Projects (individual or group) Fieldwork, casework and other forms of applied research Laboratory reports and notebooks Practical skills and competences Oral presentations Poster exhibitions In-tray exercises Problems Problems Cases and open problems Case studies and simulations Reflexive practice Reflective practice assignments In-tray exercise Brown et al. [7] suggest that different methods may have different applications. As an example, MCQ are more suitable for sampling comprehensive knowledge while essays are better for assessing understanding, synthesis and evaluation skills. Still, they consider that almost every method could be used to any purpose, although with some sacrifice of validity. Another interesting statement of these authors which is of great importance for the current work is that the effectiveness or validity of assessment depends not only on the method but on the specificities of the assessment task. It is necessary to match the purpose of assessment with the iLOs and the assessment tasks, including the methods and the instruments. To achieve this match, Brown et al. propose the course designer or tutor to answer a list of questions related with iLOs, assessment methods and grading schemes/criteria that will lead to the detailed definition of the assessment task. Multiple Choice Questions (MCQ) or objective test questions As defined by Brown et al. [7] a MCQ consists of a question followed by several alternative answers from which the student has to choose the correct one. This type of questions is frequently used in objective tests. Bull and McKenna [51] define objective tests as the ones where students are required