Weitere ähnliche Inhalte Ähnlich wie (2009) Providing Students with Feedback for Continuous Improvement (20) Mehr von International Center for Biometric Research (20) Kürzlich hochgeladen (20) (2009) Providing Students with Feedback for Continuous Improvement1. PROVIDING STUDENTS WITH FEEDBACK FOR CONTINUOUS
IMPROVEMENT
Stephen J. Elliott, Ph.D. 1 , Christine R. Blomeke 2 , and Michael D. Frick 3
Abstract ⎯ This paper describes the introduction of a using educational course management systems such as
feedback system for undergraduate students in a junior level Blackboard and Moodle. According to Newlin and Wang
automatic identification and data capture class. Students (2002), instructors can provide personalized, question-by-
have highlighted that in general feedback has been slow question feedback to each student in a timely fashion. In
from some classes in the curriculum. This issue has been doing so, Newlin and Wang identified that even anecdotally
made a continuous improvement action item as a part of the this feedback provides students with greater feelings of
department’s ABET continuous improvement program. The support, increased instructor contact and availability.
implemented solution was to provide students with
personalized feedback regarding their understanding of the CONTINUOUS IMPROVEMENT PROGRAM
lecture learning objectives prior to undertaking the lab Continuous improvement is undertaken in many different
practicals. Student performance was measured through the guises, from “water-cooler discussions” through to a
use of quizzes and exams to indicate whether the new system rigorous and measurable evaluation program. Continuous
of feedback contributed to higher scores. The author’s improvement in an academic environment is done for a
compared scores from Spring 2008 (control) with the Fall number of reasons as outlined in Temponi (2005). There are
2008 (experimental) semesters. The results show that for several steps to continuous improvement, shown in Figure 1
higher order learning activities, the personalized feedback below. They include gathering data from various sources
helped contribute to statistically significantly higher score. (inputs), which can include faculty, student feedback,
industrial advisory boards, alumni feedback, employer
Index Terms ⎯ biometrics, education, continuous feedback, internal communications within the university
improvement environment, ABET accreditation requirements and
information from conferences, journals etc. which identify
INTRODUCTION issues for further study. This can be shown in Figure 1
According to Chickering and Gamson (1999), there are below.
seven principles for good practice in undergraduate In this case, the issue was identified through surveys to
education. These include student-faculty contact, current students as well as senior exit interviews. Once the
cooperation amongst students, the encouragement of active issue had been identified, a subgroup of faculty examined
learning, providing students with prompt feedback, emphasis how to provide feedback to students. Given the tools
on time on task, the communication of high expectations, available in the classroom to faculty, the authors decided to
and the inclusion of a diverse set of talents and ways of implement a continuous feedback loop based initially on
learning. With respect to good feedback, Chickering and student input in the classroom, through the use of a clicker.
Gamson state that “in classes, students need frequent This concept was already well established in the literature as
opportunities to perform and receive suggestions for a way of engaging students in such an environment. What
improvement” (page 1). This is sometimes a challenge in was not clear was how to provide resources to students using
large lecture courses; in this class there are 80 students per the data collected in class, as the report functionality in the
semester, and dialog is limited. There have been several clicker software was limited.
ways noted in the literature to improve the feedback to
students, especially with the adoption of technology in the
classroom such as Audience Response Systems, commonly
known as clickers (Caldwell, 2007). Other communication
tools available to instructors include e-mail, web-pages,
forum postings, instant messenger discussions, typically
1 Stephen J. Elliott, Ph.D., Associate Professor, Purdue University, Department of Industrial Technology, 401 N. Grant Street, West Lafayette, IN 47907-
2021, USA, elliott@purdue.edu
2 Christine R. Blomeke, Graduate Researcher, Purdue University, Department of Industrial Technology, 401 N. Grant Street, West Lafayette, IN 47907-
2021, USA, blomekec@purdue.edu
3 Michael D. Frick, Undergraduate Researcher, Purdue University, Department of Industrial Technology, 401 N. Grant Street, West Lafayette, IN 47907-
2021, USA, mfrick@purdue.edu
© 2009 ICECE March 08 - 11, 2009, Buenos Aires, ARGENTINA
VI International Conference on Engineering and Computer Education
159
2. used as feedback tools to gain real time information from the
students on their understanding of a particular concept (see
below).
MOTIVATION
There were several motivations for the study, the
primary one from the instructor viewpoint was whether
students were understanding the material. The second, was
to try and provide students with feedback on their
understanding of the objectives. The third, was to investigate
whether there was any increase in their knowledge based on
previous semester’s performance.
Providing feedback to the student is an important
component to learning. A recent survey of our department’s
senior undergraduate students’ expectations indicated that
feedback was important. When asked whether they received
FIGURE. 1 feedback from their instructors regarding their written
CONTINUOUS IMPROVEMENT PROCESS assignments 50% of the students agreed, 7.1% disagreed,
35.7% strongly disagreed, and the remainder didn’t provide
a response. When asked about feedback with respect to
DESCRIPTION OF THE COURSE verbal communication skills, 7.1% strongly agreed, 35.7%
The course under study is a junior level course in automatic agreed, 21.4% were unsure, 35.7% disagreed. This
identification and data capture, covering technologies such highlighted that there was some scope to improve feedback
as bar codes, card technologies, radio frequency to students, and at the same time, provide some empirical
identification, and biometrics. The course has the following evidence to establish whether provinding feedback actually
objectives: impacted their performance.
• Outline the various forms of AIDC technologies
• Differentiate AIDC technologies based on specific
METHODOLOGY
scenarios
• Formulate AIDC solutions and recommendations There were a couple of iterations to the feedback loop,
using statistical knowledge or mathematical ability primarily as a result of understanding the technology and
• Demonstrate the integration of AIDC equipment how to implement in a classroom environment. Initially, the
• Critique case studies related to the deployment of instructor outlined the lecture objectives at the start of the
AIDC technologies class, followed by the instruction, and at the end of the class,
the students would complete the survey via the use of the
The course is taught as a combination of face-to-face clickers. Figure 2 shows the view that the students would see
lectures (2 per week), usually followed by a 2 hour in class.
laboratory experience, taught by graduate teaching
assistants. The lecture component consists of a period of
time for attendance, followed by an explanation of the
agenda. Students also recieve handouts and related
information about the course which includes a list of
objectives that the instructor is covering for either the lecture
or module. The laboratory experience typically consists of
students providing an overview of case studies, or
applications of the technology that is related to their major,
followed by a hands-on activity(ies), and then a quiz. The
course utilizes the Blackboard Course Management System.
Each topic is broken up into learning modules. Each learning
module has the same outline. This includes PDF powerpoint
lectures, Adobe Presenter summary lectures, case studies
that supplement the material, and online video case studies.
Students also use the E-instruction classroom response
system clickers. Last semester, these were used for FIGURE. 2
attendance purposes. This semester, the clickers were to be OBJECTIVE QUESTION DISPLAYED TO STUDENTS
© 2009 ICECE March 08 - 11, 2009, Buenos Aires, ARGENTINA
VI International Conference on Engineering and Computer Education
160
3. This did not appear to be a good use of either the
clickers, or the instruction time. Completing 4 or 5 questions
took almost 5 minutes, which sometimes would cause the
class to run over in time. Furthermore, there seemed to be no
additional feedback to the student, just to the instructor on
where the students’ knowledge was at any one point in time.
The second iteration was to give students a list of objectives
at the beginning of the class, and then initiate the clicker
system which would run in the background while the slides
were being projected. As the clickers were in “student-
managed” mode, the student could answer the questions
when they wanted to during the entire lecture, freeing up
instruction time, and keeping the student engaged with the
material. The data from the students was then downloaded
and analyzed; any objective that was not understood would
then be covered again in the next lecture. Depending on the
topic at hand, sometimes the instructor would break the FIGURE. 3
objectives down into two sets, so he could display the results FLOWCHART
of the class responses to the students and answer any
questions. Upon reflection, it appeared that the feedback was The database consisted of five tables; ‘objective’, ‘quiz’,
still class specific, and displayed the cumulative results of ‘quiz_objective’, ‘student’ and ‘student_quiz’. The
the students understanding. So whilst feedback was now ‘objective’, ‘quiz’, and ‘student’ tables are all simple tables.
instantaneous, it was generic to the class. The ‘objective’ table contains an objective ID and the
objective itself. Similarly, the ‘quiz’ table contains a quiz ID
DEVELOPMENT OF PERSONALIZED STUDY along with a quiz name, and the ‘student’ table contains the
GUIDES student’s first and last name. The ‘quiz_objective’ and
‘student_quiz’ table links the ID of an objective to a
The third iteration of the feedback concept used in the class particular question on a quiz (read quiz ID and quiz
was to give feedback to the individual student. This was question), along with a field where additional information
non-trivial. The CPS tool does provide some reporting may be provided to assist the solution with a particular
functionality, namely a series of export functions, or reports question at hand. Lastly, the ‘student_quiz’ table links the
based on students understanding, but it was limited to student to a quiz question along with their answer to the
providing students with their answer, and the correct question. The report was then run, and the individual reports
response, as opposed to any additional information such as emailed to the students.
where to provide additional resources so the student could
find out more about the particular topic. Using the CPS RESULTS
database, personalized study guides were created, and then
emailed to the student. This could be used by the student as There were two aspects of the results. The first was the
targeted revision of specific objectives. The lectures were on students opinions on this endevour. The second was to
Wednesday, Friday, with the lab component occuring on examine whether the adoption of clickers tied into
Monday or Tuesday. In class on Friday, students were given personalized study guides had an impact on the learning of
a survey on their understanding on the objectives of the the students.
module. They would need to understand these objectives The first question asked was whether the printouts of
before lab on Monday or Tuesday of the following week. the objectives helped the student understand the agenda for
At the end of the lecture, the instructor uploaded the CPS the class / module. 44 students responded positively (78%
data into Blackboard, and at the same time exported it to a success rate), 5 negatively, and 7 did not respond. The
cvs file. This was then imported into a MySQL community second question asked whether the modules should be
edition database. posted in Blackboard (the course management system). 91%
of students agreed that they wanted to have the objectives
posted online as well as handed out in class. With respect to
the clickers used in class, the question asked of the students
were “does using the clicker in class help you keep focused
in class”. 30 students responded favorably (54%), the
remainder negatively. When asked whether using the clicker
in class helped retention of knowledge, 25% of the students
thought that it did, 46% did not believe that the clicker
© 2009 ICECE March 08 - 11, 2009, Buenos Aires, ARGENTINA
VI International Conference on Engineering and Computer Education
161
4. helped them retaining the knowledge. When asked whether CONCLUSIONS
the personalized study guides helped, 82% of the students
responded favorably, 12% responded negatively, and the Once the system and objectives had been entered into
remainder didn’t respond. the system, the time taken to provide feedback to students
The second part of the study was to examine whether was significantly reduced. It was interesting that some
there was any incremental increase in scores for those (about 3) students wanted to get a study guide for all the
modules where a personalized study guide was distributed. objectives, regardless of whether they disclosed that they
In order to evaluate the scores, the preceeding semester understood all of the objectives. It is interesting to note that
scores were used as a baseline. the study guide did not impact student learning
The pre-test is an evaluation of the student’s prior (significantly), although the mean scores were slightly
knowledge of the course material. We also use this to higher. The use of clickers in the classroom enabled targeted
establish whether there is any significant difference in discussion on topics that the group as a whole did not
knowledge over previous semesters. For the purposes of the understand. This feedback from the students might not have
article, the author’s compared the Spring and Fall semester been forthcoming quite as easily without the use of the
of 2008. The scores of the pre-test from one semester to clicker. The next steps will be to integrate the use of the
another should not be statistically different, and in this case clicker in the lab practicals. The use of the clicker would be
an ANOVA showed no significant difference with a p-value to start a discussion on the topic in order to gain more
of .736. interaction in the lab environment. In some cases, the clicker
The second module of the semester is relatively simple, might replace the use of the bubble-sheet, again providing
and introduces students to their first bar code symbology – more information to the student.
PostNET. For this module, there are only a few objectives,
including definitions and some calculations. There was no
difference between the two groups (p=0.953). REFERENCES
For the next module, the students were introduced to
Caldwell, J. E. (2007). Clickers in the Large Classroom:
linear bar codes. There were a number of objectives to be
Current Research and Best-Practice Tips. CBE Life
completed, but again the scores across the two semesters
Sci Educ, 6(1), 9-20. doi: 10.1187/cbe.06-12-0205.
were practically identical (p=0.670). Students did receive an
Chickering, A. W., & Gamson, Z. F. (1999). Development
individualized study guide for this module after class. The
and Adaptations of the Seven Principles for Good
study guide would have been issued prior to the quizzes.
Practice in Undergraduate Education. New
The stacked barcode module involves a higher level of
Directions for Teaching and Learning, 1999(80),
understanding, building on the concepts covered in the
75-81. doi: 10.1002/tl.8006.
previous modules. The content is more challenging, as is the
Newlin, M. H., & Wang, A. Y. (2002). Integrating
lab activity. The students were questioned on their
Technology and Pedagogy: Web Instruction and
understanding of the objectives in the class prior to the lab
Seven Principles of Undergraduate Education.
activity using the CPS pad, and then a personlized study
Teaching of Psychology, 29(4), 325. doi:
guide was issued to the students. The results of an ANOVA
10.1207/S15328023TOP2904_15.
showed a statistically significant difference between the two
Temponi, C. (2005). Continuous improvement framework:
groups (p=0.000). Interestingly, the range of the two groups
implications for academia. Quality Assurance in
were the same, but there was a 7 point difference in the
Education, 13(1), 17-35.
means.
The matrix barcode module also involved a higher level
of understanding of the concepts, although the lab is not as
challenging as the stacked symbology lab. Students were
again polled during class abouot their knowledge of the
content, and an individualized study guide was generated.
Again, there was a statistically significant difference
between the two groups – with a p-value of 0.007.
The next module was the midterm, again the results of
the midterm were statistically different between the two
groups (p=0.001).
After the exam, an additional module was surveyed.
This consisted of basic information about contact memory
and card technologies. Again, students were issued a
personalized study guide. There was no statistically
significant difference in the scores (p=0.222).
© 2009 ICECE March 08 - 11, 2009, Buenos Aires, ARGENTINA
VI International Conference on Engineering and Computer Education
162