This document summarizes a presentation on evaluating the impact of a competency-based evaluation course. The presentation covered:
- The importance of competency-based education for evaluation training in Canada.
- An evaluation course designed around selected competencies and featuring an evaluation design project.
- A study finding the course improved students' reflective, situational, and technical evaluation competencies based on pre/post surveys.
- Focus groups found the evaluation design project and instructor feedback most helped competency development. Students saw evaluation as more complex and contextual.
1. Katya Chudnovsky
Dr. Cheryl Poth
Canadian Evaluation Society Conference, Toronto
June 10, 2013
What constitutes impactful evaluation
experiences for building CES
competencies within a program
evaluation course?
2. Centre for Research in Applied Measurement and
Evaluation
Presentation Outline
Background
Importance of competency-based education
Competency-based evaluation course example
Approach
Study rationale and methodology
Competency development: pre and post- self reports
Student experiences: end of course focus group
Findings & Implications
Greatest increased reflective and situational competencies
Improvement in evaluation-specific technical competencies
Lessons learned for course development
3. Centre for Research in Applied Measurement and
Evaluation
Background
Importance of competency-based education
In contrast to our counterparts in the United States, in Canada
there are very few university level programs specifically devoted
to evaluation
There is little consistency as to how graduate-level evaluation
courses are taught in terms of content, instructional
activities, & assessments
Evaluation types & approaches
Technical aspects (logic models, matrix, data collection &
analysis)
Less emphasis on soft skills and competencies, acquired
through experiences
If education is not clearly aligned with the new Canadian
Competencies, how then can we have confidence that new
graduates possess the knowledge and skills to be considered
competent evaluators?
4. Centre for Research in Applied Measurement and
Evaluation
Background
Doctoral evaluation course at the Department of
Educational Psychology, University of Alberta
Learner outcomes focused on selected
reflective, situational, technical, and interpersonal
competencies
39 in-class hours supplemented by Learning
Management System
Students with diverse backgrounds
Evaluation design as a centrepiece
Includes a sequence of formative
tasks to apply knowledge and skills step by step
5. Centre for Research in Applied Measurement and
Evaluation
Approach
Rationale for the study
Application of competencies to guide course
development&implementation
The emergence of CES competencies
Move towards competency-based assessment
Questions
1. To what extent have students developed the intended
evaluation competencies?
2. From the student perspective, how did course features
contribute to their competency development?
6. Centre for Research in Applied Measurement and
Evaluation
Approach
Development of competency: Pre and post self reports
Measured change using a rating scale 1-4 in addition to
providing evidence of development
1: Not yet knowledgeable/competent
2: Some knowledge, but not yet competent
3: Knowledgeable with some competency
4: Knowledgeable and competent
Instrument was adapted from CES competencies and
designation process
Post-course rating scale incorporated into final reflective
assignment
21 students completed the pre and post-course assessment
7. Centre for Research in Applied Measurement and
Evaluation
Approach
Student experience: end-of-course focus group
Documented course experiences using a 11
question protocol
Questions gauged the impact of course activities on
students’ learning and suggestions for course
improvement
E.g.: change in perceptions on evaluation process and
skills; alignment between readings and evaluation
design, feedback opportunities
Incorporated into the final class and instructor was
not present
14 students participated in two focus groups
8. Centre for Research in Applied Measurement and
Evaluation
Approach
Student profile:
PhD and Masters’ students in psychology (research and
clinical specializations) education and social sciences
Strong research expertise and interpersonal skills
(clinicians); some experience with community-based
research
Limited prior evaluation experience
General interest in evaluation (what is it like?)
Plan for clinical and/or research careers
Only two students intended to be an evaluator
9. Findings & Implications
• Greater ability to apply
standards & ethical
guidelines to an
evaluation design
Improvement in reflective competencies
10. Centre for Research in Applied Measurement and
Evaluation
Findings & Implications
Students recognized
evaluation complexity
by working with real-
life programs
Improvement in situational competencies
11. Centre for Research in Applied Measurement and
Evaluation
Findings & Implications
Evaluation
design
assignment as
a competency
development
tool
Improvement in evaluation-specific technical competencies
12. Centre for Research in Applied Measurement and
Evaluation
Findings & Implications
What students are saying about the course:
Learning from evaluation design and poster:
“Even though the main project was a lot of work I didn't mind it because that's where
all the kind of learning came from”
“Those small activities that we had to do from week to week were extremely helpful so
that we just had to put it all together… and you could see the overall picture at the
end”
Learning from instructor’s feedback:
“The instructor really refocused us and gave us what we needed to move forward
sometimes when we were stuck”
“One of the hardest things to ask our professors is how to go about applying [the
knowledge] because it is so contextual. I think that the one on ones can help with that.”
Learning from reflection:
“I think it could have been useful to almost keep a reflection notebook. You [have] your
five minutes at the end of class you just jot down [what] we talked about
today…Reflection piece is important and I think being exposed to those competencies
helped me understand what is expected in evaluation.”
13. Centre for Research in Applied Measurement and
Evaluation
Findings & Implications
What students are saying about evaluation:
Evaluation as a complex contextual activity
“I had no idea what it [evaluation] was. I would say now …I see it as something that
can be very integrated from the very beginning…, that it serves many, many purposes.
I didn't realize how many different kinds of evaluation there were and why you might
do it in a different way”
“ Rigour … of your methods is definitely paramount but I think that watching some
people understand more why it's not relevant in certain evaluations was interesting.”
Further training in evaluation
“It would have been nice to learn more about doing the evaluation … to align with the
[design] project that we're doing”…
Career applicability
“Definitely, as a researcher in training, having done this process and taking this course
is going to look great on my CV and …make us look more employable.” (research
psychology student)
“I think there could have been more [done] to help us to understand why this is
important to the programs that we are in.” (school psychology student)
14. Centre for Research in Applied Measurement and
Evaluation
Findings & Implications
Lessons learned for course development
A course needs to be focused on specific competencies
Time limitations
Prioritize experiential learning over traditional reading/presentations
Development of situational competency: providing guidance/boundaries
for students
Build course around a hands-on activity, evaluation design is a good
option
Need time for individualized feedback, reflection and sharing ideas with
peers
For students with diverse backgrounds – give options (e.g. selecting
clients, readings, reporting formats).
An opportunity for a second course (reading course, practicum) to
conduct and manage evaluation
Students needed to see the whole picture of an evaluation
15. Centre for Research in Applied Measurement and
Evaluation
Questions?
Suggestions?
16. Centre for Research in Applied Measurement and
Evaluation
Thank you!
The first author acknowledges travel grant for this presentation provided by
the Faculty of Education, University of Alberta
Editor's Notes
Katya
CherylBackground- 5Approach- 5Findings & Implications- 10
Cheryl Important to highlight: LMS as repository and optional discussion forum (how many have used it?)Students come from a variety backgrounds: for some it is obligatory for some not; different backgrounds and interests. A combination of formal readings, in-class exercises and hands-on activities. The central piece of the course eval design. A summative assessment, but also consists of a series of structured formative assessment tasks, which represent step-by-step approach to eval planning. Students are provided with recommendations on timelines and can receive feedback from the instructor or the TA who completed the course before. Additional assignment – topic facilitation and reflection on competency development following the completion of the design and visual representation (poster).
Katya
Katya
Katya
Katya
KatyaTell us what the colours mean
KatyaSituational awareness, complexity – main message here, needed individualized feedback to make it happenGenerally, again confident in their knowledge, and beest competency for identifying stakeholders: this one was explicitly discussed in the course, and aligns with UFE (need to idenify primary intended users first)Individual feedback and class discussion – demonstration of how approach can be tailored to specific situations
KatyaEveryone attained at least some knowledge; less confident about using different data sources. Evaluation purpose - learned in class