This presentation was delivered to the Monitoring and Evaluation Colloquium of Bridge on 12 August 2014. It is based on a paper delivered at the SAMEA 2013 conference by Williams, Marais and Rampa
2. Evaluation design of the cofimvaba ict4 red initiative - Bridge 2014 version
1. Evaluation Design of the
Cofimvaba e-Textbook Initiative
Benita Williams, Mario Marais,
Mmamakanye Rampa
2. Project Aims
• The project aims to improve the teaching
and learning environment
• Contribute towards
– improved learner results and
– the development of 21st century skills among
teachers and learners.
• It also aims to document a model for e-
textbook rollout in schools, which could
potentially be sustainably replicated in
other educational districts.
3. 12 Component Implementation Model
Merryl Ford, CSIR Meraka Institute
SCHOOL ICT
ARCHITECTURE
Devices
Wireless LAN
Storage and Power
CHANGE MANAGEMENT
People (District, SMT)
Technology
Process
CONTENT
Standards
Conversion
Creation & Customisation
TEACHER DEVELOPMENT
Training
Preparation
In the classroom
OPERATIONS
MANAGEMENT
Logistics
Support & Maintenance
Distribution
RESEARCH
Masters & PhD
Technology R&D
ICT4E R&D
NETWORK
WiFi Mesh
Backbone connectivity
Internet
MONITORING &
EVALUATION
Learners
Teachers
School
COMMUNICATION
Marketing strategy
Social Media Strategy
Knowledge management
COMMUNITY
ENGAGEMENT
Learners & Parents
Teachers
Community
STAKEHOLDER
MANAGEMENT
District/Circuit officials
Local leadership
Provincial
PROJECT MANAGEMENT
Financial Management
Procurement
Implementation management
4. M&E Approach
• Developmental Evaluation
• Principles of Utilization Focused Evaluation
• Why? Constantly evolving complex and
innovative project
5. Framework Part 1
• Key Activities
– Model Development
• Document Theory of Action
& Theory of Change
• Design Choice Review
• Cost Study
– Monitoring & Learning
• Implementation Monitoring
– Evaluation & Learning
• Result Review (Reaction, Learning, Individual Behaviour,
Organizational change at level of learner, teacher, school,
district, province)
6. Framework Part 2
Component Evaluation Questions
Documentation
of theory of
action and
theory of
change
1. What is the initial design and significant modifications made to
each of the 12 components (version 1 of model) required to
address in e-textbook rollout?
Method:
Participatory Workshop
7. Framework Part 2
Component Evaluation Questions
Design choice
review
Programme focus:
2. What were the significant design choices for each of the
components, and what was the impact on relevance,
sustainability and achievement of outcomes?
Model focus:
3. Which design of the programme components, and
programme overall, are likely to yield sustainable
implementation by the DBE/ ECDOE ?
Methods:
Documentation of Decisions in a decision template.
Documentation of lessons in “Learning brief” template
8. Framework Part 2
Component Evaluation Questions
Cost Study Programme focus:
4. What was the total cost of implementing the different
designs of the programme ?
Model focus:
5. What would the total cost be to implement such a
programme on district wide level? (three scenarios, different
designs, in comparison to existing LTSM delivery process)
Methods:
GeSCI TCO tool
Process modelling
Scenario Building
Activity based costing
9. Framework Part 2
Component Evaluation Questions
Implementation
monitoring
Programme focus:
6. Was the programme components, and
programme overall, feasible to implement?
Model focus:
7. What have we learnt about the implementation
of this programme that should be transferred to
others?
8. Which factors support, inhibit or prevent
implementation?
Methods:
Textual analysis of Twitter feed, and
Whatsap group
Post training trainer feedback form
Attendance registers
Post-training teacher feedback forms
Quarterly telephonic teacher survey
Quarterly telephonic school survey
Device usage statistics
School functionality checklist
10. Framework Part 2
Component Evaluation Questions
Results Review: Programme focus:
9. Did the programme deliver the anticipated
educational outcomes (learning, behaviour change,
organizational change) and necessary intermediate
outcomes (reaction) at learner, teacher, school, district
and province levels?
10. What were the unanticipated results?
Model focus:
11. What have we learnt about the results of this
programme that should be transferred to others?
12. How do the achieved outcomes in this project
compare to the outcomes
13. Which factors support, inhibit or prevent the
realisation of outcomes?
11. Framework Part 2
• Post-training teacher feedback forms
• Learner focus groups, Parent focus groups
• Quarterly telephonic district interviews
• Teacher session assessment record (Badge tracker)
• Teacher baseline questionnaire and follow up
• Learner 21st century assessment assignment with sample of learners
• Assessment of learner marks (Matric results, ANAs) in comparison with other
matched schools
• Ethnographic description of classroom observations
• School Functionality checklist
• Impact Story template
• School leavers tracer study
12. Issues
• Initiative is evolving, the M&E framework needs to be
adapted within budgetary and practical constraints
• Operationalizing the evaluation required that other
component managers must help with the collection of data,
interpretation of the data
• Evaluation information collected on some tools may have
immediate uses for other components – how best to make it
available
• Capturing learning in learning templates is useful for the team
members who are more practice than research oriented.
• Challenges of collecting and packaging the information for
purposes of research, evaluation, and sharing in media
13. Benefits
• The benefit of the evaluation approach include
providing continuous feedback on various project
components to the various evaluation users whilst also
collecting information that could later be consolidated
into a model.
• There is flexibility to incorporate methods with
different ontological foundations e.g. RCT and thick
ethnographic description in order to meet the needs to
the diverse set of users.
• The framework is now also being adapted for the
evaluation of other similar ICT programmes rolled out
by the Meraka Institute