The document discusses the Michael & Susan Dell Foundation's work using data and performance management to improve education outcomes. It outlines key lessons learned, including:
1) Easing data collection burdens on schools is important for successful adoption.
2) Tools should support a shift from compliance-focused to support-focused management to improve outcomes.
3) Extensive stakeholder engagement is needed to design effective tools and ensure uptake.
The foundation is working with districts in South Africa to establish data-driven management through dashboards, routines, and training while easing data collection burdens on schools. The goal is to improve learner performance by empowering districts with insights from education data.
5. Together with the DBE, in 2012 we conducted a diagnostic study of data
at every level of the school system across all 9 provinces
Research across all 9 provinces
▪
Deep-dive visits (2 weeks)
Over 250 interviews
Light touch visits (1-2 days)
School locations Limpopo
▪ National-, Province-, district-,
school-, and class-level
interview
Gauteng
▪ Diverse and representative
North West
mix of educational
Mpumalanga
environments:
– Mix of urban and rural
Free state
Kwazulu Natal
schools
Northern Cape
– Distribution of wealth
quintiles
– Spectrum of PC and
internet access
Eastern Cape
Western Cape – Range of performance
bands
Note: Infrastructure descriptions: Poor IT = Less than 2 computers, no phones, no internet // Fair IT = 2 - 5 computers used by staff, maybe internet connected and phone lines
Good IT = More than 5 computers, phone lines and internet, printers, copiers etc.
Below average, Average and Above average determined by using reported pass rate – with 40-60% being considered average – considers both primary and secondary schools
5
7. Despite collection and recollection of data, schools, circuits and districts
receive little to no insight or feedback on submissions
Data flows upwards to satisfy compliance requirements
Public National National
Queries EMIS directors
Provincial Generic Province
?
EMIS Reports directors
District
District EMIS
director
SMGD
Other data
requests
School
7
8. Schools are overburdened by data requests
Duplication of data requests on schools EXAMPLE
Number of survey questions asked by district per quarter
245
-68%
572
144
How does this district analyse 500+ 183
questions per quarter per school?
Questions Questions Questions “Net new”
asked by with answers duplicated questions
district tools in SA-SAMS across set
submissions of surveys
8
9. Outside of planning, few management routines use data to guide
decision making
System-wide difficulty in translating collected data into insights and actions
Common Capture Input data Aggregate Create Establish
focus data to in and actionable skills and
goals measure shareable analyse outputs routines
progress system data
Metric requested at province level
Learner demographic and ID info
Learner registration and promotion
Learner daily attendance
Learner subjects and timetable
Leaner marks (daily, weekly, quarterly)
Learner grade performance and pass quality
Learner scores for standard provincial assessments
Learner scores for standard national assessments
Learner social needs and discipline record
Educator ID, qualifications and assignment
Educator curriculum coverage
Educator development and training
Educator attendance and leave
Educator teaching and learning results and trends*
School ID and contact information
School infrastructure and facilities
School finances
School LTSM ordering and delivery
School improvement plans
School posts vacancies and appointments
School support services - transport and feeding
School teaching and learning results and trends**
School management meetings
Circuit and district management visits
Province, district, and/or circuit level goals/targets
9
12. Districts are a critical point of leverage in the school system
Primary levers for improving learner performance
▪ Strategy, policy and direction
National DBE ▪ Design and delivery of instructional support material
▪ Educational system resourcing
▪ Strategy, policy and direction
Provincial DoE (9) ▪ Human resources and hiring
▪ District and school resourcing Performance
management
▪ Instructional and administrative support and tools improvements at
▪ Performance management the district level
Districts (86) ▪ Quality assurance and compliance can put learner
▪ Professional development performance at
▪ School and community engagement the forefront of
thought in
classrooms and
▪ Local and operational support provincial offices
Schools (~27 000) ▪ Enrolment and progression alike
▪ Delivery of facilities instructional material
Classrooms ▪ Teaching and learning
Districts are also historically the most ignored and atrophied
part of the delivery chain across South Africa 12
13. What will it take to establish 86 data-driven school districts?
Agree on a small-set of student outcome metrics across
Metrics 1 grades R-12, aligned at province, district, & school level
Identify and build
capabilities and
5 mindsets to
Script data-driven management routines for districts to implement data-
Action /
support and review schools‟ performance driven action
Routines 2
Scale
Outputs / Put data in the hands of people who can act on it by
nationally, pione
Dashboard ering first across
creating data dashboards that:
3 districts that
• are visually easy to interpret, 6 represent a wide
3 • trigger action where it is needed
range of district
Analyse • display analyses for root-cause identification,
contexts, „to meet
• designed into management routines, and
districts where
• are user-tested
they are‟
Input and Streamline and strengthen data collection processes to Work in close
[1] improve accuracy and [2] reduce duplication & burden 7 partnership with
aggregate
government
4
Collection
13
14. We have selected districts of three different types and have begun
work with them to map processes and design a dashboard
National DBE
Limpopo Free State Gauteng Other 6 provinces
Waterberg Thabo Mafutsanya Ekudibeng One per province
“Builder” pioneer “Architect” pioneer “Experimenter” “Twin” districts
pioneer
Overstretched district in Similarly rural setting,
rural area with large but with more Functioning • Shadows process
number of schools per technology dashboard developed • Reality checks work
Circuit Manager infrastructure and in Excel, tracking a to ensure
support in place full range of indicators generalizability
Limited technology in
place Practice of using data Management for
to manage for results results change
Matric results only real
established process underway for
indicator managed
2 years
SI and Design: Frog Design, Double Line Partners
Training and coaching organizations: New Leaders Foundation
Consulting and operations team: McKinsey
14
15. A district management system can support the transition to data-driven
performance through a combination of processes, tools and capabilities
Building KPIs and business processes Establishing tools and capabilities
60% 40%
Business processes and tools will be tailored to meet districts‟ schooling and technology realities
15
16. Work has started to create visually appealing dashboards for
different district archetypes
16
22. Participants felt the workshop was an “eye- opener” and it
motivated them to improve current practices
The pilot would be a
good tool for us to use
in the district
”
My eyes are open I feel empowered to
now…we are able to change It shows we have been
read the statistics and haphazard in the way we
see how they relate have been doing
things…this is the start
of change
It seemed like child’s
play at first but now we
are empowered
You can pick up
problems through the
data
”
It was not boring, it
was like playing and
now we are empowered
Source: Waterberg design workshop
22
23. We have identified a core set of student and school metrics
to guide school performance support and management PRELIMINARY
Indicators Unit of measure Range Archetype
Learner outputs
▪ ANA, NSC1 ▪ Levels, % 1-7, 0-100 ▪ All
Achievement ▪ Common Tests2 ▪ Levels, % 1-7, 0-100 ▪ Experimenter
▪ Grades 9, 10, 11 pass ▪% 0-100 ▪ All
Progression rates
▪ Grade 12/Grade 8 ▪% 0-100 ▪ All
learners
▪ Matric pass rate ▪% 0-100 ▪ All
Inputs
▪ Educator attendance ▪ % of days 0 - 100 ▪ All
Attendance
▪ Learner attendance ▪ % of days 0 - 100 ▪ All
Curriculum
▪ Option 1: Common Tests2 ▪ % of curriculum 0 -100 ▪ Experimenter
▪ Option 2: CAPS syllabus ▪ % of curriculum 0 -100 ▪ Ekudibeng
coverage
Resourcing ▪ LTSM ▪ %3 0-100 ▪ Experimenter,
Architect
▪ Vacant educator ▪% 0-100 ▪ All
positions
1 Annual National Assessment, National senior Certificate (Matric) 2 Written test, standardised across the region/province, data with/without
SBA
3 100% TBD
SOURCE: Team analysis 23
24. We are developing a baseline which will track core metrics along
with operational improvements and school feedback
Description Rationale Time to impact
▪ Measures of learner ▪ Improvements in achievement ▪ Long-term
Education achievement and progression and progression are the
outcomes at all stages of the basic ultimate goal for DBE and all
education system stakeholders
▪ Measures of enablers that ▪ Enablers are the leading ▪ Medium-term
contribute to achievement and indicators for performance in
progression improvements achievement and progression
▪ Measures of the effectiveness ▪ Operational indicators ▪ Short-term
Operational of data-driven management measure the levers that the
improvements processes project will use to drive
improvements in education
▪ Measures of the efficiency of outcomes
data processes
▪ Measures of the schools‟ ▪ Schools are in the best ▪ Medium-term
Feedback from assessment of the position to assess the shift
schools quality, frequency and impact form compliance to support
of their interactions with the and give districts feedback on
district their service to schools
SOURCE: Team analysis
25. PRELIMINARY
Preliminary timeline for pioneering districts
= Current phase
Beyond 3 months the timeline is subject to revision as we proceed
National
3 months 6 months 12 months TBC…
rollout
Prepare for Begin
implementation ▪ Deliver online
“Experimenters”
implementation
dashboard, linked to
▪ KPIs ▪ Implement existing data
change
systems ▪ Work with
▪ Map of data process within
▪ Deliver training district, circuit
collection, and schools
pioneer
storage, and to embed
districts
use tools and
practices
▪ Develop actual
▪ Design ▪ Deliver offline
All district types
All district types
dashboard
process dashboard, drawing ▪ Finalize DBE leads roll
“Architects”
system
changes from existing data reporting
out together
structures to
▪ Forge systems
▪ Develop partnerships ▪ Conduct dashboard province and with provinces
dashboard and report training national
to expand
prototype funding base
▪ Package
▪ Assess materials and
baseline of
▪ Create local printed prepare for
outputs and traing next district
district
“Builders”
effectiveness
▪ Shift output and rollouts
analysis capabilities
to province
▪ Plan for
implementation
& roll-out
25
29. Findings suggest that stakeholders face common challenges across all
phases of a data-driven decision-making cycle
Data driven decision cycle Research themes Example analyses*
Stated goals and incentives do not always line up,
Create shared focus resulting in managers prioritising the items that lead to
Assessment
on measurable goals funding or publicity above all else
Schools are overburdened by reporting requirements,
Capture data to
which exist in duplicate and triplicate because districts,
measure progress
provinces and national offices do not share data
against goals
There is no single source of accurate data for circuits or
Input data into a districts, as each function sources its own data and EMIS
Analysis
sharable system remains outside of core conversations
Despite its promise, SA-SAMS is not being used as a
Aggregate and analyse school management tool, and is primarily used by
data administrative clerks as an “electronic accountability tool”
District, circuit and school personnel do not receive
Create actionable feedback on submitted data, as most data is collected and
outputs for various passed upwards in compliance, with limited desire to inform
ground-level actors of relevant data insights
Action
users
Outside of planning, few management routines use data
Establish skills and to guide decision-making, instead most rely on touch and
routines to review data feel and ad hoc efforts to manage crises
and guide action
* Detailed view of analyses can be found in appendix and supporting documents 29
30. Our current work in implementation preparation is guided by six
key questions
1
What are the core guiding metrics for data-driven school
performance management, and what is the current baseline of
performance and practices?
2
What are the data-driven routines that districts should put in place
for supporting schools and managing their performance on a
regular (termly) basis to drive lasting improvements in education
outcomes?
3
How can data best be How can data systems (inputs, processes and access) be
optimised and used to optimised to ensure that accurate and unique inputs result in
support schools to relevant information that supports data driven support and
improve performance? performance management?
4
How can school data be best displayed to result in the most
effective action by school officials, politicians and other decision
makers to improve R-12 education across all districts?
5
How will we build the capabilities and mindsets needed to drive
data-driven school support?
6 How do we plan for a successful rollout to all districts in the
country?
30