Black, Adam Dr - Efficacy and how to improve learner outcomes

E
Efficacy and how to improve
learner outcomes
EAQUALS Conference
April 25, 2014, Budapest
Dr. Adam Black
Efficacy and Research, Pearson (Professional)
Overview
• Defining Efficacy
• Building a path to Efficacy
• Efficacy Reviews: a framework and
lessons learned
• Efficacy Studies: holistic measures
of impact
• Efficacy Analytics: global trends
• Do an Efficacy Review yourself
• Q&A
Efficacy and improving learner outcomes, EAQUALS April 20142
Context:
what are we trying to improve
and why now?
Context
―No program can be evaluated properly without a common
understanding of what it’s supposed to achieve.
An unfortunate consequence of treating purposes casually is
a tendency to accept goals that seem important in theory
without pausing to consider whether it is possible to achieve
them within the time available.‖
Our Underachieving Colleges
Derek Bok (former President, Harvard University)
Efficacy and improving learner outcomes, EAQUALS April 20144
―I have been struck by how important measurement is to
improving the human condition.
You can achieve incredible progress if you set a clear goal
and find a measure that will drive progress toward that
goal—in a feedback loop.‖
Bill Gates, Jan 2013
Context
Why now?
• There is a shared understanding that
high-quality education drives
personal, economic and societal
growth
• Governments, individuals, employers
and institutions recognise the need to
deliver high-quality learning
• New technology makes it increasingly
possible to see what works and what
doesn’t in helping learners to achieve
their goals
Efficacy and improving learner outcomes, EAQUALS April 20146
What do we mean by efficacy?
A measurable impact on
learner outcomes
Efficacy and improving learner outcomes, EAQUALS April 20147
efficacy (dictionary definition)
• ability to produce the intended
result
Efficacy (Pearson’s definition)
• make a measureable impact on
learner outcomes
efficiency (dictionary definition)
• achieve maximum productivity
with minimum wasted effort
Improving Efficacy:
how are we going about it?
Pearson’s path to efficacy: three integrated
activities for improving learner outcomes
research
design and
development
piloting
full
deployment
customer
use
Efficacy Analytics: mine data from products
to gain insights into iterative improvements,
learner behaviours and future innovation
Efficacy Reviews: predict
likelihood of impacting learner
outcomes and plan improvements
Efficacy Studies: learn from
long-term holistic studies of
outcomes
An Efficacy Framework:
how likely is it that your
project will successfully
improve learner outcomes?
Criteria area Rating Rationale summary
• Action plan
• Governance
• Monitoring and reporting
• Internal capacity and culture
• User capacity and culture
• Stakeholder relationships
Outcomes
• Intended outcomes
• Overall design
• Value for money
• Comprehensiveness of evidence
• Quality of evidence
• Application of evidence
Evidence
Planning and implementation
Capacity to deliver
Efficacy
An Efficacy Framework: likelihood of impact
An Efficacy Framework: an explanation of ratings
Good – requires slight refinement, but on track
Mixed – some aspects require attention, some solid
Problematic – requires substantial attention, some require
urgent rectification
Off-track – requires urgent action and problem solving
Ratings are not grades on performance
Ratings prompt discussions that lead to actions
Ratings prioritise and suggest timeline
Efficacy and improving learner outcomes, EAQUALS April 201412
Criteria area Rating Rationale summary
• Action plan
• Governance
• Monitoring and reporting
• Internal capacity and culture
• User capacity and culture
• Stakeholder relationships
Outcomes
• Intended outcomes
• Overall design
• Value for money
• Comprehensiveness of evidence
• Quality of evidence
• Application of evidence
Evidence
Planning and implementation
Capacity to deliver
Efficacy
An Efficacy Framework: likelihood of impact
An Efficacy Framework: a deep-dive on outcomes
Overall design
• Is the product designed in a way that will most
effectively help your target group reach their goals?
• Does the design allow you to automatically collect
evidence of your progress?
• Have you adapted the design based on feedback
from users?
• Could the design by used by others?
Value for money
• Do you understand the benefits of your
product/service to your target group? Relative to
other options?
• Is the cost of the product/service competitive,
considering the benefits it would deliver?
Intended outcomes
• Have you identified specific outcomes for your target
group?
• Do you have a way to measure the intended
outcomes?
• Do you have ambitious and measurable targets in
place, and deadlines for achieving them?
• Are your intended outcomes clearly documented and
understood by your team and customers?
Example of green rating Example of red rating
• All outcomes are specific and
clearly documented.
• People within and outside my
organisation understand the
intended outcomes and can
communicate them clearly.
• Future targets are ambitious
and achievable.
• Outcomes can be regularly
measured against set targets.
• Design is superior to other
options/competitors with
features focused on
delivering outcomes.
• Real-time evidence is
generated.
• The design can be adapted
and developed.
• Others could use this design,
and it has been shared with
them.
• Feedback/research has
identified what benefits the
product/service needs to
deliver to users.
• Feedback and return-on-
investment research shows
that the cost of the
product/service reflects the
benefits.
• Outcomes are not documented
or specific.
• People within and outside my
organisation do not
understand the intended
outcomes or communicate
them in the same way.
• Targets do not exist to
measure outcomes against.
• Outcomes are only defined at
a high level.
• No feedback from users
(formal or informal), and
benefits of using the
product/service are unclear
to our team and users.
• Perceptions of value for
money and user experience
are poor.
• The design does not meet
target group expectations
and is difficult to use.
• The design does not reflect
intended outcomes.
• The design does not allow
for the collection of
feedback.
• The design is specific to a
local situation and cannot be
replicated.
An Efficacy Framework: in action
Review of evidence
• Strategy papers
• Customer feedback
• Audits
• Progression research
• Policy briefs
Internal interviews
• Sales
• Strategy
• Marketing
• Planning
• Executive leadership
Customer and
stakeholder interviews
• Government bodies
• Universities
• Potential employers
• Associations
Efficacy workshop
Outputs
Assessment of current
efficacy
Actions needed to
enhance efficacy
Highly collaborative and focused
on improvement opportunities
Framework area Initial
review
3- month
estimate
6-month
estimate
Comment
Outcomes
Intended outcomes After 6 months, outcomes and
metrics will be clear and will
influence design. Value for money
will be tested from pilots.
Overall design
Value for money
Evidence
Comprehensiveness of evidence After 6 months, the plan to develop
the forward evidence base will be
finalised and initiated.Quality of evidence
Application of evidence
Planning and implementation
Action plan After 6 months, long-term plans and
reporting structures will be in place
and governance agreed. Reporting
will be at an early stage.
Governance
Monitoring and reporting
Capacity to deliver
Pearson capacity and culture After 6 months, Capacity issues will
be clear, pilots delivered and lessons
learned and applied. Stakeholder
relationships plans will be launched
and gathering feedback.
Customer capacity and culture
Stakeholder relationships
An Efficacy Framework: driving improvement
Efficacy and improving learner outcomes, EAQUALS April 201416
An Efficacy framework: lessons learned
1. You won’t improve a learner outcome you
can’t define clearly!
2. You can’t demonstrate you’re improving a
learner outcome if you’re not measuring it!
3. Appropriate learner outcomes vary by age,
stage, and situation (school, college,
private language school, corporation)
4. To improve learner outcomes, stakeholders
must be aligned to the same goals (tutors,
administrators, education authorities, etc.)
Efficacy and improving learner outcomes, EAQUALS April 201417
Efficacy Studies:
holistic studies of outcomes
Efficacy Studies: holistic, long-term studies with
specific learners, teachers, and institutions
Efficacy and improving learner outcomes, EAQUALS April 201419
Efficacy (Learning) Analytics:
learning from Big and Small data
Identify common
learner difficulties Personalise learning Optimise learning by L1
Research learner
behaviours that lead to
success (machine learning)
Improve learner
engagement
(activity design)
Predict learners who will fail
for early intervention
(predictive algorithms)
Efficacy Analytics: insights into learning behaviours
Efficacy and improving learner outcomes, EAQUALS April 201421
both students
have the same
net score
0 100 200 300 400 500 600
-25
-20
-15
-10
-5
0
5
10
NetScore
Responses Submitted by Student
0 100 200 300 400 500 600
0
50
100
150
200
250
300
Student 57
Fractal D = 1.60
NetScore
Responses Submitted by Student
fractal alert: alert
teacher and learner
to intervene
responses submitted over course
student who will
succeed - smooth
Fractal = 1.60
student who will fail
or not complete – noisey
Fractal = 1.94
Efficacy Analytics: identifying learners at risk
Patent awarded 2013
Efficacy and improving learner outcomes, EAQUALS April 201422
Want to do an
Efficacy Review yourself?
http://efficacy.pearson.com
Efficacy framework: try it yourself
Efficacy and improving learner outcomes, EAQUALS April 201424
Want more?
Efficacy and improving learner outcomes, EAQUALS April 201425
Questions and answers…
Contact: adam.black@pearson.com
1 von 26

Más contenido relacionado

Was ist angesagt?(20)

Evaluation seminar1Evaluation seminar1
Evaluation seminar1
Shanthosh Priyan2.6K views
Program evaluation part 2Program evaluation part 2
Program evaluation part 2
sourav goswami852 views
 program evaluation program evaluation
program evaluation
Burhan Omar616 views
Program Evaluation 1Program Evaluation 1
Program Evaluation 1
sourav goswami11.6K views
Basics of Extension Evaluation (Foundations Course 2019)Basics of Extension Evaluation (Foundations Course 2019)
Basics of Extension Evaluation (Foundations Course 2019)
Ayanava Majumdar (Dr. A), Alabama Cooperative Extension System675 views
Program evaluation 20121016Program evaluation 20121016
Program evaluation 20121016
nida193.3K views
Outputs, Outcomes, and Logic ModelsOutputs, Outcomes, and Logic Models
Outputs, Outcomes, and Logic Models
Ashley Brundage15K views
Introduction to Logic ModelsIntroduction to Logic Models
Introduction to Logic Models
DiscoveryCenterMU5.7K views
Evaluation modelsEvaluation models
Evaluation models
Maarriyyaa32.1K views
Functions of evaluationFunctions of evaluation
Functions of evaluation
Shadrack Bentil202 views
Non-Profit Program Planning and EvaluationNon-Profit Program Planning and Evaluation
Non-Profit Program Planning and Evaluation
Kristina Jones12.1K views
Management oriented evaluation approachesManagement oriented evaluation approaches
Management oriented evaluation approaches
Jessica Bernardino4.5K views
Content of a training programContent of a training program
Content of a training program
Simplify MyTraining.com3.5K views
Logic modelsLogic models
Logic models
Tarun Gehlot1.8K views
Importance of M&EImportance of M&E
Importance of M&E
clearsateam16.3K views

Similar a Black, Adam Dr - Efficacy and how to improve learner outcomes

Trg evaluationTrg evaluation
Trg evaluationDreams Design
881 views39 Folien
Beyond surveysBeyond surveys
Beyond surveysAndrew Downes
521 views41 Folien
Learning_Unit_3Learning_Unit_3
Learning_Unit_3Jack Ong
87 views35 Folien

Similar a Black, Adam Dr - Efficacy and how to improve learner outcomes(20)

Trg evaluationTrg evaluation
Trg evaluation
Dreams Design881 views
Beyond surveysBeyond surveys
Beyond surveys
Andrew Downes521 views
Quality Assurance_FinalQuality Assurance_Final
Quality Assurance_Final
kristin kipp203 views
Learning_Unit_3Learning_Unit_3
Learning_Unit_3
Jack Ong87 views
Putting Program Evaluation to Work for YouPutting Program Evaluation to Work for You
Putting Program Evaluation to Work for You
John Kmiec, Ph.D., Certified ROI Professional619 views
Personal Development Action plan rev 1.pptxPersonal Development Action plan rev 1.pptx
Personal Development Action plan rev 1.pptx
Freelance Consultant28 views
Summative Evaluation PowerPointSummative Evaluation PowerPoint
Summative Evaluation PowerPoint
Joidon Jennings1.5K views
Performance Improvement CulturePerformance Improvement Culture
Performance Improvement Culture
Anand Subramaniam2.8K views
Street Jibe Evaluation Workshop 2Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Brent MacKinnon608 views
Training Program EvaluationTraining Program Evaluation
Training Program Evaluation
Laura Pasquini1.8K views
Unit 8 updatedUnit 8 updated
Unit 8 updated
Abidin mokhlas sdn. bhd1K views
Learning focused EvaluationLearning focused Evaluation
Learning focused Evaluation
Michele Garvey400 views
M & E Training guide M & E Training guide
M & E Training guide
Public Health 538 views
M & E Presentation DSK.pptM & E Presentation DSK.ppt
M & E Presentation DSK.ppt
ShafayetHossain3057 views

Más de eaquals(20)

Último(20)

Psychology KS4Psychology KS4
Psychology KS4
WestHatch52 views
AI Tools for Business and StartupsAI Tools for Business and Startups
AI Tools for Business and Startups
Svetlin Nakov57 views
Sociology KS5Sociology KS5
Sociology KS5
WestHatch50 views
Gopal Chakraborty Memorial Quiz 2.0 Prelims.pptxGopal Chakraborty Memorial Quiz 2.0 Prelims.pptx
Gopal Chakraborty Memorial Quiz 2.0 Prelims.pptx
Debapriya Chakraborty221 views
Nico Baumbach IMR Media ComponentNico Baumbach IMR Media Component
Nico Baumbach IMR Media Component
InMediaRes1186 views
NS3 Unit 2 Life processes of animals.pptxNS3 Unit 2 Life processes of animals.pptx
NS3 Unit 2 Life processes of animals.pptx
manuelaromero201389 views
Narration lesson plan.docxNarration lesson plan.docx
Narration lesson plan.docx
Tariq KHAN90 views
Scope of Biochemistry.pptxScope of Biochemistry.pptx
Scope of Biochemistry.pptx
shoba shoba110 views
Education and Diversity.pptxEducation and Diversity.pptx
Education and Diversity.pptx
DrHafizKosar56 views
STYP infopack.pdfSTYP infopack.pdf
STYP infopack.pdf
Fundacja Rozwoju Społeczeństwa Przedsiębiorczego143 views
discussion post.pdfdiscussion post.pdf
discussion post.pdf
jessemercerail70 views
ICANNICANN
ICANN
RajaulKarim2057 views
ICS3211_lecture 08_2023.pdfICS3211_lecture 08_2023.pdf
ICS3211_lecture 08_2023.pdf
Vanessa Camilleri68 views
ACTIVITY BOOK key water sports.pptxACTIVITY BOOK key water sports.pptx
ACTIVITY BOOK key water sports.pptx
Mar Caston Palacio132 views
231112 (WR) v1  ChatGPT OEB 2023.pdf231112 (WR) v1  ChatGPT OEB 2023.pdf
231112 (WR) v1 ChatGPT OEB 2023.pdf
WilfredRubens.com100 views
ANATOMY AND PHYSIOLOGY UNIT 1 { PART-1}ANATOMY AND PHYSIOLOGY UNIT 1 { PART-1}
ANATOMY AND PHYSIOLOGY UNIT 1 { PART-1}
DR .PALLAVI PATHANIA156 views

Black, Adam Dr - Efficacy and how to improve learner outcomes

  • 1. Efficacy and how to improve learner outcomes EAQUALS Conference April 25, 2014, Budapest Dr. Adam Black Efficacy and Research, Pearson (Professional)
  • 2. Overview • Defining Efficacy • Building a path to Efficacy • Efficacy Reviews: a framework and lessons learned • Efficacy Studies: holistic measures of impact • Efficacy Analytics: global trends • Do an Efficacy Review yourself • Q&A Efficacy and improving learner outcomes, EAQUALS April 20142
  • 3. Context: what are we trying to improve and why now?
  • 4. Context ―No program can be evaluated properly without a common understanding of what it’s supposed to achieve. An unfortunate consequence of treating purposes casually is a tendency to accept goals that seem important in theory without pausing to consider whether it is possible to achieve them within the time available.‖ Our Underachieving Colleges Derek Bok (former President, Harvard University) Efficacy and improving learner outcomes, EAQUALS April 20144
  • 5. ―I have been struck by how important measurement is to improving the human condition. You can achieve incredible progress if you set a clear goal and find a measure that will drive progress toward that goal—in a feedback loop.‖ Bill Gates, Jan 2013 Context
  • 6. Why now? • There is a shared understanding that high-quality education drives personal, economic and societal growth • Governments, individuals, employers and institutions recognise the need to deliver high-quality learning • New technology makes it increasingly possible to see what works and what doesn’t in helping learners to achieve their goals Efficacy and improving learner outcomes, EAQUALS April 20146
  • 7. What do we mean by efficacy? A measurable impact on learner outcomes Efficacy and improving learner outcomes, EAQUALS April 20147 efficacy (dictionary definition) • ability to produce the intended result Efficacy (Pearson’s definition) • make a measureable impact on learner outcomes efficiency (dictionary definition) • achieve maximum productivity with minimum wasted effort
  • 8. Improving Efficacy: how are we going about it?
  • 9. Pearson’s path to efficacy: three integrated activities for improving learner outcomes research design and development piloting full deployment customer use Efficacy Analytics: mine data from products to gain insights into iterative improvements, learner behaviours and future innovation Efficacy Reviews: predict likelihood of impacting learner outcomes and plan improvements Efficacy Studies: learn from long-term holistic studies of outcomes
  • 10. An Efficacy Framework: how likely is it that your project will successfully improve learner outcomes?
  • 11. Criteria area Rating Rationale summary • Action plan • Governance • Monitoring and reporting • Internal capacity and culture • User capacity and culture • Stakeholder relationships Outcomes • Intended outcomes • Overall design • Value for money • Comprehensiveness of evidence • Quality of evidence • Application of evidence Evidence Planning and implementation Capacity to deliver Efficacy An Efficacy Framework: likelihood of impact
  • 12. An Efficacy Framework: an explanation of ratings Good – requires slight refinement, but on track Mixed – some aspects require attention, some solid Problematic – requires substantial attention, some require urgent rectification Off-track – requires urgent action and problem solving Ratings are not grades on performance Ratings prompt discussions that lead to actions Ratings prioritise and suggest timeline Efficacy and improving learner outcomes, EAQUALS April 201412
  • 13. Criteria area Rating Rationale summary • Action plan • Governance • Monitoring and reporting • Internal capacity and culture • User capacity and culture • Stakeholder relationships Outcomes • Intended outcomes • Overall design • Value for money • Comprehensiveness of evidence • Quality of evidence • Application of evidence Evidence Planning and implementation Capacity to deliver Efficacy An Efficacy Framework: likelihood of impact
  • 14. An Efficacy Framework: a deep-dive on outcomes Overall design • Is the product designed in a way that will most effectively help your target group reach their goals? • Does the design allow you to automatically collect evidence of your progress? • Have you adapted the design based on feedback from users? • Could the design by used by others? Value for money • Do you understand the benefits of your product/service to your target group? Relative to other options? • Is the cost of the product/service competitive, considering the benefits it would deliver? Intended outcomes • Have you identified specific outcomes for your target group? • Do you have a way to measure the intended outcomes? • Do you have ambitious and measurable targets in place, and deadlines for achieving them? • Are your intended outcomes clearly documented and understood by your team and customers? Example of green rating Example of red rating • All outcomes are specific and clearly documented. • People within and outside my organisation understand the intended outcomes and can communicate them clearly. • Future targets are ambitious and achievable. • Outcomes can be regularly measured against set targets. • Design is superior to other options/competitors with features focused on delivering outcomes. • Real-time evidence is generated. • The design can be adapted and developed. • Others could use this design, and it has been shared with them. • Feedback/research has identified what benefits the product/service needs to deliver to users. • Feedback and return-on- investment research shows that the cost of the product/service reflects the benefits. • Outcomes are not documented or specific. • People within and outside my organisation do not understand the intended outcomes or communicate them in the same way. • Targets do not exist to measure outcomes against. • Outcomes are only defined at a high level. • No feedback from users (formal or informal), and benefits of using the product/service are unclear to our team and users. • Perceptions of value for money and user experience are poor. • The design does not meet target group expectations and is difficult to use. • The design does not reflect intended outcomes. • The design does not allow for the collection of feedback. • The design is specific to a local situation and cannot be replicated.
  • 15. An Efficacy Framework: in action Review of evidence • Strategy papers • Customer feedback • Audits • Progression research • Policy briefs Internal interviews • Sales • Strategy • Marketing • Planning • Executive leadership Customer and stakeholder interviews • Government bodies • Universities • Potential employers • Associations Efficacy workshop Outputs Assessment of current efficacy Actions needed to enhance efficacy Highly collaborative and focused on improvement opportunities
  • 16. Framework area Initial review 3- month estimate 6-month estimate Comment Outcomes Intended outcomes After 6 months, outcomes and metrics will be clear and will influence design. Value for money will be tested from pilots. Overall design Value for money Evidence Comprehensiveness of evidence After 6 months, the plan to develop the forward evidence base will be finalised and initiated.Quality of evidence Application of evidence Planning and implementation Action plan After 6 months, long-term plans and reporting structures will be in place and governance agreed. Reporting will be at an early stage. Governance Monitoring and reporting Capacity to deliver Pearson capacity and culture After 6 months, Capacity issues will be clear, pilots delivered and lessons learned and applied. Stakeholder relationships plans will be launched and gathering feedback. Customer capacity and culture Stakeholder relationships An Efficacy Framework: driving improvement Efficacy and improving learner outcomes, EAQUALS April 201416
  • 17. An Efficacy framework: lessons learned 1. You won’t improve a learner outcome you can’t define clearly! 2. You can’t demonstrate you’re improving a learner outcome if you’re not measuring it! 3. Appropriate learner outcomes vary by age, stage, and situation (school, college, private language school, corporation) 4. To improve learner outcomes, stakeholders must be aligned to the same goals (tutors, administrators, education authorities, etc.) Efficacy and improving learner outcomes, EAQUALS April 201417
  • 19. Efficacy Studies: holistic, long-term studies with specific learners, teachers, and institutions Efficacy and improving learner outcomes, EAQUALS April 201419
  • 20. Efficacy (Learning) Analytics: learning from Big and Small data
  • 21. Identify common learner difficulties Personalise learning Optimise learning by L1 Research learner behaviours that lead to success (machine learning) Improve learner engagement (activity design) Predict learners who will fail for early intervention (predictive algorithms) Efficacy Analytics: insights into learning behaviours Efficacy and improving learner outcomes, EAQUALS April 201421
  • 22. both students have the same net score 0 100 200 300 400 500 600 -25 -20 -15 -10 -5 0 5 10 NetScore Responses Submitted by Student 0 100 200 300 400 500 600 0 50 100 150 200 250 300 Student 57 Fractal D = 1.60 NetScore Responses Submitted by Student fractal alert: alert teacher and learner to intervene responses submitted over course student who will succeed - smooth Fractal = 1.60 student who will fail or not complete – noisey Fractal = 1.94 Efficacy Analytics: identifying learners at risk Patent awarded 2013 Efficacy and improving learner outcomes, EAQUALS April 201422
  • 23. Want to do an Efficacy Review yourself?
  • 24. http://efficacy.pearson.com Efficacy framework: try it yourself Efficacy and improving learner outcomes, EAQUALS April 201424
  • 25. Want more? Efficacy and improving learner outcomes, EAQUALS April 201425
  • 26. Questions and answers… Contact: adam.black@pearson.com