1. Experiments in Educational Research
& Practice
Joseph Jay Williams
josephjaywilliams@gmail.com
www.josephjaywilliams.com
Lytics Lab, Office of the Vice Provost of Online
Education, Graduate School of Education, Stanford
2. Experiments in Educational Research & Practice
•
Kinds of Experiments
–
–
–
–
•
Laboratory
Design experiment (Brown, 1992)
In vivo embedded experiments (Koedinger, Corbett & Perfetti, 2010)
Randomized Controlled Trial/Intervention
Experiments & Internet, Online Education
–
Internet-facilitated Laboratory studies
•
•
•
–
–
–
–
•
•
Mechanical Turk (tiny.cc/mechanicalturk )
Qualtrics/Rapid authoring tools (cognitivescience.co/qualtrics )
“A/B testing” more widespread
Iterative Design & Improvement (Amy Collier, Stanford Office of Online Learning)
In vivo motivation experiment on Khan Academy (Williams et al, 2013, MOOCshop)
Conducting Randomized Trials over the Internet (Paunesku, Greene, Williams)
Bridging Studies (Williams et al, 2012; Williams & Poldsam, 2013)
REPEAT Approach to Experimental Research on Online/Blended Resources
Identifying a Practical Online Context for your Research Question
–
–
–
–
Partner with platform *early* in research
“Customer Development”: Engaging with Products/Teachers/Districts/Curricula Designers (Blank & Reese, 2010)
Examples of Blended Learning Resources & MOOCs
Examples of resources that support Randomized Trials
–
–
Interventions extended over time
Bridging studies
•
•
Engaging with Practitioners
–
–
How to communicate value of experiments
Consulting
•
•
•
•
–
–
•
Educational companies & publishers (marcy.baughman@pearson.com )
Corporate Training & E-Learning (E-learning guild, Devlearn, Bror Saxberg, Will Thalheimer)
Medical Education & Health Behavior (Medbiquitous, International Society for Internet Interventions, cognitivescience.co/behaviorchange)
Synthesize literature: PSLC Wiki, www.josephjaywilliams.com/education
Grant funding (IES, NSF, Gates, Spencer, moocresearch.com)
Virtual collaboration on practical resources (www.learnnetwork.net )
Experiments & …
–
–
–
•
Go to the source: Empirically test value of instructional prescriptions & professional development (University of Virginia,
Computer Supported Collaborative Learning (Rose)
Cognitive Tutors (Aleven, Ritter, Koedinger)
EDM & AIED (Stamper, Gordon, Pardos)
Conducting Experiments (tiny.cc/conductingexperiments)
7. Integrate Research & Practice
• Randomized assignment
• Experimental Control
• Rich data
• Real-world environment
• Authentic activities
• Practical Challenges
• Generalizable theories
• “in vivo” experiments
• Diverse populations
• Practical improvements
• Disseminate research
• Generate Funding
8. Experiments & Internet, Online Education
– Internet-facilitated Laboratory studies
• Mechanical Turk (tiny.cc/mechanicalturk )
• Qualtrics/Rapid authoring tools (cognitivescience.co/qualtrics )
• “A/B testing” more widespread
– Iterative Design & Improvement (Amy Collier, Stanford Office of Online
Learning)
– In vivo motivation experiment on Khan Academy (Williams et
al, 2013, MOOCshop)
– Conducting Randomized Trials over the Internet
(Paunesku, Greene, Williams)
– Bridging Studies (Williams et al, 2012; Williams & Poldsam, 2013)
9. In-vivo motivation experiment on Khan Academy
Jascha Paunesku,
• Williams,Sohl-Dickstein Haley, & SohlDickstein, MOOCshop, AIED
• Embed messages in online Khan Academy exercises
• Effect of Growth Mindset beyond encouragement?
10. Implicit beliefs about Intelligence
• On a scale from 1 to 10, how much do you agree
that?
– Your intelligence is something very basic about you that you can’t
change very much.
– No matter how much intelligence you have, you can always change it
quite a bit.
• Growth vs. Fixed Mindset (Dweck, 2007)
11. Experimentally manipulate added messages
Practice-as-usual Message
Growth Mindset Message
Positive
Some of these problems are hard. Do your best!
Remember, the more you practice the smarter you become!
12. Design
Practice-as-usual
• Growth Mindset Message
• Positive Message
•
•
•
"Remember, the more you practice the
smarter you become.”,
"Mistakes help you learn. Think hard to
learn from them.”
•
"Some of these problems are hard. Just
do your best."
"This might be a tough problem, but we
know you can do it.”
• 50 000+ students per condition
• Dependent measures:
– Number of problems completed
– Number correct on first attempt
– Accuracy
13. R.E.P.E.A.T. Framework for Online Education Research
•
•
•
•
•
•
Realistic
Experimental
Product
Evaluated
Accessible
Theoretically motivated
• REPEAT – iteratively improve through revision &
collaboration
13
14. Research & Practice in Online Education
• APS symposium: Williams, Saxberg, Means, Mitros.
(2013). Online Learning and Psychological Science:
Opportunities to integrate research and practice.
• Williams, J. J., Renkl, A., Koedinger, K., Stamper, J.
(2013). Online Education: A Unique Opportunity for
Cognitive Scientists to Integrate Research and
Practice. In M. Knauff, M. Pauen, N. Sebanz, & I.
Wachsmuth (Eds.), Proceedings of the 35th Annual
Conference of the Cognitive Science Society.
15. Experiments in Educational Research & Practice
•
Kinds of Experiments
–
–
–
–
•
Laboratory
Design experiment (Brown, 1992)
In vivo embedded experiments (Koedinger, Corbett & Perfetti, 2010)
Randomized Controlled Trial/Intervention
Experiments & Internet, Online Education
–
Internet-facilitated Laboratory studies
•
•
•
–
–
–
–
•
•
Mechanical Turk (tiny.cc/mechanicalturk )
Qualtrics/Rapid authoring tools (cognitivescience.co/qualtrics )
“A/B testing” more widespread
Iterative Design & Improvement (Amy Collier, Stanford Office of Online Learning)
In vivo motivation experiment on Khan Academy (Williams et al, 2013, MOOCshop)
Conducting Randomized Trials over the Internet (Paunesku, Greene, Williams)
Bridging Studies (Williams et al, 2012; Williams & Poldsam, 2013)
REPEAT Approach to Experimental Research on Online/Blended Resources
Identifying a Practical Online Context for your Research Question
–
–
–
–
Partner with platform *early* in research
“Customer Development”: Engaging with Products/Teachers/Districts/Curricula Designers (Blank & Reese, 2010)
Examples of Blended Learning Resources & MOOCs
Examples of resources that support Randomized Trials
–
–
Interventions extended over time
Bridging studies
•
•
Engaging with Practitioners
–
–
How to communicate value of experiments
Consulting
•
•
•
•
–
–
•
Educational companies & publishers (marcy.baughman@pearson.com )
Corporate Training & E-Learning (E-learning guild, Devlearn, Bror Saxberg, Will Thalheimer)
Medical Education & Health Behavior (Medbiquitous, International Society for Internet Interventions, cognitivescience.co/behaviorchange)
Synthesize literature: PSLC Wiki, www.josephjaywilliams.com/education
Grant funding (IES, NSF, Gates, Spencer, moocresearch.com)
Virtual collaboration on practical resources (www.learnnetwork.net )
Experiments & …
–
–
–
•
Go to the source: Empirically test value of instructional prescriptions & professional development (University of Virginia,
Computer Supported Collaborative Learning (Rose)
Cognitive Tutors (Aleven, Ritter, Koedinger)
EDM & AIED (Stamper, Gordon, Pardos)
Conducting Experiments (tiny.cc/conductingexperiments)
16. Identifying a Practical Online Context for your Research Question
– Partner with platform *early* in research
– “Customer Development”: Engaging with
Products/Teachers/Districts/Curricula Designers (Blank & Reese, 2010)
– Examples of Blended Learning Resources & MOOCs
– Examples of resources that support Randomized Trials
• Go to the source: Empirically test value of instructional prescriptions & professional
development (University of Virginia,
– Interventions extended over time
– Bridging studies
17. Engaging with Practitioners
– How to communicate value of experiments
– Consulting
• Educational companies & publishers (marcy.baughman@pearson.com )
• Corporate Training & E-Learning (E-learning guild, Devlearn, Bror Saxberg, Will
Thalheimer)
• Medical Education & Health Behavior (Medbiquitous, International Society for
Internet Interventions, cognitivescience.co/behaviorchange)
• Synthesize literature: PSLC Wiki, www.josephjaywilliams.com/education
– Grant funding (IES, NSF, Gates, Spencer, moocresearch.com)
– Virtual collaboration on practical resources (www.learnnetwork.net )
19. Experiments in Educational Research & Practice
•
Kinds of Experiments
–
–
–
–
•
Laboratory
Design experiment (Brown, 1992)
In vivo embedded experiments (Koedinger, Corbett & Perfetti, 2010)
Randomized Controlled Trial/Intervention
Experiments & Internet, Online Education
–
Internet-facilitated Laboratory studies
•
•
•
–
–
–
–
•
•
Mechanical Turk (tiny.cc/mechanicalturk )
Qualtrics/Rapid authoring tools (cognitivescience.co/qualtrics )
“A/B testing” more widespread
Iterative Design & Improvement (Amy Collier, Stanford Office of Online Learning)
In vivo motivation experiment on Khan Academy (Williams et al, 2013, MOOCshop)
Conducting Randomized Trials over the Internet (Paunesku, Greene, Williams)
Bridging Studies (Williams et al, 2012; Williams & Poldsam, 2013)
REPEAT Approach to Experimental Research on Online/Blended Resources
Identifying a Practical Online Context for your Research Question
–
–
–
–
Partner with platform *early* in research
“Customer Development”: Engaging with Products/Teachers/Districts/Curricula Designers (Blank & Reese, 2010)
Examples of Blended Learning Resources & MOOCs
Examples of resources that support Randomized Trials
–
–
Interventions extended over time
Bridging studies
•
•
Engaging with Practitioners
–
–
How to communicate value of experiments
Consulting
•
•
•
•
–
–
•
Educational companies & publishers (marcy.baughman@pearson.com )
Corporate Training & E-Learning (E-learning guild, Devlearn, Bror Saxberg, Will Thalheimer)
Medical Education & Health Behavior (Medbiquitous, International Society for Internet Interventions, cognitivescience.co/behaviorchange)
Synthesize literature: PSLC Wiki, www.josephjaywilliams.com/education
Grant funding (IES, NSF, Gates, Spencer, moocresearch.com)
Virtual collaboration on practical resources (www.learnnetwork.net )
Experiments & …
–
–
–
•
Go to the source: Empirically test value of instructional prescriptions & professional development (University of Virginia,
Computer Supported Collaborative Learning (Rose)
Cognitive Tutors (Aleven, Ritter, Koedinger)
EDM & AIED (Stamper, Gordon, Pardos)
Conducting Experiments (tiny.cc/conductingexperiments)
Hinweis der Redaktion
Reframing Education to teachers – it’s about INVESTIGATION. Evidence-based practice.*Teacher buy-in, teacher CONTRIBUTION. Teacher modification.Unnatural things researchers are doing…Vs. Teachers doing things anyway, trying it out. Researchers are being involved.“Experiment”. It’s a comparison of instruction… (less the theory, more about its effectiveness).Personalization.Bridging studies.It’s on a case-by-case basis.What’s similar, what’s different.Still pretty diverse.Population demographics.Motivational issues – being paid.Are your assumptions correct? Just having to IMPLEMENT it concretely is valuable, as a DESIGN exercise.Cons:I AM CHOOSING just the right contexts. Most people start with others.> Can you adapt your research question to make it easier to run in a Bridging Study?Interventions:Short & Fat.Long & Skinny.
Features. Heuristics. Guiding principles.Context of how this fits in the larger context.
Reframing Education to teachers – it’s about INVESTIGATION. Evidence-based practice.*Teacher buy-in, teacher CONTRIBUTION. Teacher modification.Unnatural things researchers are doing…Vs. Teachers doing things anyway, trying it out. Researchers are being involved.“Experiment”. It’s a comparison of instruction… (less the theory, more about its effectiveness).Personalization.Bridging studies.It’s on a case-by-case basis.What’s similar, what’s different.Still pretty diverse.Population demographics.Motivational issues – being paid.Are your assumptions correct? Just having to IMPLEMENT it concretely is valuable, as a DESIGN exercise.Cons:I AM CHOOSING just the right contexts. Most people start with others.> Can you adapt your research question to make it easier to run in a Bridging Study?Short & Fat.Long & Skinny.
Reframing Education to teachers – it’s about INVESTIGATION. Evidence-based practice.*Teacher buy-in, teacher CONTRIBUTION. Teacher modification.Unnatural things researchers are doing…Vs. Teachers doing things anyway, trying it out. Researchers are being involved.“Experiment”. It’s a comparison of instruction… (less the theory, more about its effectiveness).Personalization.Bridging studies.It’s on a case-by-case basis.What’s similar, what’s different.Still pretty diverse.Population demographics.Motivational issues – being paid.Are your assumptions correct? Just having to IMPLEMENT it concretely is valuable, as a DESIGN exercise.Cons:I AM CHOOSING just the right contexts. Most people start with others.> Can you adapt your research question to make it easier to run in a Bridging Study?Short & Fat.Long & Skinny.