On Thursday, February 14 from 9:30 a.m. to 12:00 p.m. the Office of Academic Innovation hosted our first Data Showcase - an event for all University of Michigan (U-M) community members to come take a tour through the data that power our work.
3. My job, Year 1: Wrangle the data
Coursera
edX
ART 2.0
ECoach
Problem
Roulette
GradeCraft Tandem
Sage
ViewPoint
Michigan Online Revenue
Events
Process &
Bandwidth
Collaborators
Vendor Data Homegrown Tool
Data
Data about AI
4. My job, Year 2:
Create an excellent environment to
support interdisciplinary research
Coursera
Online Learning
Data Warehouse
(OLDW)
edX
Student Data
Warehouse
Collaboration between IQ & AI
● Build community awareness of
Academic Innovation datasets
● Identify blockers to research and
address
● Establish ongoing research
partnerships to ensure we’re fulfilling
the promise of these innovations
7. ● Data and information have overtaken knowledge and truth in English-language usage.
● Data and information are more synonymous than either are with knowledge or truth.
8.
9. Universities are largely responsible for designing and enabling the IoT
c.f., This month’s Academic Innovation offering
But Universities have been reluctant to apply an IoT
approach to people. Why?
● IoP ≠ IoT because people ≠ things.
● “Business of Learning” ill-defined; what exactly are
we optimizing?
● Who wants to look like Facebook? Data
ownership + rights are evolving and often
unclear. (Who “owns” grades?)
10. My IoP projects with Academic Innovation...
Academic Report
Tools
(ART 2.0)
Mission:
● promote deeper knowledge of the University of Michigan’s
curricular history within the campus community, and, in so
doing,
● support exploration, discovery, and decision making by
U-M students, faculty and staff.
Mission:
● provide equal opportunity for all students to acquire
competency through practice testing and distributed
practice.
Common themes: Access, Transparency
Common approach: Iterative development with community input
14. Community input: ART 2.0 Steering Team 2015-17
● 18 members across
○ 5 colleges
○ Student Life
○ Registrar
○ Center for Research on Learning and Teaching
○ Central Student Government
○ Central IT
● Bi-weekly, one-hour meetings during Fall, Winter terms
● Team members guide development and serve as communicators with their constituencies
Academic
Reporting
Tools 2.0
15. Simple design:
multiple decks of
cards, each with
relevant descriptive
statistics for every
● course
● instructor
● major
● student
...
https://legendsplayingcards.com/
Academic
Reporting
Tools 2.0
20. Majority of students on campus have used ART 2.0
Opportunity: Understand impact on student choices and outcomes.
Academic
Reporting
Tools 2.0
21. Opportunities we’re engaged with
● Cornell implementation
● Connect majors to career outcomes (data sources...)
● Personalization (student cards)
– support exploration for intellectual breadth, disciplinary depth
– simplify (ONE CLICK!!) registration process
– proximity to credentials tool
– support new forms of Official Transcript
● Magnify functionality for faculty, staff, administrators
– advisory group: LSA, CoE, Ross, Ford, Stamps, SEAS, +
– challenge: multiple players in this space
● Institutionalizing the service
– shared ITS-RO-AI
– design & implement effective, sustainable governance
Academic
Reporting
Tools 2.0
26. Review of learning techniques in the educational psychology
literature finds practice testing and distributed practice
(learning partitioned into multiple sessions) as the only two
techniques having high utility.
Problem Roulette
supports both
practice testing and
distributed practice
modalities.
32. Opportunity for ITS + AI + Colleges
Embrace their roles as centralizing forces for academic
information and services by nurturing alliances and
supporting communities of practice within and without U-M.
Partnerships and long-term governance models for
robust services are key.
33. Building the future:
Infrastructure for Innovation
Ben Hayward
Associate Director for Software Development & User Experience
Office of Academic Innovation
hayward@umich.edu
34. Academic Innovation | Definitions & Translation
Application programming interface (API )
● A set of functions and procedures allowing the creation of applications that access the features or data of an operating system,
application, or other service.
Data Warehouse (DW)
● In computing, a data warehouse is a system used for reporting and data analysis, and is considered a core component of
business intelligence. DWs are central repositories of integrated data from one or more disparate sources. They store current
and historical data in one single place that are used for creating analytical reports for workers throughout the enterprise.
Data Integration
● Data integration involves combining data residing in different sources and providing users with a unified view of them. This
process becomes significant in a variety of situations, which include both commercial and scientific domains.
Abstraction
● In software engineering and computer science, abstraction is the process for constructing generalized concept-objects which
are created by keeping common features or attributes to various concrete objects or systems of study
35. Four-month iterative cycles
Faculty Partners
& Innovators
Academic
Innovation
Academic Innovation | A Model For Problem Solving
OUR PROCESS
Faculty
Partners
AI Developers, UX Designers,
Behavioral Scientists, and
Data Scientists
Outside
Partnerships
Product development & iteration
Application Data Research
Seek outside
interest to
grow product
36. Personalized Learning
at Scale
Technology for
Innovative Pedagogy
Tool for
Online Learning
Academic Innovation | Applications Transforming Education
ECoach
Personalized messaging to
students
ART
Academic data to help make
choices
Sage
Resources and reflection for
student mental health
Problem Roulette
Practice Problems for Exam
Preparation
ViewPoint
Role-playing simulations
GradeCraft
Gameful pedagogy for learning
Tandem
Supporting productive and
equitable group work
Michigan Online
Making our elite public
research university’s learning
experiences accessible at scale
Online Learning Tools
Expanding the capabilities of
online learning
37. Academic Innovation | Where is the data?
Student
Information
System
edX Coursera Canvas
3rd Party
● Information about our students’ backgrounds, performance, course load, field of study, etc…
● Information about our courses’ instructors, enrollment, assignment structure, grades
● Information about our degrees’ populations, sequencing, pathways
● Information about our students’ study habits, interests
38. Academic Innovation | Data Storage to Data Service
Transforming and mapping data into actionable services for research and
applications.
● UM Institutional Data Service
○ Data Source: Academic Innovation Data Warehouse
● Online Learning Data Service
○ Data Source: Online Learning Data Warehouse
● Unizin Data Service
○ Data Source: Unizin Data Warehouse
● API Network
○ Data Source: Canvas, Qualtrics, Google, Problem Roulette, EECS Autograder, Moodle, etc...
39. Data Services
API Network
Student
Record
Data Origins
Student
Record
3rd Party
AI DW
edX Coursera
Online
Learning DW
Canvas
Unizin DW
Academic Innovation | Data Storage to Data Service
40. Data
Services
API Network
Unizin DWAI DW
Data
Integrators Institutional
Data
Online
Learning
Data
Grade
Data
Academic Innovation | Data Service to Data Integrators
Online
Learning
DW
Grade
Data
Behavior
Data
41. Academic Innovation | Data Integration
Don’t assume the source, create the format.
● Grade Data Integrator: A structure for Gradebooks, Grading Schemes,
Assignments, Assignment Categories, Submissions, etc...
○ Services: Unizin DW, Moodle API, Canvas API
● Behavior Data Integrator: A structure for behavioral categories, instances
and affiliated user actions
○ Services: Problem Roulette API, EECS Autograder API, Course.Work API,
Canvas API
● Institutional Data Integrator: A structure for representing terms, degrees,
majors, courses and students
○ Services for UM , Cornell
45. Applications
Data
Origins
Student
Information
System
Infrastructure for Innovation: The Data Ecosystem
3rd Party
Data
Services
AI DW
API Network
Data
Integrators
Institutional
Data
Behavior
Data
Grade
Data
Abstracted
Technologies
edX Coursera
Online
Learning DW
Online
Learning Data
Michigan Tailoring
System (MTS)
Event Tracking data.ai
Canvas
Unizin DW
Grade
Data
Randomization
Engine
Personalized Learning at Scale Technology for Innovative Pedagogy Tools for Online Learning
46. Using data to visualize MOOC
design and pedagogy
Dr. Rebecca M. Quintana
Learning Experience Design Lead
Office of Academic Innovation
Yuanru Tan Noni Korf
48. MOOCs
One issue for learning
design teams is grasping
the overall course
structure without a
mediational tool of aid
Quintana, Tan, Gabriele, & Korf, 2018
49. Beads!
We used beads to
represent the structure
of Massive Open Online
Courses (MOOCs) as a
mediational tool with a
MOOC design team.
Quintana, Tan, Gabriele, & Korf, 2018
50. A: Section heading
B: 10-minute lecture video
C: 10-minute interview video
D: Textual guide
E: Reflection activity
F: Course reading
G: External resources
H: Sub-heading
I: Lecture > 10 mins
J: Interview > 10 mins
K: Visual guide
L: Discussion forum
M: Team-work activity
N: Quiz
51. We wanted to provide opportunities for course designers to
examine a familiar phenomenon through an uncommon
medium, provoking curiosity and exploration
Focus group
● Beaded representations of 5 MOOCs
● School of Education MicroMasters courses
● Professor, course designers, managers, builders
How can beaded representations of online course
structure lead to insights that could impact learner
experience?
What might be the value of eliciting insight among
design team members?
52.
53. CCDs
AKA “course composition
diagrams” are interactive
digital representations
that depict the structure
of a MOOC (i.e., content
types, sequence of
elements).
Quintana, Tan, & Korf, 2018 (best
paper award, OTL SIG, AERA)
Seaton, 2016
55. We wanted to create opportunities for reflection by course
design team members, to offer a better understanding of the
impact of design choices
Online open-ended survey
● CCDs of 10 MOOCs launched in previous year
● Professor, course designers, managers, builders
● Inductive, qualitative analysis
What do course composition diagrams reveal/obscure
about the design of a MOOC?
How, if at all, to course composition diagrams allow
course design teams to reflect on the impact of their
design choices?
56. What do course composition diagrams reveal
about the design of a MOOC?
● Bird’s eye view
● Quantitative aspects
● Relational aspects of course elements
Analysis also revealed semantic connections to visual
language of design (e.g., balance, variety, repetition,
pattern, rhythm, emphasis, and movement)
● Differences among course elements
Easily
understood
What do course composition diagrams obscure
about the design of a MOOC?
So simple, it
ceases to be
useful
Reflection on Design
● Opportunities for comparison
● Congruence with perception
● Confirmation of design choices
● Questioning design choices
57. Characterizing MOOC
Pedagogies
Visual methods are now
part of our set of
tools, which allow us to
understand and
characterize the
underlying pedagogies of
MOOCs
Quintana & Tan, 2019
Epistemology Objectivist 1 2 3 4 5 Constructivist
Role of teacher Teacher-center
ed
1 2 3 4 5 Student-centered
Focus of activities Convergent 1 2 3 4 5 Divergent
Structure Less structure 1 2 3 4 5 More structure
Approach to content Concrete 1 2 3 4 5 Abstract
Feedback Infrequent,
unclear
1 2 3 4 5 Frequent,
constructive
Cooperative Learning Unsupported 1 2 3 4 5 Integral
Accomodation of Individual
Difference
Unsupported 1 2 3 4 5 Multi-faceted
Activities/assessments Artificial 1 2 3 4 5 Authentic
User role Passive 1 2 3 4 5 Generative
Swan et al.’s Assessing MOOC Pedagogies framework
Course Composition Diagrams
58. Cluster 1: Applied Data Science with Python 1, 3, 4, 5
Cluster 2: Mindware, Model Thinking, Internet History, Intro to Thermodynamics
Cluster 3: Sampling People, AIDS, Cataract Surgery
Cluster 4: Instructional Methods, Graduate Study, Learning for Equity
Cluster 5: Act on Climate, Applied Data Science with Python 2
Cluster 6: Clinical Skills, Successful Negotiation
Cluster 7: Science of Success, Digital Democracy
Characterizing MOOC
Pedagogies
Visual methods are now
part of our set of
tools, which allow us to
understand and
characterize the
underlying pedagogies of
MOOCs
Quintana & Tan, 2019
59. Student mental health at Michigan:
what we know, what we don't know,
and what we can do
Dr. Meghan Duffy
Professor of Ecology and Evolutionary Biology, LS&A
Faculty Innovator in Residence, Office of Academic Innovation
60.
61. Student mental health: what we know
● Many Michigan students have MH diagnoses:
○ Depression (25%), Generalized Anxiety (18%), Social Anxiety (8%),
ADHD (7%), OCD (3%)
● 44% of undergrads & 41% of grad students reported that
mental or emotional difficulties affected their academic
performance in the past 4 weeks
Sources: CAPS College Student Mental Health Survey; Eisenberg et al. 2007
62. Student mental health: what we know
● In Intro STEM courses:
○ 23% of students reported a previous diagnosis of a depressive disorder
and 25% reported a previous diagnosis of an anxiety disorder.
○ First generation and LGBTQ+ students had significantly higher scores on
the PHQ-8 (depression) and GAD-7 (anxiety) screeners.
○ Most students were aware of at least some on campus mental health
resources.
Source: Morgan Rondinelli Honors Thesis
63. Student mental health: what we know
● Recent survey of US economics grad students:
○ 18% of currently experience moderate to severe symptoms of anxiety
○ 25% have a mental health diagnosis
○ 11% reported suicidal thoughts on at least several days in the past two
weeks
● MH influences performance & increases likelihood of
leaving
Source: Barreira et al. working paper, Healthy Minds Study
64. Student mental health: what we don’t know
● What data are we already collecting that could give us
insights into student mental health and well-being?
65. Student mental health: what we don’t know
● What is the phenology of student well-being? (4Q Project)
Wikipedia: J.hagelüken
66. Student mental health: what we don’t know
● What are some easy changes that could improve
well-being?
UMich College Sleep Disorders Clinic
Dr. Shelley
Hershner
67. Student mental health: what we can do
● Wellness playbook: wellness coaching at scale
○ Model: ECoach’s Exam Playbook
○ Goal: encourage students to:
■ reflect on why wellness is important to them
■ plan for how to improve well-being,
■ connect with resources
69. Partnership opportunities
● Phenology/4Q Project needs:
○ courses/student populations to run in
○ to link with existing data (e.g., Canvas usage), would need data
scientist/analyst
● Small changes: sleep
○ Need instructors!
● Wellness playbook
○ in development, open to input!
● Grad student mental health
○ in planning phase
Interested? Contact: duffymeg@umich.edu
70. Understanding global learners
through billions of lines of
clickstream data
Dr. Christopher Brooks
Research Assistant Professor, School of Information
Director of Learning Analytics & Research
Office of Academic Innovation
brooksch@umich.edu @cab938
71. Motivation
My research is in learning analytics and educational data science
I’m specifically interested in understanding scaled learning experiences, like Massive Open
Online Courses, and global learning populations through a mixture of observational,
experimental, and computational methods
My lab, the educational technology collective (etc.),
is made up of students, postdocs, and
collaborators from a breadth of disciplinary and
scholarly backgrounds
72. Part 1: Scaled Learning
How has the MOOC population had changed since the
early days of the phenomena (2012).
Strong implications for researchers as well as
instructional designers and educational technologists
Used a quantitative approach looking at how discourse
and language are changing in forums
Nia Dowell
(UM Postdoc)
Dowell, N. M., Brooks, C., Kovanović, V., Joksimović, S., & Gašević, D. (2017, April).
The Changing Patterns of MOOC Discourse. In Proceedings of the Fourth (2017)
ACM Conference on Learning@ Scale (pp. 283-286). ACM.
76. Peer Review and Written Feedback
How do peers review short written works from students of different
socioeconomic groups? Previous work has explored bias in evaluation, we
are interested in bias in qualities of responses.
Heeryung Choi
(PhD Student)
77. Predicting Student Success
An explosion in the interest in predicting student success over the last decade, both in MOOCs
and in on-campus higher education. Now a core part of Learning Analytics (LAK) and
Educational Data Mining (EDM) conferences
Both computationally and educationally interesting!
Lots of different reasons to predict success:
- understanding the determinants of success
- changing outcomes for all/some students
- administratively practical (it scales)
Craig Thompson
(PhD Student, usask)
78. C. Brooks, C. Thompson, S. Teasley. (2015) A Time
Series Interaction Analysis Method for
Building Predictive Models of Learners using
Log Data. 5th International Conference on
Learning Analytics and Knowledge 2015 (LAK'15)
C. Brooks, C. Thompson, S. Teasley. (2015) Who
You Are or What You Do: Comparing the
Predictive Power of Demographics vs. Activity
Patterns in Massive Open Online Courses
(MOOCs). The second annual conference on
Learning At Scale 2015 (L@S2015), Works in
Progress track.
79. Frustrations
There are dozens of predictive modeling in MOOC papers, and each uses different:
a. Feature engineering methods
b. Training methods
c. Modeling methods and
hyperparameters
d. Training and evaluation data
e. Predictive outcomes
Comparison of features/models/parameters
is impossible. Replication of results is impossible.
Josh Gardner
(Washington)
80. W. Li, C. Brooks, F. Schaub (2019). The Impact of Student Opt-Out on
Educational Predictive Models. 9th International Conference on Learning
Analytics and Knowledge (LAK19). March, 2019. Tempe, AZ.
Educational Predictive Model Biases
Where does bias come from?
- Data collection practices and social inequalities
- Population changes over time
- Opt outs, right to be forgotten, FERPA, PIPEDA, GDPR
Warren Li
(PhD Student,
Michigan)
Florian Schaub
(Faculty,
Michigan)
81. W. Li, C. Brooks, F. Schaub (2019). The Impact of Student Opt-Out on
Educational Predictive Models. 9th International Conference on Learning
Analytics and Knowledge (LAK19). March, 2019. Tempe, AZ.
Educational Predictive Model Biases
Where does bias come from?
- Data collection practices and social inequalities
- Population changes over time
- Opt outs, right to be forgotten, FERPA, PIPEDA, GDPR
Warren Li
(PhD Student,
Michigan)
Florian Schaub
(Faculty,
Michigan)
82. Ryan Baker
(Faculty, Penn)
Josh Gardner
(PhD Student,
Washington)
J. Gardner, C.
Brooks, R. Baker
(2019). Evaluating
the Fairness of
Predictive Student
Models Through
Slicing Analysis.
9th International
Conference on
Learning Analytics
and Knowledge
(LAK19). March,
2019. Tempe, AZ.
83. Personalization and Inclusion
There are several reasons inclusion is interesting to study in MOOCS:
1. The population isn’t as WEIRD (western, educated, industrialized, rich, democratic)
2. Multiple motivations for learning; interest, edutainment, jobs skills, social integration
3. There is learning beyond the immediate (e.g. higher ed): lifelong learning in a
semi structured environment
4. A/B testing is baked into the platform
Rene Kizilcec
(Cornell)
Kizilcec, R. and Brooks, C. (2017). Diverse Big Data and Randomized Field Experiments in Massive Open
Online Courses. In Lang, C., Siemens, G., Wise, A. F., and Gaevic, D., editors, The Handbook of Learning
Analytics, pages 211–222. Society for Learning Analytics Research (SoLAR) 1st edition.
84. Situational Video Cues and Activity
Based in part on Cheryan et al. (2009) looking at interest in pursuing computer science by female
students.
Pre-registered a set of hypothesis at OSF:
1. Primary: Retention in the female condition will be higher for women, but retention in the
female condition will be no different for men (between conditions)
2. Secondary: (a) completion (b) achievement (c) forum participation and (d) certificate
participation of women will be higher in the female condition
86. Results
No difference in achievement or drop out for the two populations (women and men; n~23k each)
when compared across conditions within population.
But, a difference in discourse amount (though not prevalence of discourse)?
(Similar results found for quantity of interaction (clickstreams))
C. Brooks, J. Gardner, Kaifeng Chen (2018)
How Gender Cues in Educational Video
Impact Participation and Retention.
Festival of Learning, June, 2018. London
UK. Full Crossover Paper.
87. In my research group we’ve looked specifically at MOOC trends broadly, predictive models for student
success, and inclusion and personalization.
The data the University of Michigan has on MOOC learners, and the flexibility of our platforms, have
made this a fertile area for understanding global learners
Quick Conclusions
Christopher Brooks, School of Information, University of Michigan
brooksch@umich.edu http://edtech.labs.si.umich.edu
88. What do students value about
learning online, and how can this
impact program design?
Sarah Dysart
Director of Online & Hybrid Degrees
Office of Academic Innovation
sdysart@umich.edu @SarahDysart
89. What learners are we trying to reach?
Underrepresented learners
Career changers/advancers
Non-traditional learners
● Students who delay enrollment by a year or more
● Having dependents other than a spouse
● Being a single parent
● Working full time while enrolled
● Being financially independent
● Attending part time
91. … but what about ...
Synchronous class sessions
Synchronous office hours
On-campus orientations
On-campus engagements/residencies
Field placement requirements
95. Important Factors that Drive Enrollment Decisions
What are the most important factors in your decision about which school to
enroll for an online program? [Selected top three]
All
Students
Tuition & Fees 34%
Reputation of the Program 13%
Reputation of the School 11%
Home Location of the School 11%
Quality of Faculty 6%
The School Offers Multiple Study Formats 6%
The School Reflects my Values 6%
Alumni Achievements 3%
Magda, A. J., & Aslanian, C. B. (2018). Online college students 2018: Comprehensive data on demands and preferences. Louisville, KY: The Learning House, Inc.
96. We need to better
understand what
online students value.
(and how that differs across groups, and why)
97. Where do we start?(um, where do we get the data, Sarah?)
99. Expectancy-Value Theory
Expectancy for Success
Subjective Task Value
Achievement Related
Choices, Engagement,
Persistence
Wigfield, A., & Eccles, J. S. (2000). Expectancy–Value Theory of Achievement Motivation. Contemporary Educational Psychology, 25(1), 68–81.
https://doi.org/10.1006/ceps.1999.1015
100. Specifically:
Values Enrollment Choices
Schunk, D. H., Pintrich, P. R., & Meece, J. L. (2014). Motivation in education: theory, research, and applications (4th ed.).
Upper Saddle River, N.J: Pearson/Merrill Prentice Hall.
101. Subjective Task Value
Interest-Enjoyment Value
Attainment Value
Utility Value
Relative Cost
$$$$$
Task Effort Cost
Outside Effort Costs
Loss of Valued Alternatives
Emotional Cost
Flake, J. K., Barron, K. E., Hulleman, C., McCoach, B. D., & Welsh, M. E. (2015). Measuring cost: The forgotten component of expectancy-value theory.
Contemporary Educational Psychology, 41, 232–244. https://doi.org/10.1016/j.cedpsych.2015.03.002
102. Learner Populations
Our enrolled students are those for whom certain costs are less of an issue
As we begin to develop program portfolios, we can turn to our learner communities
in the open environment to measure value components associated with various
program characteristics (i.e. cost of program, synchronous requirements,
on-campus commitments, etc.)
Leveraging our relationship with peers whose program characteristics differ from
ours
103. In short…
We don’t have this data yet, but I think we can get there.
The data can give us a starting point for understanding why motivation to enroll in
programs may differ across demographic groups and subject areas
104. Thank you!
The Team at Academic Innovation
academicinnovation@umich.edu @UMichiganAI