Learning Analytics & the Changing Landscape of Higher Education
1. @NYU_LEARN
Learning Analytics &
the Changing Landscape
of Higher Education
Alyssa Friend Wise, PhD
Associate Professor of Educational Technology and Director, NYU-LEARN
@alywise
2. References 1
E-Listening Analytics
Marbouti, F. & Wise, A. F. (2016) Starburst: A new graphical interface to support productive engagement with
others’ posts in online discussions. Educational Technology Research & Development, 64(1), 87-113.
Wise, A. F. Vytasek, J. M., Hausknecht, S. N. & Zhao, Y. (2016). Developing learning analytics design knowledge
in the “middle space”: The student tuning model and align design framework for learning analytics use. Online
Learning, 20(2), 1-28.
Wise, A. F., Hausknecht, S. N. & Zhao, Y. (2014). Attending to others' posts in asynchronous discussions:
Learners' online "listening" and its relationship to speaking. International Journal of Computer-Supported
Collaborative Learning, 9(2) 185-209.
Wise, A. F., Perera, N., Hsiao, Y., Speer, J. & Marbouti, F. (2012). Microanalytic case studies of individual
participation patterns in an asynchronous online discussion in an undergraduate blended course. Internet and
Higher Education, 15(2), 108–117.
Wise, A. F., Speer, J., Marbouti, F. & Hsiao, Y. (2013). Broadening the notion of participation in online
discussions: Examining patterns in learners' online listening behaviors. Instructional Science, 41(2), 323-343.
Wise, A. F., Zhao, Y., & Hausknecht, S. (2014). Learning analytics for online discussions: Embedded and
extracted approaches. Journal of Learning Analytics, 1(2), 48-71.
MOOC Discussion Analytics
Wise, A. F., & Cui, Y. (2018). Learning communities in the crowd: Characteristics of content related interactions
and social relationships in MOOC discussion forums. Computers & Education, 122, 221-242.
Wise, A. F., Cui, Y., Jin W. Q. & Vytasek, J. M. (2017) Mining for gold: Identifying content-related MOOC
discussion threads across domains through linguistic modeling. Internet and Higher Education, 32, 11-28.
3. References 2
Analytics Design & Use in Practice
Sarmiento, J.P., Campos, F., & Wise, A. F. (2020). Engaging students as co-designers of learning
analytics. Proceedings of the 10th International Conference on Learning Analytics and Knowledge.
Wise, A. F., & Jung, Y. (2019). Implications of instructor analytics use patterns for the design of actionable
educational data visualizations. Proceedings of the 9th International Conference on Learning Analytics and
Knowledge, 689-696.
Wise, A.F., & Jung, Y. (2019). Teaching with analytics: Towards a situated model of instructional decision-
making. Journal of Learning Analytics, 6(2), 53-69.
Reflection Analytics
Cui, Y., Wise, A. F. & Allen, K. A. (2019). Developing reflection analytics for health professions education:
Aligning critical concepts with data features. Computers in Human Behavior, 100, 305-32.
Jung, Y. & Wise, A.F. (2020). How and how well do dental students reflect?: Viability of multi-dimensional
automated reflection assessment in health professions education. Proceedings of the 10th International
Conference on Learning Analytics and Knowledge, 595–604.
Wise, A. F. & Cui, Y. (2019). Top concept networks of professional education reflections. Proceedings of the 9th
International Conference on Learning Analytics and Knowledge, 260-264.
Wise, A.F. & Reza, S. (2020). Becoming a dentist: Tracing professional identity development through mixed
methods data mining of student reflections. Proceedings of the 15th International Conference of the Learning
Sciences.
4. The use of
data science methods
to generate
insights into teaching and learning
that lead to
direct impactful action
Learning Analytics (LA)
6. …. for Designers into how course activities are used
…. for Instructors into where more support is needed
…. for Students into their own learning practices
…. for Advisors into which students to reach out to
…. for Administrators into how the curriculum aligns
LA can generate insight…
Each supports data-informed decision-making
7. • Instructors don’t have access to many of the
classroom-based cues they usually rely on
• Students often struggle with the self-regulation
needed (and don’t have access to model peers)
• Opportunities for access to new information that
wasn’t easily available in physical spaces
Data informed decisions are more important
than ever with the ‘shift to digital’
8. Survey of 200 students this Spring
1
2
3
4
5
Course
(Before COVID)
Course
(Remote)
Previous Online
Courses
Course Experience Rating
1
2
3
4
5
Before Covid Remote
Change in Course
Experience Rating
Graduate
Undergrad
9. Survey of 200 students this Spring
1
2
3
4
5
COVID Onset Transition Period Steady State
Course (Remote) Experience Rating
11. Image Credit: UGA College of Agriculture and Environmental Sciences via Flickr (CC BY 2.0), adapted
12. Image Credit: UGA College of Agriculture and Environmental Sciences via Flickr (CC BY 2.0), adapted
13.
14.
15.
16.
17. DEMOGRAPHICS ( W H O P E O P L E A R E )
PERFORMANCE ( H O W T H E Y ’ V E D O N E )
ACTIVITY ( T H I N G S P E O P L E D O )
LO G - F I L E S , P H Y S I C A L T R A C K S , S E L F - R E P O R T
ARTIFACT ( T H I N G S P E O P L E C R E AT E )
P R O B L E M A N S W E R S , W R I T T E N E X P L A N AT I O N S
ASSOCIATION ( C O N N E C T I O N S P E O P L E M A K E )
W H O A N D W H AT T H E Y I N T E R A C T W I T H
DATA
3A’S
WISE (2019)
HOPPE (2015)
19. S I N H A E T A L . ( 2 0 1 4 )
THE KEY IS
TO CONNECT
LOW-LEVEL
BEHAVIORS
WITH
HIGH-LEVEL
CONSTRUCTS
P R E - C L A S S V I D E O
20. S I N H A E T A L . ( 2 0 1 4 )
THE KEY IS
TO CONNECT
LOW-LEVEL
BEHAVIORS
WITH
HIGH-LEVEL
CONSTRUCTS
Raw
Clicks
Aggregate
Features
Critical
Concept
Info
Process.
Play
SeekFwd
ScrollFwd
RateFast
Skipping Disengaged Low
Play
Pause
SeekBw
SeekFwd
Checkback Searching
for specific
info
Med
Play
Pause
SeekBw
Rewatch Reviewing
content
High
Play
Pause
SeekBw
ScrollBw
Clarify
Idea
Tussling
with
content
Very High
P R E - C L A S S V I D E O
25. MEANINGFUL VARIABLES TO
CONSTRUCT + INCLUDE
POTENTIAL CONFOUNDS,
SUBGROUPS, OR COVARIATES
WHICH RESULTS TO ATTEND TO,
WHAT THEY MAY MEAN, WHERE
THEY MAY GENERALIZE TO
HOW TO TAKE ACTION BASED
ON OUTCOMES
W I S E & S H A F F E R ( 2 0 1 5 )
THEORY
GIVES
GUIDANCE
26. ONLINE DISCUSSION
LEARNING MODEL
Externalizing one’s
ideas by contributing
posts to an online
discussion
Taking in the
externalizations of
others by accessing
existing posts
• Social constructivist perspective - online discussions as a forum
for learning through conversation
• Students learn as they articulate their ideas, are exposed to the
ideas of others, and negotiate differences in perspective
• Focus on how students contribute comments (“speak”) and
attend to other’s messages (“listen”)
27. Criteria Metric Definition
Listening
Breadth
Percent of posts viewed
Number of unique posts that a student viewed divided
by the total number of posts in the discussion
Percent of posts read
Number of unique posts that a student read divided by
the total number of posts in the discussion
Listening Depth Percent of real reads
Number of times that a student read a post divided by
the total number of times they read and viewed posts
Listening
Reflectivity
Number of reviews of
own / others’ posts
Number of times a student revisited posts that they had
made / viewed previously in the discussion
Conversational
Distribution
Posts made / viewed
throughout discussion
Dispersion or concentration of posts made / viewed by a
student in the discussion space
Speaking
Quantity
Number of posts
Total number of posts a student contributed to the
discussion
Average post length
Total number of words posted by a student divided by
the number of posts they made to the discussion
ONLINE DISCUSSION
LEARNING METRICS
29. Listening Reflectivity
• Reviewing others’ posts multiple times predicts greater
responsiveness
Listening Depth
• A greater % of real reads predicts richer argumentation
(Informed) Listening Breadth
• Reading a greater % of posts and viewing a greater % of posts
than those read predicts richer argumentation
ONLINE DISCUSSION
LEARNING INSIGHTS
31. Metric Your Data
(Week X)
Class Average
(Week X)
Observations
Range of participation 4 days 5 days
# of sessions 6 13
Average session length 33 min 48 min
% of sessions with posts 67% 49%
# of posts made 8 12
Average post length 386 words 125 words
% of posts read 42% 87%
#of reviews of own posts 22 13
#of reviews of others’ posts 8 112
EXTRACTED LISTENING ANALYTICS
32. IMPACT OVER TIME
“I found that I wanted the challenge of trying to up the % of overall posts
that I reviewed each week. This also meant slowing down my reading since
the data would not record a quick read of the information. The overall result
was that I think I learned more and was able to get a broader sense of
opinion concerning the readings.”
34. Interpret Data
Sense-Making Pedagogical Response
Get Oriented/
Focused Attention
Find Absolute & Relative
Reference Points
Read Data
Triangulate
Contextualize
Make Attribution
Explain Pattern
AFFECTIVE PROCESSES
Area of
Curiosity
Question
Generation Wait-and-See
Reflect on
Pedagogy
Check
Impact
Take Action
Whole-Class Scaffolding
Targeted Scaffolding
Revise Course Design
Teaching with Analytics
Wise, A. F., & Jung, Y. (2019). Teaching with Analytics: Towards a Situated Model of
Instructional Decision-Making. Journal of Learning Analytics, 6(2), 53-69.
35. Pedagogical
Questions
Are my students
preparing?
Which of my
students need
help?
How are my
students thinking
about STEM?
Data-Based
Answers
Student interaction
grid
Predictive
modelling
Concept network
examination
Educational
Action
Emphasize
important
resources
Offer help in a
targeted manner
Evaluate / update
curricular design
39. Who Are My Students Engaging With?
Always Same People Always Different People
My Actual Class
Avg Degree = 3
Modularity = .81 Avg Degree = 10
Modularity = .14
Avg Degree = 9
Modularity = .27
41. StatMed’13
StatMed’14
StatLearn
YBW
PSY
Course Subject Learning Process Question Words Connectors Existence/Condition Course TasksQuality/Quantity Effort / Action People Appreciation/Greeting
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
Content - Related
NonContent - Related
Numberoffeatures Content-Related
Discussion Posts
Non-Content-Related
Discussion Posts
Question Words + Connectors
e.g. “can” “does” “why” “how”
“which “and” “of” “than” “is”
Course Tasks + People
+ Appreciation / Greetings
e.g. “answer” “exam” “course”
“lecture” “thank” “good” “I” “my”
Top Feature Distribution by Category
42. How am I Facilitating Interaction?
Dr. Wang
• Responses at all levels
• Coaching and supporting
• Social presence cues
“That is correct -
Nice! So how
would you use
this to solve the
question?”
“A bell shape is
not necessary.
You could have
a bimodal
distribution”
Dr. Smith
• Responses only to top level
• Straight forward answers
• Little social presence
Degree = 4.4
Weight = 2.2
Degree = 2.8
Weight = 1.8
Circle diagrams from Brooks, Greer & Gutwin (2014)
43. Indicates weak
algebra skills
SAT Math
Score
Diagnostic
Test
High School
GPA
Race /
Gender Drop, Fail
Withdraw
Grade < C+
Drop, Fail
Withdraw
Grade < C+
Which of My Students Will Need Help?
Introductory Calculus as a Challenging Gateway Course
But a bigger challenge than building the model is how to
effectively leverage the information into effective action
Model enhanced
w/ first wk activity
44. How Are My Students Thinking About…?
Becoming a Dentist
45. How Are My Students Thinking About…?
Becoming a Dentist
Start D1 End D4Start D3
50. Image Credit: Dakotilla via Flickr (CC BY 2.0)
How will analytics change our
relationship with our students and
their relationships to (active)
learning process?
How can data inform (w/o dictating)
our pedagogical decisions?
How can our pedagogical decisions
generate better data?
What dangers must we be on the
watch for?
Learning analytics is a powerful machinery that
creates new products of data that change our
relationship to teaching and learning
51. Image Credit: Dakotilla via Flickr (CC BY 2.0)
Ownership, Consent & Choice
Agency & Context
Transparency & Accountability
Privacy & Surveillance
Equity, Bias & Fairness
The Right to Be Forgotten
The Creation of New Labels
Learning analytics is a powerful machinery that
creates new products of data that change our
relationship to teaching and learning
52. Learning Analytics at NYU
NYU Learning Analytics is a collaborative effort, focused on
community change that puts people, not data, first
Key Characteristics
• Partnerships between IT, faculty, administrators and students
• User-Centered Design involving stakeholders from the start
• Scalability to serve a large university with 10+ global
campuses and a diverse international student body
• Research to innovate and build a knowledge base for data-
informed teaching and learning in higher education
54. Yeonji Jung Sameen Reza
Alyssa Wise
JP Saramiento
Eunyoung JeonSophia Lu Trang TranSophie Sommer
Fabio Campos
Ofer Chen
Yoav Bergner Susana ToroXavier Ochoa
Yu Wang
Qiujie Li
Big thanks
to the core
LEARN
Team
55. Ben Maddox
Chief Instructional Technology Officer
Jason Korenkiewicz
Director of Instructional Technology Tools & Services
Elizabeth McAlpin
Project Director of Research & Outcomes Assessment
And our amazing partners at NYU-IT
Andrew Brackett
Learning Analytics Specialist
Robert Egan
eLearning Specialist
56. As well the many members of the larger LEARN community
across NYU who work with us on our diverse set of projects
Arts & Sciences
Selin Kalaycioglu
Lucy Appert
Tyrell Davis
Business
Kristen Sosulski
Ben Bowman
Sean Diaz
Marian Tes
Daniel de Valk
Libraries
Andrew Battista
Denis Rubin
SPS
Victoria Axelrod
Student Success
John Burdick
Dental
Kenneth Allen
57. @NYU_LEARN
Learning Analytics &
the Changing Landscape
of Higher Education
Alyssa Friend Wise, PhD
Associate Professor of Educational Technology and Director, NYU-LEARN
@alywise
Hinweis der Redaktion
(1) The model we conceptualized consists of two-part structure with multiple phases. First, sense-making. Second, pedagogical response.
(2) Sense-making process started from instructors’ general area of curiosity about their class.
Then, this curiosity can be further developed into the specific questions such as where exactly the students are in progress.
(3) Then, instructors will start interpreting the data. We conceptualize this process with two different activities: reading the data and explaining the patterns. When they start data interpretation, instructors try to identify meaningful patterns by reading the data.
(4) In this activity, instructors get oriented to the overall visualizations or pay focused attention to a specific piece of information.
(5) Also, instructors may find and use the reference points for comparison. This can be either absolute (e.g. do at least 80% students engage in the provided course materials?) or relative reference points (e.g. does student engagement change over time during the course?)
(6) After doing that, instructors extend the meaning of the patterns they identified by explaining or questioning their implications for the class. We conceptualize three different activities for instructors to explain the patterns’ meaning.
(7) First, instructors often try to triangulate the patterns with additional information (e.g. class observation) to confirm their interpretation.
(8) When this supports the interpretations, instructors may use their contextual knowledge of the course and students to explain what the results imply for their class.
(9) When explaining the results, instructors often made attribution of it to the random sources. When instructors saw the same data of low student engagement, some instructors attributed it to their faults, but others attributed it to the students, or the others attributed it to the course schedule or curriculum.
This process can lead instructors to question the analytics results and hesitate to take action.
(10) In addition to cognitive processing of patterns, data interpretation can provoke instructors’ affective responses such as surprise, disappointment, or joy.
(11) Pedagogical questions can be revised or generated after looking at the data.
(1) The model we conceptualized consists of two-part structure with multiple phases. First, sense-making. Second, pedagogical response.
(2) Sense-making process started from instructors’ general area of curiosity about their class.
Then, this curiosity can be further developed into the specific questions such as where exactly the students are in progress.
(3) Then, instructors will start interpreting the data. We conceptualize this process with two different activities: reading the data and explaining the patterns. When they start data interpretation, instructors try to identify meaningful patterns by reading the data.
(4) In this activity, instructors get oriented to the overall visualizations or pay focused attention to a specific piece of information.
(5) Also, instructors may find and use the reference points for comparison. This can be either absolute (e.g. do at least 80% students engage in the provided course materials?) or relative reference points (e.g. does student engagement change over time during the course?)
(6) After doing that, instructors extend the meaning of the patterns they identified by explaining or questioning their implications for the class. We conceptualize three different activities for instructors to explain the patterns’ meaning.
(7) First, instructors often try to triangulate the patterns with additional information (e.g. class observation) to confirm their interpretation.
(8) When this supports the interpretations, instructors may use their contextual knowledge of the course and students to explain what the results imply for their class.
(9) When explaining the results, instructors often made attribution of it to the random sources. When instructors saw the same data of low student engagement, some instructors attributed it to their faults, but others attributed it to the students, or the others attributed it to the course schedule or curriculum.
This process can lead instructors to question the analytics results and hesitate to take action.
(10) In addition to cognitive processing of patterns, data interpretation can provoke instructors’ affective responses such as surprise, disappointment, or joy.
(11) Pedagogical questions can be revised or generated after looking at the data.
(1) The model we conceptualized consists of two-part structure with multiple phases. First, sense-making. Second, pedagogical response.
(2) Sense-making process started from instructors’ general area of curiosity about their class.
Then, this curiosity can be further developed into the specific questions such as where exactly the students are in progress.
(3) Then, instructors will start interpreting the data. We conceptualize this process with two different activities: reading the data and explaining the patterns. When they start data interpretation, instructors try to identify meaningful patterns by reading the data.
(4) In this activity, instructors get oriented to the overall visualizations or pay focused attention to a specific piece of information.
(5) Also, instructors may find and use the reference points for comparison. This can be either absolute (e.g. do at least 80% students engage in the provided course materials?) or relative reference points (e.g. does student engagement change over time during the course?)
(6) After doing that, instructors extend the meaning of the patterns they identified by explaining or questioning their implications for the class. We conceptualize three different activities for instructors to explain the patterns’ meaning.
(7) First, instructors often try to triangulate the patterns with additional information (e.g. class observation) to confirm their interpretation.
(8) When this supports the interpretations, instructors may use their contextual knowledge of the course and students to explain what the results imply for their class.
(9) When explaining the results, instructors often made attribution of it to the random sources. When instructors saw the same data of low student engagement, some instructors attributed it to their faults, but others attributed it to the students, or the others attributed it to the course schedule or curriculum.
This process can lead instructors to question the analytics results and hesitate to take action.
(10) In addition to cognitive processing of patterns, data interpretation can provoke instructors’ affective responses such as surprise, disappointment, or joy.
(11) Pedagogical questions can be revised or generated after looking at the data.
(1) The model we conceptualized consists of two-part structure with multiple phases. First, sense-making. Second, pedagogical response.
(2) Sense-making process started from instructors’ general area of curiosity about their class.
Then, this curiosity can be further developed into the specific questions such as where exactly the students are in progress.
(3) Then, instructors will start interpreting the data. We conceptualize this process with two different activities: reading the data and explaining the patterns. When they start data interpretation, instructors try to identify meaningful patterns by reading the data.
(4) In this activity, instructors get oriented to the overall visualizations or pay focused attention to a specific piece of information.
(5) Also, instructors may find and use the reference points for comparison. This can be either absolute (e.g. do at least 80% students engage in the provided course materials?) or relative reference points (e.g. does student engagement change over time during the course?)
(6) After doing that, instructors extend the meaning of the patterns they identified by explaining or questioning their implications for the class. We conceptualize three different activities for instructors to explain the patterns’ meaning.
(7) First, instructors often try to triangulate the patterns with additional information (e.g. class observation) to confirm their interpretation.
(8) When this supports the interpretations, instructors may use their contextual knowledge of the course and students to explain what the results imply for their class.
(9) When explaining the results, instructors often made attribution of it to the random sources. When instructors saw the same data of low student engagement, some instructors attributed it to their faults, but others attributed it to the students, or the others attributed it to the course schedule or curriculum.
This process can lead instructors to question the analytics results and hesitate to take action.
(10) In addition to cognitive processing of patterns, data interpretation can provoke instructors’ affective responses such as surprise, disappointment, or joy.
(11) Pedagogical questions can be revised or generated after looking at the data.
Log-files – LMS, classroom response (clickers), hw sets, online textbooks
Physical space (card logs, wired classroom
Great way, when approaching a new situation to think about what might be collected
Levels of specificity – learning model but not soooo specific to the type of discussion – some built in flexibility for transferof situaitons.
High student overall buy-in to guidelines / metrics, was difficult to isolate the two as students seemed to think of them together
Students interpreted metrics in terms of the guidelines
Students described using the guidelines and metrics to decide how to participate
Students found goal-setting valuable as motivating them to improve, used multiple strategies, drew on metrics and tried to adjust behaviors
Validation and surprises - emotional reactions No major “big brother” issues
Involuntary propensity to target average
High student self awareness of if meeting goals
Having an audience for the journal mattered
Negotiation and contextualization of analytics - students explained choices, strategies, struggles
Instructor responses seen as supportive, providing guidance to help students move towards goals
Does this challenge agency? Some tensions…
(“One thing I try to do is actually do it – go into the discussions when I had time to really actually think about things as opposed to just, you know, read them, check, read them, check.”) - purposefulness