Diese Präsentation wurde erfolgreich gemeldet.
Wir verwenden Ihre LinkedIn Profilangaben und Informationen zu Ihren Aktivitäten, um Anzeigen zu personalisieren und Ihnen relevantere Inhalte anzuzeigen. Sie können Ihre Anzeigeneinstellungen jederzeit ändern.

Blackboard Learning Analytics Research Update

916 Aufrufe

Veröffentlicht am

Blackboard’s data science team conducts large-scale analysis of the relationship between the use of our academic technologies and student impact, in order to inform product design, disseminate effective practices, and advance the base of empirical research in educational technologies.

In this presentation, John Whitmer, Director of Analytics & Research, will discuss findings from 2016. Some findings challenge our conventional knowledge, while others confirm what we believed to be true.

Archived presentation made to JISC Learning Analytics workgroup on Feb 22, 2017

Veröffentlicht in: Daten & Analysen

Blackboard Learning Analytics Research Update

  1. 1. Learning Analytics Research Findings Update “Dr. John” Whitmer Director, Analy6cs and Research JISC Remote Presenta6on | 2.22.2017
  2. 2. 1.  Learning Analy6cs Overview & Bb Data Science 2.  Major Findings in 2017 1.  Varia6on in LMS “Effec6veness” 2.  Tool Use 3.  Course Categories 4.  Student Percep6ons of Data Dashboards (6me permiYng) 3.  Blackboard Analy6cs Solu6ons Porolio 4.  Discussion
  3. 3. Learning Analytics Overview
  4. 4. Educational Technology Assessment Hierarchy Does it impact student learning? (Learning Analy6cs) How many people use it? (Adop6on) Does it work? (SLAs)
  5. 5. What is Learning Analytics? Learning and Knowledge Analytics Conference, 2011 “ ...measurement, collec6on, analysis and repor6ng of data about learners and their contexts, for purposes of understanding and op2mizing learning and the environments in which it occurs.”
  6. 6. Meta- questions driving our Learning Analytics research @ Blackboard 1. How is student/faculty use of Bb plaorms (e.g. Learn, Collab, etc.) related to student achievement? [or sa6sfac6on, or risk, or …] 3. What data elements, feature sets, and func6onality can we create to integrate these findings into Bb products to help faculty improve student achievement? 2. Do these findings apply equally to students ‘at promise’ due to their academic achievement or background characteris6cs? (e.g. race, class, family educa6on, geography)
  7. 7. Techniques •  Simula2on if X, what Y? (“With this Ultra Learning Analy6cs trigger rule, how many students would trip no6fied?”) •  Hypothesis tes2ng: inves6gate if a specific rela6onship is true (“What’s the rela6onship between 6me spent in a course and student grade”?) •  Data mining: analyze underlying latent pakerns in data (“What typical pakerns in tool use characterizes BB Learn courses?”) Key Data Sources •  Learn Managed Hos6ng •  Learn SaaS •  Collaborate Ultra Main Big Data Sources & Techniques
  8. 8. Commitment to Privacy & Openness •  Analyze data records that are not only removed of PII, but de-personalized (individual & ins6tu6onal levels) •  Share results and open discussion procedures for analysis to inform broader educa6onal community •  Respect territorial jurisdic6ons and safe harbor provisions
  9. 9. Major Findings in 2016 (and one from 2017)
  10. 10. Relationship Student Use Learn vs. Grade
  11. 11. Bb Study: Relationship Time in Learn & Grade •  Distribution in Time Spent is highly skewed toward low access •  Transforming data (log transform) can produce normal curves for analysis •  Of course, huge variation of quality within that time spent (of course materials, of student activity)
  12. 12. Findings: Relationship Time in Learn & Grade •  Question: what is the relationship between student use of Learn and their course grade? •  Investigate at student-course level (one student, one course) •  1.2M students, 34,519 courses, 788 institutions •  Significant, but effect size < 1%
  13. 13. Finding: Tool Use & Grade Tool use and Final Grade do not have a linear rela6onship; there is a diminishing marginal effect of tool use on Final Grade Interpreta6ons •  Students absent from course ac6vity are at greatest risk of low achievement. •  The first 6me you read/see a PowerPoint presenta6on, you learn a lot, but the second 6me you read/see it, you learn less. •  GeYng from a 90% to a 95% requires more effort than geYng from a 60% to a 65%. Log transforma2on shows stronger trend
  14. 14. But strong effect in some courses (n=7,648, 22%)
  15. 15. What makes some for a stronger or weaker relationship? Tools used? Course design? Quality of activity/effort?
  16. 16. Learn Tool Use vs. Grade
  17. 17. Investigation Grade by Specific Tools Used Ques6on: what is the rela6onship between use of Learn and student grade, based on the tool used? Analysis Steps 1.  Filter data for courses with poten6al meaningful use (>60 min average, enrollment >10 <500, gradebook used) 2.  Iden6fy most frequently used tools 3.  Separate tool use into no use & quar6les 4.  Divide students into 3 groups by course grade •  High (80+) •  Passing (60-79) •  Low/Failing (0-59)
  18. 18. Finding: MyGrades At every level, probability of higher grade increases with increased use. Causal? Probably not. Good indicator? Absolutely.
  19. 19. Finding: Course contents More is not always beker. Large jump none to some; then no rela6onship
  20. 20. Finding: Assessments/Assignments Students above mean have lower likelihood of achieving a high grade than students below the mean
  21. 21. Implications •  Move beyond LMS use as proxy for effort (where more is always beker), and get at finer-grained learning behaviors that are more useful (e.g. students who are struggling to understand material, students who are not prepared). •  Major missing elements from research – fine-grained understanding of ac6vity over 6me (e.g. cramming vs. consistent hard working) – quality of course materials and course design
  22. 22. Patterns in Course Design
  23. 23. Research Questions Ques2ons 1.  Are there systema6c ways that instructors use LMS tools in their courses that span instructors and ins6tu6ons? 2.  What recommenda6ons can be drawn for faculty, instruc6onal designers, and other academic technology leaders seeking to increase the impact of LMS use at their ins6tu6on? Methods 1.  Use same filtered data sample of student-course data 2.  Calculate rela6ve student 6me per tool (as % of total course 6me), for comparison between courses 3.  Cluster by pakerns in the balance of 6me spent in each tool (unsupervised machine-learning; k means cluster analysis) 4.  Add data as relevant to pakerns about enrollment, total 6me, etc. 5.  Make up cool names for each cluster and interpret meaning
  24. 24. Distribution of Courses by Type
  25. 25. Finding: Discussions with low/high avg use Compare courses with low forum use to courses with forum use >1 hour / student average
  26. 26. Summary & Future Directions for DS Research Summary •  Tremendous varia6on in use of Learn; most use skewed toward low/very low use. •  Importance of 6me spent in Learn for learning is also tremendously varied (“necessary” and “effec6ve” use of Learn) •  Cri6cal to account for this varia6on to understand poten6al importance of Learn ac6vity Future Direc2ons •  Analyze quality of ac2vity in greater depth (e.g. content of assignments, words in forum posts) to get insights into quality of interac6ons •  Conduct 6me-series analysis (quan6ta6ve methods, design also needed); when someone accesses is more important than if they do. •  Create proxies/derived values for behavior (above average, at average, etc.) by tool
  27. 27. 3. Blackboard Analytics Portfolio
  28. 28. Blackboard Analytics – Product Naming Blackboard Analytics Data warehouse products Blackboard Analytics Suite of analytics products
  29. 29. Blackboard Analy.cs Product portfolio Blackboard Intelligence •  Analy6cs for Learn – LMS data •  Student Management – SIS data •  Finance, HR, Advancement – ERP data Blackboard Predict •  Predic6ve analy6cs and early alerts for reten6on •  Provides data for faculty and advisors about at-risk students •  Formerly Blue Canary X-Ray Learning Analytics •  Classroom engagement data for faculty •  Ac6vity aggregated into 30+ visualiza6ons •  Currently available for Moodlerooms & Self-Hosted Moodlers only Past view Current view Future view Past view Current view Future view Past view Current view Future view
  30. 30. Blackboard Analytics Solution Portfolio X-Ray Learning Analy6cs Blackboard Analy6cs Blackboard Predict Analy6cs for Learn Student Management Finance HR Advancement Blackboard Intelligence
  31. 31. Blackboard Analytics – Our Approach & Philosophy Products that provide insight into the teaching and learning process Our Philosophy: Data complements human decision-making Core competency: Learning so{ware and academic data A team of experts in the analy6cs field
  32. 32. Discussion Thank you! John Whitmer, Ed.D. john.whitmer@blackboard.com @johncwhitmer www.johnwhitmer.info/research

×