To a large extent, current tutorial dialogue systems lack the ability to gauge students’ level of mastery over the curriculum. Human tutors do gauge the level of knowledge and understanding of their tutees to some degree, although they are not very adept at diagnosing the causes of student errors.
We propose integrating a student model that evaluates the student’s understanding of curriculum elements into tutorial dialogue and that doing so can address these differences between human and simulated tutors.
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
An adaptive tutoring system for physics using reflective dialogue
1. An adaptive tutoring system for physics using
reflective dialogue
Irene-Angelica Chounta1, Bruce M. McLaren1
1Human Computer Interaction Institute, Carnegie Mellon University
Patricia Albacete2, Pamela Jordan2, Sandra Katz2
2Learning Research and Development Center, University of Pittsburgh
https://sites.google.com/site/rimacsite/
Work-in-progress
Rimac Proposed Approach for Adapting Dialogues
• Rimac is a web-based natural-language tutoring system that engages students in
conceptual discussions after they solve quantitative physics problems;
• Rimac supports student learning about physics concepts (e.g. Albacete et. al.
2015)
Tutorial Dialogue
Problem statement
and learning material
Rimac Web Interface
• Mixed design student model combining a regression model (Additive Factors
Model – AFM [Cen, et. al., 2008]) with a rule-based approach.
• The model is invoked during critical steps of the dialogue (as defined by
domain experts) to assess students’ understanding of knowledge components
• The system relies on the student model’s predictions to decide if a step of the
line of reasoning can be skipped and, if not, how to address that step (e.g.
provide a prompt, hint, explanation, etc.).
Example
- Steps marked as green are predicted as Known
- Steps marked as orange cannot be predicted (that is, they are classified as
Undecided)
- Steps marked as red are predicted as Not known
Strategy for building the Example Dialogue
- State reflection question (RQ)
- Hint on steps (8), (5), (4), (7) [Undecided ]
- Provide explicit information on steps (2), (3) [Not known]
- Skip (11), (10), (6) [Known]
Goal
We want to understand:
• what characteristics of tutorial dialogues help increase the student's
understanding;
• how tutorial dialogues can be adapted to student characteristics (e.g. prior
knowledge, motivation);
• how to provide feedback to students in order to support their practice;
We study:
• factors that affect tutorial dialogues
We aim to:
• design a dialogue tutoring system for physics in which a student model is used to
guide students through adaptive lines of reasoning.
Motivation
• To a large extent, current tutorial dialogue systems lack the ability to gauge
students’ level of mastery over the curriculum.
• Human tutors do gauge the level of knowledge and understanding of their tutees
to some degree, although they are not very adept at diagnosing the causes of
student errors.
We propose integrating a student model that evaluates the student’s understanding
of curriculum elements into tutorial dialogue and that doing so can address these
differences between human and simulated tutors.
Albacete, Patricia, Pamela Jordan, and Sandra Katz. "Is a dialogue-based tutoring system that emulates helpful co-constructed relations during human tutoring effective?." International Conference on Artificial Intelligence in Education. Springer International Publishing, 2015.
C Cen, Hao, Kenneth Koedinger, and Brian Junker. "Comparing two IRT models for conjunctive skills." Intelligent tutoring systems. Springer Berlin/Heidelberg, 2008.
Chi, Min, et al. "An evaluation of pedagogical tutorial tactics for a natural language tutoring system: A reinforcement learning approach." International Journal of Artificial Intelligence in Education 21.1-2 (2011): 83-113.
Chi, Michelene TH, Stephanie A. Siler, and Heisawn Jeong. "Can tutors monitor students' understanding accurately?." Cognition and instruction 22.3 (2004): 363-387.
Di Eugenio, Barbara, Michael Glass, and Michael J. Trolio. "The DIAG experiments: Natural language generation for intelligent tutoring systems." INLG02, The Third International Natural Language Generation Conference. 2002.
Jordan, Pamela, Patricia Albacete, and Sandra Katz. "When Is It Helpful to Restate Student Responses Within a Tutorial Dialogue System?." International Conference on Artificial Intelligence in Education. Springer International Publishing, 2015.
Katz, Sandra, and Patricia L. Albacete. "A tutoring system that simulates the highly interactive nature of human tutoring." Journal of Educational Psychology 105.4 (2013): 1126.
Currently: We are carrying out extensive studies to evaluate the student model and the proposed approach.
We are testing different tutoring strategies, such as:
• Students for whom most steps are known will move on to more challenging questions, no help or even skip specific parts of the dialogue that the model predicts they have
mastered;
• When steps are undecided, students will receive meaningful information, scaffolding and hints related to the curriculum;
• When most steps are not known then students will be guided to easier tasks or they will receive explicit information and instruction.
To evaluate our approach, we will compare the learning gains of students who use the adaptive, student model driven version of the tutor with that of students who use a
control version that is not guided by the student model.
Sample Line of Reasoning