Contextualizing Explainability of Learning-Path Recommendations through Knowledge Graphs and Graph-based MDP
Introduction
Human learning is a complex and multi-dimensional process, governed by a wide range of factors that describe the learner, the learning content, and the learning environment. Even in learning environments where learners are subject to comparable conditions, such as in classrooms, the individual differences between learners influence how they respond to the learning materials and activities. This influence becomes more dominant when learning takes place in less formal settings, such as learning on the job or in vocational education and training. In these situations, it becomes essential for an effective learning process to consider the individual differences among learners, learning goals, learning settings, environment conditions, and the involved stakeholders; that is to say, the differences in the context in which learning happens. In the field of personalizing recommendations, context aware recommender systems (CARS) have offered a promising solution to bridge this gap by tailoring learning experiences to the specific contexts. However, existing CARS and context representation approaches often fall short in comprehensively integrating complex contextual data into their reasoning. This is because a comprehensive integration of learning context is not limited to algorithmic reasoning, such as in, or structural representation, such as in Afreen et al. and Ilkou et al. , but it requires a hybrid approach where structural representation of the context factors and their connections is complemented by the algorithms reasoning over the knowledge structure. Moreover, complex CARS face challenges in providing transparent and explainable recommendations to learners, especially when they have no technical background. This hinders CARS effectiveness and acceptance among learners.
Method, Evaluation and Result
We addresses these challenges by proposing an approach to CARS design, which enhances both context-based learning personalization and explainability contextualization. Structurally, we employ knowledge graphs (KGs) to represent contextual learning factors and their interdependency, capturing the dynamic interplay between various contextual variables in complex learning settings. Algorithmically, we proposes an approach based on Markov decision process (MDP) over the KG, with a customization design of a context-aware reward function. This reward function enables the RS to generate contextualized learning paths in different learning settings, where various combinations of context factors have to be considered. Moreover, it allows generating contextual explanations of the recommended path, due to the integration of learning context factors in the MDP-based reasoning. To generate user-centric explanations that are accessible and meaningful across diverse educational contexts, our proposed method (CARExKG) further incorporates a dedicated explainability framework, following. Utilizing large language models (LLMs) and expert input from pedagogy specialists, the explanations are designed to foster trust, reduce resistance to recommendations, and encourage collaborative human-AI decision-making in learning processes. In this sense, CARExKG, as method for generating context aware recommendations and explainability through KGs, is designed not only to enable generating personalized learning paths, but also to provide clear, pedagogically sound explanations for these recommendations. The goal is supporting the learner’s understanding of the reasoning behind generating the recommended learning path, and the connections between its components, to enhance the learner’s decision-making ability, and ownership of their educational journey. To evaluate the proposed system, a set of experiments was designed to measure the effectiveness of the system’s three main pillars: the KG, the RS algorithm, and the explainability framework. Quantitative and quantitative evaluation metrics were utilized to evaluate the individual parts and the CARExKG-based system as a whole. Overall system evaluation was done through a user study with nursing staff in two elderly homes. A total of 24 participants were provided with a complex learning scenario, which was constructed to mimic the multi-dimensional challenges they face in their profession. Within an A/B test setting, a learning path is generated by the recommendation algorithm, which the participants explored for three hours. We measure the learner’s irritation with the recommended path to reflect the change of their acceptance when the explanation is presented. Our results show that treatment group has 37.5% more acceptance of the recommendations, in comparison to the control group. Participants also reported that they were able to “skip” parts of the recommended path because the explanation allowed them to understand that they had already learned similar content, which they did not include in their profiles. This highlights the potential for participants to take informed actions, such as modifying their profile, to adjust the recommendation and adopt it fully or partially. Our results also show that the combination between expert and LLM textual explanations is necessary to provide explanations on contextual and reflection levels, because LLMs were not solely able to reach reflection-level explanations. This points out the role of human-AI collaboration in the task of generating the explanation itself, not only the task of making a decision about the recommendation.
Presentation Contextualizing Explainability of Learning-Path Recommendations through Knowledge Graphs and Graph-based MDP held at the 3rd TRR 318 Conference: Contextualizing Explanations on 18th of June 2025 in Bielefeld, Germany