Enhancing Recommender Systems Explainability with Knowledge Graphs
Thesis Type | Master |
Thesis Status |
Open
|
Number of Students |
1
|
Thesis Supervisor | |
Contact | |
Research Field |
Recommender systems have become part of our daily lives. As recommendations impact various decisions we make day by day, the need for explainability has emerged as a critical factor in fostering user trust and improving decision transparency. This thesis aims to explore the use of knowledge graphs to generate interpretable explanations for recommendations. Through the integration of knowledge graphs with state-of-the-art recommendation algorithms, this research investigates how explanations can be derived. We evaluate the quality, clarity, and relevance of these explanations through empirical studies and user feedback.