CauSeR (original) (raw)
2021, Proceedings of the 30th ACM International Conference on Information & Knowledge Management
Recommender Systems (RS) tend to recommend more popular items instead of the relevant long-tail items. Mitigating such popularity bias is crucial to ensure that less popular but relevant items are part of the recommendation list shown to the user. In this work, we study the phenomenon of popularity bias in session-based RS (SRS) obtained via deep learning (DL) models. We observe that DL models trained on the historical user-item interactions in session logs (having long-tailed item-click distributions) tend to amplify popularity bias. To understand the source of this bias amplification, we consider potential sources of bias at two distinct stages in the modeling process: i. the data-generation stage (user-item interactions captured as session logs), ii. the DL model training stage. We highlight that the popularity of an item has a causal effect on i. user-item interactions via conformity bias, as well as ii. item ranking from DL models via biased training process due to class (target item) imbalance. While most existing approaches in literature address only one of these effects, we consider a comprehensive causal inference framework that identifies and mitigates the effects at both stages. Through extensive empirical evaluation on simulated and real-world datasets, we show that our approach improves upon several strong baselines from literature for popularity bias and long-tailed classification. Ablation studies show the advantage of our comprehensive causal analysis to identify and handle bias in data generation as well as training stages. CCS CONCEPTS • Information systems → Recommender systems.
Sign up for access to the world's latest research.
checkGet notified about relevant papers
checkSave papers to use in your research
checkJoin the discussion with peers
checkTrack your impact
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.