Abstract
Recommendation performance usually exhibits a long-tail distribution over users - a small portion of head users enjoy much more accurate recommendation services than the others. We reveal two sources of this performance heterogeneity problem: the uneven distribution of historical interactions (a natural source); and the biased training of recommender models (a model source). As addressing this problem cannot sacrifice the overall performance, a wise choice is to eliminate the model bias while maintaining the natural heterogeneity. The key to debiased training lies in eliminating the effect of confounders that influence both the user's historical behaviors and the next behavior. The emerging causal recommendation methods achieve this by modeling the causal effect between user behaviors, however potentially neglect unobserved confounders (e.g., friend suggestions) that are hard to measure in practice. To address unobserved confounders, we resort to the front-door adjustment (FDA) in causal theory and propose a causal multi-teacher distillation framework (CausalD). FDA requires proper mediators in order to estimate the causal effects of historical behaviors on the next behavior. To achieve this, we equip CausalD with multiple heterogeneous recommendation models to model the mediator distribution. Then, the causal effect estimated by FDA is the expectation of recommendation prediction over the mediator distribution and the prior distribution of historical behaviors, which is technically achieved by multi-teacher ensemble. To pursue efficient inference, CausalD further distills multiple teachers into one student model to directly infer the causal effect for making recommendations. We instantiate CausalD on two representative models, DeepFM and DIN, and conduct extensive experiments on three real-world datasets, which validate the superiority of CausalD over state-of-the-art methods. Through in-depth analysis, we find that CausalD largely improves the performance of tail users, reduces the performance heterogeneity, and enhances the overall performance.
| Original language | English |
|---|---|
| Pages (from-to) | 459-474 |
| Number of pages | 16 |
| Journal | IEEE Transactions on Knowledge and Data Engineering |
| Volume | 36 |
| Issue number | 2 |
| DOIs | |
| Publication status | Published - Feb 2024 |
| Externally published | Yes |
Keywords
- Causal distillation
- front-door adjustment
- performance heterogeneity
- recommender system
ASJC Scopus subject areas
- Information Systems
- Computer Science Applications
- Computational Theory and Mathematics